Hey Sharon, great post! Re. dwell time – I’ve read conflicting opinions, some saying that Google DOES consider it an ‘important’ ranking signal, and others saying that it doesn’t, because dwell time can sometimes be a misleading indicator of content quality. For example when a user searches for something specific and finds the answer immediately in the recommended page (meaning that the content on the page is actually spot on) so he returns to the SERPs very quickly. I have been unable to locate any definitive statements (written/spoken) from anyone at Google that suggest that dwell time IS still a factor in ranking considerations, but it makes sense (to me, anyway) that it should be. Do you have any ‘proof’ one way or the other re. whether Google definitely considers dwell time or not?
******” Quote from Google: One other specific piece of guidance we’ve offered is that low-quality content on some parts of a website can impact the whole site’s rankings, and thus removing low-quality pages, merging or improving the content of individual shallow pages into more useful pages, or moving low-quality pages to a different domain could eventually help the rankings of your higher-quality content. GOOGLE ******
One of the things Google looks at when ranking a page is the content on that page. It looks at the words on the page. Now picture this, if every word on, for instance, a blog post about a digital piano is used 2 times, then all words are of equal importance. Google won’t have a clue which of those words are important and which aren’t. The words you’re using are clues for Google; it tells Google and other search engines what the page or post is about. So if you want to make Google understand what your page is about, you need to use it fairly often.
Great SEO is increasingly dependent on having a website with a great user experience. To make your user experience great requires carefully tracking what people do so that you always know where to improve. But what do you track? In this 15-minute talk, I’ll cover three effective and advanced ways to use event tracking in Google Analytics to understand a website's user.
Let's say, for example, you're researching the keyword "how to start a blog" for an article you want to create. "Blog" can mean a blog post or the blog website itself, and what a searcher's intent is behind that keyword will influence the direction of your article. Does the searcher want to learn how to start an individual blog post? Or do they want to know how to actually launch a website domain for the purposes of blogging? If your content strategy is only targeting people interested in the latter, you'll need to make sure of the keyword's intent before committing to it.
Search engines are expanding - When someone mentions search engines, do you automatically assume they’re talking about Google? The tech giant has such a big share of the market that 'Googling' has become a verb. However, a significant portion of searches take place on alternative sites, such as Microsoft’s Bing. Make a point to search for your site on Google alternatives to see where you rank. Just improving social media engagement and adding meta tags might be all it takes to boost you a couple ranks on Bing.
And finally, the other really important bucket is authority. Google wants to show sites that are popular. If they can show the most popular t-shirt seller to people looking to buy t-shirts online, that’s the site they want to show. So you have to convince Google - send them signals that your site is the most popular site for the kind of t-shirts that you sell.
QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant. This is basically the definition of a content farm in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics.” SMX West 2016 – How Google Works: A Google Ranking Engineer’s Story (VIDEO)
Google is looking for a “website that is well cared for and maintained” so you need to keep content management systems updated, check for broken image links and HTML links. If you create a frustrating user experience through sloppy website maintenance – expect that to be reflected in some way with a lower quality rating. Google Panda October 2014 went for e-commerce pages that were optimised ‘the old way’ and are now classed as ‘thin content’.
In short, nobody is going to advise you to create a poor UX, on purpose, in light of Google’s algorithms and human quality raters who are showing an obvious interest in this stuff. Google is rating mobile sites on what it classes is frustrating UX – although on certain levels what Google classes as ‘UX’ might be quite far apart from what a UX professional is familiar with in the same ways as Google’s mobile rating tools differ from, for instance, W3c Mobile testing tools.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Great content is a critical component of everything we do. From blogging and site content to compelling social and PPC copy, Dallas SEO Dogs’ content writers improve client credibility and establish them as authorities in their business. Presenting your organization in a way that attracts and maintains interest affects search rankings, engagement, leads and conversions. That’s what content marketing is the most important aspect of our overall strategy.
Clear view of rankings and postions, site audit tool for quick scan and backlink checker are very usefull. I use it a lot and also use the lead generator to get a free scan for potential clients wich runs automated when they fill in te form. The dashboard gives you a good view of changes in traffic and positions. The marketing plan is i bit simple but it gives you some direction of what to do first on the website and you can also check the boxes when you finished a task wich works very well