QUOTE: “Medium pages achieve their purpose and have neither high nor low expertise, authoritativeness, and trustworthiness. However, Medium pages lack the characteristics that would support a higher quality rating. Occasionally, you will find a page with a mix of high and low quality characteristics. In those cases, the best page quality rating may be Medium.” Google Quality Evaluator Guidelines, 2017
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][52] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[53] although the two are not identical.
Mobile-first design has been a best practice for a while, and Google is finally about to support it with mobile-first indexing. Learn how mobile-first indexing will give digital marketers their first real swing at influencing Google’s new AI (Artificial Intelligence) landscape. Marketers who embrace an accurate understanding of mobile-first indexing could see a huge first-mover advantage, similar to the early days of the web, and we all need to be prepared.

One of the things Google looks at when ranking a page is the content on that page. It looks at the words on the page. Now picture this, if every word on, for instance, a blog post about a digital piano is used 2 times, then all words are of equal importance. Google won’t have a clue which of those words are important and which aren’t. The words you’re using are clues for Google; it tells Google and other search engines what the page or post is about. So if you want to make Google understand what your page is about, you need to use it fairly often.
"I just wanted to let you know that Ben has been so great with us. I know we were picky (to say the least) before/after our new site went live, but Ben was responsive the whole time. He continues to help us out with website stuff and we really appreciate everything he has done! Also, Chris has been wonderful with SEO stuff as well. He has been very helpful with the SEO project and helping me not let things fall through the cracks. You have a great team and we have enjoyed working with them!"
Sometimes we refer to those root keywords as "vanity keywords," because if you do just one search to see who seems to be winning the space, you are likely to pick the single broadest keyword and see who comes up ranked highly. In nearly every case, however, we have found it to be more successful and deliver a significantly better return on your SEM investment by focusing on the hundreds or even thousands of more specific keywords that more closely match the services, products, brands, and locations that you sell or serve.
Good news for web designers, content managers and search engine optimisers! ” Google clearly states, “If the website feels inadequately updated and inadequately maintained for its purpose, the Low rating is probably warranted.” although does stipulate again its horses for courses…..if everybody else is crap, then you’ll still fly – not much of those SERPs about these days.

The biggest advantage any one provider has over another is experience and resource. The knowledge of what doesn’t work and what will hurt your site is often more valuable than knowing what will give you a short-lived boost. Getting to the top of Google is a relatively simple process. One that is constantly in change. Professional SEO is more a collection of skills, methods and techniques. It is more a way of doing things, than a one-size-fits-all magic trick.
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, it’s 6 or half a dozen, what the actual difference is in terms of ranking in Google – usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.
What’s your audience searching for? – Just a few years ago, the average user didn’t trust search engines to understand conversational questions. They were searching with clunky phrases like “flower delivery new york.” Now people feel comfortable typing in things like “who delivers roses near me?” Changes in searcher habits are usually subtle, but will affect which keywords will be most valuable for your site. Instead of focusing on keywords that get you more traffic, focus on those that translate into conversions, revenue and profits.
OBSERVATION – You can have the content and the links – but if your site falls short on even a single user satisfaction signal (even if it is picked up by the algorithm, and not a human reviewer) then your rankings for particular terms could collapse – OR – rankings can be held back – IF Google thinks your organisation, with its resources, or ‘reputation, should be delivering a better user experience to users.
I like the competition analysis tools, it provides paid and organic data, which gives me an idea on how to catch up and outrank the immediate competition for my clients. It also provides data for the potential traffic, which helps show clients the potential gains of the campaign. And with the marketing plan, I know what needs to be improved in order to get results for my clients.
QUOTE: “I think there is probably a misunderstanding that there’s this one site-wide number that Google keeps for all websites and that’s not the case.  We look at lots of different factors and there’s not just this one site-wide quality score that we look at. So we try to look at a variety of different signals that come together, some of them are per page, some of them are more per site, but it’s not the case where there’s one number and it comes from these five pages on your website.” John Mueller, Google

Congrats Floyd! To answer your question: a big part of the success depends on how much your content replaces the old content… or is a good fit for that page in general. In the example I gave, my CRO guide wasn’t 1:1 replacement for the dead link. But it did make sense for people to add it to their pages because they tended to be “list of CRO resources” type things. Hope that helps.
Google knows who links to you, the “quality” of those links, and whom you link to. These – and other factors – help ultimately determine where a page on your site ranks. To make it more confusing – the page that ranks on your site might not be the page you want to rank, or even the page that determines your rankings for this term. Once Google has worked out your domain authority – sometimes it seems that the most relevant page on your site Google HAS NO ISSUE with will rank.

Google states, “News articles, Wikipedia articles, blog posts, magazine articles, forum discussions, and ratings from independent organizations can all be sources of reputation information” but they also state specifically boasts about a lot of internet traffic, for example, should not influence the quality rating of a web page. What should influence the reputation of a page is WHO has shared it on social media etc. rather than just raw numbers of shares. CONSIDER CREATING A PAGE with nofollow links to good reviews on other websites as proof of excellence.

Yes, you need to build links to your site to acquire more PageRank, or Google ‘juice’ – or what we now call domain authority or trust. Google is a link-based search engine – it does not quite understand ‘good’ or ‘quality’ content – but it does understand ‘popular’ content. It can also usually identify poor, or THIN CONTENT – and it penalises your site for that – or – at least – it takes away the traffic you once had with an algorithm change. Google doesn’t like calling actions the take a ‘penalty’ – it doesn’t look good. They blame your ranking drops on their engineers getting better at identifying quality content or links, or the inverse – low-quality content and unnatural links. If they do take action your site for paid links – they call this a ‘Manual Action’ and you will get notified about it in Webmaster Tools if you sign up.
Critics will point out the higher the cost of expert SEO, the more cost-effective Adwords becomes, but Adwords will only get more expensive, too. At some point, if you want to compete online, your going to HAVE to build a quality website, with a unique offering to satisfy returning visitors – the sooner you start, the sooner you’ll start to see results.

Experience can educate you when a page is high-quality and yet receives no traffic. If the page is thin, but is not manipulative, is indeed ‘unique’ and delivers on a purpose with little obvious detectable reason to mark it down, then you can say it is a high-quality page – just with very little search demand for it. Ignored content is not the same as ‘toxic’ content.
If you are improving user experience by focusing primarily on the quality of the MC of your pages and avoiding – even removing – old-school SEO techniques – those certainly are positive steps to getting more traffic from Google in 2019 – and the type of content performance Google rewards is in the end largely at least about a satisfying user experience.
Google WILL classify your site when it crawls and indexes your site – and this classification can have a DRASTIC effect on your rankings. It’s important for Google to work out WHAT YOUR ULTIMATE INTENT IS – do you want to be classified as a thin affiliate site made ‘just for Google’, a domain holding page or a small business website with a real purpose? Ensure you don’t confuse Google in any way by being explicit with all the signals you can – to show on your website you are a real business, and your INTENT is genuine – and even more important today – FOCUSED ON SATISFYING A VISITOR.
The higher the search volume for a given keyword or keyword phrase, the more work is typically required to achieve higher rankings. This is often referred to as keyword difficulty and occasionally incorporates SERP features; for example, if many SERP features (like featured snippets, knowledge graph, carousels, etc) are clogging up a keyword’s result page, difficulty will increase. Big brands often take up the top 10 results for high-volume keywords, so if you’re just starting out on the web and going after the same keywords, the uphill battle for ranking can take years of effort.
Simple navigation reigns and quality content is king – A user-friendly website, with interesting and easy-to-find information is what will boost your traffic. Each page needs to be built around keyword themes, with unique content, so search engines can easily index yours and rank you higher. Positive behaviors from site visitors is your best bet for a better ranking, so keep the content natural and focused; avoid jargon and keyword stuffing to keep users from leaving the site unhappy and hurting its ranking.
QUOTE: “Google will now begin encrypting searches that people do by default, if they are logged into Google.com already through a secure connection. The change to SSL search also means that sites people visit after clicking on results at Google will no longer receive “referrer” data that reveals what those people searched for, except in the case of ads.

It is important you spread all that real ‘PageRank’ – or link equity – to your sales keyword / phrase rich sales pages, and as much remains to the rest of the site pages, so Google does not ‘demote’ pages into oblivion –  or ‘supplemental results’ as we old timers knew them back in the day. Again – this is slightly old school – but it gets me by, even today.
Some pages are designed to manipulate users into clicking on certain types of links through visual design elements, such as page layout, organization, link placement, font color, images, etc. We will consider these kinds of pages to have deceptive page design. Use the Lowest rating if the page is deliberately designed to manipulate users to click on Ads, monetized links, or suspect download links with little or no effort to provide helpful MC.
One thing Google has indicated it likes to do is penalize sites or stores or companies that consistently have poor reviews, so if you have many poor reviews, in time Google is going to figure out not to show your site in their rankings because Google doesn’t want to show those sites to their searchers. So prove to Google’s algorithm that you are trustworthy. Get other highly authoritative websites to link to you. Get newspaper articles, get industry links, get other trusted sites to link to you: partners, vendors, happy customers—get them to link to your website to show that you are highly credible and trustworthy.
The higher the search volume for a given keyword or keyword phrase, the more work is typically required to achieve higher rankings. This is often referred to as keyword difficulty and occasionally incorporates SERP features; for example, if many SERP features (like featured snippets, knowledge graph, carousels, etc) are clogging up a keyword’s result page, difficulty will increase. Big brands often take up the top 10 results for high-volume keywords, so if you’re just starting out on the web and going after the same keywords, the uphill battle for ranking can take years of effort.
Submit website to directories (limited use). Professional search marketers don’t sub­mit the URL to the major search engines, but it’s possible to do so. A better and faster way is to get links back to your site naturally. Links get your site indexed by the search engines. However, you should submit your URL to directories such as Yahoo! (paid), Business.com (paid) and DMOZ (free). Some may choose to include AdSense (google.com/adsense) scripts on a new site to get their Google Media bot to visit. It will likely get your pages indexed quickly.
Ever since its April 2015 update, Google is now taking the user's mobile experience into consideration in its search results. This means that when a user is doing a search on a mobile device, Google's search results will favor websites that are mobile friendly over the ones that aren't. If you want to capture that mobile search audience, you will need to have a mobile version of your website in addition to your desktop version.

Google loves great websites with quality content. Our design and SEO teams work together to understand your business, audience, competition and target keywords. We use responsive design or mobile first design on every site—ensuring that they look great and function well on all devices. Every web design project results in the perfect blend of user-friendliness and Google-friendliness.
If there is a single concept that is the driver of much of the Internet's growth over the past decade – not to mention nearly all of Google's annual revenue of $25 billion – it is the concept of keywords. Keywords are what we type in when we are searching for products, services, and answers on the search engines, an act that Americans performed 15.5 billion times in April 2010 according to ComScore, the web research firm.
Now, some buckets are worth more than others, and the three main buckets that you need to be aware of for search rankings are quality, trust and authority. So quality: what Google is trying to measure when they’re trying to figure out what sites should rank is offering something valuable or unique or interesting to googles searchers. For example: good content—if you are selling t-shirts and you are using the same description that every other t-shirt seller is using on their website then you are not offering anything unique to Google’s searchers.
You likely have a few keywords in mind that you would like to rank for. These will be things like your products, services, or other topics your website addresses, and they are great seed keywords for your research, so start there! You can enter those keywords into a keyword research tool to discover average monthly search volume and similar keywords. We’ll get into search volume in greater depth in the next section, but during the discovery phase, it can help you determine which variations of your keywords are most popular amongst searchers.
×