Google is falling into a familiar pattern. First, they offer web publishers increased visibility and SERP display options. Next, they incent participation in specific formats and data structures. Finally, they take that data for themselves, changing the SERPs to favor advertising, their own properties, and/or instant answers that can reduce publisher traffic. For web marketers, it's a prisoner's dilemma. In this presentation, Rand will show data on how Google is being used today, how it's changing, then dive into strategic initiatives and specific examples of how savvy players can build a moat to protect against long-term risk.
NOTE, in 2019, the HTML title element you choose for your page, may not be what Google chooses to include in your SERP snippet. The search snippet title and description is very much QUERY & DEVICE dependant these days. Google often chooses what it thinks is the most relevant title for your search snippet, and it can use information from your page, or in links to that page, to create a very different SERP snippet title.
We’ve used other tools in the past, but SE Ranking offers more up-to-date data and information, which benefits our agency and clients. SE Ranking allows us to access historical data with just a few clicks without ever having to leave the interface. From daily ranking updates to current search volume trends, there are numerous aspects that are essential when formulating client strategies, and with SE Ranking’s continuously updated system we are able to use this data to help our clients succeed.
I do not obsess about site architecture as much as I used to…. but I always ensure my pages I want to be indexed are all available from a crawl from the home page – and I still emphasise important pages by linking to them where relevant. I always aim to get THE most important exact match anchor text pointing to the page from internal links – but I avoid abusing internals and avoid overtly manipulative internal links that are not grammatically correct, for instance..

We’ve combined the most important SEO techniques and created an advanced platform that’ll get you noticed online. Working together and learning from each other is the way SEO should be, and our local team of Leicester based specialists work hard to ensure this happens. Most importantly, it’s an affordable digital marketing solution for today’s busy entrepreneurs. Our unique approach to SEO allows users to take full control of their digital marketing whilst supported by an experienced team.
Place strategic search phrases on pages. Integrate selected keywords into your website source code and existing content on designated pages. Make sure to apply a sug­gested guideline of one to three keywords/phrases per content page and add more pages to complete the list. Ensure that related words are used as a natural inclu­sion of your keywords. It helps the search engines quickly determine what the page is about. A natural approach to this works best. In the past, 100 to 300 words on a page was recommended. Many tests show that pages with 800 to 2,000 words can outperform shorter ones. In the end, the users, the marketplace, content and links will determine the popularity and ranking numbers.
Goals and Objectives. Clearly define your objectives in advance so you can truly measure your ROI from any programs you implement. Start simple, but don’t skip this step. Example: You may decide to increase website traffic from a current baseline of 100 visitors a day to 200 visitors over the next 30 days. Or you may want to improve your current conversion rate of one percent to two in a specified period. You may begin with top-level, aggregate numbers, but you must drill down into specific pages that can improve products, services, and business sales.
“Doorways are sites or pages created to rank highly for specific search queries. They are bad for users because they can lead to multiple similar pages in user search results, where each result ends up taking the user to essentially the same destination. They can also lead users to intermediate pages that are not as useful as the final destination.

A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility.[48] Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic.[48] Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element[49] or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.
Ranking refers to the process search engines use to determine where a particular piece of content should appear on a SERP. Search visibility refers to how prominently a piece of content is displayed in search engine results. Highly visible content (usually the content that ranks highest) may appear right at the top of organic search results or even in a featured snippet, while less-visible content may not appear until searchers click to page two and beyond
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Now, some buckets are worth more than others, and the three main buckets that you need to be aware of for search rankings are quality, trust and authority. So quality: what Google is trying to measure when they’re trying to figure out what sites should rank is offering something valuable or unique or interesting to googles searchers. For example: good content—if you are selling t-shirts and you are using the same description that every other t-shirt seller is using on their website then you are not offering anything unique to Google’s searchers.
I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.
Link building is not JUST a numbers game, though. One link from a “trusted authority” site in Google could be all you need to rank high in your niche. Of course, the more “trusted” links you attract, the more Google will trust your site. It is evident you need MULTIPLE trusted links from MULTIPLE trusted websites to get the most from Google in 2019.
A page title that is highly relevant to the page it refers to will maximise usability, search engine ranking performance and user experience ratings as Google measures these. It will probably be displayed in a web browser’s window title bar, bookmarks and in clickable search snippet links used by Google, Bing & other search engines. The title element is the “crown” of a web page with important keyword phrase featuring AT LEAST ONCE within it.
QUOTE: “alt attribute should be used to describe the image. So if you have an image of a big blue pineapple chair you should use the alt tag that best describes it, which is alt=”big blue pineapple chair.” title attribute should be used when the image is a hyperlink to a specific page. The title attribute should contain information about what will happen when you click on the image. For example, if the image will get larger, it should read something like, title=”View a larger version of the big blue pineapple chair image.” John Mueller, Google

Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google “eyes” as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it – in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites – trump every other signal.)
Other research shows that exact-match domains that are deemed to be relevant, valuable, and high-quality can see a ranking boost because of it. However, if you already have an established website, you don’t need to go looking for an exact-match domain for your business; focus on a URL that reflects your business and optimize the heck out of it instead!
QUOTE: “Content which is copied, but changed slightly from the original. This type of copying makes it difficult to find the exact matching original source. Sometimes just a few words are changed, or whole sentences are changed, or a “find and replace” modification is made, where one word is replaced with another throughout the text. These types of changes are deliberately done to make it difficult to find the original source of the content. We call this kind of content “copied with minimal alteration.” Google Search Quality Evaluator Guidelines March 2017
Congrats Floyd! To answer your question: a big part of the success depends on how much your content replaces the old content… or is a good fit for that page in general. In the example I gave, my CRO guide wasn’t 1:1 replacement for the dead link. But it did make sense for people to add it to their pages because they tended to be “list of CRO resources” type things. Hope that helps.
I used to think it could take more to get a subfolder trusted than say an individual file and I guess this sways me to use files on most websites I created (back in the day). Once subfolders are trusted, it’s 6 or half a dozen, what the actual difference is in terms of ranking in Google – usually, rankings in Google are more determined by how RELEVANT or REPUTABLE a page is to a query.

The basics of GOOD SEO hasn’t changed for years – though effectiveness of particular elements has certainly narrowed or changed in type of usefulness – you should still be focusing on building a simple site using VERY simple SEO best practices – don’t sweat the small stuff, while all-the-time paying attention to the important stuff  – add plenty of unique PAGE TITLES and plenty of new ORIGINAL CONTENT. Understand how Google SEES your website. CRAWL it, like Google does, with (for example) Screaming Frog SEO spider, and fix malformed links or things that result in server errors (500), broken links (400+) and unnecessary redirects (300+). Each page you want in Google should serve a 200 OK header message.
Websites that have extremely negative or malicious reputations. Also use the Lowest rating for violations of the Google Webmaster Quality Guidelines. Finally, Lowest+ may be used both for pages with many low-quality characteristics and for pages whose lack of a single Page Quality characteristic makes you question the true purpose of the page. Important: Negative reputation is sufficient reason to give a page a Low quality rating. Evidence of truly malicious or fraudulent behavior warrants the Lowest rating.
QUOTE: “Duplicated content is often not manipulative and is commonplace on many websites and often free from malicious intent. Copied content can often be penalised algorithmically or manually. Duplicate content is not penalised, but this is often not an optimal set-up for pages, either. Be VERY careful ‘spinning’ ‘copied’ text to make it unique!” Shaun Anderson, Hobo, 2018

Now, some buckets are worth more than others, and the three main buckets that you need to be aware of for search rankings are quality, trust and authority. So quality: what Google is trying to measure when they’re trying to figure out what sites should rank is offering something valuable or unique or interesting to googles searchers. For example: good content—if you are selling t-shirts and you are using the same description that every other t-shirt seller is using on their website then you are not offering anything unique to Google’s searchers.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
Naturally, business owners want to rank for lots of keywords in organic listings with their website. The challenge for webmasters and SEO is that Google doesn’t want business owners to rank for lots of keywords using autogenerated content especially when that produces A LOT of pages on a website using (for instance) a list of keyword variations page-to-page.
QUOTE: “Content which is copied, but changed slightly from the original. This type of copying makes it difficult to find the exact matching original source. Sometimes just a few words are changed, or whole sentences are changed, or a “find and replace” modification is made, where one word is replaced with another throughout the text. These types of changes are deliberately done to make it difficult to find the original source of the content. We call this kind of content “copied with minimal alteration.” Google Search Quality Evaluator Guidelines March 2017
While backlinks are still an important factor used by search engines to determine content quality, linkless mentions are given more weight than ever before. This is partially due to the perception is that linkless mentions are more genuine, unlike black hat SEO techniques like paid links. Social media mentions are also playing an increasingly important role in evaluating website quality.

How do you figure out what keywords your competitors are ranking for, you ask? Aside from manually searching for keywords in an incognito browser and seeing what positions your competitors are in, SEMrush allows you to run a number of free reports that show you the top keywords for the domain you enter. This is a quick way to get a sense of the types of terms your competitors are ranking for.
×