Google has completely rewritten the Quality Ratings Guideline, their resource which their team of quality raters uses to rate websites for Google. This one is a brand new version, rewritten from the ground up, so it isn’t just a refresh of the old one (and most doesn’t even resemble any of the old Guidelines that SEOs scrutinized in detail whenever one got leaked). It has some great new insights into how Google is approaching the search results and what it takes to be a top ranked website.
The new version 5 is just three months old, in mid 2014, and shows how Google is placing much more emphasis on their knowledge graph style results that seem to be taking over the Google search results, as well as reputation, authority, and of course, the case of advertisements on a page.
Google is now putting a high emphasis on sites that are considered to have a high level of expertise, authoritativeness or trustworthiness. This is something that Google has been working on in their algorithms for some time, and shouldn’t be much of a surprise for webmasters to see this.
Lacking in E-A-T
Google’s brand new emphasis in the new Quality Rater’s Handbook is the idea of E-A-T, which is a website’s “expertise, authoritativeness and trustworthiness”.
Likewise, Google is stressing that sites that lack expertise, authoritativeness and trustworthiness should be awarded the Low rating when a page or site is being assigned a rating by one of their quality raters. And more importantly, Google says that lacking a certain amount of E-A-T is enough of a reason for a rater to give any page a low quality rating.
This means that webmasters will need to do what they can to ensure their sites pass the newly minted E-A-T test.
Google also warns the quality raters about sites with user contributed content, such as forums or other sites that allow users to submit articles or information. They urge caution because pages on the sites may not be trustworthy and many lack appropriate amount of E-A-T.
That said, there are some types of user generated content that have an extremely high level of expertise, such as forums that are frequented by experts on specific topics, and Google asks the raters to try engage the experience and expertise of those authors to try and determine if a page should be considered trustworthy or not.
What makes an expert?
As part of their new E-A-T, Google does stress to raters that there are many kinds of experts, all dependent on the topic area. Not every subject area has a way to qualify expertise. It is easier to determine a medical expert than an expert for a hobby site.
Raters are also supposed to consider everyday expertise, even in “Your Money or Your Life” (YMYL) areas – something that the last publicly available version of the guidelines stressed. Specifically, support forums are an excellent resource for specific diseases, since they have input from those who are suffering from the disease. Those contributors might not be physicians, but their life and everyday experiences with the disease make them an expert on sharing those personal experiences. That said, for specific medical advice, it should still come from doctors or health professionals.
For SEOs, this is going to be crucial that you establish your writers as authorities or experts in your field. You want people to trust your site – not to mention Google is clearly emphasizing the role of authority and expertise in websites already, so going the next step further definitely isn’t surprising in the least.
A significantly higher portion of the guidelines covers Knowledge Graphs, and shows that they are having their raters spend quite a bit of time rating just the knowledge graphs alone, showing Google plans to continue their march on the search results real estate percent.
Previous versions talked about Title Link Result Blocks (TLRB) and No Title Link Result Blocks (NTRB) as the way Google differentiates between two distinct styles of knowledge graphs the company uses. The first, TLRB, displays a clickable headline at the top of the area, while NTRB blocks do not.
Vital versus non-Vital Knowledge Graphs
Google also asks raters to specify a Vital rating for TLRB which give all the needed information for the query on the TLRB’s landing page. This hints even more that Google is putting a lot of resources into knowledge graphing up the web and it’s here to stay (much to the delight of searchers but not so much webmasters).
Many Ads = Low Quality
While the page layout algo has be attempting to reduce the prominence in the search results of pages that are heavy on advertising, the previous versions of the quality raters guide didn’t place much emphasis on advertising on a website unless it was deceptive or spammy. Now, the new guidelines definitely want raters to look at advertising on the page to determine if there is an overabundance of it.
Google specifically mentions layouts that are all advertising at the top, and requires scrolling to see the content – to the point where people could initially believe that there is no content on the page at all. The same for advertising designed to look like navigation links or secondary content.
However, this also is completely opposite from what AdSense likes their publishers to do – not perhaps to the extreme where you need to scroll to see content, but they are constantly reminding publishers that they can add additional ad units or replace smaller ad units with larger ones.
They also state that pages should get the Lowest rating if the rater feels the design tries to manipulate the user to click the ads, inline ads or download links.
While previously the quality rater’s guide focused on the main content of the page, with only a brief mention of supplementary content, now there is a new emphasis on not only supplementary content, but types of supplementary content as well. Gone are the days where you can have a high quality page with just navigation for the supplementary content.
While most people consider secondary content to be strictly navigation and perhaps the footer, Google now wants raters to look at other types of secondary content, with a particular emphasis on content that suggests related content on a site.
The quality of that content is of higher importance now. Previously, Google only specifically mentioned supplementary content in two ways – additional suggested videos on a page about a Saturday Night Live episode, as well as what many would consider important features of a recipe site – printable features, reviews and nutritional information. Now, Google wants to see a wide variety of supplementary content on a page, and are putting a greater emphasis on it as being an important and integral part of a page that is worthy of a High or Very High rating.
Now, Google considers valuable supplementary content to include not only those things but also showing similar makes or models of an item on a shopping website, ability to multiply or divide a recipe,
Google also wants supplementary content to have a higher emphasis for those pages that could be considered High or Highest quality.
Google’s rule of thumb for quality raters on determining what is secondary content: anything on a page that isn’t the main content or advertisements. They consider it important to the overall user experience.
Google does caveat that some times of webpages lack secondary content but aren’t necessarily low quality – such as community or local sites. Likewise with PDFs or images, secondary content isn’t needed in those cases either.
Bottom line, if you don’t have useful secondary content on your website currently, you should begin working on that as soon as possible, because this is clearly where Google wants to head.
Bad Supplementary Content
Google has been stressing the use of deceptive ad placements and pages that are very ad heavy as being subject to the page layout algo. Now Google considers any site or page with bad supplementary content as being worthy of a Low rating. Primarily, they are warning about deceptive ad placement where users can accidentally click on an ad or are lured to click an ad, believing it to be content on the site and not an advertisement.
They also specifically mention advertisements placed under headers such as “Top Posts”, which has been against the Google AdSense terms for quite some time, but which we still see used with AdSense and other ad networks.
Essentially, if the secondary content is unhelpful or distracting, that’s a Low quality rating.
Poor Page Design
Google mentions the many typical things that advertisers trying to squeeze every last penny out of a webpage do, such as popups, large quantity of ads with only minor content and text ads in navigation.
Google is also calling out a specific advertising tactic that many, many websites use – and that is inserting advertisements a few times in the middle of the main content, breaking it up with main content – ad – main content – ad – main content. This is used on many newspaper and magazine sites – and if this gets factored into the algorithm, you can be sure newspapers will complain about it (or paywall instead).
If you are using ads within content – it is probably a good idea to consider restricting it to one, as Google sees it as jarring when a user has to go back and forth between content and ads too many times. And those fancy links in navigation that are really ads? Webmasters should probably start rethinking those too.
Google had some pretty specific list of features that rater’s should use to determine if a site is a true merchant site or not – something that had some companies concerned if they did not offer it. For example, many merchants that are legitimate might not offer a wish list feature, a gift registry or a user forum, but some SEOs were pressuring that it was “needed” simply because the quality rater’s guide included it on a list of features for true merchant websites.
In the new handbook, this has been removed. Instead, Google asks raters to look for pages such as contact information, return and exchange policies. This does bring it more inline with what many smaller merchants offer, who either don’t have the budget, technical know-how, or even the need for some of the features Google listed previously.
Rating forums and Q&A
Google wants raters to determine not only the expertise of the site itself, but also who are participating in the particular thread that is being rated. Also considered are whether new threads being added, if many people participating and whether the conversations are in-depth?
While many of us think of Q&A sites as low quality with crappy content, there are some gems out there, so Google does warn not to assume all Q&A sites are low quality.
Q&A Without the Answer
We have all ended up on a Q&A page only to discover there is no answer posted yet, even for questions asked years ago, or the answer is hidden behind a paywall. Google considers pages of this type without the answer given are Low quality.
They did mention this previously in the older version of the guidelines, but they removed the part stating that unhelpful answers should be given a Low quality rating – hopefully they removed it because it is common sense that an unhelpful answer is bad.
Are you one of the few people who are still using inline advertising? Well, its days could be numbered, as Google considers this distracting and can make the main content on the page difficult to read, which equals a poor user experience.
Inline advertising, those double underlined links that pop up an ad when you mouseover the link, has definitely lost popularity over the years, and it isn’t that common to see it anymore, except on spammy pages.
Curiously, reference to “thin” affiliate sites has been removed from the new guide. Previously Google warned about thin affiliate sites not being a high quality site, and how raters could determine if a site was an affiliate site or not.
Is Google confident in how they are currently ranking affiliate sites and not letting “thin” affiliate sites rank well? That doesn’t seem to be the case, particularly in some competitive spaces. Panda did make a considerable dent in the affiliate rankings, at least for those run by less savvy SEOs. But the fact they removed the warning altogether raises the question that Google feels spam affiliate sites are a thing of the past and don’t rank today anyways, that poor quality sites would get a low rating regardless, or that they are happy for affiliate sites to rank, as long as they have some of the other criteria needed for an above average rated website.
Website reputation has been given a boost in the new version of the guide, and it is clear that Google is putting a greater emphasis on reputation than they did before.
Of course, webmasters have to consider that reputation is going to be given a boost in the search algorithm too, since Google is clearly wanting their rater’s to take this into a fairly significant account for rating any website. Although Google has previously said that rater’s themselves do not affect search results, we often see changed within the guide that has a greater impact.
Google does include a caveat that small, local businesses or community organizations might not have any online reputation because their web presence is relatively small compared to word of mouth. But Google now expects to find reputation information for any large business or organization.
Most importantly, Google stresses that a webpage cannot be given a High rating if the site has a negative reputation.
They are also asking raters to give the lowest rating to any page where there is sufficient evidence of fraudulent or malicious behavior on behalf of the website.
With a greater emphasis on reputation for quality ratings, it could be a sign that reputation from credible sources could be someone added to the algorithm, or we might see a new penalty specifically targeting sites with negative reputations (Hello Puffin update?)
Where’s the Spam?
Another curious omission is the fact all reference to spam has been removed from the new guide. Previously there were sections describing spam and how to check for it, but in the new guide, the only reference to spam is in regards to sites with a large amount of spam comments or forums that have been spammed.
Where’s the Cloaking?
There was previously an entire detailed section explaining what cloaking is and how raters can identify whether a site is cloaking or not. But this has also been removed from the new guide.
Yet another item removed from the examples of low quality content is the reference to “other distracting content”. The only reason for this removal is that Google considers other types of distracting content to more clearly be considered secondary content, which is pretty specific about pages needing to be rated lower when the rater is presented with distracting content – particularly when dealing with advertising as secondary content.
Low Quality Pages Now Never Acceptable
Previously, the guide stated “Low quality pages may only be acceptable to users if there are no other higher quality pages.” This has now been removed. Does Google believe there is always a higher quality page for every low quality page out there? Possibly, with the rate the web is still growing.
But clearly, they don’t want higher ratings being given, which could confuse engineers basing algorithms on raters and pages, simply because there isn’t a better quality alternative out there.
Lack of Purpose
Google now refers to all gibberish and auto generated pages as pages that have lack of purpose and should always be rated Lowest.