Google To Clarify E-E-A-T & Quality Rater Guidelines Documentation

Feb 6, 2024 - 7:41 am 96 by

Google Robot Eating

Danny Sullivan, Google's Search Liaison, said we should expect to see more clarification from Google on its creating helpful content page, specific to the section "Get to know E-E-A-T and the quality rater guidelines."

Sullivan said on X, "You're almost certainly going to see more clarification come to that page, as well, in the near future." This is after he was asked "But then say their raters are trained on it in the link."

The section currently reads:

Google's automated systems are designed to use many different factors to rank great content. After identifying relevant content, our systems aim to prioritize those that seem most helpful. To do this, they identify a mix of factors that can help determine which content demonstrates aspects of experience, expertise, authoritativeness, and trustworthiness, or what we call E-E-A-T.

Of these aspects, trust is most important. The others contribute to trust, but content doesn't necessarily have to demonstrate all of them. For example, some content might be helpful based on the experience it demonstrates, while other content might be helpful because of the expertise it shares.

While E-E-A-T itself isn't a specific ranking factor, using a mix of factors that can identify content with good E-E-A-T is useful. For example, our systems give even more weight to content that aligns with strong E-E-A-T for topics that could significantly impact the health, financial stability, or safety of people, or the welfare or well-being of society. We call these "Your Money or Your Life" topics, or YMYL for short.

Search quality raters are people who give us insights on if our algorithms seem to be providing good results, a way to help confirm our changes are working well. In particular, raters are trained to understand if content has strong E-E-A-T. The criteria they use to do this is outlined in our search quality rater guidelines.

Search raters have no control over how pages rank. Rater data is not used directly in our ranking algorithms. Rather, we use them as a restaurant might get feedback cards from diners. The feedback helps us know if our systems seem to be working. Reading the guidelines may help you self-assess how your content is doing from an E-E-A-T perspective, improvements to consider, and help align it conceptually with the different signals that our automated systems use to rank content.

Sullivan replied, reiterating again that EEAT is not a score, is not a ranking factor, it is not an algorithm and Google will keep repeating this message. He wrote:

You're almost certainly going to see more clarification come to that page, as well, in the near future.

We thought "While E-E-A-T itself isn't a specific ranking factor" was clear enough for the people who somehow believe we have an E-E-A-T "score." But some still have this misconception despite that we don't have some E-E-A-T ranking score we use. Not a thing. Not a rankingf actor.

Raters use the *concept* to rate pages *in other ways* so we can *evaluate* how our search results perform. They don't assign an E-E-A-T "score" to pages; their rating also aren't used directly in rankings.

Honestly, for all the worry some spend trying to figure out how to "prove" their pages have E-E-A-T, I would sincerely urge them to just ask themselves "if someone comes to my page from search, are they satisfied with what they get, from the content to the experience?"

Yesterday he added:

Honestly, I'm at a loss sometimes what else to say. We get asked things like:

Is EEAT a ranking signal?

And say "No, EEAT is not a ranking signal"

And people go "Well, what does ranking really mean. Maybe it's signals? They didn't say it's not signals!"

So do we have a signal page experience signal? No. That's why we made a page that says "There is no single signal."

Oh but wait, so you have multiple signals? Yes, we anticipated this question which is why we have on that same page "Our core ranking systems look at a variety of signals."

Which leads to things like "So is CWV a signal and if I don't meet those, am I doomed?" Which is why that same page says "However, great page experience involves more than Core Web Vitals."

We don't list what is and isn't a ranking signal on that page because things change. Maybe something was once; maybe it shifts but aligns with other things we might do to understand page experience. We're trying to guide people toward some useful resources and things to thing about with page experience but in the end -- do whatever you think is providing a great experience for your visitors.

Here are those posts:

This is what stemmed off of the updated Google SEO guide saying "Thinking E-E-A-T is a ranking factor? No, it's not" under the "Things we believe you shouldn't focus on" section.

This then branched out into that topic we covered yesterday on Google doesn't say core web vitals are a ranking factor.

Forum discussion at X.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: November 20, 2024

Nov 20, 2024 - 10:00 am
Google Search Engine Optimization

Google Site Reputation Abuse Policy Now Includes First Party Involvement Or Content Oversight

Nov 20, 2024 - 7:51 am
Google

Google Lens Updated For In-Store Shopping

Nov 20, 2024 - 7:41 am
Google Search Engine Optimization

Google Makes It Clear It Has Both Site Wide & Page Level Ranking Signals

Nov 20, 2024 - 7:31 am
Other Search Engines

ChatGPT's Search Marketing Share vs Google

Nov 20, 2024 - 7:21 am
Bing Search

Bing Video Search Tests Categorizing Videos

Nov 20, 2024 - 7:11 am
Previous Story: Google Search Console May Lose Verification With Squarespace Migration