I have covered Google saying that its quality raters, which are third-party consultants who are not even employed by Google directly, do not have any direct influence in Google search rankings. I reported Googlers saying this countless times and I didn't report Googlers saying it as well because I covered it countless times.
Here are some, not all, of my stories on this topic from the past:
- Google Search Quality Raters Don't Directly Impact Rankings
- Google: Quality Raters Help Us Understand If Our Signals Produce Good Results
- Google: Quality Raters Guidelines Are Not One-To-One With The Ranking Algorithm
Heck, Jennifer Slegg and I even discussed it in the vlog and she is an expert on the quality raters. Oh, and by the way, Bing has the same deal.
In any event, Danny Sullivan wrote a detailed blog post on the Google blog about this again. And even when he wrote it, people still asked about how these raters influence the search results.
Google wrote:
Ratings are not used directly for search rankingOnce raters have done this research, they then provide a quality rating for each page. It’s important to note that this rating does not directly impact how this page or site ranks in Search. Nobody is deciding that any given source is “authoritative” or “trustworthy.” In particular, pages are not assigned ratings as a way to determine how well to rank them. Indeed, that would be an impossible task and a poor signal for us to use. With hundreds of billions of pages that are constantly changing, there’s no way humans could evaluate every page on a recurring basis.
Instead, ratings are a data point that, when taken in aggregate, helps us measure how well our systems are working to deliver great content that’s aligned with how people—across the country and around the world—evaluate information.
Then you have people read this and go off on how Google uses raters, manual humans, to adjust rankings. But Google said no, it does not.
You are not. It doesn't work that way. As the post explains, ratings are not used directly in rankings. pic.twitter.com/UBIEItumHM
— Danny Sullivan (@dannysullivan) August 4, 2020
They're not used directly in the sense that somehow we're having pages rated, and those ratings are used for how to rank individual pages. The feedback helps us indirectly measure how well our results seem to be. pic.twitter.com/GFAidzJv2J
— Danny Sullivan (@dannysullivan) August 4, 2020
If ratings show our results aren't doing well, we look to improve our systems overall, to hopefully improve them. We do not take particular ratings about a page or a site and use them to somehow to rank that particular page or site. pic.twitter.com/OwFZYaQ8xj
— Danny Sullivan (@dannysullivan) August 4, 2020
I think what you're trying to say is that people can do a lot of technical and referential SEO but those won't matter if human beings don't like your content. And yes, that's effectively true. But not because a human rater specifically doesn't like it....
— Danny Sullivan (@dannysullivan) August 4, 2020
Yes. If raters were like, "I searched for tomatoes and you gave me all corn results!," then we'd look to improve our systems to return tomatoes as people would expect. But we wouldn't go "Rater didn't find this particular corn page failed to match the search. Down with the page!"
— Danny Sullivan (@dannysullivan) August 4, 2020
They cannot. To be clear, as the post explains, ratings are not directly used in rankings in that way. pic.twitter.com/9IbmIJjgaS
— Danny Sullivan (@dannysullivan) August 4, 2020
I am kind of tired of reporting on it...
Forum discussion at Twitter.