Yesterday we covered the new quality raters guidelines and how it targets and flags offensive, inaccurate and hate content in the search results. Tons of people took it to mean that Google is censoring the internet. That is not really the case, instead, if you read the new raters guidelines.
Google is looking to make sure that if the intent of the searcher is to find information, that the information Google returns is not offensive, inaccurate or hateful. But if the searcher is looking for offensive, inaccurate or hateful content, that Google does provide it.
Paul Haahr from Google explained a bit more on Twitter with these three tweets, I believe it is the only public statement he made on Twitter on this.
.@jonahburke Sorry if it wasn't clear in the articles, but I think the guidelines do the right thing here. We ask raters to tell us both /1
— Paul Haahr (@haahr) March 15, 2017
.@jonahburke that Holocaust denial is wrong and that it's offensive. We want to learn both things from them. /2
— Paul Haahr (@haahr) March 15, 2017
.@jonahburke Wrong and offensive overlap in this case, but not always. Different types of low quality, of which there are many. /3 (fin)
— Paul Haahr (@haahr) March 15, 2017
It seems to me that Paul has taken on the role of search quality man, not for spam like Matt Cutts did but for offensive, inaccurate or hateful results. Again, for 0.1% of the queries, I am not sure how much it is needed but Paul believes it does and overall, if Google wants to spend their time on this problem, that is very commendable in my opinion.
Forum discussion at Twitter.