Google's Two Pronged Approach To Search Spam: Ignore vs Penalize

Jun 13, 2017 - 7:41 am 21 by

Google Cracks

We've known for some time now that Google sometimes selects to ignore spam and sometimes selects to penalize the spam (i.e. devalues vs demotes). The topic came up again when we covered how Gary from Google talked about how they ignore Forbe's link spam.

John Mueller in yesterday's Google hangout at the 46:32 mark into the video where one webmaster discusses how this Forbes spam happens. He explains:

There is another thing John that you probably would need to be aware of. There is a website called people per hour, I don't know if you heard? So what what people go there, they go there and say for example, I can publish a story about you in let's say Forbes or Lifehacker or another website and these websites, the Forbes, doesn't know anything about this for example. So these people per hour, they take money to write this story and they go to Forbes and they say, we have a, would you like a guest post from us, it's a free guest post okay and they look at the it, it’s a nice story and they publish. So that means for a links on one page is Forbes source web sites like this responsible for that? They have no idea that the person will actually she or he got paid and that may be fair - because he has to be paid for her work or whoever that person?

John responds explaining that they recently issued a warning about this and it has been an issue for a long time that the search quality team is aware of. He said Google takes a "two-pronged approach" with this type of spam. Specifically "on the one hand manually we try to take action where we think it's necessary on the other hand algorithmically we try to ignore things that we can kind of isolate," he said.

Here is the video embed at the start time:

Here is the answer transcript:

Yeah so this kind of guest posting isn't really new. It's something that has been around for quite a while and we we've been dealing with that for a while as well. In end of May, we did a blog post about some of these issues, let me just copy that into the chat, with regards to especially these kind of link schemes, where where some company goes to different bloggers and says hey I would like to buy a guest post here here and here and I want a link to my website from your guest post. So that's something that our web spam team is well aware of and our algorithms are also kind of working to to handle that algorithmically as well.

So in general what happens in cases like this is when we can recognize that this kind of activity is happening across a site, we tend to lose trust in those links. So for example if I don't know on my blog there's like constant guest posters, they're always linking to these random sites that they get paid for. Then the web spam team might say, ok we see you're doing this, it's your site you can do whatever you want with it, but we're not going to trust any of those links. None of those links are going to provide any value for any of those sites. So essentially you're publishing this but those other sites that are paying for this type of links are not getting any value. So that's kind of the approach we try to take there. That is similar to two other things where when we can recognize that something is happening in a bad way and we can just ignore what the bad part of that and focus on a good part then we'll try to do that. So for example, with keyword stuffing, if we can recognize that a page is doing keyword stuffing and we can just ignore the keyword stuff part and focus on rest, then that for us is also kind of a reasonable approach. Because we can't fix all pages. We can inform the webmasters and say hey you're doing this wrong, we can send them notifications from the web spam team but sometimes there are still useful information on those pages and we want to rank those pages for the useful part not for the kind of keyword stuffing bad part.

So that's kind of a two-pronged approach that we take there. On the one hand manually we try to take action where we think it's necessary on the other hand algorithmically we try to ignore things that we can kind of isolate.

Forum discussion at YouTube.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: October 29, 2024

Oct 29, 2024 - 10:00 am
Google

Google AI Overviews Rolling Out To 100+ Countries & Billion+ Users

Oct 29, 2024 - 7:51 am
Google Search Engine Optimization

Google: Doubtful You'd See Big Ranking Drop Over Core Web Vitals Issues

Oct 29, 2024 - 7:41 am
Google

Google Tests Frequently Saved Label On Search Result Snippets

Oct 29, 2024 - 7:31 am
Google Maps

Google Tests New Local Places & Compare Sites Interface

Oct 29, 2024 - 7:21 am
Bing Search

Bing Recommends Against Batch Mode For IndexNow

Oct 29, 2024 - 7:11 am
Previous Story: Google Animated Logo For ICC Champions Trophy 2017