We've been covering the significant drop in Google AdSense earnings from some publishers for over a month now. We posted yesterday that Google says this is due to a change in how they buy on the Google Display Network and now "Google will not automatically monetize sites or pages that the AdSense bot can’t crawl."
One publisher said on WebmasterWorld that Google told him that they "detected that your AdX /Adsense account is sending a significant number of ad requests from URLs that are not crawled." He then posted the full Google response which reads:
Thanks for reaching out to our team. My name is xxxx. I'm happy to help you. :)As part of Google’s efforts to increase brand safety for advertisers, AdWords and DoubleClick Bid Manager have adopted more restrictive bidding on ad requests coming from URLs that are uncrawled. This is necessary to avoid the risk of ads running on sensitive content.
We detected that your AdX /Adsense account is sending a significant number of ad requests from URLs that are not crawled. Below are some of the possible reasons why a URL might not be crawled:
You may be using complex parameters or encoded strings in your URLs that are unique for each visit, instead of sending us the canonical URL, which is easier to crawl. Your URL may represent newly available content which had not been crawled before you sent ad requests. This is transient as your URL will be crawled shortly after your first ad request. You may be sending an incorrect URL to us because you are manually sending an incorrectly formatted URL in your ad request. You may be sending the URL of an iframe with an ad instead of the URL of the content page that hosts the iframe. (This typically applies to larger publishers). You may have limits on how often they can be crawled (crawler rejects our crawl requests). To avoid a potential revenue impact from this change, please consider the following best practices for ensuring URLs can be properly crawled:
AdSense Publishers, see:
About the AdSense crawler How to fix AdSense crawler errors Display ads on login-protected pages Give access to our crawler in your robots.txt file DFP and AdX Publishers, see Crawler Access.
Here are two additional tools that can help identify what adjustments you need to make.
The Fetch is a Google tool that enables you to test how Google crawls or renders a URL on your site. You can use Fetch as Google to see whether Googlebot can access a page on your site, how it renders the page, and whether any page resources (such as images or scripts) are blocked to Googlebot. This tool simulates a crawl and render execution as done in Google's normal crawling and rendering process, and is useful for debugging crawl issues on your site. robots.txt Tester - The robots.txt Tester tool shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawl the URL of an image you wish to block from Google Image Search.
I hope this information helps clarify. After reviewing the information, let me know if you have a follow up question.
Honestly, that all reads very weird to me and doesn't seem like a nicely worded email - but what do I know.
The folks don't really like Google's response and want Google to fix the issues on their end. Most publishers are saying they are not blocking crawler access and something is wrong on Google's end, not the publishers end.
Google should be reaching out the publishers impacted by this but thus far, their communication around this has been horrible - in fact, so bad, I suspect a class action lawsuit brewing.
Forum discussion at WebmasterWorld.