Google Crawl Rate Errors With Sitemap Indexing

Apr 27, 2011 - 8:16 am 1 by

googlebotOver the past few days there have been reports from Webmasters that changing the crawl rate in Google Webmaster Tools has prevented Google from indexing their Sitemap files.

A Google Webmaster Help thread has reports of this issue dating back five days with several webmasters complaining thereafter.

The error webmasters are seeing read:

We were not able to download your Sitemap file due to the crawl rate we are using for your server. For information on increasing crawl rate, please see our Help Center.

Why is this happening? It seems to only happen when you change the crawl rate to a manual setting but I cannot confirm that for sure.

One webmaster effected by this said:

I am having the same problem too. I got the message yesterday, and changed my crawl rate to manual and moved the bar all the way to the right to allow the fastest crawl rate, but I still the see the same "crawl rate problem" message as I saw yesterday.

So far Google has not addressed these concerns in the forum.

Forum discussion at Google Webmaster Help.

Update: Googler, JohnMu replied to the thread with more details. He said:

This message generally means that we'd like to crawl more (in this case, your Sitemap file) from your site if your server (or crawl-rate-setting) would allow it.

If you have a manually set crawl-rate in Webmaster Tools, you may wish to reset that back to "let Google determine my crawl rate," so that our systems can try to automatically raise it to match your server's accessibility (the manual setting is mostly to limit it even lower). Somewhat simplified: should we notice that crawling too much causes your server to slow down, we will generally reduce our crawl rate to avoid causing problems.

Should you notice that Googlebot is regularly crawling less than you would want, then you may want to consider these points:

* Work to reduce the number of crawlable URLs on your website. For example, if you have session-IDs in your URLs, or use complex, dynamically generated URLs, that will generate a large number of crawlable URLs. Normal canonicalization techniquest can generally help in a case like that: http://www.google.com/support/webmasters/bin/answer.py?answer=139066

* Check your Webmaster Tools crawl-stats to see if crawling of your site (and also -- if you have access -- other sites on the same server) is particularly slow, and then work with your hoster and/or web-developer to see if that can be improved.

* Use the "Report a problem with Googlebot" form behind the "learn more" link next to the crawl-rate settings. Keep in mind that if Googlebot is not crawling as much as you'd want due to technical issues (too many URLs being crawled and/or server issues), then we'd really recommend fixing those first.

Hope this helps! Feel free to post back should you have any questions.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: December 18, 2024

Dec 18, 2024 - 10:00 am
Google Search Engine Optimization

Video On Google Exploit With End Points Reveal Interesting Ranking Signals

Dec 18, 2024 - 7:51 am
Google Search Engine Optimization

Forbes Fires Freelancers Over Google's Site Reputation Abuse Policy

Dec 18, 2024 - 7:41 am
Google

Google Search Tests Rich Things To Do Image Carousel

Dec 18, 2024 - 7:31 am
Google

Google Search Shadow On Hover Of Search Results

Dec 18, 2024 - 7:21 am
Google Ads

Google Ads Tests Double Serving Ads From Same Advertiser On Same Page

Dec 18, 2024 - 7:11 am
Previous Story: Google Webmaster Tools Sitemaps Report Bug