A Google Groups thread has a detailed discussion around the topic of Google spider, GoogleBot, crawling too much. Sometimes servers can be overwhelmed by all the traffic it gets and automated crawlers, such as GoogleBot, can add a tremendous amount of stress to a server that is already stressing. Most webmasters are not in the position of banning GoogleBot from accessing their sites, so what can you do?
Here are some of the tips from the thread, including tips from Google representatives:
- Make sure GoogleBot is really GoogleBot and not some spammer. More on that over here and here.
- If you have a large site, limit or instruct GoogleBot on what it can or cannot crawl via the robots.txt file.
- Some URLs might be more "expensive" to be crawled than others (i.e. static pages versus large dynamic and graphic rich pages.
- Do you have 2 or 3 times the amount of pages indexed by Google, as you have actual product pages on your site? If so, why?
- Redirect any temporary URLs or tracking URLs using a 301
- Set the Google Crawl Rate, in Webmaster Tools, more on that over here
Forum discussion at Google Groups.