Google: Stop Using 403s or 404s To Reduce Googlebot Crawl Rates

Feb 17, 2023 - 7:51 am 2 by

Bee Googlebot

Gary Illyes posted a new blog post on the Google Search Central site asking all of you to stop using 403 and 404 server status codes to reduce the crawl rate of Googlebot. He said they have seen an uptick in the number of sites and CDNs doing this and they need to cut it out.

Gary wrote, "Over the last few months we noticed an uptick in website owners and some content delivery networks (CDNs) attempting to use 404 and other 4xx client errors (but not 429) to attempt to reduce Googlebot's crawl rate." "The short version of this blog post is: please don't do that," he added.

Instead, he said Google has documentation about how to reduce Googlebot's crawl rate. "Read that instead and learn how to effectively manage Googlebot's crawl rate," he added.

Gary also posted on LinkedIn saying, "Friday rumble... ramble? One of those. Anyway: the 403 and 404 status codes will not help you quickly reduce crawl rate. If anything, they might have the opposite effect. We have documentation about how to reduce crawl rate and unsurprisingly 403/404 is not in them."

There are more details in the blog post.

Forum discussion at LinkedIn.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: November 21, 2024

Nov 21, 2024 - 10:00 am
Google

Google AI Overview Index Serving Delayed Compared To Web Results

Nov 21, 2024 - 7:51 am
Google

Page Annotation In Google iOS App Browser

Nov 21, 2024 - 7:41 am
Google News

DOJ: Force Google To Sell Chrome, Restrict Android & Ban Default Deals

Nov 21, 2024 - 7:31 am
Google

Google Sitelinks With Icons & Labels

Nov 21, 2024 - 7:21 am
Bing Search

Bing Search Tests Circle Shaped Favicons In Search Results

Nov 21, 2024 - 7:11 am
Previous Story: Google: Not All Googlebots Use Same Rendering Engine & Render JavaScript