Google: Stop Using 403s or 404s To Reduce Googlebot Crawl Rates

Feb 17, 2023 - 7:51 am 2 by

Bee Googlebot

Gary Illyes posted a new blog post on the Google Search Central site asking all of you to stop using 403 and 404 server status codes to reduce the crawl rate of Googlebot. He said they have seen an uptick in the number of sites and CDNs doing this and they need to cut it out.

Gary wrote, "Over the last few months we noticed an uptick in website owners and some content delivery networks (CDNs) attempting to use 404 and other 4xx client errors (but not 429) to attempt to reduce Googlebot's crawl rate." "The short version of this blog post is: please don't do that," he added.

Instead, he said Google has documentation about how to reduce Googlebot's crawl rate. "Read that instead and learn how to effectively manage Googlebot's crawl rate," he added.

Gary also posted on LinkedIn saying, "Friday rumble... ramble? One of those. Anyway: the 403 and 404 status codes will not help you quickly reduce crawl rate. If anything, they might have the opposite effect. We have documentation about how to reduce crawl rate and unsurprisingly 403/404 is not in them."

There are more details in the blog post.

Forum discussion at LinkedIn.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: April 1, 2025

Apr 1, 2025 - 10:00 am
Google Search Engine Optimization

Google Ranking Reddit AI Translated Pages Too Well

Apr 1, 2025 - 7:51 am
Google

Google AI Overviews Tests Check Important Info Disclaimer

Apr 1, 2025 - 7:41 am
Bing Search

Bing Tests New Sitelinks Hover Shadow & Related Search Designs

Apr 1, 2025 - 7:31 am
Bing Search

Bing Copilot Answer Buttons To Image & Video Search

Apr 1, 2025 - 7:21 am
Google Search Engine Optimization

April 2025 Google Webmaster Report

Apr 1, 2025 - 7:11 am
Previous Story: Google: Not All Googlebots Use Same Rendering Engine & Render JavaScript