Update Robots.txt 24 Hours Prior To Adding New Content

May 31, 2010 - 8:29 am 1 by

On Friday, Google's JohnMu tweeted an important tip. Let's say you block a directory from being crawled in your robots.txt file. Let's say you want to add content in that directory to be crawled by Google. John explained that since Google caches your robots.txt, you want to update the robots.txt file at least 24 hours prior to update content within that directory.

Here is JohnMu's tweet:

Robots-tip: crawlers cache your robots.txt; update it at least a day before adding content that is disallowed. Q&A in Buzz.less than a minute ago via web

Tedster in WebmasterWorldposted this tip and added:

I never thought about this before, but we certainly know that googlebot works off a cache of the robots.txt most of the time. Otherwise it would need to ping robots.txt right before each URL request, and that would get nasty pretty fast.

So apparently, 24 hours is what John is saying is the length of the cache time. Good to know. When Disallowed content gets place online, this is one precaution I never thought about.

The discussion goes off on to if you should use the robots.txt protocol to block content or not. But I won't get into that debate in this post.

Forum discussion at WebmasterWorld.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: October 16, 2025

Oct 16, 2025 - 10:00 am
Bing SEO

Bing Gains Support For data-nosnippet HTML Attribute

Oct 16, 2025 - 7:51 am
Google

AI-Generated Google People Also Ask Almost Doubles In 3 Months

Oct 16, 2025 - 7:41 am
Google

Nano Banana Comes To Google Search Via Google Lens

Oct 16, 2025 - 7:31 am
Google Ads

Google Ads API Version 22 Now Available

Oct 16, 2025 - 7:21 am
Google

Google Read Aloud User Agent Updates Which Services Use It

Oct 16, 2025 - 7:11 am
 
Previous Story: Google Lets Advertisers Place AdSense Ads On Top 1,000 Web Sites