Update Robots.txt 24 Hours Prior To Adding New Content

May 31, 2010 - 8:29 am 1 by

On Friday, Google's JohnMu tweeted an important tip. Let's say you block a directory from being crawled in your robots.txt file. Let's say you want to add content in that directory to be crawled by Google. John explained that since Google caches your robots.txt, you want to update the robots.txt file at least 24 hours prior to update content within that directory.

Here is JohnMu's tweet:

Robots-tip: crawlers cache your robots.txt; update it at least a day before adding content that is disallowed. Q&A in Buzz.less than a minute ago via web

Tedster in WebmasterWorldposted this tip and added:

I never thought about this before, but we certainly know that googlebot works off a cache of the robots.txt most of the time. Otherwise it would need to ping robots.txt right before each URL request, and that would get nasty pretty fast.

So apparently, 24 hours is what John is saying is the length of the cache time. Good to know. When Disallowed content gets place online, this is one precaution I never thought about.

The discussion goes off on to if you should use the robots.txt protocol to block content or not. But I won't get into that debate in this post.

Forum discussion at WebmasterWorld.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: January 21, 2025

Jan 21, 2025 - 10:00 am
Bing Search

Microsoft Bing Now Hiding Google Search Results

Jan 21, 2025 - 7:51 am
Google Ads

Google Ads PMax Reports With Private Search Term Category

Jan 21, 2025 - 7:41 am
Google

Google AI Overviews Translation

Jan 21, 2025 - 7:31 am
Google Search Engine Optimization

Google: Word-Count Itself Makes So Little Sense

Jan 21, 2025 - 7:21 am
Bing Search

Bing Adaptive Zoom Setting

Jan 21, 2025 - 7:11 am
Previous Story: Google Lets Advertisers Place AdSense Ads On Top 1,000 Web Sites