On Friday, Google's JohnMu tweeted an important tip. Let's say you block a directory from being crawled in your robots.txt file. Let's say you want to add content in that directory to be crawled by Google. John explained that since Google caches your robots.txt, you want to update the robots.txt file at least 24 hours prior to update content within that directory.
Here is JohnMu's tweet:
Robots-tip: crawlers cache your robots.txt; update it at least a day before adding content that is disallowed. Q&A in Buzz.
Tedster in WebmasterWorldposted this tip and added:
I never thought about this before, but we certainly know that googlebot works off a cache of the robots.txt most of the time. Otherwise it would need to ping robots.txt right before each URL request, and that would get nasty pretty fast.So apparently, 24 hours is what John is saying is the length of the cache time. Good to know. When Disallowed content gets place online, this is one precaution I never thought about.
The discussion goes off on to if you should use the robots.txt protocol to block content or not. But I won't get into that debate in this post.
Forum discussion at WebmasterWorld.