Double Check Your Robots.txt: Google Testing New Crawler Directives

Nov 21, 2007 - 7:54 am 2 by

Validate your robots.txt - Googlebot becomes smarter from Sebastian reports official confirmation from Google that they are testing out new crawler directives.

He explains that adding "Noindex: /" to your robots.txt file will now deindex your complete site. Specifically, Google has told us about the new REP META tags protocol and the X-Robots support a while back, so just be careful with your old tags.

Google commented at Sebastian's post saying:

Good catch, Sebastian. How is your experiment going? At the moment we will usually accept the “noindex” directive in the robots.txt, but we are not yet at a point where we are willing to set it into stone and announce full support.

Forum discussion at Sphinn.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: December 5, 2025

Dec 5, 2025 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Search Volatility, AI Mode Taking Over Search, Search Console AI Configurator & 22 Year Anniversary

Dec 5, 2025 - 8:01 am
Google Maps

Google API Doc Hints Service Areas Not A Local Ranking Factor

Dec 5, 2025 - 7:51 am
Google Search Engine Optimization

Google Search Console Performance Report Tests AI Powered Configurator

Dec 5, 2025 - 7:41 am
Google Search Engine Optimization

Google Search Console Average Impression Increased For Many

Dec 5, 2025 - 7:31 am
Google Search Engine Optimization

Google Merchant Center New Regional Member Pricing (Beta)

Dec 5, 2025 - 7:21 am
 
Previous Story: Google AdSense Clickable Area Change 'Not So Bad'