Validate your robots.txt - Googlebot becomes smarter from Sebastian reports official confirmation from Google that they are testing out new crawler directives.
He explains that adding "Noindex: /" to your robots.txt file will now deindex your complete site. Specifically, Google has told us about the new REP META tags protocol and the X-Robots support a while back, so just be careful with your old tags.
Google commented at Sebastian's post saying:
Good catch, Sebastian. How is your experiment going? At the moment we will usually accept the “noindex” directive in the robots.txt, but we are not yet at a point where we are willing to set it into stone and announce full support.
Forum discussion at Sphinn.