Google's URL Removal Tool Lacks Support for Wildcards

Jan 16, 2006 - 8:58 am 0 by

Dan Thies reports over at Search Engine Watch Forums URL Removal tool doesn't support robots.txt extensions. He explains that even though "Googlebot supports an extension to the robots.txt syntax, which allows webmasters to use wildcards in disallow directives." It does not support the same extensions when using the URL removal tool. He said it "will generate an error message telling you that wildcards aren't allowed, if you feed it a robots.txt file which makes use of these extensions."

Dan continues to explain that "Matt Cutts confirmed this... but it really shouldn't be a huge problem under normal circumstances, since it should only take a few days for Googlebot to pick up changes in the robots.txt file, and drop any pages that are disallowed."

So I would expect this to be added soon to the removal tool.

Forum discussion at Search Engine Watch Forums.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: November 22, 2024

Nov 22, 2024 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Core Update Heated, Site Reputation Abuse Expands, Site Wide Search Signals, DOJ On Chrome, AI Overview Ads & More

Nov 22, 2024 - 8:01 am
Google Search Engine Optimization

Google Search Console Indexing Reports Lagging By 7 Days

Nov 22, 2024 - 7:55 am
Google

Google Things To Know Tests Side By Side Results

Nov 22, 2024 - 7:51 am
Google Search Engine Optimization

Google On Too Many Network Requests & SEO

Nov 22, 2024 - 7:41 am
Google

Google's People Also Search In Images

Nov 22, 2024 - 7:31 am
Previous Story: Google Testing Refinement or Clustering Filter Links