Danny reports that WebmasterWorld Bans Spiders From Crawling where he points to a thread Brett started which he named lets try this for a month or three...
Brett explains that the bots, mostly from unauthorized sources are taking a toll on the server. He explains; "We have pushed the limits of page delivery, banning, ip based, agent based, and down right cloaking to avoid the rogue bots - but it is becoming an increasingly difficult problem to control."
Members are worried about how they can search WebmasterWorld, since the internal search engine is sub-par. Brett hopes that they can allow the bots again within 60 days, but it depends. Best of luck Brett!
Forum discussion at WebmasterWorld.