Seems somebody at Yahoo doesn't know how to train Slurp well enough to behave. It's misbehavior has got a few webmasters on WMW talking about the problem with Slurp (the bot) taking down their servers after bulk submitting keywords to Yahoo. The issue seems to arise when the spider comes by to verify links in the ads. The impact of the verification can take down a server if it doesn't stop trying to verify the link exists. Unfortunately if the site goes offline, then the ads at Yahoo don't get approved because the spider took down the advertisers site. Making this a efficiency problem for Yahoo to fix. I have heard of another case like this last year, so was glad to see a thread about it. According to one WMW member who, "spoke with someone at Yahoo who confirmed that they do have a process that clicks on the links and it is possible that we'd get a lot of concurrent clicks. "
Another member goes on to say:
This is an issue that is well-known to any advertiser who has uploaded several thousand terms, and yet seems to be a mystery to Yahoo's 'support' people.In the past, I've sent them dozens of megabytes of our log files showing slurp hammering our servers as it checks our submission. If you use the bulk upload to modify your listings, slurp will check both the old and the new versions ... sometime at rates up to 20 hits per second or more and often continuing for many, many hours (over 32, in one case).
Y 'support' has no idea what is going on. We end up blocking the bad (terribly rude) spider for a few days and it finally stops coming. Seems to have no impact on whether the listings are approved or not.
Continued discussion at WebmasterWorld