There are tons of niche and general directories these days for everything on the sun, a recent thread on SEOchat examines how they will survive and some of the problems owners are facing when you happen to have too many links in your directory. Try managing over 500,000 and not link to a bad neighborhood?! One of the member who owns a very large directory is concerned linking to these bad neighborhoods and ways to find links that could do the site harm in the serps. Lots of good feedback in this post.
One of the first suggestions naturally was to use the "nofollow" tag in the link of the site. This would prevent the search engines from spidering these links, but then again it would be major PR hoarding which is notoriously unwise. Another member posts that instead of block the search engines from spidering these links you should allow them too, as your position in the serps will dramatically increase. I threw in the suggestion that you could get rid of categories that are "at risk" so to say for bad neighborhoods, or just scan those categories for those links. Another member who runs a large directory as well says to "If you really want to do something about it. Get yourself a free Google API key, then write a PHP script that will look for each domain or page in Googles SEPR via the 'site:' command. Just remember to have it run each day at night and not to pass your API query limit."
Continued Discussion at SEOchat