Pierre Far from Google announced on his Google+ page that Google has seriously slowed down the GoogleBots, Google's web crawlers, to reduce any of the negative impact sites that are blacking out would have in Google's search results.
Google laid out the SEO considerations for blacking out your site yesterday - for those who wanted to do what Wikipedia and other sites are doing by preventing access to their site. Wikipedia did it a search engine friendly way, kind of the same way we did our April Fools joke a couple years ago, but just overlaying using JavaScript a black cover page. Technically, it might be against Google's terms of service but I am sure Google will let it slide. The reason this works well is because GoogleBot won't see the JavaScript overlay and crawling will remain fine.
That being said, many other sites are blacking out their sites and not even thinking about what type of impact this may have on their sites in the Google rankings. So Google decided to drastically slow down how active GoogleBot is today only. So this will help sites who are not taking into account SEO.
Here is what Pierre said:
Hello webmasters! We realize many webmasters are concerned about the medium-term effects of today's blackout. As a precaution, the crawl team at Google has configured Googlebot to crawl at a much lower rate for today only so that the Google results of websites participating in the blackout are less likely to be affected.
This is a pretty serious move from Google but in all likelihood is the best possible thing they can do today.
Oh, the picture above is Google's official picture of GoogleBot, but I made him all dark for SOPA.
Forum discussion at Google+.