Google's John Mueller answered a question in Stack Exchange about if GoogleBot uses SPDY for crawling the web.
The question:
When SPDY is available for a site, does Googlebot use it? If so, sites which are SPDY enabled would appear faster to Googlebot.
John Mueller said, after asking internally at Google, that Google does not currently use it. He said, "no, Googlebot doesn't use SPDY."
SPDY is an open networking protocol for transporting web content, it manipulates HTTP traffic, with particular goals of reducing web page load latency and improving web security. SPDY achieves reduced latency through compression, multiplexing, and prioritization although this depends on a combination of network and website deployment conditions. SPDY was developed by Google.
John added that Google may use it in the future:
With regards to potential future functionality, this kind of thing is something they would definitely consider, if they can see that it would have a significant impact. For example, if they see many servers that are overloaded with requests, where combining requests could allow better crawling, then maybe this would be an option. On the other hand, if it's only an option on servers that are already well-crawled, then it probably doesn't make that much sense.
John is specifically talking about how GoogleBot crawls the web. Not about ranking.
Sites that are really slow and suffer from a site speed issue, can potentially use SPDY to improve their page speed and thus rankings. But again, no direct impact here for using SPDY as a ranking factor.
Forum discussion at Stack Exchange.
Image credit to BigStockPhoto for rocket