This might not seem like a big deal, and most people won't even notice a difference, but Googlebot seems to be using a slightly different user agent that supports HTTP/1.1 (including support for server-side compression of documents), so it's about a 5x bandwidth saving for servers that have it turned on.
They are still running the old spiders as well, so it might just be a test phase or something. Either way, it's something I've been wishing for, and seem to have now. There is some discussion on it over here.