The Google Webmaster Central Blog posted a blog post named How Google defines IP delivery, geolocation, and cloaking. They cover topics such as making sure to treat Googlebot as a California resident with the Geolocation section, they go over IP delivery, cloaking content and and Google News First click free program.
I won't go through the blog post, you can read it at Google, I will however link to the related Google Groups thread and show you how this post started yet another cloaking and IP delivery debate.
You can also watch a video that summarized the blog post:
Here are select quotes:
Hopefully this is legit, but if not, how in general should browser specific HTML generation be handled? Should we pretend that robots are IE7 or Firefox, for example?
Re: cloaking: What about REST and the idea of different resource representations that is so prevalent throughout the web? Saying that one could run a MD5 hash on a resource url to determine a change in content violates the principles of REST, which states that resources can exist at the same url in different representations, the nature of the representation returned or received being determined by content negotiation?
Is what I'm doing really cloaking? I'm only trying to make it easier for Google to (efficiently) crawl the site which contains over 30 million posts. I'm not in any way trying to manipulate my rankings.
The cloaking debate never seems to end.
Forum discussion at Google Groups and WebmasterWorld.