Google's John Mueller in a long Google Hangout yesterday said that it is not problematic to cloak your XML sitemap file to search engines. The reason is, the XML Sitemap files are meant to only be seen by search engines and not users, so cloaking them differently for users makes no sense. You can change them up for GoogleBot versus Yahoo, Bing, etc - there is no issue with that.
John said this at the 19:30 mark into the video.
The question was:
As a way to deal with scrapers, I'd like to add X-Robots-Tag to my XML sitemaps, and then permit crawling only to googlebot and some other creditable crawlers, but not to rogue crawlers nor any users. Can it be perceived as cloaking?
John's answer:
No. I don’t think that would be in any way problematic. With sitemap files you can even cloak them directly to search engines. That is something that we would explicitly allow. Where if you test IP addresses and you see it is not a Google[bot] or Bing or Yandex or Yahoo or whatever else by IP address, you can serve like a not allowed page if you want to.So that’s the kind of situation where this content is explicitly only for search engines, so you can choose to really explicitly only show it to search engines.
Here is the video embed:
So if you are dying to do some cloaking, have fun with it with your XML Sitemaps. :) Oh, just be careful because if you show yourself a different XML file, it might be hard to debug what GoogleBot sees.
Forum discussion at Google+.