GoogleBot typically crawls from the United States, not 100% of the time. In fact, Google recently began crawling on a limited basis from other countries but only to check on local-aware features.
In any event, what if you have a web site that is not accessible for US users for legal or other reasons? Google says that GoogleBot from the US wouldn't be able to access it and it probably will cause major indexing issues.
Google's John Mueller said this in a Google Webmaster Help thread yesterday. He wrote, "In general, our cloaking guidelines say that you must show Googlebot the same content as you would show other users from the region that it's crawling from. So if you're blocking users in the US, then you'd need to block Googlebot when it's crawling from the US (as is generally the case)."
He did offer advice that you can allow some legal content to be shown to US users and thus GoogleBot can index the legal content. But without allowing US users to your web site, you have to imagine GoogleBot won't access it - unless you do things that are against Google's Webmaster Guidelines. John said, " one suggestion would be to have content that's globally accessible, for both users & Googlebot from the US, which can then be indexed in search."
This isn't a new topic, we actually wrote about it a few times including in 2008 and 2011 - the interesting part is the advice didn't change even since the January 2015 news that GoogleBot has locale aware GoogleBot smarts.
Forum discussion at Google Webmaster Help.