A Google Webmaster Help thread has one webmaster asking why GoogleBot seems to be maxed out at crawling his site at 10 requests per second. The webmaster even documented the behavior over time and plotted it in this graph below.
He wanted to know why is there a cap at 10 requests per second, why not do more if the server can handle it?
Google's John Mueller answered him that there is indeed no cap, it is set automatically and GoogleBot will crawl as much as it thinks it can, without harming your server. John said thee is "no hard-coded limit of 10 QPS for crawling." He said there is nothing for you to do, with the exception of not limiting GoogleBot yourself from crawling your site.
If you think something is wrong, John said you can always click on the "Report a problem with Googlebot" link within Webmaster Tools to help Google debug the crawl issue.
Here is that chart:
Forum discussion at Google Webmaster Help.