Almost all SEOs know that a busy server or slow server will result in GoogleBot slowing how they crawl your web site and we also know that extremely slow sites/pages can be negatively impacted in ranking well in the Google search results.
But how slow is too slow?
Google's John Mueller called out a specific load time as being too slow for GoogleBot to crawl the site at its normal rate. John Mueller said in a Google Webmaster Help thread:
We're seeing an extremely high response-time for requests made to your site (at times, over 2 seconds to fetch a single URL). This has resulted in us severely limiting the number of URLs we'll crawl from your site, and you're seeing that in Fetch as Google as well. My recommendation would be to make sure that your server is fast & responsive across the board. As our systems see a reduced response-time, they'll automatically ramp crawling back up (which gives you more room to use Fetch as Google too).
He specifically called out "over 2 seconds" to load a single URL on this site, which is resulting in GoogleBot "severely limiting the number of URLs" it will crawl on that site.
Now, John is not mentioning anything about the PageSpeed algorithm, just about the crawling of the site.
We often don't hear specific numbers from Google, like this.
Forum discussion at Google Webmaster Help.