Google's John Mueller hinted that you should aim for your "Time spent downloading a page" metric in Google Search Console to be less than a few 100 milliseconds. He said he has seen some less than 100 milliseconds and some way over 1,000 milliseconds. This is not about specifically page speed, it is more about how fast GoogleBot can crawl your site.
John said on Twitter "We don't have any hard guidelines on this, but anecdotally I see sites with a few 100ms time to fetch resources (some with <100ms), and other sites with way over 1000ms." "Faster usually lets us crawl more, should we need it (crawling is just a small piece of the picture though)," he added.
Later he said "This is not about rendering / user-experience (though that's important too), it's really just about crawling for search." "The two sides are significantly different, and often mixed up," he added.
You can see this stat by going to the old Google Search Console crawl stats section:
Here are those tweets:
We don't have any hard guidelines on this, but anecdotally I see sites with a few 100ms time to fetch resources (some with <100ms), and other sites with way over 1000ms. Faster usually lets us crawl more, should we need it (crawling is just a small piece of the picture though).
— 🍌 John 🍌 (@JohnMu) September 11, 2018
Yeah, that's the stat shown in search console. This is not about rendering / user-experience (though that's important too), it's really just about crawling for search. The two sides are significantly different, and often mixed up.
— 🍌 John 🍌 (@JohnMu) September 11, 2018
Forum discussion at Twitter.