GoogleBot is pretty good, actually, very good, at not crawling most sites too much where it actually ends up slowing down the site from loading. GoogleBot is designed to detect that and back off. But if you have a specific set of pages or a page that is super slow because the way you wrote your database query is super inefficient, Google says - don't blame them.
Gary Illyes, the official House Elf and Chief of Sunshine and Happiness at Google, wrote on Twitter, as the official bad guy at Google (he has lots of titles), "if you have a page that uses a considerable amount of server cpu when accessed - maybe because database queries? -, you need to fix/optimize that. if Googlebot uses up your server quota because it fetches that page, that's on you, not on Googlebot."
if you have a page that uses a considerable amount of server cpu when accessed - maybe because database queries? -, you need to fix/optimize that. if Googlebot uses up your server quota because it fetches that page, that's on you, not on Googlebot pic.twitter.com/yvheCEYg33
— Gary 鯨理/경리 Illyes (@methode) July 16, 2020
Now if Google could just help you automatically optimize your queries, add some indexes, or something...
But truth is, yes, Gary is 100% correct here. Make sure your database queries are designed in a way that can handle traffic, not just from GoogleBot but also from your users that Google Search may send you.
Forum discussion at Twitter.