Google's John Mueller said that since the robots.txt file is cached by Google for about 24-hours, it does not make much sense to dynamically update your robots.txt file throughout the day to control controlling.
Google won't necessarily see that you don't want Google to crawl a page at 7am and then at 9am you do want Google to crawl that page.
John Mueller wrote on Bluesky in response to this post:
QUESTION:
One of our technicians asked if they could upload a robots.txt file in the morning to block Googlebot and another one in the afternoon to allow it to crawl, as the website is extensive and they thought it might overload the server. Do you think this would be a good practice?(Obviously, the crawl rate of Googlebot adapts to how well the server responds, but I found it an interesting question to ask you) Thanks!
ANSWER:
It's a bad idea because robots.txt can be cached up to 24 hours ( developers.google.com/search/docs/... ). We don't recommend dynamically changing your robots.txt file like this over the course of a day. Use 503/429 when crawling is too much instead.
This is not new news, we covered this a decade ago under Google: Don't Make A Dynamically Generated robots.txt. We also knew the 24-hour thing back in 2010.
Forum discussion at Bluesky.