Gary Illyes from Google said on LinkedIn that if your server returns a 500/503 HTTP status code for an extended period of time for your robots.txt file, then Google may remove your site completely from Google Search.
This is even if the rest of your site is accessible and not returning a 500 or 503 status code.
It is not just a 500/503 HTTP status code that you need to worry about, it is also an issue if your site does those network timeout issues.
Again, it has to be for an "extended period of time," which was not defined, but I assume it is more than just a day or two.
Gary wrote, "A robots.txt file that returns a 500/503 HTTP status code for an extended period of time will remove your site from search results, even if the rest of the site is accessible to Googlebot." "Same goes for network timeouts," Gary added.
Gary referred to the HTTP docs and added, "if we can't determine what's in the robotstxt file and the server doesn't tell us a robotstxt file doesn't exist, it would be way more hurtful to crawl as if everything was allowed (eg. we might index martin's awkward hat pictures accidentally)."
We know that Google recommends using a 503 server status code for when your website goes offline or down temporarily for less than a day (hours not multiple days). If it goes offline for longer, than try to upload a static version of your site in its place.
Just be careful with long down time - not that you likely have a choice.
Forum discussion at LinkedIn.