We've all seen the message in Google Search Console's index coverage report for "Page indexed without content." Google's Gary Illyes said when you see that, most of the time (not all of the time), it is about "pages that are blocked by robots.txt."
The error is defined in Google's help document as "Page indexed without content: This page appears in the Google index, but for some reason Google could not read the content. Possible reasons are that the page might be cloaked to Google or the page might be in a format that Google can't index. This is not a case of robots.txt blocking. Inspect the page, and look at the Coverage section for details."
Gary Illyes was asked if this error can be caused by "heavy loading time or time-outs" but Gary said no. If it was a heavy loading time or time-out issue, you'd likely see a soft 404 notice instead Gary explained. Gary said "this error is really just for pages that are blocked by robots.txt."
Here are those tweets:
no, we would likely just not used those pages of they time out. maybe we'd report them as soft404, depending on whether they time out for Googlebot or rendering.
— Gary 鯨理/경리 Illyes (@methode) March 20, 2021
this error is really just for pages that are blocked by robots.txt
Forum discussion at Twitter.