Gary Illyes from Google posted a PSA (public service announcement) of sorts on Mastodon and LinkedIn for sites that are heavy with JavaScript. He said that you should try to load the content, including the "marginal boilerplate" content, first when you have a JavaScript-heavy site.
He said this is recent advice based on seeing a "bunch" of emails with complaints from SEOs where they see "lots of dups reported in Search Console." He said when he tried to load the pages in his browser, it "took forever to load," "so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups," he added.
So if you see this issue in Google Search Console for your JavaScript heavy site, then try it, try to load the content first.
Here is what Gary posted:
Do you have a JavaScript-heavy site and you see lots of dups reported in Search Console? Try to restructure the js calls such that the content (including marginal boilerplate) loads first and see if that helps.I have a bunch of emails in my inbox where the issue is that the centerpiece took forever to load, so rendering timed out (my most likely explanation) and we were left with a bunch of pages that only had the boilerplate. With only the boilerplate, those pages are dups.
Gary later added on Mastodon, "Search engines are in fact very similar to a user's browser when it comes to indexing, but a user doesn't access billions of pages (or however many search engines typically access) every day, so they must have stricter limits."