Google's John Mueller was asked about how GoogleBot handles crawling, indexing and ranking SPAs, single page applications on the web. John said on Twitter that while it might not be "always perfect, and certainly not easy, but for some sites it can work well, even if you rely on client-side-rendering."
So we know it is possible but there are obviously many stumbling blocks you need to be aware of when ensuring Google can crawl and index the content on the SPAs. Using Fetch as Google can help you determine what Google can see. But you might need to do some deeper dives into the code to do some client-side-rendering.
John Mueller did add even if you rely just on JavaScript and no server-side-rendering it can cause maybe even more issues. So if you have SPAs, you need to test and test and see what Google can pick up.
It's not always perfect, and certainly not easy, but for some sites it can work well, even if you rely on client-side-rendering (just JS, no server-side-rendering). YMMV :)
— John ☆.o(≧▽≦)o.☆ (@JohnMu) July 16, 2018
Google did say back in 2015 that using SPAs is not cloaking or against Google's guidelines.
Forum discussion at Twitter.