Google's John Mueller said Google can handle sites on IPv6 & HTTP/2 all just fine with Google web search. But John also recommends that the set up also supports the older infrastructure, IPv4 & HTTP/1.x, which is typically how these are set up by default anyway, because it helps Googlebot and other crawlers & user-agents get access to your content, he said.
John wrote:
Sites on HTTPS, with support for IPv6 & HTTP/2, work fine in Google Search - lots of sites support that infrastructure. For Googlebot, and other crawlers & user-agents, it's important that IPv4 & HTTP/1.x support continues to exist on a site, but that's generally how this infrastructure is set up, especially if you're using a more common hoster / CDN.
Then if you want to set to make sure GoogleBot is not having issues with your site, just use the fetch and render tool in Google Search Console and you should see right away if there are issues. John wrote:
My recommendation would be to use a tool like Fetch and Render in search console to double-check. You'll quickly see whether Googlebot can access your site or not. If Googlebot can access it normally, and you're seeing a change in ranking, it's possible that the infrastructure change isn't related to the ranking change. In that case, it can help to get feedback from the community here, with regards to what you might be able to focus on with your specific site.
Being in this world, it is outstanding that this person took the time to ask how going to IPv6 & HTTP/2 will impact GoogleBot. Most don't think about crawlers and just do the switch. Of course, Google needs to make sure GoogleBot can handle it but for the most part, it should not be an issue.
Forum discussion at Google Webmaster Help.