Google's John Mueller said it would be "extremely rare that Googlebot would submit a form" on your web site. He said Google did this more often in the old days, espesially on governmental web sites where there was no site navigation and the only way to get to public content was to use a search box on the site. But nowadays most sites have good navigation through links and it is is rare for Google to have to submit forms.
He added that if you do see GoogleBot submitting forms on your site, it might show that you need to improve your site's navigation and architecture to enable Google to crawl the site normally.
He said this at the 15:08 mark into this video:
Here is the transcript:
It's extremely rare that Googlebot would submit a form.It's something where we primarily did this way in the beginning when websites were structured in the way that we could not crawl them properly. In particular we saw this issue on a lot of government websites where there is a lot of content on the site but to find it you had to go to a search form to actually find links to that content. And for sites like that pretty much the only way to get to the detailed content was to go through the search form.
However for pretty much every modern site we can crawl normally and people are used to creating a structure where we can crawl with categories and subcategories. Where essentially we never need to go through any of the forms.
So I would imagine most of the people who have sites who have logs that they can look at if you look at the server logs and you look at Googlebot, you wouldprobably never see Googlebot submitting any of the forms that you have on the site.
So that's something that's really extremely extremely rare and something usually where when it does happen with a website, it's kind of a sign that we can't crawl normally. Where we we realize there's a lot of content but we can't actually find that content at all.
So that's something where if you're seeing this happening I would kind of go down the direction of like what what am I doing wrong, what could I be doing differently, with regards to my site's navigational structure. That's I think the the primary aspect there and with with that in mind adding more complexity like iframes or other domains, I suspect a lot of that would just not happen just for practical reasons. Because we want to avoid running into a situation where we accidentally enter things like credit card numbers and accidentally Googlebot goes off and buys things or fills out some contact forms with it's like random information. All of that doesn't really make sense and it causes almost more problems than it helps anyone. So that's something where I imagine if you have a configuration with iframes and other domains you would probably never see Googlebot go through that.
In 2008, Google said it can crawl content behind forms but as soon as 2015 Google said this was only done if it cannot access your content via normal navigation.
Forum discussion at YouTube Community.