Google's John Mueller confirmed that GoogleBot will crawl and pick up on URL patterns that simply do not work on your site. I mean, we have all seen this happen time and time again with sites we manage. But John added that the crawling should slow over time as Google picks up on this.
John said on Mastodon "Usually our systems pick up on URL patterns that don't work and slow down crawling for them," he then added "so it'll likely get a bit better (but you'll still see these warnings)."
This is in regards to sites getting this Search Console notice:
Here is a screenshot of the conversation if you don't want to click through.
Forum discussion at Mastodon.