Back in 2007, Google came up with a mechanism for SEOs or site owners to verify that Googlebot is who it says it is through reverse DNS checks. But now, Google has also decided to publish a list of IP addresses that Googlebot will use to crawl your site.
Google posted two different JSON files with the list of IP addresses Googlebot can use:
(1) You can identify Googlebot by IP address by matching the crawler's IP address to the list of Googlebot IP addresses in this JSON file.
(2) For all other Google crawlers, match the crawler's IP address against the complete list of Google IP addresses in this JSON file.
I assume these IP addresses may change from time to time, so it might make sense for you to check the JSON files daily for updates, that is if you automate any scripts to use this IP list.
It is great to have these officially published by Google. Keep in mind, there are other methods to verify Googlebot listed here.
I asked John Mueller of Google why did they do this now, why is reverse DNS not good enough? John responded on Twitter saying "it makes it a bit easier for some sites (CDNs, etc), and the old issues / risks around cloaking seem to have mostly gone away, so..."
It makes it a bit easier for some sites (CDNs, etc), and the old issues / risks around cloaking seem to have mostly gone away, so pic.twitter.com/PgsGYBzn6i
— 🧀 John 🧀 (@JohnMu) November 10, 2021
Forum discussion at Twitter.