Some sites, hosted on some CDNs (content delivery networks), are experiencing a big spike in server response times for crawling, while seeing a drop in total crawl requests. So technically, the crawling has dropped but Google is taking much longer to crawl a lot less. Supposedly, this started earlier this month and is still an issue for some.
This was discovered by Gianna Brachetti-Truskawa who posted more about this both on LinkedIn and Bluesky and she wrote:
Have you seen a recent drop in new users, and/or found that Google's crawl rate has dropped on your site while server response times seem to be higher than usual?Google have quietly updated their list of IP ranges used for crawling (as of 04.02.2025). If your website is delivered via a CDN, their WAF protecting your site from DDoS attacks might have Googlebot run into rate limiting or be blocked now - unless they updated their allowed IP ranges accordingly.
This did not affect every CDN, in fact, CloudFlare handled it fine, she said. But not all CDNs handled it. "Luckily, Cloudflare seems to be on top of it! But we found reports of a few websites delivered via other CDNs, including larger ones like Akamai Technologies, who run into the issue, suggesting that their CDN providers might not have updated their IP ranges for Googlebot yet," she wrote.
Here is a chart from a Google Webmaster Help Forum thread showing the issue. You can look at your crawl stats in Search Console over here:
Back in 2021, Google began publishing its Googlebot IP list and I covered some of the times Google updated that IP list (then I stopped, it wasn't exciting - until now).
John Mueller from Google replied to the concerns on Blueksy basically explaining there is this JSON file to track these changes and the crawling will settle down over time. He wrote:
We push the IP json files automatically -- changes happen from time to time. If you need to alert internally on those files, feel free to poll them. I checked the last three updates, they were each 2x IP blocks added (ipv6/v4). It's generally not a complete revamp.It's hard to know how the web will react to subtle infrastructure shifts, which is part of why we've been publishing these IP ranges automatically. Hopefully it was just a short-term blip!
I track these changes still and normally, the changes aren't that frequent and often pretty minor to the overall size of the document. But changes are changes - here are some of the more recent changes that I tracked:
You can see the JSON file here.
Gianna Brachetti-Truskawa shared some tips on what you can do, if you are impacted - she wrote:
- Check with your CDN provider if they've updated their IP ranges for Googlebot. You can ask them to verify using Google's JSON file. If not, consider switching to a provider that keeps up with these changes.
- Consider monitoring changes yourself, or find snapshots of the file in the Wayback Machine. You can also save snapshots there on demand by yourself (I would not suggest to rely on infrastructure you don't own but it's one easy way!) and then compare the two files with your favourite method (eg. using Testomato or Little Warden - or a Compare plugin in Notepad++ if you're feeling old-school).
- Find more advice about CDNs in the comments.
Do you want me to cover the changes to this JSON file going forward? Would it be helpful to you?
Forum discussion at LinkedIn and Bluesky.
Update: There is now also a WebmasterWorld thread complaining about the same thing - here is a similar chart from there: