Over the past few days or so there has been a spike in complaints around Google Search Console reporting an invalid sitemap error for sitemaps URLs that should be valid. Some are saying adding an allow before the sitemap URL in the robots.txt file fixes the issue but many are curious as to why.
There is this thread at the Google Webmaster Help Forums gaining traction with complaints from numerous webmasters and SEOs. The first person who posted wrote, "I have valid sitemap and robots.txt even though tool showing bellow error on Search Console. Can some one please help on this, we tried all but nothing is working, this is really urgent fix for us."
Here is a screenshot of that error:
The solution folks found was to add allow before the sitemap URL line in your robots.txt, so to replace your sitemap with this line
Allow: sitemap: https://"your-domain-name"/sitemap_index.xml
Here are some posts on X with complaints about this as well:
@googlesearchc @rustybrick
— Milan Dhrangadharia 🇮🇳 (@milan_0015) November 30, 2023
Didn't updated anything since long in Robots.txt file but recently noticed syntax error in Search Console report.
Also validated robots file using 3rd party tool for errors but didn't find any.
Similar thread: https://t.co/heS5US2ylR pic.twitter.com/TrZt6hvTVV
Same issue here.
— Mobilanyheter (@mobilanyheter) December 3, 2023
Google has not yet chimed in on this topic in the forum thread.
But a Google top contributor, Dave, wrote yesterday, "There was a reporting bug in search console in this new report, it's apparently now been fixed so over the next couple of days you should see this error go away. Let us know if you're still seeing issues after that."
Many still see the issue this morning.
Forum discussion at Google Webmaster Help Forums.