As many of you know, Google has set new quotas for the Fetch as Google submit to index feature because of spam and abuse. But it still seems there is a cat and mouse game going on between Google and spammers with that tool.
John Mueller of Google responded to the number of complaints around the tool spitting out error messages that Google has set "pretty aggressive limits there at the moment." He said on Twitter, " I suspect that'll settle down again over time, but in general, I'd recommend focusing on non-manual methods for normal crawling & indexing (like sitemap files :))."
yeah, we have some pretty aggressive limits there at the moment. I suspect that'll settle down again over time, but in general, I'd recommend focusing on non-manual methods for normal crawling & indexing (like sitemap files :)).
— John ☆.o(≧▽≦)o.☆ (@JohnMu) March 13, 2018
Here is the error many are seeing when they use the tool:
Clearly, Google is working out the right setting to stop those trying to abuse the tool.
So if you get this error, go take a break, come back later and try again - maybe by then, the page would have been crawled naturally by Google anyway?
Forum discussion at Twitter.