Google's Gary Illyes posted a pop quiz for SEOs the other day on Twitter. The responses of the poll showed that SEOs are split on the answer of how Google would handle a contradictory robots.txt file directive.
Gary asked "how will this affect crawling:"
User-agent: *
Disallow: /
Allow: /
He said try not to use a tool when answering.
The results after almost 2,500 responses to the poll were that 52% said this will lead to everything being disallowed and 48% said it will lead to everything being allowed.
Here are the results:
Try answering without the help of a tool. How will this affect crawling:
— Gary 鯨理/경리 Illyes (@methode) July 2, 2020
User-agent: *
Disallow: /
Allow: /
Former Googler, Pedro Dias said order matters but current Googler showed some notes that may say otherwise:
Are you sure?https://t.co/7Me8PBbg47
— Edu Pereda (@epere4) July 2, 2020
Then Pedro showed the tool:
Yes :) pic.twitter.com/wG9KsXICq2
— Pedro Dias (@pedrodias) July 2, 2020
The answer should be clear cut?
Honestly, if any SEO would see this in a robots.txt - they would just make sure to clean it up and make sure the directives were clear. But it was a fun poll to see.
Forum discussion at Twitter.