Google: Keep The Default CMS Robots.txt Setting

Dec 28, 2016 - 7:49 am 7 by

googlebot google

Google's John Mueller said in a Google Webmaster Help thread that it makes most sense to keep the default setting found within your content management system for your robots.txt access control.

This answer comes in response to someone asking if they should block or allow access to the admin portion of their WordPress site.

John said, that generally, you want to keep the default setting. This is now espesially true for WordPress sites, they are pretty search engine friendly out of the box.

John wrote:

For the most part, unless you're aware of something specific that you need to change, I'd recommend leaving the robots.txt file to the default that your website's content management system has. If you're curious about how the robots.txt file works, you might want to take a look at https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt too.

So use Search Console and if you see issues, then tweak it but otherwise, keep it how it is.

Forum discussion at Google Webmaster Help.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: January 17, 2025

Jan 17, 2025 - 10:00 am
Search Video Recaps

Search News Buzz Video Recap: Google Search Volatility Cooling, AI Overviews Penalties, Maps Pin Hack Fix, Search Market Share & More

Jan 17, 2025 - 8:01 am
Google Ads

Scary Google Ads Phishing Scam

Jan 17, 2025 - 7:51 am
Google Ads

Google Ads Search Max Coming Soon?

Jan 17, 2025 - 7:41 am
Google Search Engine Optimization

Google Updates Examples Of Events & Estimated Salary Images In Structured Data Docs

Jan 17, 2025 - 7:31 am
Google

Google Testing AI Generated What People Are Saying

Jan 17, 2025 - 7:21 am
Previous Story: Google Misinforming Webmasters That Their Page Is Not Mobile Friendly