Google: Avoid Blocking Pages That Are Important Enough To Have Links To Them

Dec 14, 2020 - 7:21 am 0 by

Dandelion Google

Google's John Mueller said he would advice that if "that's something where if you see that these pages are important enough that people are linking to them then I would try to avoid blocking them by robots txt."

In short, if you have important or popular pages with a lot of links to them, make sure Google can access the page.

If you robots.txt out that page, Google may drop the links and those links won't help Google understand the true importance of your web site. That means, your rankings can decline in Google Search.

Of course, it all depends on the specific situation of that piece of content. So this is not simple blanket advice across every situation.

Here is the embed where John said this at the 6:44 mark into Friday's video:

Forum discussion at YouTube Community.

 

Popular Categories

The Pulse of the search community

Follow

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: October 29, 2024

Oct 29, 2024 - 10:00 am
Google

Google AI Overviews Rolling Out To 100+ Countries & Billion+ Users

Oct 29, 2024 - 7:51 am
Google Search Engine Optimization

Google: Doubtful You'd See Big Ranking Drop Over Core Web Vitals Issues

Oct 29, 2024 - 7:41 am
Google

Google Tests Frequently Saved Label On Search Result Snippets

Oct 29, 2024 - 7:31 am
Google Maps

Google Tests New Local Places & Compare Sites Interface

Oct 29, 2024 - 7:21 am
Bing Search

Bing Recommends Against Batch Mode For IndexNow

Oct 29, 2024 - 7:11 am
Previous Story: Google Tests Displaying Images Within Snippets On Mouse Hover