Google's Site Performance Shows For Some Blocked Pages

Dec 30, 2010 - 9:38 am 1 by
Filed Under Google Updates

A Google Webmaster Help thread has one webmaster asking why Google's Site Performance reports in Google Webmaster Tools are showing pages he blocked using the robots.txt.

The reason is pretty simple. A GoogleBot is not used to calculate the speed of a page.

Instead, Google uses Toolbar data from real users surfing and accessing your web pages with the Google Toolbar installed. Blocking GoogleBot will not block ordinary users with the Google Toolbar installed from accessing your site.

So if you are trying to hide your slow pages from Google, I'd recommend other methods, such as hiding your slow pages from your users.

Forum discussion at Google Webmaster Help.

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
Video Details More Videos Subscribe to Videos

Most Recent Articles

Bing Search

Microsoft Bing Now Hiding Google Search Results

Jan 21, 2025 - 7:51 am
Google Ads

Google Ads PMax Reports With Private Search Term Category

Jan 21, 2025 - 7:41 am
Google

Google AI Overviews Translation

Jan 21, 2025 - 7:31 am
Google Search Engine Optimization

Google: Word-Count Itself Makes So Little Sense

Jan 21, 2025 - 7:21 am
Bing Search

Bing Adaptive Zoom Setting

Jan 21, 2025 - 7:11 am
Search Forum Recap

Daily Search Forum Recap: January 20, 2025

Jan 20, 2025 - 10:00 am
Previous Story: Google PageRank Becoming Non Exclusive, At Least The Original Patent Document