Google Responds To Impact Of Blocking CSS & JSS & Panda 4.0

Jun 24, 2014 - 8:23 am 21 by

google panda coffee java cupYesterday we covered some SEO theories around blocking JavaScript & CSS triggering Panda 4.0 issues. I didn't honestly believe there was a relation, based on the example provided and very few other sites reporting the same effects but now we have a response from a Googler.

Well, maybe the response is a bit Google-like and cloudy.

One webmaster posted the theory on Google Webmaster Help and John Mueller responded to the specific case at hand, not necessarily Panda 4 and how it related to blocking CSS & JavaScript. But he did respond to the question about being hit by Panda and blocking content via external files.

John Mueller of Google wrote:

Allowing crawling of JavaScript and CSS makes it a lot easier for us to recognize your site's content and to give your site the credit that it deserves for that content. For example, if you're pulling in content via AJAX/JSON feeds, that would be invisible to us if you disallowed crawling of your JavaScript. Similarly, if you're using CSS to handle a responsive design that works fantastically on smartphones, we wouldn't be able to recognize that if the CSS were disallowed from crawling. This is why we make the recommendation to allow crawling of URLs that significantly affect the layout or content of a page. I'm not sure which JavaScript snippet you're referring to, but it sounds like it's not the kind that would be visible at all. If you're seeing issues, they would be unrelated to that piece of JavaScript being blocked from crawling.

So is John saying that if you block content, then it may impact the Panda algorithm? Is he saying that? Or is he saying that the content that is blocked, Google can't see anyway and it has no impact on Panda? Or maybe it may or may not have an impact on Panda because Panda is about content and maybe layout?

See how this can get confusing. What is your take?

Forum discussion at Google Webmaster Help.

Update: John responded again basically implying it is not Panda. He wrote:

Looking at your site, those disallowed scripts are definitely not causing a problem -- it's primarily an issue of problematic links here. That's what I'd focus on first. Since there's a manual action involved, that's something which you can work on to resolve.

He then aims to answer the specific question at hand head on:

Regarding your more general question of whether disallowed scripts, CSS files, etc play a role in our Panda quality algorithm: our quality algorithms primarily try to understand the overall quality of a page (or website), and disallowing crawling of individual aspects is generally seen as more of a technical issue so that wouldn't be a primary factor in our quality algorithms. There's definitely no simple "CSS or JavaScript is disallowed from crawling, therefore the quality algorithms view the site negatively" relationship.

He goes on in more detail, so check out the thread.

Image credit to BigStockPhoto for Panda Java Mug

 

Popular Categories

The Pulse of the search community

Search Video Recaps

 
- YouTube
Video Details More Videos Subscribe to Videos

Most Recent Articles

Search Forum Recap

Daily Search Forum Recap: December 17, 2024

Dec 17, 2024 - 10:00 am
Google Search Engine Optimization

Google Site Reputation Abuse: Treating Some Sites Within A Site

Dec 17, 2024 - 7:51 am
Google Search Engine Optimization

Google: Disavowing Toxic Links Is A Billable Waste Of Time

Dec 17, 2024 - 7:41 am
Google Search Engine Optimization

Google Adds Faceted Navigation To Help Documentation

Dec 17, 2024 - 7:31 am
Google Ads

New Google Merchant Center Promotion For First Order Discount

Dec 17, 2024 - 7:21 am
Other Search Engines

OpenAI Opens ChatGPT Search To All Logged In Users

Dec 17, 2024 - 7:11 am
Previous Story: Google Glass Now Available In The UK For $200 Extra