Joost published a blog post showing how some of the sites he hit by Google Panda 4.0 and in one case, unblocking the CSS & JavaScript resulted in the site returning to its normal rankings.
Joost wrote, "they’ve returned on almost all of their important keywords. Just by unblocking Google from spidering their CSS and JS."
Well, I have an issue with this for a couple reasons:
(1) Panda is an algorithm and it needs to run again for it to have an impact. So first, you need to theoretically unblock your CSS & JavaScript, then wait for GoogleBot to crawl to pick it up, then Google needs to process all of that, and then Panda has to be rerun.
(2) I haven't seen enough evidence from the community to prove this to work.
(3) More of his supporting evidence is Google recommending you don't block CSS & JavaScript at SMX Advanced a couple weeks ago. Google has been saying that for years and years. They just keep saying it.
The truth is, a lot of sites that may have benefited or took a hit from Panda 4 actually saw reversals a week ago (last weekend). We have a story on that over here. So something did happen with Panda 4 but Google would not chime in about that.
I think it is unrelated to the blocking or allowing of CSS & JavaScript.
Robert Charlton, moderator of WebmasterWorld, wrote in the WebmasterWorld thread:
He associated a site's drop with its accidental blocking of CSS and Javascript files. This was shortly after Google had announced its new Fetch and Render feature in Webmaster Tools. Assumption is that this is now being used in the page layout algorithm. Unblocking CSS and JS appeared to produce quick recoveries.This kind of association is entirely consistent with algorithmic changes I've seen over the years, where Google has been quick to make use of a capability that we first see in a reporting feature.
What do you think?
Forum discussion at WebmasterWorld.
Update: Have somewhat a response on this from Google, new story over here.