Google is now sending out a rush of new warnings via Google Search Console, formerly Google Webmaster Tools to notify webmasters that GoogleBot cannot access their CSS and JS (JavaScript) files on their web sites.
Michael Gray first notified me directly via Twitter and I planned on covering it tomorrow, but it seems Google is sending these warnings out at a rapid pace. Tons of webmasters are concerned after receiving these warnings.
Here is a picture:
Google has been telling webmasters to not block CSS & JavaScript for years and years. Here is Matt Cutts in 2012 telling webmasters not to block it. The webmaster guidelines were updated to say not to block them. The new fetch and render tool warns you when you block CSS and JavaScript. We also know, Google renders the page as a user would see it these days, so blocking CSS/JS can impact that big time.
Like I said, it seems like Google is sending these notices out in mass quantity now. The message reads:
Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.
This is not a penalty notification, but a warning that if Google cannot see your whole site, it may result in poorer rankings.
If you get this message, talk to your developers and discuss what you can do, if you need to do anything. Use the fetch and render tool to diagnosis the issue deeper as well.
Forum discussion at Twitter.
Update: I should add, that many many WordPress sites are getting this notification because their /wp-includes/ folder is blocked by robots.txt. Plus there are many popular CMS solutions that block their include files by default.
Update 2: I have more details from Google on this notification, which you can read over here.