Few months earlier in October 2014, Google informed some specified webmasters that Googlebot can’t access JavaScript and CSS files. If you have received the same message recently, do not get hassle because it can be fixed at a glance.

Via ‘search console’ Google is sending this notification along with an underlined message that getting this warning means “suboptimal ranking.”

You can see the official warning message from Google.

“Google systems have recently detected an issue with your homepage that affects how well our algorithms render and index your content. Specifically, Googlebot cannot access your JavaScript and/or CSS files because of restrictions in your robots.txt file. These files help Google understand that your website works properly so blocking access to these assets can result in suboptimal rankings.”

Since October 2014, Google has been informing the webmasters that it is not fair for the websites to block CSS or Java Scripts and now after monitoring for a long time it has started to issue warning message.

If you have a little knowledge about robots.txt, then you can easily fix the issue in quickest possible time. Here, I have managed some codes and if any of these codes is visible, you must remove them because that is the culprit that is blocking Googlebot for crawling.

Disallow: /.js$*

Disallow: /.inc$*

Disallow: /.css$*

Disallow: /.php$*

After removing such codes, you must check if the problem is fixed or not. To check the same use Google’s Fetch and Render tool. If the problem has not been removed and Googlebot is still blocked, there will be further instructions how to manage everything. Additionally you can Robots.txt testing tool for identifying the crawling issues.