Google Webmaster Guidelines Updated To Warn About Blocking CSS & JavaScript Files

Google announced they’ve updated their webmaster guidelines to specifically note that blocking your CSS or JavaScript files may have a negative impact on your indexing and search rankings in Google. Pierre Far, Google’s Webmaster Trends Analyst, said the “new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that […]

Chat with SearchBot

google-guidelines-rules-fade-ss-1920

Google announced they’ve updated their webmaster guidelines to specifically note that blocking your CSS or JavaScript files may have a negative impact on your indexing and search rankings in Google.

Pierre Far, Google’s Webmaster Trends Analyst, said the “new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use” all for “optimal rendering and indexing.” If not and you block Googlebot access to these files, Pierre added it “directly harms how well our [Google] algorithms render and index your content and can result in suboptimal rankings.”

How can you be certain GoogleBot can render your web pages properly? Google created the fetch and render tool within Google Webmaster Tools four months back specifically for this reason. Google back then said you should make sure not to block these files because GoogleBot is trying to render your full HTML.

What changed in the guidelines exactly?

Before:

Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

After:

To help Google fully understand your site’s contents, allow all of your site’s assets, such as CSS and JavaScript files, to be crawled. The Google indexing system renders webpages using the HTML of a page as well as its assets such as images, CSS, and Javascript files. To see the page assets that Googlebot cannot crawl and to debug directives in your robots.txt file, use the Fetch as Google and the robots.txt Tester tools in Webmaster Tools.

For more details on this update, see the Google blog.


About the author

Barry Schwartz
Staff
Barry Schwartz is a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry can be followed on Twitter here.

Get the must-read newsletter for search marketers.