SEOGoogle Webmaster Update: Blocking JavaScript & CSS Can Affect Indexing

Google Webmaster Update: Blocking JavaScript & CSS Can Affect Indexing

Google has updated its Webmaster Guidelines, which will likely affect sites that are blocking JavaScript or CSS files.

Google has updated its Webmaster Guidelines, which will likely affect sites that are blocking JavaScript or CSS files.

According to an announcement on Google’s Webmaster Central Blog, the tech giant has updated its indexing system to function more like a modern browser, which includes having CSS and JavaScript active.

Google gave explicit advice on allowing Googlebot to access the JavaScript, CSS, and image files that a website uses:

“This provides you optimal rendering and indexing for your site. Disallowing crawling of JavaScript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings.”

Updated Google Indexing Advice

An upgraded system will require some process changes for webmasters, as Google warns that users should no longer view their indexing system as a “text-only browser.” Below is the advice Google provided as it relates to this new phase:

  • Google’s rendering engine may support not all technologies.
  • The design of your website should adhere to progressive enhancement principles to ensure that engines can scan the usable and supported content.
  • Page load speed is still very important for users and indexing.
  • Make sure your server is enabled to support serving JavaScript and CSS files to Googlebot.

Fetch & Render

Additionally, Google has updated its Fetch as Google diagnostic tool, which is designed to allow webmasters to simulate how Google is crawling a URL on a website.

How does it work? According to Google Support, the tool functions in the following ways:

“When the Fetch as Google tool is in fetch mode, Googlebot crawls any URL that corresponds to the path that you requested. If Googlebot is able to successfully crawl your requested URL, you can review the response your site sent to Googlebot. This is a relatively quick, low-level operation that you can use to check or debug suspected network connectivity or security issues with your site.

The fetch and render mode tells Googlebot to crawl and display your page as browsers would display it to your audience. First, Googlebot gets all the resources referenced by your URL such as picture, CSS, and JavaScript files, running any code. to render or capture the visual layout of your page as an image. You can use the rendered image to detect differences between how Googlebot sees your page, and how your browser renders it.”

Google released a post in May 2014 that informed webmasters that these changes were on the horizon. In this post, it provided examples of some potential issues that webmasters might encounter and ways to prevent them from occurring. These examples included:

  • If your website is blocking JavaScript or CSS, Google’s indexing system won’t be able to read the page like an average user.
  • There may be a negative impact on your website if your server is ill equipped to handle the volume of crawl requests.
  • Your pages may not be rendered properly if the JavaScript is too complex.
  • In some instances, JavaScript may remove not add content from a page which will prevent proper indexing of the page.

 

Resources

The 2023 B2B Superpowers Index

whitepaper | Analytics The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing

whitepaper | Analytics Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook

whitepaper | Digital Marketing The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

whitepaper | Digital Marketing Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y