Googlebot Learns to Read AJAX/JavaScript Comments

Google’s search robot has been improved; it can now read and understand certain dynamic comments implemented through AJAX and JavaScript. This includes Facebook comments left through services like the Facebook social plugin.

As first noted by Digital Inspiration, the Googlebot seems to have picked up a neat trick: the ability to index the comments on specific pages, even when those comments are made through an AJAX/JavaScript format that wasn’t previously searchable.


This change was confirmed in a tweet from Matt Cutts. His comment (“Googlebot keeps getting smarter. Now has the ability to execute AJAX/JS to index some dynamic comments.”) gives an indicator of the limitations on the update. It seems that the changes were built for a specific sort of content; not all comments built in AJAX/JS will be included, and it’s not a universal change that allows all dynamic content to be indexed.

It is a good indicator for things to come, however. As the Googlebot gets better at understanding dynamic page elements, JavaScript and Flash will become safer for designers to use. While we’re not to that point of complete safety just yet, an advancement such as this one points us toward a light at the end of the tunnel.

In the meantime, webmasters can approach dynamic commenting options with less concern for the potential SEO consequences, and users will gain access to the large amount of content posted directly to these AJAX/JS-based mediums.

It’s not yet certain what other SEO or user impact this will have; the possibility of link-juice being passed from these comments, the comments leading to an increase in social search rankings, or the inclusion of these posts in real-time search have yet to be tested or explored.

Related reading

Using Python to recover SEO site traffic (Part three)
how to make SEO-friendly Javascript websites
Complete guide to Google Search Console