SEOSEO Strategies for JavaScript-Heavy Single Page Applications or AJAX Sites

SEO Strategies for JavaScript-Heavy Single Page Applications or AJAX Sites

Extensive use of JavaScript to power site functionality is here to stay. Let’s look at some technical approaches to reaping the benefits of these sites without hurting your SEO.

Client-side JavaScript is great for making innovative and user-friendly web sites. Next to AJAX, a web technology well established to trigger the load of certain page content on demand, so-called Single Page Application (SPA) frameworks, which heavily rely on JavaScript, are seeing a rapid adoption in the web developer community as they allow for fast-to-market development of highly interactive and fast-loading web applications.

However, relying heavily on JavaScript to render your page content and provide navigation functionality brings with it well-known risks in terms of technical SEO, indexing and linkability challenges. Here are some proven strategies for creating AJAX sites and Single Page Applications that are not search disasters.

Background: Client Side JavaScript and URL Fragments

Client-side web frameworks are heavily using a technology referred to as AJAX — Asynchronous JavaScript and XML. It turns static websites into dynamic web applications that can present rich and varied user experiences without refreshing the entire page and changing the URL. Different states of the application are typically denoted by appending a hashtag to the URL, recognizable by the (#) prefix (e.g. http://www.hugeinc.com/#aboutus).

The benefits in terms of user experience are clear — fast and responsive pages that only load the content that is necessary for the user experience. In fact, many next-generation interfaces are not easily done without the use of JavaScript as it removes the need for the user to actively click a link and for the browser to refresh the entire page.

Using these technologies, users can browse products and content in a fast, intuitive way via a clean interface. The URL, however, remains more or less the same. Hyperlinks, which include URL fragments, are used to load the requested product data at the time of user interaction, avoiding a page refresh.

The Problem

The downside is that the crucial content or key product data may not be visible to search engines, which often will not parse the JavaScript. As the hashtag part of a URL is visible to the local browser but not the server, standard bot indexing practices that follow links fail in the absence of a sophisticated emulation of the JavaScript logic by the indexing robot.

And while Google has been continuously evolving the JavaScript capabilities of its robots, Bing/Yahoo has lagged. And even if the engines do attempt to index the content, they often seem to have little confidence they can accurately represent the state most relevant to the search query based on the URLs in their index. The result: Poor search visibility relative to competitors.

One Solution: Fallback Pages

Fallback pages are HTML pages that display if the requesting resource does not parse JavaScript. They are typically static pages that attempt to replicate the functionality and content of the JavaScript web application via server-side rendered pages. These pages hold the same content the Javascript application would show as well as using standard indexable links for navigation.

The Benefits

Fallback pages give search engines the content they need for important organic search landing pages. These pages are not intended for users unless they are using a limited or text browser. Going a step further, the Cadillac approach to the problem is often referred to as “Progressive enhancement” — a full site where users get as much functionality as their system can handle. This is also the most work, of course, as code needs to be written for each level of client functionality across the entire site.

The Downside

Building progressive enhancement sites or fallback pages increases initial build time and expense and adds ongoing maintenance workload. Furthermore, users may never see the fallback pages or their URLs — they will see the URL with the hash symbol — and these URLs will not accrue inbound links or social signals at the URL level. This may or may not be a problem, depending on whether the URLs are important organic landing pages.

Finally, as it is may not be possible to fully replicate JavaScript functionality via static pages, this means you are in effect creating a separate, slightly different site for key organic landing pages, which again adds workload.

Implementing Fallback Pages

Several approaches exist to implementing fallback pages. They can be achieved by simply putting the important content in the <noscript> tag. This adds code bloat, can be awkward and is limited to key entry pages. If the tag is not loaded as the user progresses into the web app, content will be lost to the engines.

Another approach is to handle it on the server side. In 2011, Google introduced a standard — often referred to as “hashbang”(#!) — which translates hashtag URLs into a query string for indexing by robots. Unlike the fragment part of a URL, query strings are visible to server-side systems and therefore allow web developers to render the appropriate content on the server side exclusively for search engines. This has the benefit of creating indexable URLs.

Details of the Google hashbang specification and its implementation can be found here. While this approach allows for indexing by Google, it has proved unpopular with SEOs and developers for several reasons, mainly due to increased development workload and incomplete support by other search engines and third party web services.

Other Solutions to the URL Problem

As mentioned above, depending on the extent to which SEO has been considered, a given UX state for AJAX or SPA sites may have visible URLs that are different from or invisible to search engines. This means search engines may see one set of URLs and users another.

Creating custom “share” UX elements and functionality including sharable URLs helps mitigate the inability of users to copy and past URLs that accurately reflect the state the user is seeing:

sharable-url

UX elements can help mitigate the lack of sharable URLs on JavaScript heavy sites

Of course, not everyone will use these UX elements and they require developing server-side logic that can force the state via URL elements and presenting this to the user.

Using pushState to Fix the URL Problem

HTML 5 includes the ability to manipulate the path of the URL as seen in the browser as well as the browser history through Javascript. This is useful for SPA and AJAX sites in that a search-engine-friendly, sharable and bookmarkable URL can be shown to the client while the JavaScript and URL fragments chug away in the background.

Implementing pushState

Adding pushState is fairly straightforward and indeed many of the popular Single Page Application frameworks like the open-sourced framework Ember or Google’s Angular framework provides APIs to easily access the functionality. But even for web developers preferring custom Javascript development, the newly added History API, which is part of the HTML5 specification, provides a simple interface to push full URL updates to the browser bar on the client side without using the limited accessible URL fragments or forcing a page refresh.

Downsides

The best SEO implementations of pushState are on sites that are already accessible without JavaScript, with the AJAX version built “on top” as described above. PushState is then enabled to allow for the copying and pasting of links and all the other benefits of having URLs that reflect the user experience, like landing page destination URLs, for example. So, in effect, pushState is not a solution to the problem of AJAX sites and SEO all by itself, but it helps.

Implementing pushState adds development and maintenance workload. The variables and URLs referenced will need to be updated as the site evolves.

PushState also requires a modern browser, but then again, so do many modern AJAX and SPA sites, so you are most likely already going down the road of tossing obsolete browsers and their users to the side of the road.

Putting it Together

If budget and resources allow, a comprehensive approach to creating an SEO-friendly AJAX or SAP site would involve:

  • Fallback pages for key organic landing pages at a minimum or a fully progressive enhancement site that works with and without Javascript, which enables indexing.
  • And implementing pushState to fix the URL problem for JavaScript-enabled users or clients, which enables sharing and bookmarking and generates a single URL for all users.

AJAX and Single Page Application are here to stay and bring with them many benefits that SEOs should not ignore. Excellent user experiences get linked to, talked about and generate social signals – all key for SEO. Brands and agencies should understand that building in the technical elements to ensure that Javascript-heavy sites are indexed and linkable should be considered non-negotiable moving forward and part of the resources scoped in to web design and building projects from the start.

Author’s note: Special thanks to Thomas Prommer, vice president of technology at Huge, for helping author this post.

Image by Ognian Mladenov, Flickr

Resources

The 2023 B2B Superpowers Index

whitepaper | Analytics The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing

whitepaper | Analytics Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook

whitepaper | Digital Marketing The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

whitepaper | Digital Marketing Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y