SEOHow to Conduct a Technical SEO Site Audit

How to Conduct a Technical SEO Site Audit

After you’ve completed your on-page SEO audit, it’s time to go deeper by conducting a technical site audit. Here’s how you can identify whether some common technical issues are preventing your site from playing nicely with search engines.

technicalseoOnce you’ve finished your on-page SEO audit, the next step is to conduct a technical site audit. You don’t want technical issues to make it difficult for search engines to crawl and index your site, as that will make it tougher to rank highly and cost you traffic or potential business.

What follows is a checklist of items you should watch for when conducting a technical SEO audit.

Information Architecture

Information architecture can be slight tricky as it is a broader topic and can encompass many different areas. Information architecture basically means how information flows around on your site.

There are three main areas of information architecture to watch out for.

  • Site navigation: This should be intuitive to the user. Grouping similar content or categories of product together can help eliminate this issue. If it is intuitive and easy to navigate for users then in all probability it will be easy for search engine spiders to crawl.
  • Labeling and naming conventions: Labeling refers to how you name your categories or navigation links. Labeling incorrectly and not using targeted keywords could hurt, so it’s important to get this right.
  • Directory structures: Folder structures refer to what kind of folder hierarchy you have on your site. A recommended approach is not have a site hierarchy more than three folders deep to reach final product page. It may look like this: www.yoursite.com/maincategory/products/productpage.html

Server Response Codes

There are few server response codes that can have an impact on your technical SEO audit, for example 301, 302, and the dreaded 404. These numbers refer to server directives that run whenever a user tries to request a page. Those responses could be 301, 302, or a 404 which means the requested page was not found.

A 301 response code means that the requested page has been permanently moved to a new location. Similarly, 302 is a temporary redirect or a page level redirect where the resource has been moved but it has been not configured at the server level.

While conducting your audit, it’s critical to look for 301s and 302s as search engines will index the final URL destination and since 302 is a temporary or page level directive crawlers will ignore the temporary directive. As a result it is important to make sure that if there are any redirects then they’re 301s which are search friendly and it passes on PageRank from one page to another. A few tools you can use to check if there are any redirects on your site are Pingdom Tools and Webconfs.

Identifying and fixing broken links fo repair 404s definitely needs to be part of your audit checklist. Broken links can cause in spider traps and wasted crawl budget. It can also indicate to search engine spiders that the site hasn’t been updated recently, causing indexing issues.

It’s always a good idea to use a tool like Xenu Link Sleuth or Screaming Frog to check for broken links. Google Webmaster Tools also provides detailed list of broken links on your site and pinpoints the exact pages on your site from where the broken links originates from. If your site is verified on Google webmaster then you may see a message like the one below:

url-errors-site-audit

Crawling/Indexing Issues

If the pages on your site can’t be crawled then they definitely can’t be indexed by search engines. There are many common issues that impede a search bot’s ability to crawl your site. Most notable among them are broken links or 404s on your site, content in Flash, AJAX, and iFrames or embedding links inside JavaScript.

These are severe issues and so when conducting your technical SEO audit make sure you create a checklist of all these items to ensure that none of these technologies or scripts are blocking crawlers from indexing pages on your site.

Dynamic or Unfriendly URLs

If you have an e-commerce site, then chances are your products are being pulled from a database and/or they may also have some kind of session IDs in them. Any URL that looks like the one below can be termed an unfriendly URL. The reason being when bots crawl them the special characters trip the bots (software programs) and traps them in their crawling process.

As a result, engines prefer simple static URLs as opposed to dynamic ones like this: /edge/app/viewArticle?catId=FIN&articleId=1108lskolnik03

Depending on your server if it is apache or IIS you can implement a URL rewrite solution which should be part of your technical SEO recommendations document and not your on-page SEO site audit document since it falls under technical domain.

Duplicate Content Issues

The most common problem with duplicate content is internal competition with other pages on your site. When a search engine can access and index multiple versions of the same page, there is a strong likelihood that the engine will filter out all but one version. The problem here is the search engine is in control – not you.

The most common example of duplicate content is on e-commerce sites where the manufacturer’s product description is used by almost all retailers. The easiest way to determine if you have duplicate content is to check it on CopyScape.

To avoid this, it isn’t that tough to add a few of your own comments in the product description to make it unique and stand out among your competitors – who are probably using the same manufacturer supplied product descriptions.

The other and often the most important issues with duplicate content on your site is one you waste your precious crawl budget and the other is wasted PageRank. A few ways to eliminate that is by using a rel=canonical as a temporary solution or better use a 301 redirect as a more permanent long-term solution.

XML Sitemap issues

Submitting your XML sitemaps directly to engines can help them to discover new content on your site. All search engines provide a way for site owners to submit XML sitemap so it makes sense to utilize this feature. This should be a key part of your technical audit.

A couple benefits of doing this:

  • It helps engines to discover all your content and eliminates any duplicate content issues as your sitemap should contain the canonical URL for site.
  • It also allows you to access your sitemap and content data on the search engines’ Webmaster Tools.

You can use a free sitemap generator but make sure you also manually scrutinize the XML sitemap generated to remove any duplicates that the sitemap generator may have included. Also, make sure that you include your XML sitemap location in your robots.txt file.

Site Loading Time

A slow loading site will result in higher visitor drop-out rate and eventually less conversion. Looking at your site page weight is important with excessive use of scripts, heavy images, Flash files increasing the page weight. Ideally, try to keep maximum page weight at around 130K for your home page and 180K on all other pages.

Google’s recent focus on faster loading sites again makes it critical for SEOs to include site loading time as part of technical site audit. Site performance tools like Gomez and pingdom can help you determine your site loading times. Google Webmaster Central also shows you how your site is in terms of performance.

Summary

A technical SEO audit can be exhausting depending on the complexity of your site. It’s always a good idea to do this kind of audit in two phases where in the first phase you create some kind of a scorecard listing all the metrics and targets suggested above and the second phase is when you really start making recommendations for your IT or your clients to implement.

If you’re an in-house SEO, then you will quickly realize that creating a recommendations document with the items listed above is the easy part. The hard part? Getting your IT team to implement!

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

9m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

11m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

2y