SEOEnd of Year Housekeeping for Robust SEO Campaigns

End of Year Housekeeping for Robust SEO Campaigns

For most of us, December is the quietest time of the year in the SEO industry. Now is the perfect time to do some housekeeping. Here’s how to get all your on-page and techno-ducks in a row and help boost the effectiveness of your New Year campaigns.

giant-rubber-duckAh, mid-December! Unless you’re Amazon, it’s the quietest time of the year for most of us in the SEO industry. Nobody wants a mortgage or a car or life insurance.

Of course, SEO life returns to its normal, breakneck pace come January, but for now let’s have a look at some pre-emptive tasks we can make time for that can help boost the efficacy of the New Year campaigns. You know, the sort of stuff that you pore over diligently and rigidly when taking a new client on? That can tend to lapse unless you have very robust monthly processes in place.

Now is the perfect time to do some housekeeping; get all your on-page and techno-ducks in a row and reap the benefits come New Year.

Spot-Check Errors by Site

One of the easiest ways to do this is via Bing or Google Webmaster Tools, particularly if you’re an agency or freelancer managing multiple sites. In Google, select the first site and go to >Health on the left menu, then >Crawl Errors. These are the errors that Googlebot encounters when trying to crawl pages on your website. Some of these errors might be permanent and considered a hard error and some of these errors may be temporary, such as errors that are related to an overloaded server.

Having errors on your site won’t directly hurt your SEO but there will be indirect impact on what you’re trying to achieve. If you have a lot of broken links on your site, this will be something of annoyance for users but also may impede the path of crawlers around your site, making it a little harder to get to deeper content or new content in some cases.

Additionally, errors in the 500s can be similarly impeding for users and robots in equal measures. Such errors are server-related – 500 is indicative of an internal server error; 502 indicates that the server is under strain (commonly seen if the site is under DDoS attack); or 503 indicates the server has given up the ghost, been moved, or your own connection is lost.

  • Become familiar with common HTTP response codes. If there’s an unusual one appearing in your crawl errors then it’s easy enough to look it up as you come to it.
  • Investigate why the error is occurring if it is still occurring checking the difference between the Last crawled and First detected dates.

url-not-found-googlebot-couldnt-crawl

  • If you have quite a volume of errors it might be easier to download to a .CSV and use Excel to analyze the errors by type and date and other common features.
  • It is important to understand why errors are still occurring particularly 404s and to do this check where the error URL is linked from. It may be that content has moved or been deleted but is still referenced on your site somewhere. Get these re-pointed where appropriate or delete the reference if it is intentional that the page is no longer available.

Check Internal Redirects

A redirect is a little like a change of address that you might organize with the post office when you move house. You’re no longer at the old address, but can redirect your post (users) to the new address if it is delivered to the old address.

A redirect means that users still get to content pages rather than being served a 404 page. You might have a redirect on your site for reasons including the following:

  • Non-preferred (essentially duplicate) URLs (e.g., http://example.com may redirect to a preferred homepage, such as the www version).
  • Discontinued stock items may redirect to a similar stock item, such as replacement model

There may be other reasons why redirects have been put in place historically, such as site structure changes, but internally it is important to make sure that links point cleanly through the new absolute URL, rather than a redirection.

  • Use a tool like Screaming Frog SEO Spider to spider the site and identify redirects. (The tool will check the header status of all URLs found on the site if you enter the root domain. You’re looking for the 301 and 302 status codes.)
  • Identify all redirects making sure that note any of them are 302 redirects, which don’t pass any link equity through the redirection and are therefore only helpful from a user perspective
  • Instead make sure all redirects are 301s thus benefiting users and your search efficacy at the same time.
  • Check that all 301 redirects are in place for valid reasons such as those in the bullet points above.
  • Make sure that structural change 301s are corrected so that they are not referenced internally (keep the 301 in place so that users are redirected, but correct the reference to point directly to the new clean URL).

Read Your Sites’ robots.txt File

Be honest, when is the last time you checked your robots.txt file? Always the source of school-boy forehead-slapping, SEO mistakes; the robots.txt file is a site resource that no professional can claim territorial possession of.

Site developers may need to use this file to disallow robots access to certain URLs or site sections, perhaps a legal team may pass instruction to block pages on the site along, without being aware of other stakeholders or perhaps a content manager may use the file to block a section of content that isn’t quite ready or perfected; or perhaps you have inherited a site from a so-called-SEO-guru who know about as much about SEO as my Mum. One or two of these scenarios may sound far-fetched but strange things do happen, particularly in this file.

It’s always a good idea to spot-check every now and again and take a time-stamped screen-dump every couple of months. Look out for the following:

  • Pages disallowed that you don’t recall requesting or don’t appear to be of any sensitive nature.
  • Legacy instructions that do not serve any logical function such as timed only-crawl requests.
  • Conflicting syntax that may not instruct in the sense in which it was intended.

In general you just want to keep an eye on any changes or new additions to this file so that you’re informed and can think through any future attempts to limit a robots’ access to the site pages according to the best methods for your objectives.

Check On-Page Optimization

Again, this is another optimization aspect that you will likely be all over like a bad simile when first taking a client on. Over time of course that can change, particularly when managing sites with multiple content producers and content owners who may be adding new content all the time.

Even if you’ve conducted training for your clients, new hires may not be in the loop and those with primary objectives that aren’t SEO related may not have on-page practice front of mind. It’s a good idea to spot-check that all your on-page elements are well optimized. Look out for the following:

  • Poorly written title tags.
  • Missing titles and particularly descriptions.
  • Under-optimized image names or use of standard filenames.
  • General syntax and choice of terms (“informed” terms versus general usage terms ).

Perform Keyword Trend Analysis

The terms that might have been so wonderfully apt and voluminous when you first acquired a client may not be so successful further down the line. This is particularly true if you’re working with a client in an emerging industry.

Language is always evolving and can change for many reasons including macro-economic factors and cultural influences. It’s always a good idea to sanity check the various data clues for emerging terms such as:

  • Internal site search.
  • Webmaster Tools Search Queries (though this relies on your site appearing for a term and isn’t a true reflection of potential user appetites).
  • Google Trends.

Can You Say “Global Economic Recession?”

payday-loans-interest-over-time

I’m sure I’ve missed plenty of sensible housekeeping tasks from this list so please be sure to add them in the comments if you can think of any other essential spot-checks that we may often mean to get to but with the best intentions sometimes fall by the wayside?

Author’s note: This post will be my last for Search Engine Watch. I’ve decided that I need to cut down on some editorial commitments. I’m looking forward to reading content from fresh voices in the industry. I’ve had a great couple of years writing content for Search Engine Watch and have learned a lot in the process. Special thanks to Danny and Jonathan. A happy Christmas to all!

Image Credit: Yoshimasa Niwa/Flickr

Resources

The 2023 B2B Superpowers Index

whitepaper | Analytics The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing

whitepaper | Analytics Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook

whitepaper | Digital Marketing The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

whitepaper | Digital Marketing Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y