Google Penalty or Algorithm Change: Dealing With Lost Traffic

There is nothing worse for a webmaster/site owner than to wake up one day and find Google Armageddon has taken out all of their site’s rankings and traffic. In most cases, mayhem ensues as they scramble around to try and figure out what happened or if they have run afoul of the mighty Google in one way or another.

This framework should help you keep your head should the unthinkable happen.

Google Penalty

Sadly this topic really isn’t written about a lot, or discussed (go figure; peeps don’t want to public talk about a smackdown) so let’s get into some of the issues surrounding this particular, often poorly understood, phenomenon. We’ll also look at some preventative measures that we can all keep, just in case.

Have You Been Penalized?

Some of the basics first… we need to establish if it is truly a penalty. As much as it may be intuitive to some, one of the more common things we see are people that think they’ve been penalized in Google, when in truth it’s often not the case.

Cart before horse

What often masquerades as a Google penalty?

  • Filters and dampening factors: Google may have changed how they’re treating your particular space. Be sure to always track not only your own rankings, but those of your competitors as well. Is there a shake up in the space? Or is it just you.
  • Data center anomalies: The other obvious one is that rankings can vary depending on location. Be sure to have others check the rankings (from target market region) and cross reference Google referrer traffic in your analytics.
  • Algorithm updates: Such as Panda, GooPLA and so on, (see below).

Remember, recent updates such as the Google Page Layout Algorithm, (GooPLA) are in fact not a penalty per se. It is more of a filter/dampening factor.

I hear ya saying, “anything that lowers my site in the rankings is a penalty dammit!!”. This is though, untrue.

You can’t go to Google Webmaster Tools reconsideration request form and say, “We removed the ads from our site, please put our rankings back”. Why? Because you weren’t penalized. It is an effect of the algorithm evolution. That’s an important distinction.

What Can You Get Penalized For?

If we now know the difference between algorithmic changes and penalties, what is a penalty?

A few of the more known ones include;

  • Link manipulation: Paid links, hidden, excessive reciprocal, shady links etc.
  • Cloaking: Serving different content to users and Google.
  • Malware: Serving nastiness from your site.
  • Content: Spam/keyword stuffing, hidden text, duplication/scraping.
  • Sneaky JavaScript redirects.
  • Bad neighbourhoods: Links, server, TLD.
  • Doorway pages.
  • Automated queries to Google: Tools on your site, probably a bad idea.

As you can start to see, this is not a Panda event. That’s an algorithm change. A penalty is about breaking the guidelines more than anything. One place to also look around is the Google Webmaster Guidelines.

Diagnosing a Google Penalty


This is usually the easy part. Most people notice that traffic and revenues have taken a nose dive. But that doesn’t really tell us if it is algorithmic or an actual penalty. The fastest way is to check if your site is in the index (use the [] query). If it’s gone, then my friend, you are most certainly in hot water.

Sometimes though, it can be less obvious, on the page level. But that’s not as common. You can also check on brand searches (still getting those?) and exact match searches for page title’s that should be ranking. We have seen instances where a site was penalized but still ranked for its brand.

The next question: are there any of the above punishable elements being used on your site?

For most people this is usually the first one – link manipulations. You should also check Google Webmaster Tools to ensure you haven’t gotten an ‘unnatural linking’ notification. If you haven’t been active in ways that might get you in trouble from the list, it’s unlikely you’ve been penalized.

But not impossible. Another common aspect we see are sites that have been hacked. You should inspect your server. Inspect the logs. Even look at the query reports from Google Webmaster Tools, (for more read “Hacking for SEO” – important stuff).

The other obvious measurement tool is your analytics. When the traffic died was is it site-wide (losses in Google traffic)? Was it only on a page/keyword level? Once more though, first be sure that it’s a penalty, not an algorithmic change. There are two distinct approaches to rectifying the situation.

At a Complete Loss?

In some more desperate situations we might also do a little deeper digging. The answers are usually much more obvious, but just in case, here’s a few other ways to research the issues.

  • Spy On Web: Check for suspect sites on your server. In extreme cases they will actually consider nuking or dampening on this level if a given IP is infested.
  • Email black lists: You can use tools like MX Toolbox to see if there are any events related to the domain.
  • Malware: You can also use the Safe browsing diagnostic tool to ensure there hasn’t been any malware detected recently on the domain.

At this point you should likely have a sense of where your problems lie. You should have identified if you actually have a penalty or not. If you believe you have been penalized… let’s look at what we can do to get it fixed.

How to Deal With a Google Penalty


Many instances Google will just re-crawl the site and place back if the offending content has been rectified (this is for algorithmic penalties).

For manual smackdowns, there is a time-out that is set (generally 30-60 days I believe). Once that expires, you’re eligible to get back into the index. On the other hand, a reconsideration request might help speed up that process. We have used this with great success in the past.

The thing with that is, if you have an algorithmic penalty, Google would likely take no action from the reconsideration request, just allow the algorithm to do it’s job (re-crawl and remove the penalty).

More here from Google’s Distinguished Engineer Matt Cutts.

The main goal here is to look at all the common areas and elements that can be involved when a site is penalized (or removed entirely in this case). The reason it is important to look at everything in these situations is because when one files a reconsideration request, we want to ensure we’re actually taking responsibility for the causation of the removal.

In short, if we go to them and discuss elements that are unrelated to the removal, we run the risk of them profiling us as trying to play games/trick them. Not an ideal situation….

For more, here is a great interview about reconsideration requests with Tiffany Oberoi.

Dealing with Algorithm Changes

Now, if you’re fairly certain that you haven’t been penalized, the question remains – what the heck happened?

We now have to start considering the possibility that there isn’t really a penalty, but some form of filtering or dampening happening to the site or page. What can cause this? Plenty.


  • Changes in valuations of links (weight, segmentation, etc.)
  • Changes in freshness (temporal signals)
  • Seasonal changes
  • Major algorithm updates (Panda, et al)
  • Minor algorithm updates (see monthly posts from Google)
  • Trust algorithm changes (social graph, outbound links etc.)
  • Duplication (scraping and attribution)

We want to start tracking things on the keyword and page level. Commonly analytics data is the best place to turn for this, and rank checking monitors (always use more than one for accuracy).


  1. Is it one or more pages that have been affected?
  2. Are there common keyword elements/modifiers to terms affected?
  3. Are the rankings lost common in severity?
  4. What (if anything) happened to competitors in the space?
  5. What changes have been made to the website?
  6. What known algorithm changes have happened of late?

We essentially need to identify the type of filtering or dampening that is at play. In most cases there are some common elements that will emerge as you start to look deeper at which pages have been affected.

For example, we have only one (suspect) case of Panda only affecting a single page, not an entire site. If we note that a large number of pages have been affected, then we can start to consider algorithm changes that related to entire sites, not just individual pages.

Obviously, with the limited space of this article, can’t get into dealing with each and every potential instance. What is important for now is the differentiation between a penalty and an algorithmic affect.

One Last Consideration


It also bears mentioning that we have also seen many cases where it’s none-of-the-above in the sense that the site was neither penalized nor feeling the affects of an algorithmic change. What can often be the case are some less than complicated human error.

It is always a great idea to have your site developers/programmers keep a detailed change log of things being worked on for the site. The SEO team for that matter too. We have seen things like:

  • Architectural changes
  • Canonical issues
  • Response codes (500 instead of 200)
  • Redirect issues
  • Crawl issues (see Webmaster Tools)
  • Content changes (unauthorized)
  • Title and meta data changes

You’d be amazed how many times developers or other staffers throw a monkeywrench into the works and others have no clue it has happened.

What Data to Keep for the SEO Doctor

It doesn’t matter if you do it for your own purposes, or for that poor ol’ consultant that has to come in someday to try and assist in recovery, data is going to be key.

Here’s what you should be collecting to ensure if it happens, you’re in a position to do anaysis and get things back on track.

General data includes:

  • Google Analytics (or relative)
  • Google Webmaster Tools (lots of great data there)

Historical data includes:

  • On-site change log (programmers, SEO, anyone that edits the site)
  • Link building activity log (including dates, types, locations)
  • Paid links activity (including dates and locations)
  • Ranking reports (competitors and yourself)
  • Indexation levels (actual pages indexed)
  • Changes to htaccess or Robots.txt

If you have all of these, you will be in much better shape should the day come that you have to deal with traffic/rankings losses. And if you’re bringing in outside help, these data points will be invaluable to the person doing the analysis.

And that’s the basics for you. Remember, there is always an answer, one just has to find it.

“Muddy water, let stand becomes clear.” – Lao Tzu

Related reading

Exclusive interview with Craig Campbell Golden nuggets every SEO needs to know
How to understand searcher intent to boost SEO rankings
How to master technical SEO Six areas to attack now