You're awesome. Your site is fantastic. You have great content. People think your site rocks – traffic is skyrocketing and you're attracting plenty of links.
Search engine optimization? Who needs SEO? Nothing to worry about here!
As you go about your merry way, unbeknownst to you, a plethora of issues are boiling under the surface, like a super volcano waiting to explode.
Double Double, Toil & Trouble!
Eventually, one fateful day, you open up Google Webmaster Tools and discover a penalty! And before you get your head around that – a warning! A few weeks later – thwap! – another penalty!
How could this be? Your head is spinning. What happened?
Welcome to the ugly side of the new SEO paradigm.
How can a site with great content, links, and traffic suddenly wake up and find itself in proverbial hot Google water. Easy! Because, despite what you may have heard or read, content marketing isn't the new SEO.
Great SEO is as much about great technical awesomeness, onsite fortitude, and understanding how easy not having a knowledgeable watchman can destroy your site is as it always was. SEO has become increasingly complex, the Jenga puzzle more dependent on the layers beneath than it ever was before.
Yes, in the wake of Google's Panda and Penguin updates, you must have amazing original, unique, and well-written content and natural appearing backlink structures. It still isn't enough.
If you aren't paying attention, poor website quality indicators can torpedo your website. You'll watch in horror as your site quickly starts taking on SEO water – in the form of lost Google rankings – and eventually your battleship (and traffic) will sink to into the abyss.
What is a Poor Website Quality Indicator?
A poor website quality indicator comes in many forms – usually in areas such as content and backlinks. But the majority usually come from unattended technical components.
Once you take on enough of these PQIs, you will earn a site penalty. This penalty might be an algorithmic filter at the keyword or page level, but more likely the penalty you receive will be manual and these are often the hardest from which to recover.
Poor Quality Indicators to Watch Out For!
1. Comment Spam
One of the newer penalties that people are finding in their inbox is one for comment spam.
You know that person who leaves those annoying links at the bottom of your forum or blog. The one that is made up of spinner (robot) content or has multi-links in the page, maybe offers “Free Shipping” on your informational site? These are one of the newest and fastest ways to a Google penalty. It can take relatively few instances of comment spam to wind up with a manual penalty.
Captcha won’t stop your spammers, so don’t make that your “stop all” method. Add an authentication method such as Twitter/Facebook (never use just Facebook, many users won't log-in with their Facebook accounts).
Also make sure to lock down the use of HTML in your comments. Use a good spam scrubbing software to flag bad comments, limit links, and make sure to have some method of human moderation. Flagging by other trusted users also helps!
2. Ad Creep
Do you have too many ads above the fold on your site? Did you even know Google has a penalty for the excessive use of ads above the fold? Of course, what is “above the fold” in this day of device chaos and just how much ad space is too much is a bit up for interpretation, but you can get a penalty for it nonetheless.
Evaluate how much value you're really getting from those ads on your site. With banner blindness at an all-time high, and users resenting the intrusion of ads inside their content space, would that space you are using for ads be best used on something else, say internal pathing?
If you're going to use ads, use a post-it-note as your guide. Hold one up to your screen. If you're using more space on the top of your page than one, you're probably using too much. Yeah, I know, not much to go on, but it’s what we've got.
3. Internal Linking
I have seen pages that have had as many as 400+ internal links. How you say? Using newer techniques such as tagging and infinite scrolling their internal link structures ran away from them before anyone even noticed.
Make sure your internal links structures aren't splitting your site into infinite strands. While link sculpting is considered an outdated technique, keeping your page links relevant and on topic is not. Splitting your page information into 400 pieces of unrelated data makes your page information difficult for the spider to categorize.
An oldie, but a goodie is the no more than 100 internal links on a page (and really the fewer the better). Don't split your site into XXX topics from one page to the next. Focus the internal links to tell a proper story, take your user down simple paths. Links should lead your user through your site, not send them down an endless set of endless unrelated tunnels.
4. Crawler Issues
Is someone watching your Webmaster Tools? Are you checking your site for crawler issues?
Especially check for “404 page not found” issues or “500 server issues”. If your site is giving Google 404 errors, then you're losing site links and Google isn't able to serve site pages. This can be highly problematic if it is a high percentage of site pages and it is not rectified quickly. If 500 issues, it is a quick indicator to Google you have site quality issues and no one is at your helm.
Check Google Webmaster Tools at least once a week for site crawler issues. When you find them, address them immediately. Generally, these can be fixed easily – in the case of a 404 error with a 301 permanent redirect or a server check for what is causing the 500 errors.
5. Duplicate Content - Title/Meta Content
While most people know about duplicate site content, they often forget that duplicate title tags and meta descriptions are also considered duplicate content. When this happens on multiple pages this can also be a site quality issue that can derail your site health.
Duplicate title tags and meta descriptions often happens as a result of CMS software or content teams that are unaware of the ramifications of this duplicate content issue. Make sure your CMS software does not use a common titles or description across the site pages or site sections. (Blog CMSs are infamous for doing just this.)
Then make sure your content teams are writing original, unique title and descriptions for each page of content. When this isn't possible and the titles and descriptions will be auto-generated, make sure the scripts are pulling the content from a unique area of the page.
6. Duplicate Content – Tagging
A very popular technique these days are to add category tags to articles, posts, or blogs. These were originally meant to help categorize content and make it easier for users to find that content.
Unfortunately, these tags are often created on the fly, according to the imagination of the author and no limitation is given to the number of tags. Then the tags are used to create alternative pages on the site. Since the tags are unique, created by the author, often you wind up with many pages with one article and the same content (i.e., duplicate content on your own site).
Limit tagging to pre-set tags set by an SEO team or content manager, so you don't have an unlimited number of pages with ungrouped content. Tags were meant to group content together, so there should be a predefined set of descriptive tags.
Also, make sure you don't have pages that have only one piece of content on each page. Use scripting to keep pages hidden from Google (i.e., no-index our your pages that have one piece or content on it that is the same as the original) until the page has more than one piece of content – after all it is not a tag group until it has more than one-piece right?
7. Page Speed
While Google states this is an issue that has a 1 percent effect, in real life we see this can be much greater in competitive markets. So how is your page speed? If you are scoring below an 85 your site is showing quality issues, a 75 and someone is not at the helm. Google isn't happy when it thinks no one is watching the site.
Run a page speed test (a plug in you can get with Firebug or Chrome) and check your homepage, plus your main landing pages. See how your pages speed is doing. Then if you use Google Analytics see how Google is showing your site speed internally.
If your page speed is slow, work on your times. Often something as simple as proper image compression or server caching will make all the difference in the world.
8. Site Traffic
How is your site traffic? Where is it coming from? Is it where you think it should be coming from?
While for most, site traffic will be coming from where it should be, some industries attract spammers and “users” from countries where bots are more prevalent than people. I know of sites that have shut off traffic from entire countries for just this reason.
While your site metrics won't affect your organic site indicators, time from and back to the search engine will, so check your traffic for anomalies of country, time on site, bounce rate, new visitor rate etc. If a country sends an overabundance of poor quality traffic it might be worth considering filtering it out.
First place to check is your Webmaster Tools, see if Google has sent you a “WE FOUND BAD THINGS ON YOUR SITE” email. If you get one of these, do not pass go, do not go home, find it now and remove or you will find yourself in a heap of Googly issues.
However, there are many times where Google might not find a site hack right away. In those lucky cases where you do you get to remove it before Google notices it, you can escape the red flag. So how does one find malware on a site?
Often malware or other attacks on your site will show as odd referrers in your analytics reports. Generally you will see URL strings that make no sense or you will click through your reports to see referrers that refer to scripting code or iframed references. When you see these, do not ignore the report. These anomalies are often indicators something is amiss.
If you can't hunt the source down, find someone on your team who can. There will almost always be someone with some skills in this area.
When in doubt about a page, you can also use the “Fetch as Googlebot” feature in your Webmaster Tools to check the page content. Crafty hackers will serve you different content then they will serve your users. Use this tool to see what Google’s spider is seeing.
10. Webmaster Tools – Penalties, Warnings and Stuff
It's really easy for a non-SEO to forget to check your Webmaster Tools for warnings, penalties and all the other relevant information it has to tell you about your site health. However, even if checking do you know how to interpret what it is telling you?
An SEO will check this daily or at least every other. They will check your site health through not only Webmaster Tools, but also your analytics and tools of their own choosing to make sure your site is hitting the quality indicators, so you never get one of these:
But if you do get one of these or worse, a penalty, do you know how quickly you need to act? What actions signal to Google that you took this seriously and that you know what to do?
In one recently well-known case a gentleman received a backlinks penalty, so he used the new disavow tool to disavow all his links and start fresh. Yet, weeks later the backlinks penalty still followed him.
An experienced SEO could have told him this would not work, would you have done the same? Even if you wouldn’t have, do you know why not?
Having an experienced SEO on staff or in your bullpen is worth their weight in gold, when a warning or penalty comes in, because they can help you find your issues quickly, help you react appropriately, and make sure you are on top of issues as they arise.
Don’t Believe the Hype!
With every change to the algorithm, industry experts, analysts and even SEO’s love to wax philosophical about how far we have come from the original tenets of SEO and how all we know is dying, dead, or dead and gone. Don’t believe the hype.
While content is extremely important to your site health, it always has been. All that really changed is you have a much harder time getting away with “crappy” (thin affiliate, spun or just poorly written) content.
However, lately the focus has been on content almost to the exclusion of technical SEO despite Google Webmaster Quality Guidelines being almost all technical in some manner
Google advises against
- Automatically generated content
- Participating in link schemes
- Sneaky redirects
- Hidden text or links
- Doorway pages
- Scraped content
- Participating in affiliate programs without adding sufficient value
- Loading pages with irrelevant keywords
- Creating pages with malicious behavior, such as phishing or installing viruses, trojans, or other badware
- Abusing rich snippets markup
- Sending automated queries to Google
And asks you to engage in good practices like the following:
- Monitoring your site for hacking and removing hacked content as soon as it appears
- Preventing and removing user-generated spam on your site
So while content remains important it also remains mostly the same whereas technical SEO becomes increasingly more complex to manage, maintain, and sustain.
SEO is Still SEO
Without an SEO to help navigate the technical shores, it's easy to start taking on SEO water and quickly succumb to warning and penalties that were silently lurking beneath your site’s calm surface.
Current convention with its advocacy of content and links, often overlooks the importance of that technical layer.
However, in this day of Panda and Penguin and 1-2 changes to Google's algorithm every day, the technical portion of SEO can't be overlooked or denied. To do so is to overlook the lower half of the submerged iceberg and suffer the consequences when it hits you below the bow, unnoticed, until you tip upright and start to sink.
The Original Search Marketing Event is Back!
SES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Register today!