Perhaps you have received, or know someone who has received, this dreaded “unnatural links” message from Google:
There has been a lot of speculation about the messages. Are they part of a Google plan to get publishers to reveal bad links that Google isn't able to detect on its own? Hard to say.
At the recent Brighton SEO conference, on Friday the 13th of all days, Google UK's Pierre Far indicated that the messages were manually generated.
Given the report that more than 1 million websites received this message, that is pretty frightening stuff. One million manually applied messages? Chances are that this results from algorithms flagging potential suspects and then some human examination prior for validation purposes to sending the messages.
How many of your link building tactics will withstand human examination? Will a human be fooled by a spam post like this one?
Even though this particular post is better than the average machine-generated content, it doesn’t pass human review. Sadly, you can easily find these types of posts by poking around with random garbage search queries.
Worse still, there is growing evidence that these messages are leading to penalties.
To be fair, others think that no such penalties have occurred. This SEOmoz article suggests that the entire focus was on sites that were participating in link networks such as BuildMyRank. However, some of my company's clients have been served with "unnatural links" messages when their sites weren't participating in link networks, so I don't think that was the sole trigger.
Unnatural Links: Round 1?
Let's step back at look at this a bit more broadly. Consider that Google implemented some simple algorithm that flagged a bunch of sites for human examination. Maybe that algorithm was as simple as a list of sites that received links from BuildMyRank. This resulted in a large number of websites being tagged for having unnatural links. Let's accept the argument for the moment that it was due to participation in link based blog networks, or even just BuildMyRank.
Google can write other algorithms. How easy would it be to write the code for one of these scenarios:
- Large percentage of blog/forum links.
- Links from obvious junk content posts like the one that I showed above - this one might be a bit tricky in terms of semantic analysis, but they could still try something here.
- Links from posts that exceed a threshold for spelling or grammar errors
- Large percentage of footer links.
- Large percentage of right rail links.
- Unnatural anchor text distribution.
- Unnatural anchor text in a class of links (e.g., all the links you get from blogs are anchor text rich, or the right rail links are, or the footer links are ...).
We can generalize the operation of this process to the following:
Bear in mind, like Panda, this is an offline process, not something that is part of the normal Google indexing algorithm, so it can be as complex as they like in how it processes content and data. Also, these algorithms would be used to trigger a manual verification stage, so the worry about false positives is greatly lessened by that.
I haven't seen a single instance of a false positive reported with the unnatural links campaign so far. Not one.
In addition, the incremental coding task is not a challenging one. The hard part was to commit the resources to manually review the programming output on a large scale, and it appears that Google has done that.
Learning From Panda
Recall what happened with Panda. The initial algorithm came out on February 23/24 2011. Since then we have seen a stream of additional releases for Panda. These releases kept on refining the algorithm, with new winners and losers with each update. Panda broke new ground when it introduced the concept of an offline processor adding an additional ranking component.
The unnatural links messages takes this concept one giant step further. It is an offline processor that triggers a manual review for a specific type of problem. If this concept works in a manner that is acceptably scalable for Google, there are many millions of publishers out there that need to get to work on cleaning up their link profiles.
That is the real story here.
Google Has a New Tool in Their Toolbox
You don't have to like it, or dislike it, but you should know that it is there. And it shouldn't matter whether Google will punish you for the unnatural links, or simply disable them.
If you spend lots of time and money on a link building tactic that ultimately adds no value, regardless of whether you get penalized or the entire output of your campaign is disabled, won’t matter much. It will still feel just like a penalty.
The Original Search Marketing Event is Back!
SES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Early Bird rates available through Sept 12. Register today!