SEOSEO, Why You Are Doing it Wrong

SEO, Why You Are Doing it Wrong

Trent Harbour's site was bitten by Penguin. This is a cautionary tale of missteps towards the problem of ranking on search engines. It is also an examination into why small businesses are completely at a loss as to what constitutes “ethical” SEO.

Like many webmasters affected by the Google Penguin update, Trent Harbour of Berkeyshop.com needed to find out why his site had dropped positions in the Google organic results and how he could reverse that. So he reached out to Search Engine Watch for some answers. What follows is a cautionary tale of missteps towards the complex problem of ranking on search engines and an examination into why small businesses are completely at a loss as to what constitutes “ethical” SEO.

Before he was worried about Google and its algorithms, Harbour was a portfolio manager and investor. However, during the 2008 financial crisis he “hit rock bottom” and an illness forced him to give up his job of 10 years. Clean drinking water was pivotal in managing his condition and so he decided to become a distributor of water filters in October 2010. To date, he said he’s invested $20K in the business and turnovers peaked at $600/day consistently for four weeks before Google’s Penguin update went into effect. Harbour told me his revenues were reduced overnight to “next to nil, since I was penalized” and unique visitors to the site are down by 55%.

What emerged from phone conversations with Harbour was that he really could not distinguish between good and bad links and did not have a fundamental understanding of what really constituted ‘good content’. His grasp of the key principles behind the Google algorithm relied heavily on received wisdom from webmaster forums. From my perspective, the maxims he followed for SEO were taken completely out of context from Google’s actual goals. Furthermore, from his perspective the competitive landscape seemed to endorse the idea of actively manipulating search engines, whilst Google’s algorithm actually seemed to reward him – albeit temporarily – for engaging in what were clearly webspam strategies that were against Google’s guidelines.

Another challenge: he did not understand the difference between the Panda and Penguin update. He had received but never actually noticed the warning letter dated April 11, 2012 in webmaster tools because he hardly ever logged in. This letter detailed to Harbour that ‘unnatural links’ pointing to his site had been detected and that he should proceed to remove them, but gave little further insight into what that meant to Google and what actions he could take to rectify the issue.

The warning letter about ‘unnatural links’ combined with the algorithm update announced on April 24 and deployed on May 1, 2012, made Harbour feel like he had been penalized by Google. In those early days of the update it was difficult to explain what had really happened so I asked him to share more details with me so that I could conduct a deeper analysis of his site. Harbour kindly gave me access to his analytics account and lists of the bad links created.

WWW.WhatWentWrong.com?

It’s quite simple – Harbour bought article spinning services from a company in China. Article spinning, a form of webspam, violates Google’s guidelines.

Article spinning is a technique designed to fool Googlebot into thinking that hundreds of pages are linking to a particular website – however every page published is machine written junk or pure gibberish. Matt Cutts posted his own example of “spun content” on the Official Google blog in the announcement of the algorithm update later dubbed “Penguin.”

In a nutshell, the two months that he had commissioned article spinning, March and April 2012, just happened to be the exact same time that Matt Cutt’s webspam team at Google resolved to develop an algorithm update that specifically targeted spun content. Thus when the potency of those sites with spun content were rendered useless, Harbour’s site was demoted in the rankings.

Impact of the Penguin algorithmic penalty

Wrongly, due to the April 11 warning message regarding ‘unnatural links’, Harbour felt as if he had been penalized by Google and sought a reconsideration request from Google. However, despite Google providing a ‘feedback form’ regarding the Penguin update, this form was not a means by which any webmaster could have their website “reconsidered” because, unlike the Panda update that levied a manual penalty, Penguin was an algorithmic update which meant sites would simply be affected by changes in the algorithm for better or worse. Although this was not clear at the time, Google later said that reconsideration was not possible for the Penguin update.

Nonetheless Harbour took the blow to his Google rankings personally and wanted to appeal Google’s apparent ‘decision’. In further conversation with me, Harbour contends that he bought those services because ‘everybody is doing it’; he pointed to some examples from competitors that looked shady. Nonetheless, as far as I could tell, whilst the content they were putting up was not of a particularly high quality either, none of them had engaged in article spinning. No matter how much spam his competitor’s may have been investing in, Harbour’s website did the wrong thing at the wrong time.

However, in defense of Harbour, arguably Google rewarded him temporarily for engaging in webspam. During March and April of 2012 search traffic for his target term site shot up the rankings and he was making $610 in daily sales. It’s a testament to how poor Google’s spam detection algorithms actually are that prior tactics were even rewarded. It also shows how Google’s webmaster policies are in conflict with Google’s technology. To many, in a very real way, what Google ‘says’ and what Google ‘does’ are two completely different things. Take a look at the stats below and you can see that search traffic increased within a month (i.e. April 2012) of implementing the webspam strategy.

Impact of the Penguin algorithmic penalty on search traffic

But to some degree Google’s problem is every webmaster’s problem. A poor understanding of the algorithm is fostering even poorer marketing strategies. Yet these strategies continue to force businesses to invest in spam as a kind of mimetic, competitive practice.

What’s more, newcomers don’t always fully understand Google’s guidelines. So they try to copy winning techniques of long established competitors. A deeper investigation into Harbour’s website shows that really he has been running a risk that his site would be deemed spam by the search engines for over a year.

Forcing Anchor Text, Rather than Creating a Better User Experience for the Web

Among webmaster and SEO forums, it is well known that links with the right words (namely,anchor text) pointing to your website increase the likelihood of your site ranking for the same term. It’s also well known that as the volume of those links with the same term increases, so will your rankings. Put simply, anchor text matches from other domains seems to be a ranking factor on Google and other search engines.

Specifically Google examines all sites’ anchor text to use as a signal to determine the ‘substance’ of the content on the destination page (i.e. the linked page, not the page that links). Anchor text is implicitly tied into the user experience of a website and the site architecture, which Google subsequently extrapolates the relevance of your site to the search query. Google intends anchor text to be used as easy to understand signposts to content to both the user and the search engines. Google does not intend webmasters to point as much anchor text to their site simply for the purposes of ranking.

Therefore the measure of whether you are creating spam or not should always be focused on whether a human user can predict what information they will find on the page they land simply by reading the link.

The Anchor Text Test

A good user experience is one in which the link is placed on anchor text phrases that naturally prompts the reader to investigate other resources that will help clarify any questions raised. Equally, if the logical conclusion is to visit the product page of the product discussed (so that the reader can buy it), that is a fair use too.

In that sense, anchor text should be chosen based on the predictive signals it can pass to the reader. As anchor text has the inherent quality of being both a word and a button you need to think about the linked text as a signpost and choose the placement of the linked text as a means of signalling that you are pointing to something else outside of the text.

In the case of creating links between web documents, if you find your writing does not have an immediately obvious place to include that link, then you should consider rewording your copy to make it more obvious that you are referring to another page and where you can find that page.

By that reckoning it should be easier to understand that a bad user experience is simply a poorly positioned signpost. In the case of links, a bad user experience would be one in which the anchor text does not sit comfortably within the overall rationale of the text or overall theme of the page it is on.

A bad user experience would be one in which the use of links is alarming to the point of being haphazard. Another bad user experience would be one in which it is not become obvious as to why the linked document is relevant to the reader. Such links will naturally occur to the reader as being suspiciously placed because they have a suspicious context.

As a rule of thumb, anchor text links should not be randomly peppered all over the page at unless that anchor text is a persistent design feature of a site – such as site navigation (in which case it would not be random anyway).

If a human user cannot explain what a page is about, or why it is linking to another page with those specific words then it is most likely spam.

Linking Strategies That Constitute Spam

In the conversation with Harbour, it appears that he could not clearly differentiate between a paid link and an organic link. In fact, the only way he recognized the difference was that he knew what links he had paid for and what he had not.

However, the distinction became more complicated when he tried to make the ethical argument that his site had been unfairly targeted by Google. The reason being that in his definition of organic link building was that paid links were unethical, unpaid links were ethical. Yet, a deeper analysis shows that his organic link building strategies would have been seen as unethical in most online communities – and eventually to Google.

In fact, what emerges is that his idea of an organic link was one in which he could not control the anchor text, therefore could not generate results and thus the prevailing common sense made the unethical option seem like the only option.

The Cesspool of Links

In analyzing Harbour’s own link building strategies I was reminded of the time Eric Schmidt famously called the internet “a cesspool”.

For the sake of clarity, what follows a list of the types of links that Harbour tried to create. The first set (1-3) are all the paid links, which are clearly spam – but I’ve included them as an anatomical guide to webspam. In every instance of the paid link is an example of deliberately exploiting the anchor text ranking signal.

The second set (4-6) are links Harbour believed he had achieved ‘organically’ and for that matter, ethically. What becomes clear is that he was not making strategic marketing decisions to guide his organic link building activities – he was merely doggedly following the rule that any link won (that hadn’t been paid for) was fair game. Making matters worse: he only had to look to his competitors to see they were doing exactly the same.

1. Spun content (Paid)

Pure gibberish (or as close to sensical as you can get without making sense), with links embedded in the text is tantamount to spun content. Even where the anchor text is irrelevant to the site, the fact that all these links are within written copy that barely makes sense is proof enough of its spam credentials.

Example of spun content

It is spam. Expect Google to penalize your site if it’s associated with rubbish like this.

Example of spun content

2. Splogs (Paid)

The characteristics of a spam blog, otherwise known as splogs, are poorly written posts about a wide array of disconnected topics that do not relate to any particular theme. In the example below you can see that the post author is nameless (simply ‘admin’), and the related posts are completely unrelated to any other content or theme of the site. To any webmaster or competitor this is clearly a splog as any real site would have a vested interest in ranking on search engines – which would be implied by a clear thematic focus to the content and design of the site.

Example of blog spam

Whilst Google’s web crawlers cannot fully comprehend the meaning of text, the average English-language user who came across the site above would instantly comprehend that is junk just by the opening sentences, “Aѕ a living organism, уου need tο gеt аn easy access tο сlеаn water everlastingly. Clеаn water іѕ ѕο crucial fοr уουr life thаt уου саnnοt live аnd wіll die іf уου spend more thаn a week without drinking сlеаn water.”

Like, no shit Sherlock. It is spam. Expect this site to get penalized in the long run and your own rankings to suffer.

3. Free Directories (Paid)

With advances in search engines, directories have pretty much taken a back seat. However, there is a noteworthy irony that online directories used to be the de rigueur way to find information on the web.

Directories classify and categorize information and should perform an important service of organizing information and making it easier to find via a system of increasingly granular taxonomies.

If you as a user cannot navigate from your listing to the home page of a directory by clicking through a series of increasingly broad categories, then it is probably not useful information to any human being and is therefore spam. Put simply, if the directory has no obvious or useful taxonomy it is spam.

In the example below, a supposed directory of webmasters and their sites has no category or sub-category folders.

Example of directory spam

This is useful to nobody as nobody would have any hope of reaching this page without a search engine. It is spam.

4. User Profile Spam (Organic)

The “secret to SEO” that everyone knows is that your website needs links from other websites, and links from ‘authoritative’ sites – namely those with a high PageRank (PR) – are “the best”. The challenge is to get links from high PR sites which tend to be islands unto themselves. Sites that require users to create user profiles often allow users to include a link to their website. However, such a reciprocal promotion is made on the good faith that you will be an active participant in the community and, by extension, contribute value.

So, creating links in your user profile for the intent of needlessly pushing your own site is tantamount to spamming the community. The example below is an example of a fake user profile used to push a website for none other reason than to get a link from a “high PR” site.

Example of profile spam

My personal position is that if you are not actively participating in a community as a real person then you do not deserve the ability to post a link or have a link on your profile. Creating a user account just to post a link is a tactic that creates no value for anybody. Reddit users would rightly be mad at Harbour for this pointless link from their domain to his site. Arguably he was not to know any better as his competitors are also doing something similar on Yelp.

5. Comment Spam (Organic)

Comment spam does not mean you are getting a quality link from a high profile website. If your comment does not contribute to the conversation and drive the discussion forward then it is not a useful link to humans and fails our anchor text test. You might think it would be great to take the discussion of in a pointless tangent for the sake of dropping links but that is not cool either. But if your comment is purely for the sake of dropping links or misleading in any way then I would call that spam.

Harbour may have gotten a link from Mashable in the result below – but it is the worst example of comment spam. Full marks for creativity in dropping a link to your own site by pretending you are sympathetically discussing a personal experience with a competitor.

Example of comment spam

Why would anyone tell people who the competitor is that they lost their talent to, without at the very least mentioning their own company?

Even if Google ever manually reviewed comment spam, I would argue that the example below is justification enough to be labelled as a spammer – which would probably lead to a penalty.

6. Personal Splogs (Organic)

In the hope of increasing the number of links from different domains to your site you may hit upon the brainwave to create a blog on a different domain and point about links to your site. Thus you have total control of the inbound links to your website! Genius.

Example of personal blog spam in a round robin

Even more genius would be to link one personal splog to another, which then links to your website and so sends Googlebot on a digital “round robin.”

Example of personal blog spam round robin

Except it is not genius and clearly just more spam.

When Google’s Problem Becomes Our Problem

At the heart of it, my personal opinion is that it is a lack of lateral thinking and creativity that is causing people to choose spam as a means to rank. However, in discussions with Harbour I have a more sypathetic view toward those who do not live and breathe the Google Guidelines.

The old adage “if you pay peanuts, you get monkeys” is as true in SEO as any other industry and in seeking to cut costs Harbour cut too many corners and played fast and loose with Google’s guidelines. Engaging in spam gets in the way of building a decent web eco-system that can survive the shifting sands of algorithm changes. Ultimately, what Harbour did not realize is that by winning with links from spam sites, he would eventually lose rankings when Google eliminated the ability of those sites to pass PageRank.

The rub for all webmasters is that Google’s definition of spam is a moving target. It is arguably hypocritical that Harbour is at the mercy of whatever Google’s definition of spam is, with no clear guidance from them as to what to do about it. Even if BerkeyShop.com was not receiving any paid links, the organic strategy was just as vulnerable to Google deeming it spam in the long term.

The converse is that to succeed on Google, you have to play by their rules, which to a greater or lesser extent means following their freemium business model whether you like it or not. The maxim small businesses must live by is “create lots of free value for users” and they just “might start to buy into your premium products and services”. But a freemium business model works best on an enterprise scale – or internet scale – which is outside the reach of most small businesses. Google has the luxury of enabling a freemium business model because they are an internet company – not a company selling water filters. When push comes to shove, most small businesses cannot grasp the concept of ‘creating value’ because there is no clear ‘editorial’ definition of what ‘value’ really is from Google. Yet there is a working value dynamic at the heart of Google’s own algorithm which creates a secondary market of services – namely the right anchor text from the most links.

But this is a simplistic, tactical view that obfuscates cornerstone elements of Google’s algorithm yet it has currency because it is often validated by experience. It essentially conflates two key Google concepts into one in order to exploit the resultant behavior of the algorithm:

  • That inbound links are a measure of authority in the eyes of Google, because Google counts the number of unique domains (and the unique domains pointing to them too) to calculate PageRank (which is a dynamic score that Google uses to determine authority of any domain).
  • That anchor text is a signpost for hyperlinks by which Google can infer what the linked page is about.

Ultimately it is the conflation of these two concepts that gets circulated online, packaged up and sold to webmasters as a quick win formula. Buying ways to increase anchor text matches from other domains is seductive precisely because of the infallible logic and the overall difficulty webmasters face in trying to get any meaningful link from anyone at all.

However, Google’s discussion of anchor text in their own Search Engine Optimization starter guide does not go far enough in terms of clarifying the long-term role anchor text plays in the ecology of the web. In fact, Google does not give sufficient advice to webmasters on how to think about the relationship between all websites.

So, there is no working explanation of relevance that anyone can look at. Instead we are furnished with advice that amounts to “how to make a good website”. At the very least, Google could provide webmasters with the clarity to understand that links and anchor text are intended to be two distinct quality signals and therefore require two different strategies.

In Google’s handling of the ‘unnatural link’ notices in webmaster tools, they did not provide clear guidance to all webmasters and treated webmasters as “guilty until proven innocent”. The entire process was badly handled.

Eric Schmidt once callously said, “We don’t actually want you to be successful… The fundamental way to increase your rank is to increase your relevance.” Indeed, Harbour ‘was relevant’ for 4 weeks and then dumped. What is clear is that relevance and spam are on the same sliding scale.

Google certainly does not owe webmasters a living but it is a failure on Google’s part that a lot of this type of stuff did work. Google needs to be transparent with all interested parties and admit that relevance is, for all intents and purposes, as ephemeral as spam. Flaws in the algorithm need to be acknowledged because they are open to exploitation and thus beget exploitation. Arguably the algorithms failure to pass its own Turing test is currently passed on to the webmaster. But Google’s failure to be real about what webmasters can do to succeed is letting these conditions persist.

This Achilles heel, unacknowledged, gives Google more rights to dictate the web rather than be part of the web.

An understanding of the strategic difference between anchor text and links, would help webmasters differentiate between what Google understands as organic marketing and what it considers spam. In turn, this understanding would help webmasters to choose a longer term content development strategy and not be swayed by the allure of a short term webspam strategy.

Harbour doesn’t naturally have the right to spam. But he does have the right to be upset about Google playing God.

Resources

The 2023 B2B Superpowers Index

whitepaper | Analytics The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing

whitepaper | Analytics Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook

whitepaper | Digital Marketing The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

whitepaper | Digital Marketing Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y