The Link Building Conundrum – Eric Ward’s View

Over the past year, there has been much talk about link building, in most cases specifically related to buying links. As Barry discussed and I commented on last week, the most recent hotly discussed topic has been around the subject of reciprocal linking. According to some interpretations of the Google Blog post from last week, reciprocal linking is now an “official no-no” when it comes to Google’s ranking algorithm.

Last night,’s show “The Pulse” included a guest speaker who is very well acquainted with linking: Eric Ward. Eric gave some great insight into his opinions on the subject of reciprocal linking and link building best practices.

Eric was welcomed to the show by Co-Hosts Barry Schwartz, Ben Pfeiffer, and me. He introduced himself and his history (find out more about his link building history), and discussed his approach to building links which he feels is a bit different from the “norm” these days. He understands the fear of reciprocal linking that is being raised by some of Google’s recent statements. Yet he feels that is there is concern, then maybe the worried party should question why they built the reciprocal links to begin with.

The trouble, Eric feels, is when people enter into a reciprocal link engagement with the initial intent to fool or game the Google algorithm in order to improve the rank of the site. He feels that this was never the original intent of reciprocal linking, when the practice started before Google even existed. The key point, however, is that it is difficult for the search engines to determine intent in a reciprocal relationship without looking further.

Eric surmises that the algorithms need more than just the presence of a few reciprocal (A>B>A) links in order to determine intent. He gave some examples of ways that search engines may be able to look for what he calls “signals of intent,” such as looking for further links and finding the presence of link farms or other sites that have been flagged in the past. If they do, then they should possibly return to look more closely at other relationships.

Eric also briefly introduced what he calls the “Matt Cutts rule,” asking webmasters to consider if they would try to get a particular link with Matt Cutts sitting on their shoulder. When asked if there is a difference between penalization and devaluation (remembered that word today), Eric felt that it would be “foolish and reckless” of Google to penalize a site without some “pretty heavy analysis.” He notes that he has seen sites drop in rankings due to what he believes is a temporary devaluation of some links that had helped the site in the past, while Google investigates those sites further.

The last question asked was how he would pick a high quality link. He brought up an example of getting links from .edu top level domains. He suggested putting oneself in the search engine’s position: can they trust the link even if from an .edu site? If the page has links to YouTube, MySpace, and Napster, for example, it is pretty likely that it is a student page, which may be cause for concern if a commercial link appears along side. However, if the page is obviously a part of a Veterinarian School, for example, the “algorithmic footprint” left by the page’s other outbound links may indicate that it is more trustworthy. Great stuff.

Eric summarized by saying that people should understand that any tactic used for link building could lead to potential repercussions. There are places that sites should have links anyway, he recommends, regardless if the engine will give credit for them or not.

Added: I just noticed Rand’s excellent post on this subject, which also led to Danny’s equally detailed opinion.

Related reading

Gillette video search trends
fist bump between SEO and frontend developer