Three years ago Google introduced the rel="nofollow" tag to the Web. A benign title to anyone who has never run a blog and experienced the headache, "Preventing comment spam" promised future bloggers comment spam would be greatly devalued by this simple tag. The reason being, this little bugger would effectively kill the SEO value of any link "protected by it."
The new nofollow tag quickly became known as a "link condom," and experts at conferences across the world started hailing it for its simple yet powerful purpose. One of the people involved in the comment spam fighting problem within Google was an unknown engineer named Matt Cutts, now Google's famous, unofficial chief Web spam fighter. (Ironically, Google thinks "spamfighter" should be one word, yet when you search the Matt Cutts-related results still outrank spamfighter.com -- ah the power of the SEO (define) community is strong).
Matt's "next little ditty" came out in September 2005 on his personal blog post that sentenced link sellers to eternal damnation. He laid out a compelling argument that buying links solely for PageRank value was polluting the Web. Publishers started using nofollow tags to show Google selling links wouldn't be based on the SEO value of the publisher's site to the buyer's site.
The Nofollow Evolution
While the paid links debate raged on, the nofollow attribute was quietly becoming used for more than just blocking outbound links. Before we go there, let's check in during early 2007 when Sebastian "X" asked if we really need this attribute after all. Sebastian succinctly highlighted the apparent shortcomings of the tag, and expressed the frustration felt by many this tag was worthless when it came to stopping blog spam. Beyond that, there was confusion as to its real purpose, he said.
Not surprisingly, the "link condom" attribute -- much like the occasional cheap or damaged contraceptive device -- doesn't always work. Some feel Google still follows links and, in some cases, might not even strip the "juice" out based on the trust of the landing domain, still assigning value based on the anchor text.
Tests I've tried have been statistically inconclusive so far. I've noticed nofollow links at Wikipedia are scraped, dropping the attribute but keeping the link. (No names in order to protect the innocent.)
Spammers were still spamming, links were still being sold, and at least one war in the Middle East was still going on. Sebastian hinted at a better use for the tag -- specifically to instruct spiders (define) not to crawl printer friendly version of pages. Turns out a new revolution led SEOs to turn their nofollow focus inward.
The Nofollow Revolution
Eric Lander started an interesting debate about nofollow use triggering search engine algorithms (define) to detect spammy sites. This theory has almost no legs to stand on. SEOs, though, certainly haven't forgotten about this little tag.
Often a home page gathers many inbound links. Focusing the links on the most important secondary and deeper site pages benefits internal pages with a relatively low number of inbound links.
Some people find this practice effective. I've tested it and I'm not completely sold yet. One commenter on Lander's post (who happens to work with me) took apparent offense at the idea the nofollow could not be used to help "craft the flow of Internal PageRank."
Bottom line: The rel="nofollow" attribute is here to stay, and a few more years of testing and a few dozen search engine algorithms updates from now, it will likely have far different value than it holds today. An interesting tag started on a simple enough mission to rid the world of spammers has evolved into a tool to be used wisely and tested consistently. Please join us for discussion on this topic in the Search Engine Watch Forums.
Frank Watson Fires Back
Frank Watson: The initial intention of the nofollow tag was good -- but the option was that bloggers could use the tag as they saw fit. The problem now is that Google is forcing people to use the tag.
As Michael Gray pointed out, "Google is not the government." Google isn't even the Internet police. Yet the fact that Google is the source of the largest part of Web traffic forces people to follow their edicts.
Google may cause greater government intervention on the Web. Given Google's dominant share of Web searches and traffic, governments around the world may view this as a form of monopoly and start regulating Google, as they review the purchase of DoubleClick.
Will there be an antitrust case eventually brought by someone who lost his source of income by a change Google makes to their algorithm? Let's wait and see. Great informative post, Chris!
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!