SEOGiving Links Away

Giving Links Away

There are a few ways of controlling what pages of your site share their link love. PageRank "sculpting" and siloing are two methods that use the "nofollow" attribute to control which links are counted in search engine ranking algorithms.

We always talk about getting links in this column. However, links don’t just go one way. You could, and probably should, occasionally link out to other sites.

While you have no control over who links to you, you do have control over what sites you link out to. If you find yourself giving links to less-than-reputable sites, your search engine rankings could suffer for it.

This topic came up in the Google Webmaster Help Google Group. Reading that post triggered my memory about something called PageRank sculpting. Which then led me to start thinking about “siloing.” All of which involves utilizing “nofollow” attributes to control the flow of “link juice” for outgoing site links.

So, let’s try to sort all this stuff out, starting at the beginning.

Give Me An “L”…Give Me An “I”…

Google’s world revolves around links. Links are really what allowed Google to change the search engine landscape. Basically, if an important site links to you, that makes you a little more important. Additionally, if you’re important and you give a link to another Web site, that makes them a little more important.

Just imagine the head cheerleader or star football player in high school. They could make or break a kid’s social life. Important Web sites do the same thing in Google. A link from a “cool” Web site is as good as that football player saying, “Sage is all right. He’s pretty cool.” Ah, if only that happened way back when…

As soon as people got wind of this phenomenon, the links race was on. People tried to get links any way they could. Additionally, Google upped the frenzy by publishing PageRank, a 0 to 10 scoring mechanism for ranking the importance of every page in Google’s index. And so PageRank hysteria was born.

Not only did people want links to make themselves more popular, they became worried that perhaps linking out to other sites could hurt them.

Enter Siloing and PageRank Sculpting

This is simply the activity of controlling what pages of your site share their link love. You do this by adding a “nofollow” attribute to any link that you don’t want the search engines to give credit to.

Take the example Matt Cutts gives. Maybe you have a friend who is a total underground, blackhat, do-no-good, evil-empire, anarchist spammer. You know he’s bad to the bone. But you have a soft place in your heart for him and you want others to check out his site. All you have to do is add a nofollow attribute to the link. It would look like this:

a blackhat spammer.

In this article, Joost de Valk, a Dutch SEO and Web developer, quotes Matt Cutts as saying, “There’s no stigma to using nofollow, even on your own internal links; for Google, nofollowed links are dropped out of our link graph; we don’t even use such links for discovery.” Joost’s article explains PageRank sculpting in more detail if you find this topic fascinating.

His article also talks about “siloing.” He points to an article on BruceClay.com that discussed this concept in a great amount of detail.

Siloing is the idea of only linking out to other pages on your site and other outside resources that relate to that specific category or topic. So, if you had a cherry ice cream cone page, you would only link to resources discussing cherry ice cream cones. Information about chocolate ice cream cones and ice cream sundaes would either not be linked to or would be linked to using the nofollow tag like I showed you above.

Controlling Link Flow Using Robots.txt

Finally, there’s more than one way to block link love. You can also add this information to your robots.txt file. This handy file goes in the root folder of your Web server and tells the search engines how to not spider and index all sorts of things.

It can be easily created. In the first line you specify what spiders you want to block from spidering your content. If you want the instructions to apply to all spiders, you can use an asterisk on the first line: User-agent: * You can also identify specific spiders to allow or disallow, such as googlebot, Yahoo’s slurp, or Microsoft’s MSNbot.

You can allow or disallow all pages using an asterisk, or identify specific pages:

Disallow: /bad-page.html
Disallow: /terrible/yucky.html
Disallow: /really/no/good/page.html

And you can disallow spiders from indexing entire directories, like this:

Disallow: /cgi-bin/
Disallow: /yuck/
Disallow: /bad/stuff/

So, the end result of blocking all those files and directories in a completed robot.txt file would look like this:

User-agent: *
Disallow: /bad-page.html
Disallow: /terrible/yucky.html
Disallow: /really/no/good/page.html
Disallow: /cgi-bin/
Disallow: /yuck/
Disallow: /bad/stuff/

A word of caution, though. Unless you never want to see a spider come into your entire Web site again, never set up your robots.txt file with this:

User-agent: *
Disallow: /

That tells all robots to stay out of everything.

How This Applies to Your Site

All-in-all, it’s probably much ado about nothing. In this post, Jaan Kanellis reports on a conference call he was on with part of the Google team discussing this topic. Ultimately, Matt Cutts states that you probably should focus on topics other than PageRank sculpting.

Remember kids, knowledge is power but with great power comes great responsibility.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

9m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

11m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

2y