SEO News
Search

The Position Checking Wars

author-default
by , Comments

One of the most memorable moments during last month's Search Engine Strategies conference was during the "Meet The Search Engines" panel, when Google was asked about its blocking of WebPosition Gold software. Should it be avoided? Yes, came the reply from Google -- it and other automated rank checkers should be shunned.

There was an audible collective gasp from the audience, with one person in the front then exclaiming something to the effect of, "What are we to do?" From here, Google backpedaled. Rank checking tools cost it resources, because they burden its servers without providing it any compensation. However, the company also recognizes that people do have a desire and need to perform rank checking. Thus, those using rank checking software in moderation are not likely to be blocked, while the company will also be considering ways it might be able to provide rank checking directly.

Several people have asked me what I think the solution is, so I'll share my thoughts. First and foremost, I recommend concentrating on log analysis rather than rank checking. Log analysis shows you exactly how people reached your site, rather than what happened for the terms you are only guessing they are interested in.

Log analysis also gives you a better long term perspective. A program such as WebTrends, for instance, can show you the sum total of traffic you receive from any particular search engine. Focusing on changes to that sum total is better that fixating on changes to individual rankings, because you might discover that even though you've lost some key rankings, the impact on overall traffic is negligible.

Of course, rank checking isn't going to go away. There are still large corporations which run reports in order to satisfy high-level executives that their sites can be found in the top results at search engines for brand names. There's certainly the larger market of web marketers who rely on ranking reports to measure the effectiveness of their work. Search engine optimization companies working on behalf of others may also find it difficult or impossible to get access to log files, so rank checking becomes essential.

Ideally, the search engines themselves would provide these reports, probably for a fee. The advantage to everyone is that terms of interest could be checked only once, which would reduce the skewing automated programs can cause in the popularity of terms. For example, is "sex" really such a popular search term, or are there instead 100,000 different companies trying to rank well for sex that run reports on the term several times per day, thus inflating its popularity?

In an perfect world, the search engines would know all the terms people want to check. They'd check each of these terms once per day, then the results would be sent en masse to anyone who wanted to know where they stood for those terms.

The only problem here is that it is inefficient for site owners to have to work with each search engine individually. That's where makers of rank checking services have an opportunity. If they cut deals with each of the different search engines, they could provide unified reports to their customers, while the search engines earn a portion of the revenues.

As a result, everyone should be happy -- except maybe for the web marketers. That's because the days of "buy it once, rank check forever, for free" would likely end. Instead, web marketers would probably have to pay an annual fee or a fee based on volume of queries.

Of course, the alternative is to see rank checking services possibly become extinct, if search engines decide to act aggressively against them. Given this alternative, most web marketers would probably decide paying a reasonable fee to be worthwhile.

I'm planning to revisit this issue formally in the near future, so you can get a fuller account of the issues involved directly from both the search engines and the makers of rank checking services. For now, the main point is, as I've written before, it's only those who run a large number of rank checking reports every day who are likely to find their software blocked from hitting Google. If you limit your reports, you'll probably be OK.

There's also a related concern. It's one thing to have your ranking checking software stopped. However, if you run a position report, are you then likely to find your web site itself removed from a search engine's index? After all, several people have written to me noticing that the WebPosition.com site has been excluded from both Google and AltaVista. I find that particularly ironic, because this allows the many mirror sites run by WebPosition affiliates to rise to the top, rather than the official WebPosition site itself.

Most people needn't worry that their sites will be banned or demoted because they rank check. I've certainly never had a report from anyone who found running a rank check caused their site to be banned. Moreover, it would be especially hard for search engines to implement such penalties fairly. That's because anyone could run a report for their competitor's site, so innocent parties could be harmed.

To be absolutely safe, I would never include a company name or URL among the terms I was checking. Remember, it is only the terms themselves that are sent to search engines.

For instance, when you send WebPosition on a "mission," it asks you to provide the domain names you want checked. These domain names are only known to WebPosition itself. It does not send them to the various search engines it checks. Instead, it only sends the keyword information that you provide. When the answers for those keywords come back, then WebPosition scans the matches internally, without the search engines' involvement, to create a position report. Given this, as long as the keywords you specify contain no company names, you've passed along no identifying information to link the report to your site.

The only exception is if you access the Internet from a static IP address that can be linked to your company. In that case, the search engine might guess at your identity. However, it is only likely to bother doing so for anyone who runs a large number of reports. If you run reports sporadically, you aren't making yourself a target. Also, you definitely don't want to make use of any URL verification features. These are specifically designed to run queries to find your pages on search engines, thus identifying them to the search engines themselves.

So far, I've stressed that using the position checking features of programs such as WebPosition aren't likely to cause problems. It's a completely different story if you use the doorway page creation features that some position checkers may also offer. If a search engine can identify certain structures that mark a page as being created by a doorway page template, it is possible that they may choose to exclude those pages. This happened in the past with WebPosition at AltaVista, but I haven't had any reports of similar problems for some time. Basically, the main advice is this. The more distinctive and rich in content your pages are, the less likely they are going to seem like spam.

And what exactly is spam? It's a question that always comes up at the Search Engine Strategies conference, and AltaVista gave an answer at the last one that all the search engines would agree with and which site owners should take to heart: "Spam is defined more by intent than any specific technique used."

In other words, rather than being overly worried about the rules governing the use of invisible text, text cloaked by using cascading style sheets, text hidden in form areas, doorway pages, etc., search engines are far more worried about why someone does something on a web page, rather than exactly what they do.

For instance, if you had an all-graphical web page and decided to give it some textual content by adding a description that wasn't visible because the text color matched the page background color, some search engines might technically see this as spamming. However, they'd be far more likely to allow it assuming you were fairly describing the page's content. Your intent was honest, so the page would probably be seen as honest.

In contrast, feed a search engine hundreds of near-duplicate pages all aimed at getting a top ranking for a particular term, and the search engine is likely to view your intent as hostile and take action against you as a spammer.

More on the issue of the position checking wars and WebPosition's past problems with doorway page identification can be found in the resources below. If you are interested in these issues, all are well worth reading.

Thoughts from the Search Engine Strategies Conference in Boston in March 2001
Online Web Training, April 2001
http://www.onlinewebtraining.com/information/sec_boston2001.html

Robin Nobles provides an excellent summary of the position checking wars and an "open letter" format to all parties involved on solutions. She also provides further search engine optimization tips from the last Search Engine Strategies conference.

I-Search: WebPosition Gold Special Issue
I-Search, March 30, 2001
http://list.adventive.com/SCRIPTS/WA.EXE?A1=ind0103&L=i-search

This issue about WebPosition focuses primarily on complaints -- the program's lack of coverage of non-US search engines, that it doesn't make doorway page creation as easy as it sounds, that its doorway page templates got someone banned. However, defenders of the program offer it support.

I-Search #315: WebPosition Responds
I-Search, April 2, 2001
http://list.adventive.com/SCRIPTS/WA.EXE?A1=ind0104&L=i-search

Brent Winters, president of First Place Software, which makes Web Position Gold, constructively responds to complaints, criticisms and concerns about his product. He also touches specifically on the past problem with AltaVista identifying doorway pages created by WebPosition, which he says is no longer an issue.

Tapping Into Natural Traffic
http://searchenginewatch.com/subscribers/more/natural.html

In the special WebPosition Gold issue of I-Search above, you'll see many comments from people who are frustrated over their attempts to gain good rankings with doorway pages. For over five years now, I've repeatedly said, concentrate on building good content, and you'll be rewarded with traffic from search engines and elsewhere. If you've been feeling lost among the doorway page morass, this article revisits the simple, basic things you should be doing to existing pages in your web site to make them search engine friendly.

Keywords Used To Find Your Web Site
http://searchenginewatch.com/subscribers/more/keywords.html

How to analyze your log files to determine what words people are using to find your site.


SES LondonOptimising Digital Marketing Campaigns with Search, Social and Analytics
At SES London (9-11 Feb) you'll get an overview of the latest tools, tips, and tactics in Paid, Owned, Earned, Integrated Media and Business Intelligence to streamline your marketing campaigns in 2015. Register by 31 October to take advantage of Early Bird Rates.

Recommend this story

comments powered by Disqus