SEO News
Search

Promoters Call For Certification

author-default
by , Comments

Promoters Call For Certification

From The Search Engine Report
August 4, 1998

Principals from four major promotion and design firms have sent an open letter to the major search engines calling for the establishment of a certification program for optimization professionals.

The letter is the first such coordinated move from the web promotion community ever regarding search engine positioning issues. Those signing were from Beyond Interactive, MercurySeven, US Web and Web Ignite/AAA Internet Promotions.

The letter arose because of Infoseek's ban on pages that redirect to other pages. Firms like Web Ignite depend on these type of pages for their optimization programs.

Under these programs, a client specifies the terms they would like to do well for. The optimization company creates targeted pages for the terms and submits them to the major search engines. Visitors finding the pages via a search engine click through and are automatically redirected to the client's site. They never see, or only see briefly, the target page which is hosted on the optimization company's site.

The ban on redirection impacts these companies greatly, because they are paid by the click, earning anywhere from $0.25 or more per visitor. Without redirection, they cannot count the visitors they attract in order to be paid.

"That's the only way we can record our visitors and charge as an optimization company," said Paul Bruemmer, of Web Ignite, and one of those signing the letter.

Bruemmer believes Infoseek's ban emerged because the adult web site industry has heavily abused redirect pages by loading them with spam. In contrast, high profile optimization companies are hesitant to spam, for fear of scaring off important clients.

Infoseek product manager Jennifer Mullin hadn't yet read the letter, which just went out, but she said redirect pages were banned because of problems with them from both adult and other sites. Infoseek has no plans to change its policy, at the moment, she said.

Of course, some people may consider the entire concept of targeted web pages to be spamming, redirection or not. Search engines were originally designed to find the best pages from those "naturally" occurring on the web. The idea of manufacturing pages can seem too overt.

However, all of the major search engines allow targeted pages at the moment, even Infoseek, as long as the page contains no redirection and presents the users with the same text that the Infoseek spider sees. In general, target pages must also not mislead visitors about the content at a site. Nor can they contain spam, such as word repetition in the body text or meta tags.

In fact, if these manufactured pages were not allowed, it would be impossible for some sites to ever be found. Good examples are sites with graphic intensive pages, those that use frames, or sites that are completely database driven. These sites can be invisible to the major search engines, in their current state, unless targeted pages represent them.

So there's a good reason for targeted pages, yet their very existence makes many of the common spam rules that have developed hypocritical. After all, what's the point of trying to keep a page from "inflating" its score when you readily acknowledge it has no "real" score to begin with?

These type of rules made sense when the web was primarily text based. We had a relatively level playing field then, and the idea of penalizing a page that stuffed itself with extra terms made sense.

The situation is much more complex, now. Rules have evolved and gotten more complicated. For example, one of the most common questions I get is how much repetition is too much in a meta tag. Should a bookseller say "kids books, computer books, science books," for example, or are they repeating "books" too much? And if they don't use all those terms, do they reduce the chance of being found?

There is no correct answer, because the rules are unpublished and vary by search engine. The result is an incredibly confusing mess for site owners that want to know the basic things they should do to be found. They are forced to rely on guesswork, experimentation, rumor and the few bits of dependable information that search services choose to release.

Given this, it's easy to say that what's needed is a set of standard rules that everyone can follow. If we all abide by them, then we'll be back to the level playing field. Just tell us the rules, and we'll play fair.

The problem is two-fold. First, the more rules a search engine provides, the more ammunition is provided to those who want to bend the rules or break them altogether. The potential traffic payoff makes this a constant attraction, especially when it is easy to set up shop elsewhere on the web and start again if you find your domain gets banned.

Second, and more important, the idea of giving everyone rules to follow doesn't result in more relevant pages. It results in top ranked pages from people who are smarter at following the rules. Often, that corresponds with relevant pages. But the distinction is important.

Imagine a certification program with 15 companies all participating. Each of these companies has a client that wants to do well for "auto repair." Only 10 of them will make the top listings, in most places. That means the remaining five will have upset clients who are not happy with a second page listing. So they'll go back and rework their pages, albeit within the "rules," to secure a better placement. As a result, some of the other companies will lose positioning. Thus, they'll go back and rework their own pages, putting the cycle into a constant loop.

The result is not that the most "relevant" pages are being listed. Instead, the people on top will be those who are cleverest about putting words into a particular order on the crafted target pages they'll inevitably submit.

In contemplating solutions, we must recognize that all this fury over positioning is concentrated primarily on popular single and two word terms. Remove these from the equation somehow, and suddenly, many spam rules might be unnecessary. That's because it is not worth the time to spam for longer terms that don't bring in huge amounts of traffic.

Given this, it may make sense for the major search engines to consider experimenting with the GoTo model and accept paid links, especially for competitive terms.

This may sound like sacrilege, especially to professional researchers, but the reality is that paid links already exist indirectly. Sites listed for top terms are often there because they have paid an optimization company to put them there, or because they have invested significant time (and thus money) to achieve a position on their own.

It's even arguable that allowing paid links might increase relevancy. For one, the search engines would have more direct control over who's accepted. At GoTo, only sites truly relevant for terms are allowed to bid on them.

Moreover, search engines now spend significant resources in combating spam. Allowing paid links might greatly defuse the spamming situation, allowing them to concentrate their resources on improving search technology.

Paid links certainly make more sense than trying to certify optimizers to follow rules that are already outdated. Even Bruemmer, who's leading the certification charge, likes the simpler idea of moving to paid links rather than certification.

"Certification is within a certain paradigm that everyone is under right now. If we could skip that paradigm and go right into a GoTo-type environment, then right on," he said. "What happened to Open Text is ancient history. An Internet month is like a year. At least 18 years have gone by," he added, speaking of the fallout from Open Text's experiment in 1996 with paid placement.

Alternatively, perhaps some new developments such as link analysis or user measurements, as described in the earlier article, will make spam rules needless. Another idea is that new forms of trusted metadata may emerge, provided by a third party. Any of these might provide a real level playing field, and one that cannot be influenced.

What do you think regarding the situation? Could certification or a code of standards make a difference with how things currently operate? What rules would you like? What rules do you think are outdated? Is the idea of paid link repugnant, or do you not care, as long as quality sites are listed? Visit the Search Engine Watch forum below and leave your thoughts. I and others look forward to seeing your comments.

Open Letter to Search Engines
http://www.clientdirect.com/Certification.html

Search Engine Watch Forum
http://trial.internet.com:8000/forums/Ultimate.cgi?action=intro

What Is A Bridge Page?
http://searchenginewatch.com/webmasters/bridge.html

Provides more details about different types of targeted pages, and how they are used.

GoTo Sells Positions
The Search Engine Report, March 3, 1998
http://searchenginewatch.com/sereport/9803-goto.html

The idea of paid links brings up the specter that small web sites will lose out on visitors, or that searchers will miss information they are looking for. Neither case is necessarily true, and this article about GoTo describes why in more detail.


SES LondonOptimising Digital Marketing Campaigns with Search, Social and Analytics
At SES London (9-11 Feb) you'll get an overview of the latest tools, tips, and tactics in Paid, Owned, Earned, Integrated Media and Business Intelligence to streamline your marketing campaigns in 2015. Register by 31 October to take advantage of Early Bird Rates.

Recommend this story

comments powered by Disqus