SEO News
Search

Desperately Seeking Search Engine Marketing Standards

author-default
by , Comments

It seems that every so often, someone makes a new push to suggest that the search engine marketing industry needs to establish standards of conduct. The idea usually dies away from a lack of support. However, a new effort is underway from several different parties that might have more luck. They'll need that luck, because the barriers to establishing standards remain substantial.

The chief challenge is that there is no definitive guidebook that "officially" defines search engine spam. This is because each search engine is an independent entity, which controls what it considers to be spam. For example, Google outright hates cloaking and will penalize for it. In contrast, both AltaVista and Inktomi allow it, in certain circumstances.

There is a catalog of specific elements that are often associated with spamming. The use of invisible text, tiny text, excessive repetition, redirection, mirror sites and artificial links are all examples of things that one or more of the major search engines may penalize for. Unfortunately, they don't tell you exactly "how much is too much," which always prompts concerns about "accidental spam."

For example, if you decide to publish some of your HTML copy in font size 2, is that considered tiny text and likely to get you a spam penalty? Probably not, but the search engines don't tell you definitively what to avoid.

As I've written many times before, I don't expect we'll ever see such a rule book emerge. When I've talked to search engines on this topic, they generally come back to the issue that the more specific they are about what not to do, the more people push right up to the line or gain clues to other things that they can do to spam, which are yet "undefined."

That's the key thing people should take away. We have the laundry list of "bad" items now mainly because they were popular tactics that people overdid, when it came to search engine optimization. However, it's not a complete list, nor will there likely ever be a complete list, because search engines are always seeing either new tricks or a series of tactics that they feel are manipulative.

Even if the search engines won't define rules, perhaps the search engine marketing industry itself can. Indeed, the pressure to have some type of standards that one can say they adhere to seems to be rising, probably as a means for SEMs to set themselves apart from the crowd.

One example is the recently posted "Search Engine Optimization Code of Ethics," from long-time search engine marketer Bruce Clay. Most of the code is fairly common-sense items that few would disagree with: don't violate laws; don't violate published spam guidelines from search engines. The main bone of contention for some is that the code basically says that those following it will not cloak. The term "cloaking" itself is not used, but the description of falsely representing a web site certainly includes it.

An associated "Emerging Standards" page from Clay makes clearer his opinion that cloaking is a "bad practice" He also defines some examples of link spam that he believes are wrong, as well as general types of page element spamming, though he doesn't itemize specifics.

Most interestingly, he considers "machine generated" doorway pages to be bad if they land someone on a page unrelated to the page's suggested topic. In other words, a page that purports to be about wrenches would be bad if it landed you at a porn site but OK if you ultimately ended up at a page about wrenches.

Machine generated or not, any type of "doorway" style page that misleads a user about the content they'll receive is pretty much one of the major no-go areas with search engines. They simply do not like anything that's misleading, and they don't care how it is done.

More specifically, there is a general tendency by search engines not to like "gibberish" pages, which are commonly generated by some automated tools. These doorway pages have words in an order that makes no sense to human readers but instead in a way that the authors hope will please search engines.

This is changing with paid inclusion, however. With the programs offered by both Inktomi and AltaVista, such gibberish pages would be allowed, as long as the pages resolved somehow to an ultimate destination about the terms they were targeting. Cloaking would be a primary mechanism for hiding the gibberish and doing the redirection.

Search engine optimization company WebSeed offers its own "Search Engine Promotion Code of Ethics." As with Clay's guidelines, most of this is either common-sense things to avoid or a list of commonly accepted spam tactics that shouldn't be followed.

Some points of contention might be the allowance of limited mirror sites. Similarly, the idea that "popularity-boosting and hyperlink-tag strategies are acceptable as long as they are used to promote a content-rich, highly-relevant web page" sounds like the code is saying that creating some artificial link structures is fine, if you think a page is really good. And as with Clay's guidelines, cloaking is seen as a no-no.

From e-Brand Management, which produces the Search Mechanics optimization tool, two recent white papers try to help webmasters understand if they are engaging in search engines spam by examining their mindset, rather than specific actions. Nevertheless, the papers still end up getting bogged down in specific techniques considered bad.

One of the white papers, "The Classification of Search Engine Spam," succinctly states that search engine spam is "any attempt to artificially influence a search engine's ability to calculate relevancy."

No doubt many readers will immediately find this statement absurdly broad. For instance, even the search engines themselves will advise site owners to take care when crafting page titles and body copy and encourage site owners to build links. Technically, these are all "artificial" methods meant to influence search engines.

The author of the paper, e-Brand Managements chief technology officer Alan Perkins, anticipates this concern and immediately qualifies the statement to say that such actions are NOT spam if they are "anything that would still be done if search engines did not exist, or anything that a search engine has given written permission to do."

The key point Perkins is trying to get across in the paper is that site owners shouldn't be going to extreme methods to optimize pages for search engines.

"Suppose search engines did not exist. Would the technique still be used in the same way?" Perkins asks. Many would readily agree with some of his examples, though not all of them.

For example, inserting keywords into ALT tags, rather than a proper description of an image associated with the tag, is seen by Perkins to be spam. This is because it is an abuse of what the ALT tag was originally intended to be.

However, would someone be spamming if they were to make use of font tags and an H1 tag to highlight the headline of their body copy? This is often done, because some search engines may give a slight boost to H1 copy.

However, because H1 tags are so horrible looking, modifying them with font tags makes them more visually pleasing. This is something you'd do just for the search engines, but done in moderation -- and especially when used around a page's headline -- many would not consider it spam.

Once again, cloaking gets called out as spam -- though what exactly cloaking is gets confused by the many definitions that are presented: agent-based delivery vs. agent-based spam vs. IP delivery vs. IP cloaking.

What I took away is this: if you are using a system of any type to deliver content to humans that is different than what a search engine spider sees, that's cloaking and considered by Perkins to be spam.

Another effort on the standards front is SeoPros.org, a new organization backed by long-time search engine marketer Terry Van Horne ("Webmaster T"). The group has about 100 search engine marketing individuals or companies registered as members. The aim is to promote "best practices" that search engine marketers should follow. An initial list of guidelines is planned to be released later this month.

The organization also seeks to compile a public database of search engine spammers, so that search engines and consumers can easily spot companies that are violating the organization's guidelines. Alongside this, it also offers a list of search engine marketing companies that will presumably follow the guidelines, once established.

A different organization, the World Association of Internet Marketers, also has among its aims to help establish some standards relating to search engines. The group met in September in the United Kingdom and search engine standards-related discussions are ongoing in its members forum.

The push for standards and ethics that don't try to "manipulate" search engines also came up in a thread at Webmaster World, where site owner Brett Tabke eschewed the suggestion that some search engine marketers don't try to influence search engines.

"The adjustment of html page entities and content for the express purpose of ranking higher on search engines, eg: search engine optimization, is the manipulation of search engine rankings systems....I bring this up, because I've been reading a great deal lately from SEO 'experts' who are very confused about what we do for a living."

However, several follow-up posts by others in the thread still tried to push the idea that there is "good" optimization that helps search engines versus "bad" optimization that manipulates or misleads them.

As you can see, there's a variety of opinions about what constitutes spam. In particular, there's a schism between those who practice what I've always termed "natural" optimization, which is generally working with existing pages at a web site to make them "search engine friendly," another term I coined ages ago. In contrast, there are those who prefer to create "doorway" pages, pages usually designed to please search engines, rather than humans.

To further complicate matters, it's not a perfect division between the "naturalists" and "doorwayists" (or would that be "doorwayers?"). At one end of the spectrum (let's say the left), a pure naturalist would tend to say that you should never make a page for purposes other than human visitors and that you should really only make basic changes to that page: altering the title tag, paying attention to meta tags and making body copy improvements.

A bit further to the right of the pure naturalist, you might find someone else who considers themselves a naturalist. However, this person might think it is perfectly fine to make light changes to an existing page and to use style sheets so that human visitors get a nicely designed version while crawlers tend to see a more ordinary textual experience.

Around the middle of the spectrum, you might have someone who finds it perfectly acceptable to create what are sometimes called "informational pages," a term coined by search engine marketers Detlev Johnson, Marshall Simmonds and Shari Thurow.

Informational pages are designed to serve both humans and crawlers, because they contain valuable information but were also created to please spiders, typically in the sense that they feature an especially search engine friendly design.

Moving still further right, you might get someone who decides to create a doorway page with no real human value at all. The page, if viewed by a human being, might not make any sense. However, through the use of cloaking or other methods, the searcher might be delivered to a relevant page for the search they performed.

Finally, I suppose the complete opposite of the pure naturalist would be a pure doorwayist. This is someone who will create pages, perhaps even taking them from other sites, and use them to generate traffic regardless of whether the pages are in any way relevant to the terms they appear for.

These are just some of the many points on the spectrum, but you can see how easy it is for different people to disagree. For example, a more naturalist person might hate the idea of cloaked doorway pages, but someone who uses them might argue that the naturalist's use of style sheets or information pages are just another form of delivering tailored content.

While I doubt we'll see agreement in some areas, the desire for some type of standards is laudable. How can the search engines help? Certainly by providing as much information as they can, without feeling they are giving to much away.

Another idea is to perhaps offer tools to let people automatically check whether a page has suffered any spam penalty. That can help those who are concerned about having "accidentally" spammed. Another thought is to consider options so that the general public can find out if an search engine marketing firm is known to have caused problems for that search engine, so consumers considering firms can be more aware.

Bruce Clay's Search Engine Optimization Code of Ethics
http://www.bruceclay.com/web_ethics.htm

Bruce Clay's Emerging Standards Page
http://www.bruceclay.com/EmergingStandards.htm

WebSeed's Search Engine Promotion Code of Ethics
http://www.webseed.com/page1006.html

e-Brand Management: The Classification of Search Engine Spam
http://www.ebrandmanagement.com/whitepapers/spam-classification/

Relevancy, Spam, Technology, Cloaking and Ranking : The Ethical Guide
http://www.ebrandmanagement.com/whitepapers/spam1.htm

SeoPros.org
http://www.seopros.org

SEO Newsnet
http://www.seonews.net

There's nothing here yet, but later this month, the new site is supposed to serve as a resource for search engine marketers looking to explain why their work is of benefit to clients.

WAIM Cambridge Meeting, September 2001
http://www.webpr.co.uk/waim/moredetails.asp?articleid=16

Covers what was discussed at a recent World Association of Internet Marketers meeting.

WAIM Forums
http://www.internetmarketingforums.com

Discussions among WAIM members are here. You'll need to create an account, then wait until authorized, in order to view threads.

Webmaster World: Search Engine Optimization Defined
http://www.webmasterworld.com/forum5/875.htm

I-Search Spam Discussion
I-Search, Oct. 2001
http://list.adventive.com/SCRIPTS/WA.EXE?A1=ind0110&L=i-search

A discussion of standards and "what is spam" also happened in the I-Search mailing list, last month. Begin reading with issue 372, on Oct. 23.

Repositioning the Doorway: Part 1
ClickZ, Jan. 17, 2001
http://www.clickz.com/search/opt/article.php/835501

A call for standards that was issued earlier this year. Article emphasizes that within the search engine optimization industry, the definition of terms such as "doorway pages" vary, depending on who you ask.

What is the difference between a doorway page and an information(al) page?
http://www.grantasticdesigns.com/seofaqs.html#q5

Glad you asked. Shari Thurow offers a brief explanation of the concept she codeveloped.

Locked Doorways?
Top Site Listings, Oct. 10, 2001
http://www.topsitelistings.com/ag101001.htm

I call it "natural" optimization while Andrew Gerhardt of search engine marketing firm Orbidex calls it "client side" optimization. But what Gerhardt explains is the same thing: if you build up decent content within your web site, you'll likely enjoy long-term, low maintenance success with search engines.

Promoters Call For Certification
The Search Engine Report, August 4, 1998
http://www.searchenginewatch.com/sereport/98/08-certification.html

Covers the first major push for search engine marketing standards, back in 1998. The same issues described here still remain true today. By the way, the forum link at the end no longer works.

Tapping Into Natural Traffic
http://searchenginewatch.com/subscribers/more/natural.html

Fast, simple things that you can and should do as part of your site building process, to ensure you are friendly to search engines.


The Original Search Marketing Event is Back!
SES DenverSES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Early Bird rates extended through Sept 19. Register today!

Recommend this story

comments powered by Disqus