THE SEARCH ENGINE UPDATE
February 22, 2000 - Number 71
About The Update
The Search Engine Update is a twice-monthly update of search engine news. It is available only to those people who have subscribed to Search Engine Watch, http://searchenginewatch.com/.
Please note that long URLs may break into two lines in some mail readers. Cut and paste, should this occur.
In This Issue
+ About The Search Engine Watch site
+ Search Engine Strategies Conference
+ In Pursuit Of The Perfect Page
+ AltaVista Unveils New Search Centers, Help Files
+ MSN Search Gets Paid Placement, Sort Of
+ Subscribing/Unsubscribing Info
I've given the How The Open Directory Works page a complete update, and it would be well worth reviewing, even if you've read the newsletters regularly.
Within the public area of Search Engine Watch, the Search Spotlight page has a new feature about the most popular search terms at Lycos, including the top topics of 1999.
Links to both articles are below:
How The Open Directory Works
Forgot your password? The Finder will help you.
There's still time to sign up for the Search Engine Strategies seminar, to be held March 9th, in New York City. The one day conference features experts on search engine marketing issues and panelists from many of the major search engines, including confirmed speakers from AOL Search, Ask Jeeves, Direct Hit, FAST, LookSmart, Northern Light, The Open Directory and Snap. I'll also be doing two presentations and moderating throughout the day. If you are thinking of attending, please book now. The last conference sold out before the event day. If you wait until the last minute, you may miss out. Conference details are online at the URL below.
Search Engine Strategies New York 2000
In Pursuit Of The Perfect Page
Occasionally, I get questions about what "numbers" or "rules" should be followed to construct the perfect page for each crawler-based search engine. In other words, how many times should a term appear on a page for it to rank in the top results? How often should a term be repeated in a meta tag to attain success? How often can a term be repeated before a spam penalty ensues?
No one has these numbers, honest. Those that say they do are merely making educated guesses at reverse engineering the crawler-based search engines. You will always see exceptions to their perfect page formulas. Additionally, the twin rise of a greater reliance on "off-the-page" ranking criteria and human-compiled listings makes focusing on perfect page construction much less an activity than in the past (see additional articles about this at the end of this story). Those who are looking forward in the world of search engines are not worrying about "keyword densities." Instead, they are building content, building links and doing other activities that will benefit them in the future.
Similarly, I spend my time focusing on the general tips and trends that I feel will carry you through in the long term in relation to search engines. I feel this is especially important as there are other tools that stand ready to provide "perfect page" information to those who still wish to seek it. This article takes a look at some of those resources.
Before going on, let me stress again -- I am not encouraging you to go out and try to build perfect pages with these tools. In fact, it's the opposite. I want you do everything possible to tap into your "natural" traffic rather than spend time pursuing it in a heavily "active" manner. But if you have already done the natural work, or if you feel you need to be more active and construct pages from scratch, then the tools below may prove helpful.
The clear leader in the perfect page numbers game is WebPosition. In mid-1998, the rank checking software was upgraded to incorporate doorway page building tools. It even guaranteed top placements when it launched. Sounds great -- sign me up!
Actually, the only "guarantee" that WebPosition offers is that if you aren't satisfied, you'll get your money back. The same is true for anyone who offers a guarantee in relation to search engine positioning. No one can guarantee a top ranking. They can only guarantee to refund your money or do additional work if you are not satisfied with your results.
Nevertheless, many people who use WebPosition are satisfied. In my opinion, this has less to do with any magical powers that WebPosition possesses and more to do with the fact that if you've never done any optimization work before, then practically anything you do will probably increase traffic and make you feel happy. For instance, I've had people tell me that all they did was add a term they wanted to be found for to their HTML title tags and suddenly, they gained a flock of visitors via search engines.
To me, the real genius to WebPosition is that it walks you through the process of building doorway pages from scratch. The software's "Page Generator" module has you select a search engine to target, enter the term you want to be found for, provide the name of your company or web site and add some body copy text. Hit the "Generate" button, and a doorway page will be made for that term, and for that search engine.
So now you have doorway pages -- but are they also "perfect" pages, ideally tuned for each search engine? Not at all. In fact, the pages produced for AltaVista and Inktomi are identical, and the Excite, Lycos and Northern Light pages simply omit the meta keyword tag used on the AltaVista and Inktomi pages. The Google page also omits the keywords tag but does insert a comment tag. Overall, these differences are so minimal that the Generator could have made a single page for all of these search engines.
In order to really distinguish the pages from each other, the "Page Critic" module should be used, as WebPosition itself prompts you to do. The Critic will examine your page and compare it against what it finds to be average for top ranking pages at particular search engines. For instance, you would currently be told that your page should have an overall word count of 519 to 555 words for Google, while Lycos likes an overall count of 196 to 279 words. You are also shown how your page places against various criteria, so that you can make changes and bring it closer to perfection.
But how does WebPosition know what perfection is? The company is constantly downloading top ranking pages for a variety of terms, and in particular for terms that are not especially popular. That's essential, because the more popular the term, the more likely that off-the-page criteria may be causing the page to rank well. Similarly, for popular terms, page cloaking may be used by some webmasters. There's no sense analyzing cloaked pages, because they aren't what the search engine itself saw.
This all sounds great, but the reality remains that ultimately, WebPosition is still guessing -- and there are a host of factors that can cause it to guess wrong. As the software itself advises, the numbers it provides are intended as a "starting point," rather than an exact recipe to perfect itself.
Also keep in mind that when WebPosition changes its knowledge base, all the work spent perfecting your pages may need to be redone. Of course, smart webmasters leave their older pages up, because sometimes a ranking change can cause these pages to score well. But my overall concern is that by continually pursuing the perfect page, you may be squandering time better spent by building new content.
Trying to get the perfect page with WebPosition is a lot of work. You have to constantly refine your basic doorway page to please its Page Critic suggestions. In contrast, PositionWeaver tries to make the process of getting the perfect page as painless as possible.
As with WebPosition, you enter some basic information, and then PositionWeaver creates doorway pages for various search engines, based on what it believes they like. Unlike WebPosition, PositionWeaver requires no second step to further tune these pages. Once generated, you have a series of very different pages for each major search engine.
The problem is that these pages are incredibly ugly. They are designed to please search engines, not human beings. In particular, the program generates body text that is repetitive and nonsensical, which is kept out of sight by pushing it to the bottom of the page. I have every reason to believe that any search engine that closely examined pages generated by this program would consider the pages spam, since they all tell me they do not like pages that are essentially gibberish.
That's a pretty strong statement, so I asked PositionWeaver's creator, David Gikandi, if he had any complaints or problems in this regard.
"Yes, the generated text does look a little messy. I haven't received any complaints from a search engine, either directly or through a client, so at this stage I don't have reason to believe that these pages will be penalized across the board," Gikandi said. "In fact, judging from the email we receive from customers detailing their success, I guess it would be safe to assume that a user who is using it legitimately, within their own topic, will be OK, while the one using it to blatantly spam the engines using unrelated keywords will be penalized if caught."
A free doorway page generator is just part of the search engine optimization toolset that consultant Bruce Clay offers via his web site. Like the previous tools I've mentioned, you enter terms, select options, then generate a page meant to be optimized for search engines. Unfortunately, you are given no guidance as to what options are best to use for which search engines or even in general, such as how often to repeat terms in the keywords tag, which keyword density distributions are seen as helpful, and so on.
Another concern with this tool is the generic text stuck at the bottom of these pages -- it's there to help make the page look more "normal," but I suspect some search engines might consider this spamming. Of course, as with PositionWeaver, that's assuming they actually spot these pages.
The paid version of the doorway generator offers a slick "envelope" option which I've always privately termed "poor man's cloaking." It creates your doorway page code within a frames page. The user sees the main page of your web site (or whatever page you submit to the doorway generator), while the search engine reads the optimized code in the noframes area. This gives you one of the best benefits of cloaking, which is to hide the ugly doorway page text from visitors, without the disadvantage of using special software to feed spiders optimized code. However, it doesn't provide the other benefit of cloaking, which is to hide your code from the prying eyes of competitors.
Assuming you don't use doorway pages, you may still want to have some mechanism to automatically analyze your existing content, to see if it can be improved. To satisfy this demand, a number of page checking services have popped up over the past year.
The problem with all of these is that, once again, there are no actual rules for them to follow. It is quite possible for one checker to tell you that your page is great, while another reports serious problems. Basically, each checker is offering its own opinion, and just like humans, checkers can have different opinions about what they think is right.
For instance, I started getting messages from confused readers about a month ago after an advertiser began promoting their page checker on the Search Engine Watch home page. People would run my home page through this checker, only to wonder why "my" checker was reporting deficiencies in my own page.
Of course, it wasn't my checker -- it was the advertiser's -- and changes have since been made to make it clearer that this is an ad, not a service that Search Engine Watch is offering. Nevertheless, the confusion shows just how little you can depend on checker services as an absolute guide. Let's take a closer look.
The checker, run by ProBoost, reports that my title tag has a "serious problem," but it offers no clue as to what this could be. Maybe it is counting words and thinks that there are too many. Perhaps it thinks the length of 82 characters is too long. I'm not concerned, because there are no particular limits to words or characters in a title tag that the search engines publish. The guidance I offer people is write a title of about 5 to 13 words in length that uses some of your key terms, looks attractive to humans, entices people to visit your site and which isn't misleading to your content. That's my opinion; I find it works, and so I'm not concerned about what this checker thinks.
There's also woe for my meta description tag. Another serious problem is reported, though exactly what isn't stated. Since the tag is formatted correctly, and I personally find nothing wrong with it, I'm again not worried about the checker's advice.
The checker does like my meta keywords tag, but why? The only guess I can make is that it has found that I do have the tag, and that it is formatted correctly. Great -- those are about the only firm rules that a checker can really look for. It might also try to look to see whether it feels I am being too repetitive, but any report on this would be its own opinion, not firm fact.
I especially like the analysis of my body copy. The checker is trying to guess as to what words I'm trying to be found for by looking for phrases between commas in my meta keywords tag. The problem is, commas are not required and indeed can be detrimental to search engine ranking. Why do I have them at all? Just for fun -- if anyone's looking at my meta tags and wants to copy some "secret" they expect to find there, I have commas placed randomly just to confuse them.
As a result, the checker gets confused, thinking for instance that I'm trying to be found for "listings search engine watch web site." Since it also can't find that phrase in my body copy, it reports that I don't have enough keyword matches and could suffer lower search engine ranking.
Meta Tag Checkers
Meta tag checking tools are less ambitious than page checking tools but much more widespread. HitBox recently unveiled a new one, which I used to check the Search Engine Watch home page. Unlike ProBoost, HitBox's checker scored my tags a perfect 10.
HitBox told me that they are essentially looking to see if I have a title, meta description tag and meta keywords tag present. They also check to see that these are formatted correctly. Finally, they check to ensure that my lengths don't exceed general standards. For the record, most search engines that support the two major meta tags allow a description of 200 to 250 characters and a keywords element of 1000 characters. I'm assuming HitBox checks on whether tags are within these standard ranges.
Overall, the HitBox analysis isn't bad. Nevertheless, it would be even better if it reported exactly what criteria was checked. Also, it would be good if the meta tag building tool that HitBox also offers followed the same standards that the checking tool uses. The building tool limits a title to 69 characters, a number which matches no search engine limits that I know of -- and one that my perfect scoring page exceeds. The tool also encourages a description of 1000 characters, well over what the majority of search engines allow (FYI, there's no harm in going longer. The excess is just ignored).
The Bruce Clay SEOToolset also offers a meta tag checker, which tries to do more than basic syntax checking. Beyond just character length checking, it also tells you if your tags use too many words. The rules used presumably comes from Clay's own experience or research. There certainly are no official rules about how many words meta tags can have.
The paid version goes beyond meta tag checking and reports on usage of header, comment, ALT and link text. I found none of this particularly useful. For instance, the comment usage on my home page was reported as adequate. Why? Simply because I had comment tags of a certain length, not because those tags contained any terms I may have been targeting. Indeed, the comment tags simply exist for ad purposes, not for any search engine optimization reasons.
In contrast, I quite liked the feature the paid version offers, which is to highlight in red any search terms that you use in your meta tags that don't also appear in your body copy. I always recommend using the meta keyword tag as a magnifying glass for terms on your page, and so this was a nice way to see any gaps.
Keyword Density Analyzers
Another tool sometimes wielded in the quest for the perfect page are keyword density analyzers. These are tools that tell you how frequently a term is being used in relation to other words in your documents. The idea is that there is an ideal density figure that you should aim for.
Of course, as I've previously explained, there are no firm numbers that anyone can provide as to what's the perfect density. Given this, I don't recommend wasting your time trying to hit some particular target. But I do find the tool I occasionally use from GRSoftware, GRKDA, to be helpful for identifying pages to optimize.
For instance, imagine you have a web site with 30 or 40 existing pages. Using GRKDA, you can import all the files at once to be analyzed. Next, you can enter a search phrase, set the search mode to "whole phrase," and then you'll quickly be able to see which of your pages already use that search phrase with some frequency. These are the pages that you'll want to optimize for that phrase. Ensure that you use the search phrase as part of your page title and within your meta tags. Similarly, if you find that none of your pages actually use search phrases you want to be found for, start reworking your body text to include these terms.
I well understand the quest for the perfect page, and should you embark on that journey, I hope some of the tools I've mentioned will help. Nor do I mean to sound hostile to this quest or those who are making tools to help you. But please keep in mind that even armed with the right formula, your supposedly perfect page may not rank well. And if you've been spending all your time on doorways only to find little or no success, strongly consider a change of tactics. That doesn't mean abandoning doorway pages entirely. It does mean, however, to ensure you are balancing your efforts.
Enlarge your site to have real content on the terms you want to be found for. If you sell shoes, have articles about how to select different types of shoes. If you offer package holidays, provide some tourist information about your destinations. Build this "real" content and optimize it for your target terms. Then go out and link build. Find sites that are non-competitive with yours but on related topics and offer to swap links. These two activities are akin to building a house, while concentrating on doorways is similar to renting. Renting is easy and offers a lot of advantages, but at the end of the day, you don't own anything. Concentrate on building your house, and you should see traffic from search engines and other publicity venues over the long term.
Aside from the aforementioned doorway page building and critiquing components, WebPosition offers an outstanding position checking feature plus can also submit pages you've generated with the program or which already exist on your site. Don't overlook the extensive help files for both the program and its Page Critic component. They contain extremely useful information. Save time and annoyance by downloading them in PDF format, rather than having to view each subject individually, online. You'll find a URL to do this by going to Help | Contents, then selecting the first link, "How can I get a printed manual." Price is US $149 or $349, depending on version you select. Those who are really into perfect page stats will also need to purchase a subscription to keep the software's knowledge base updated. The first 90 days are free, then an extra year of updates costs $99. Overall, this tool is a must for anyone serious about search engine marketing.
Want an at-a-glance look at what PositionWeaver feels are perfect page stats for each search engine? Go to the Advanced tab, select Enter Custom Text Statistics, then choose Step 2 and Step 3. You'll see the default text lengths and frequencies that PositionWeaver thinks each search engine likes. Price is $70 or $113, depending on version you select.
Both the doorway page and meta tag checker / keyword density analyzer can be found here. A weakness of the keyword density analyzer is that you can't specify particular terms to check in either the free or fee-based version.
Engenius is another tool that tries to create perfect pages for you. Like WebPosition, there's a "Doorway Pages" module that lets you build pages to target different search engines. Also like WebPosition, there is also an analysis feature to help you refine your pages. That's where the similarities end. Engenius provides nowhere near the type of hard numbers about each search engine that WebPosition aims to deliver. Its keyword density and relevance tools do provide statistics about documents you've created, but there's no real specific advice about how to make use of these numbers. Price is $79 or $210, depending on version you select.
Beyond checking your meta tags, the HitBox Doctor will also check your HTML syntax, spelling, load times and look for broken links.
You can try the checker that doesn't like my home page from this URL.
You can download the keyword density analyzer I mentioned from here. Price is $99.
Enter a URL, and this free web-based service will report back on the number of times and percentages various keywords appear within it. Unfortunately, as you can't specify actual phrases, I find the utility is limited.
Specify a keyword or search phrase, and this free service will show you how often it appears on any two web pages that you select. On the results page, selecting the "Search Engine Profiles" option and rerunning the analysis shows you the estimated percentage for each major search engine, rather than for a generic search engine.
Software Blast: Web Promotion Tools
Still want more tools? Here's a list of packages. FYI, I intend to take a closer look at submission tools in the near future.
More About Meta Tags
Should you use commas? How many keywords is too many? How long can meta tags be? Answers to some of the more technical questions about using meta tags, along with details for each search engine.
Tapping Into Natural Traffic
A new article that I've added that focuses on a very few key essentials to keep in mind, especially if you are feeling overwhelmed by search engines. Especially suitable for those who are designers or site producers, rather than web marketers.
Search Engine Talk
Provides links to mailing lists, discussion areas and other venues about search engines, where you may get more advice on what seems to work for particular search engines. But keep in mind that many things are posted as fact, when in reality they are only speculation.
More About Doorway Pages
Basic guidelines to consider when creating doorway pages, in order to avoid spamming problems. Also links to a page that discusses issues with cloaking web pages.
Directories Power On
The Search Engine Update, Oct. 19, 1999
More about how results from human-powered directories are pushing crawler-based results to the center stage, which has a serious impact for those dependent on doorway pages.
Getting Away From Words-On-The-Page Relevancy
The Search Engine Report, March 3,1999
An earlier article that explains a bit more about what I mean by "off-the-page" criteria being used more heavily than in the past.
AltaVista Unveils New Search Centers, Help Files
AltaVista has added a new MP3/Audio "search center" to the others that appear at the top of the search box, on its home page. Additionally, the company has released a ton of new help material that especially is aimed at aiding webmasters to better understand how their sites are indexed. This can be found in the new Advanced Search area, which is being beta tested. Finally, I've been getting reports from a few webmasters who have had pages go missing from AltaVista, then told "Too many URLs at that site have been submitted today" when they try to resubmit. I have a question outstanding to AltaVista on this issue, and I'll be following up on it and the other changes in the next newsletter.
Advanced Search Beta
MSN Search Gets Paid Placement, Sort OfMSN Search is beta testing keyword-based text links that appear in the left-hand column of its search results page. Placement is by bid. More information can be found below.
bCentral / MSN Search Keyword Program
How do I unsubscribe?
+ Follow the instructions at the very end of this email.
How do I subscribe?
+ The Search Engine Update is only available to paid subscribers of the Search Engine Watch web site. If you are not a subscriber and somehow are receiving a copy of the newsletter, learn how to subscribe at: http://searchenginewatch.com/about/subscribe.html
How do I see past issues?
+ Follow the links at:
Is there an HTML version?
+ Yes, but not via email. View it online at:
How do I change my address?
+ Send a message to [email protected]
I need human help with my subscription!
+ Send a message to [email protected]. DO NOT send messages regarding list management or site subscription issues to Danny Sullivan. He does not deal with these directly.
I have feedback about an article!
+ I'd love to hear it. Use the form at
This newsletter is Copyright (c) internet.com corp., 2000
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!