The Search Engine Update - Sept. 22, 1997 - Number 13


Sept. 22, 1997 - Number 13

About The Update

The Search Engine Update is a twice-monthly update of search engine news. It is available only to those people who have subscribed to "Search Engine Watch."

Please note that long URLs may break into two lines in some mail readers. Please cut and paste, should this occur.

Site Changes

This is a much abbreviated version of the usual mid-month update. I've been spending a lot of time integrating material from the past newsletters into the site. This will make it much easier to find particular articles or locate material you may need. I'm also building a new Fact File section for the Subscribers-Only Area, as well as general updates throughout the main site.

The time needed for this updating means that I had to keep this newsletter brief. I'm going to highlight a few search engine notes mainly of interest to webmasters. The newsletter at the beginning of October will update everyone on general developments.

Additionally, I've posted a file addendum for the offline version of the guide. This has a few key files which you can unzip and replace in your existing offline version. There are so many changes and updates underway that this is more practical that posting an entire new offline version. You can get the addendum by visiting the Subscribers-Only Area.

Site Changes

The final version of the Yahoo Special Report was posted on Sept. 8. If you read the preview version posted earlier, be sure to review the final version. FYI, Yahoo has revised many of its online forms in response to the report.

Search Engine Notes

Infoseek Changes Meta Tag Limits

I've gotten preliminary confirmation that Infoseek has set a limit of three repetitions of a single word within the meta keywords tag. This is going to come as a shock to many people who have mistakenly assumed the limit was seven repetitions.

Before Sept. 1996, Infoseek used to say that up to seven repetitions were acceptable. Then it stopped saying exactly how many repetitions were too much.

Despite Infoseek dropping an exact number, many people continued to assume that up to seven repetitions was acceptable. Many people also assumed that this applied to all search engines, not just Infoseek.

This is not the case. Each search engine is different, and none of them post public limits -- not even Infoseek. Its new three repetition limit was confirmed via its support staff but is not posted on its help pages.

Infoseek also reported that any form of a word will count in the repetition totals. So if you had a tag like this:

swim, swimming, swimmers, swims

technically the form "swim" is used four times and the tag would be over the limit.

I am following up with Infoseek on this to get more specifics about the changes, such as the exact penalties applied, to verify if the word form usage described above is indeed correct and just to generally find out how people can avoid problems. I expect to have news by the next newsletter.

In the meantime, I would recommend not changing anything unless you've discovered that a page that was doing well has suddenly dropped.

If you should discover a drop, it would make sense to reexamine your tags and see if you may be going over the new limit. If so, make changes and resubmit.

However, don't assume that the change will suddenly bring your pages back to where it was. Infoseek has been very active in changing a variety of things in an attempt to control spamming, and it may be that a position drop is related to reasons other than the new meta tag limit.


Need For Standardized Rules

In the past, I've not been a big proponent of search engines having to publish specific rules about what constitutes spam. I used to be, when I first started maintaining the predecessor to Search Engine Watch, back in early 1996. I was fearful that without such rules, it was possible for someone to accidentally or inadvertently spam a search engine.

As I did more reporting on search engines, it became clear that by and large, people do not accidentally spam the search engines. Search engines are very conservative about what they exclude or penalize.

Pages that do get penalized or banned are almost always those who are aggressively struggling to rise to the top of the listings, no holds barred. I've heard about many examples and seen attempts first-hand, and I can assure you that the typical webmaster is not going to stumble into a spamming scenario.

Generally, publishing such anti-spam rules has seemed like a guide to spamming than an effort to preventing it.

However, the Infoseek change is one of several recent changes that are making it clear that some shared guidelines among the search engines are needed.

The "seven repetition limit," as explained in the above item, was never a rule applied to all the search engines, or even to Infoseek for the past year. But it has grown as one of the major rules that people have lived by. If the new three repetition rule is indeed in effect, for the first time, a large number of people really will be in danger of accidentally spamming.

Additionally, there are very good reasons why you might use a word several times in a meta tag. For example:

learn to swim, swimming lessons, swimming sessions, swimming teachers

is a better meta tag than:

swim, swimming, lessons, sessions, teachers.

This is because the first tag is using phrases while the second tag does not use phrases. That means the second tag is not going to do as well for "swimming lesson," for example.

A three repetition rule will make it harder to use phrase-based meta tags, which may be helpful with other search engines. As a result, it will put pressure on designers to begin creating specifically tailored pages for individual search engines.

This is a tactic I have never recommended. It's a lot of work; it doesn't produce the results many people expect, and some search engines even consider it a spamming tactic.

However, pressure to create search engine-specific pages are rising. Both Excite and Lycos suggest the use of invisible text, text in the same color as the background of a page, as a way to help control the description returned in a web page or to help improve the relevancy of a page.

The same tactic with Infoseek and some other search engines will cause a page to be banned for spamming.

Likewise, both Infoseek and HotBot allow submitting up to 50 pages per day via their Add URL forms. The reward is that pages submitted using the form are listed nearly instantly, within minutes to two days.

Do the same thing with AltaVista, however, and you could be branded a spammer. AltaVista considers submitting a high number of pages to be a spamming tactic and may exclude the pages.

In short, it's becoming exceptionally confusing for the average webmaster to know what's right and what's wrong.

While some common standards would be helpful, I'm not expecting that to happen. Things have become much more competitive between the search engines that when the original meta tags standards were drafted. I'm not certain that they would all want to get together and establish new, even-broader standards.

But you never know. I'll be following up on the issue, and as always, I'll report back on what I discover.


HotBot Instant Add Not So Instant

HotBot's instant Add URL feature is supposed to be adding pages within 48 hours of submission. However, I've found this has not been the case for over a week. If you have submitted, check and verify that your pages do appear. HotBot is readying a major relaunch in a few days, so it may be that things are not acting normally as a result of this.


WebCrawler Check URL Service Down

WebCrawler Check URL service has been down for at least two days. The Add URL service continues to work fine, however.


Lycos Merges Lycos Pro

Last month, I mentioned that the Lycos and Lycos Pro catalogs were different. Pages listed in Lycos may not have been listed in Lycos Pro. This is no longer the case. In fact, Lycos Pro has now been folded into the main service, which was relaunched last week (more details in the next newsletter). There now appears to be only one index in use.

However, don't rely on the Check URL feature to assure you whether a page is listed. I found several examples of the Check URL feature reporting that pages were not listed, when indeed they were. Likewise, some pages that were reported by the Check URL feature as being present could not be found in the catalog.

If you are looking for your pages, the best method is to use the Lycos Pro Power Panel. Enter a page title, then set the page title slider to 100%. Otherwise, try searching for words unique to your pages.

End Notes

To unsubscribe from this list, send a message to, with the following in the subject field:

unsubscribe searchupdate

If you have problems, just send a message to

This newsletter is Copyright (c) Danny Sullivan, Calafia Consulting. It should not be distributed. If you are not a subscriber and somehow are receiving a copy of the newsletter, learn how to subscribe at: