The Search Engine Update, April 4, 2000, Number 74

April 4, 2000 - Number 74

By Danny Sullivan
Editor, Search Engine Watch

About The Update

The Search Engine Update is a twice-monthly update of search engine news. It is available only to those people who have subscribed to Search Engine Watch, Please note that long URLs may break into two lines in some mail readers. Cut and paste, should this occur.

In This Issue

+ About The Search Engine Watch site
+ Search Engine Strategies Conference

+ Search Satisfaction And Behavior Results Released
+ Google Adds Directory
+ Hiding JavaScript
+ 7Search Sells Results, Has Nice Features
+ The Problems With Rating Services
+ The Vortals Are Coming! The Vortals Are Coming!
+ Making Search Sticky (offline only)

+ Say No To "Next Results" With QuickBrowse
+ WebCrawler Loses Second Sight
+ Crawler-Powered Job Site Launches
+ Go's Meta Search Software Updated

Search Engine Articles
+ The usual round up of interesting articles relating to search engines.

+ List Info (Subscribing/Unsubscribing)

Site News

Hello Everyone--

Within the main site, the Search Assistance Features page has been updated with details of the new page clustering change at Google, and the Media Metrix Search Engine Ratings page has been updated. Links to both can be found via the What's New page:

What's New

Conference News

The next Search Engine Strategies seminar is less than a month away. It will be held on April 27, in London, and conference details are online at the URL below. The one day conference will feature both experts on search engine marketing issues and panelists from search engines, including confirmed speakers from Inktomi, LookSmart, Netscape/The Open Directory, Search UK and Voila. I'll be presenting and moderating throughout the day, and the conference will also have a look at regional and language issues.

Search Engine Strategies London 2000

Also, for those doing forward planning, the next seminar in the US will be on August 14, in San Francisco. Conference details will be available soon via the page below: Seminars


Search Satisfaction And Behavior Results Released

For the past three years, NPD New Media Services has conducted what I'd consider to be the most extensive survey of search engine user satisfaction available. Every three months, some users at the major search engines are selected randomly by NPD and asked about their visits. The results are fascinating, but they've not been available to the general public until now.

The survey is funded and conducted on behalf of the major search engines, and the results have been used for their own internal purposes. Occasionally, bits and pieces have been released, usually when a search engine performs well in a particular area. For instance, after the latest survey was finished in December, Google announced that 95 percent of its users find what they are looking for all or most of the time, making it number one for that category.

As you'd expect, search engines that rank poorly in some areas do not announce those findings. In fact, some search engines have specifically insisted that comprehensive results of the survey not be released to the general public. As a result, I've been unable to report on the standings.

That's a shame. Numerous search engines representatives have spoken to me about the desire to have some type of mechanism to rate them in terms of relevancy. That's an incredibly difficult task, as relevancy can change by person and by search query. Yet the NPD survey provides one such measure, and I think the search engines should all agree to an open release of key data. It may be embarrassing to those that rank poorly, yet it would also serve as an incentive for them to do better.

While I can't provide comprehensive search engine-specific figures at the moment, I am now able to provide a recap of how the industry as a whole is doing. The answer? Not as well as in the past. Overall, the success rate of finding what you are looking for most of the time or all of the time has dropped to 77 percent, the lowest point since the survey began in the Summer 1997.

What's happened? According to Lisa Manuzza, Director of NPD New Media Services, the main culprit seems to be the emergence of more cluttered and complicated results pages.

"By leaps and bounds, what we hear is that there is too much on the pages," Manuzza said. "People are all confused. 'I just want to know what the results are,' they say."

Could other factors be behind the drop? For instance, if more new web users are being surveyed, they might be less sophisticated about searching, which could cause the success rate to plunge. But Manuzza said demographic factors like this don't appear to be to blame. Likewise, the addition of new search engines to the survey such as Google, Ask Jeeves and GoTo could potentially have caused a slump, if these services themselves earned bad scores. But that's not the case. In fact, these newer services have actually helped keep the overall success rate from slipping further, Manuzza said.

There is some good news. Preliminary results from the Winter 2000 survey are nearly complete, and they show a strong increase in the search success rate, bringing it to its highest point since Fall 1997. Expect a closer look at this change in the future, when the final results are available.

The survey also examines what people do when their searches fail. Nearly 80 percent of people try searching again in a different way at the first search engine they use, rather than trying the same search at a different search engine. This comes from a belief that all the search engines provide basically the same information. That's certainly not true, but the impression is strong enough to keep people from moving elsewhere.

"You'll even see that people are less likely to jump ship than in the past. 'I'm not going to get anything better anyway,' people have said," Manuzza explained. Another factor is the time investment users make, Manuzza said. They get comfortable with a service and understand how things are laid out, making them less inclined to leave.

That should be a warning to some search engines that are constantly changing how they operate. The more they tamper, the more likely they'll impact the comfort level of their users.

Another search behavior surveyed is how we pose our questions. While asking questions is the least popular method, it has seen a huge increase recently due to what I'd call the "Ask Jeeves" factor.

All the major search engines can accept natural language, question-style queries. However, Ask Jeeves explicitly encourages its users to do so. Many users were first exposed to this encouragement when Ask Jeeves began providing some results to AltaVista just over a year ago. Even more have been exposed as the Ask Jeeves site itself has rocketed higher in terms of popularity. Now the Ask Jeeves factor of encouraging questions now seems to have rubbed off everywhere.

"That was non-existent, a year ago. I didn't have any numbers for any of the sites," Manuzza said, about the rise in question asking. "AltaVista was the first to grow, and now, as more people are getting more comfortable with the question asking thing, this was first year I was putting in numbers for practically every guide."

Multiple word queries are still the most popular method of seeking information, but they've lost some ground to single word searching. In the past, single word searching has been bad, since it doesn't provide enough information to help the search engine understand what the user is seeking. But now most search engines have "Related Searches" features. This allows you to enter a single word, then click on a list of multiple keyword suggestions in order to refine your search. This is one reason why single word method seems to be growing in popularity. It allows users to be more comfortable knowing that they'll narrow in on what they are looking for, Manuzza said.

NPD Search and Portal Site Study

Charts and more details from the latest survey.

NPD Group


Google Adds Directory

Users continue to rave about the quality of Google's search results, and that's testimony that its link analysis system works well at delivering relevant information. Now Google is applying link analysis in a new way, to the human powered information of the Open Directory.

All I can say is hurray! I've been chanting for an end to the tyranny of the alphabet when it comes to directory listings for some time. Why should sites that begin with an A appear at the top of list, oppressing better quality sites that may simply begin with a letter further down in the alphabet? In many cases, this offers no help to the searcher.

For instance, imagine you want to find a comparison shopping service. Over at Yahoo, "Shopping Agents" seems to be the most suitable category. In it, the widely-praised service mySimon appears 15th on the list, which is ranked in alphabetical order. The situation is worse when reviewing the Open Directory's "Price Comparisons" category, where mySimon appears 37th on the list.

The trouble here is that it already takes time for human editors to review and list sites. It takes even more time to rank them in order of quality, even though that's exactly what users would like. That's where Google's use of link analysis adds the perfect blend of technology to the human effort of the Open Directory.

For instance, that same Open Directory category mentioned above is now available in Google's edition of the Open Directory. It's basically the same core list of sites (the Open Directory list has a few more, as it is fresher), but Google has reranked the sites in the list so that those with link popularity come at the top. Result? mySimon rises to 5th on the list, which is appropriate for such a popular and well-regarded site.

How to make use of this new directory? One means is to search Google, just as you might ordinarily. As before, that will bring back matching pages of information based on Google's automated crawling of the web. The difference is that you may now get suggested "categories" of information at the top of the search results. Clicking on one of these will take you into the Google Directory. You'll be shown a list of sites for that category that were compiled by the volunteer editors of the Open Directory and ranked according to Google's link analysis system.

When you search, you'll also discover that some pages now have new "Category" links appearing below their listings. For example, in a search for "comparison shopping," mySimon is listed in the top results, along with a link to "Home > Consumers > Consumer Information > Price Comparisons." Selecting this link will take you to the area within the Google Directory where mySimon is listed. Naturally, you'll also find other sites in this category along with mySimon. This means that you can use the category links as a way to find sites similar to those listed in the main search results.

You can browse the Google Directory by selecting the "browse web pages" link underneath the search box, on the Google home page. Browsing the directory also gives you the ability to focus Google's crawler-based results toward any topic you are interested in. This is an incredibly powerful feature that's worth exploring.

Let's say you live in the United Kingdom (I do) and are interested in information about genetically modified foods, which is a hot topic in the UK. You search for "gm foods" on Google, and now you are shown a list of matching pages from all over the web. Great, but now let's say you want to narrow your results to sites specifically related to the UK. Go to the Google Directory, click on Regional, then Europe, then United Kingdom. Now search for "gm foods" from within the category (you'll see that the option is set to "Search only in United Kingdom"). The results which come back will be much more specific to the UK.

What's happening is that Google will return any matching sites for that topic which the editors have listed, plus it goes beyond and brings back any additional relevant pages from these sites that it has found from crawling the web. "You can look through the entire contents of those web sites," explained Google President Sergey Brin.

So by navigating to a specific area of Google, you are essentially turning it into a specialty search engine of your choice. The possibilities are endless. Go to Home > Houses, search for "windows," and you get pages and sites about windows in your home, not Microsoft Windows. Go to World > Deutsch and search for "star trek," and you get links about the series written in German or aimed at German-speakers (trivia tip: Dr. "Bones" McCoy is called "Pill" when Star Trek is translated into German).

In other news, Google has announced that it is planning a family filter and has new international editions in the works. The service also says it has introduced a page clustering feature, so that at most, only two pages per web site will appear in its top results. New Google "buttons" let you search Google or use its GoogleScout feature directly from within your browser.

Finally, Google becomes the latest major service to pay affiliates 3 cents for each search request sent to them. With these programs becoming so widespread, it might even make sense (no pun intended) for libraries or even individuals to consider becoming affiliates of the search engines they use often. If you've got to search, why not get paid?

By the way, we might be seeing Yahoo make a move away from the alphabet. A sharp-eyed reader alerted me to a category where Yahoo first presented a "Most Popular Sites" area, followed by its standard listing format. That page has since gone back to normal. I'll bring more details, as they are available.


Google Directory

Google Buttons

Google Affiliate Program

Search Assistance Features

Explains more about page clustering and ways to find related pages, including using the GoogleScout feature.

Go Guides

Hate the alphabet? Go's volunteer directory also rates sites in order of popularity, with three stars being best.

AT&T WorldNet

Features a version of the Open Directory where results have been organized in terms of what people click on, using Direct Hit's technology.

Yahoo Shopping Agents

Open Directory Price Comparisons

Google Price Comparisons


Need to comparison shop? This is a wonderful service, especially improved by the fairly recent and needed addition of a search box to the home page. You no longer need to navigate first to a category and then search.

Webbys Technical Achievement Nominees

Like Google? They've been nominated for a Webby -- you can vote for them here. So has AltaVista's translation service. Like another search engine? Write-ins are accepted, but don't get your hopes up.

Google: We're down with ODP
Salon, March 24, 2000

Another look at the Google-Open Directory partnership


Hiding JavaScript

In general, search engines tend to weight the text that appears at the top of your pages more heavily than that which comes further down. Think of it like reading a newspaper article. The first paragraph of the article tells you all the main points. Similarly, search engines may analyze your opening text to try and understand what your page is about.

Consequently, using JavaScript at the top of your documents may impact your relevancy. Some search engines may index your code first, then your HTML copy next. That means they may put the biggest priority on scripting code, rather than your nice opening paragraphs that contain the important terms you hope to be found for.

Here's an example. I searched for "document.write" at AltaVista, which is a fairly typical piece of JavaScript code. My goal was to find any pages that AltaVista may have indexed that contained this text. The number three site was for the Black Entertainment Network, with a listing like this:

n'); } if ( ShockMode ) { document.write(''); document.write(' '); document.write(' '); document.write(' '); document.write(''); document.write('');...

As you can see, AltaVista clearly picked up some of the JavaScript from this page. Obviously, using a meta description tag would have solved the bad listing for the page, but that wouldn't have kept the JavaScript from still being indexed and possibly degrading relevancy.

One thing that should prevent JavaScript from being indexed is to surround the code with a comment tag. In fact, that's standard practice to keep non-JavaScript capable browsers from seeing JavaScript. Nevertheless, the code can still get through. One reason may be the use of a > symbol in the JavaScript itself. For instance, one piece of JavaScript on the BET home page above starts out with an opening comment tag, like this:

What is supposed to happen is that nothing between would be indexed by a search engine that ignores comments. But remember that code string that contains the > character? Using an old browser to simulate a search engine, I found that this was interpreted to be the closing comment tag. Thus, everything from that point on was treated as HTML text.

If you can't depend on the comment tags to hide your code, what else can you do? I recommend moving JavaScript to the bottom of your page, whenever possible. That will at least help ensure that it won't be the first text encountered.

An even better solution is to make use of .js files. These are "external" files that contain your JavaScript code. You refer to them from within your page, and then the JavaScript code is only loaded by browsers that understand JavaScript. Since most search engines don't read JavaScript, they should never import the information. More resources about them are below.

Finally, I've also been asked about using the NOSCRIPT tag as a means to hide JavaScript code. This won't work. That tag is only meant for displaying text to browsers that don't read JavaScript. For instance, say you had important information on your page that was only viewable to those with JavaScript browsers. NOSCRIPT lets you send an error message to those using non-JavaScript browsers, while browsers that can read JavaScript will ignore the text within the NOSCRIPT tags. Since search engines are generally like old, non-JavaScript browsers, the text you put in a NOSCRIPT tag is actually text you are explicitly trying to make them see.

Using .js files when creating HTML pages

Basic information about using .js files, from Netscape.

Loading External JS Files for DHTML Browsers
WebReference, Nov. 2, 1999

Long, comprehensive tutorial on the use of .js files, especially helpful in making sure you are HTML 4 compliant.

Footer Text

This code from The JavaScript Source shows is an actual example how .js files can be used.

HTML 4.01 Specification: Scripts

Discusses the NOSCRIPT tag and the issue of hiding script with comment tags. It also suggests a solution to the closing comment tag bug that I described above.


7Search Sells Results, Has Nice Features

7Search is a paid listings search engine that came to my attention last month. According to PC Data Online, it still comes nowhere near the traffic of challengers Rocketlinks and FindWhat, pay-for-placement search engines themselves. PC Data estimates 7Search received only 76,000 visitors last month. Conclusion? Don't expect a lot of traffic, if you start a campaign there. Potentially, it may not even be worth the setup time, at the moment. I also found the service to be sluggish, which might discourage visitors.

On the plus side, I actually liked some of the features 7Search offers to searchers. It matches sites with their domain name information, so that it can display a geographical location below some listings. Even better, I liked the little "Links" section below each listing, that shows how many inbound links each page has. It's an easy way to quickly assess the possible quality of a site. In general, a high number of links will probably mean a high quality site.

Non-paid results at the service come from its own crawler. If you aren't listed, use the Submit URL link at the top of the home page to add your site. You can also use this link to alter your site if it is already in the non-paid listings, though this feature wasn't working when I tried it.


Paid Listings Search Engines

GoTo Gets Cloned
The Search Engine Update, Jan. 18, 2000

My recent look at other paid listing search engines similar to, such as Rocketlinks, Kanoodle, FindWhat & SimpleSearch.

Paid Search Results Are Here to Stay
Traffick, March 27, 2000

Another review of pay-for-placement search engines.


The Problems With Rating Services

I'm often asked which are the web's most popular search engines. To answer, I mostly rely on figures from the major ratings services of Media Metrix and Nielsen//NetRatings. Their figures are far from perfect, however. In fact, it's rather alarming how often their figures are cited as proof of any site's popularity without a lot of qualifying by journalists or analysts. Web ratings can be and are manipulated much more by web property owners than could ever happen when compared to television ratings.

I don't mean to knock the numbers entirely. They are the best estimates we have and can be extremely useful. But as with any statistics, they can't be accepted at face value. Reporters and others who quote these figures to the public ought to be critically analyzing them, rather than just parroting back whatever top ten list appears in the latest press release they've been given. Since many of them are not, I thought it would be useful to go through the type of questions I have to consider when pondering the ratings for search engines. By doing so, I hope you'll be able to do your own questioning.

Let's start with Media Metrix, which is the web's oldest rating service. Media Metrix has a sample of at home and at work web surfers that it monitors, in order to estimate what we all are doing. How exactly sampling is done itself can impact whether the data received is trustworthy. But let's set that aside. Just assume that the sampling is perfect, because there will still be enough problems remaining to take issue with the numbers.

Each month, Media Metrix releases data about top web properties and top web sites ("digital" properties not necessarily on the web are also measured). There's a world of difference between a "property" and a "site," yet it is not uncommon to see the terms used synonymously when the figures are reported. For instance, the San Francisco Chronicle article below talks about AltaVista changing its web site as part of an effort to "catch up" with more popular portal sites. The discussion is mostly about web sites, yet web property numbers are what's offered as proof that AltaVista is behind. As we'll see, if web site numbers had been used -- as they should have been -- AltaVista looks much better off.

"Web property" is defined as a collection of web sites owned by one company. For instance, Lycos owns several web sites, such as its Lycos flagship portal site, the HotBot search site and the Tripod home page building service. The combined traffic of all these and other Lycos-owned web sites make up the Lycos "web property" figures.

Let's compare that to television. A television network like NBC has many TV shows, such as ER, Third Watch and The Tonight Show. Ratings companies can estimate how many people tuned into the entire network during a given week versus estimating how many people watched a particular show. They are different numbers, and they both have their uses.

Web property numbers are like network numbers. They are especially helpful if you want to know how many people a web media owner can reach. Want to get your ad out in front of a lot of eyeballs? According to Media Metrix, Yahoo's network had 45 million of them in February. Lycos had 32 million, and AltaVista had 12 million.

Web site numbers are like television show numbers. They are useful if you want to know if a particular web site is popular among surfers, just as you might use television ratings to determine if a particular TV show is a hit with viewers. For example, I use web site numbers because I'm specifically interested in which search engines are popular. The Lycos web property number is useless, in this regard. It mixes in people who went to Lycos, HotBot, Tripod and other properties. Only the web site numbers will let you know if any of these sites is a hit.

Now comes the manipulation part. Media Metrix leaves it to the web property owners to decide what makes up their web site numbers. That's why in January 1999, the web site jumped from having a 21 percent to a 34 percent reach of among users surfing the web from both at home and at work. All Go sites, such as Disney and ESPN were rolled up into the web site figure. In essence, the web site figure was simply allowed to be corrupted into a web property number.

More recently, the same thing happened with the Lycos web site figure. Since March 1999, the Lycos site reach among at home and at work surfers had been dropping from its high of 26 percent down to a low of around 22 percent. Then last December, it suddenly spiked to about 28 percent. It shot up further to about 39 percent in February.

What happened? I don't know exactly, but one major driver was the fact that Lycos decided to combine HotBot site figures with the Lycos figures, according to Media Metrix. These are two completely separate web sites, and it makes no sense to combine them. Nevertheless, this is allowed.

Can you imagine if the Fox television network was able to combine the ratings for shows like The Simpson and Ally McBeal into the ratings for Titus, in order to show how it was more popular than NBC's Friends? No one would stand for it, yet this exact situation is being tolerated in the web ratings world.

It's crucial to know the hit shows on the web. As with television, hit shows can build or break a network. The combination of distinct sites into one site number can allow web media owners to potentially hide their bad shows. The Forbes article below talks about the success of Lycos, while the Chronicle article I mentioned generally looks down upon AltaVista. But the Lycos web site, as the network's flagship show, was on a downward trend until this latest change to how the Lycos web site numbers are calculated. AltaVista's flagship site was on a generally upward trend that threatened to surpass the Lycos site. The web property numbers don't reflect this. It's only when you look at the web site numbers that such patterns emerge.

Of course, even when you know the hit shows, you still don't necessarily know which are the hits for search. For instance, many people go to the Yahoo web site in order to access their email, and there's no way to break them out of the search-specific traffic. That's one reason you are beginning to hear some search engines quote the number of queries per day that they process. Those figures are a better measure of search-specific traffic. Numbers from StatMarket also provide another look at search engine popularity, based on the traffic they send to web sites.

For its faults, at least Media Metrix does regularly cite both web property and web site numbers. In contrast, the monthly releases from NetRatings focus on only web property numbers. You can get web site numbers, but you have to make a special request. That why the NetRatings page that I maintain within Search Engine Watch is so out of date. I'm still waiting for figures to be sent back to me from NetRatings.

Ideally, I'd like to see NetRatings release both web property and web site numbers regularly, as Media Metrix does. In fact, the new format Media Metrix introduced last month is ideal. It shows the web property figure, then a web site breakdown for that property (though not a complete breakdown). But both companies also need to ensure that they, not their clients, define what makes up a web site, and that those definitions align with what the public generally perceives those sites to be.

Ratings, Reviews and Tests

You'll find numbers for major search-related sites from Media Metrix, NetRatings and StatMarket here, plus links to each of those companies from these pages.

PC Data Online Reports

Eventually, I also plan to add a page about search engine popularity according to PC Data Online. The company isn't as well known as the ratings kings of Media Metrix and Nielsen//NetRatings. However, it offers an unparalleled view into the popularity of smaller sites. You aren't just limited to a top 50 list. Enter any site, and if it is in the top 10,000 web sites in terms of traffic (as estimated by PC Data), you'll be shown stats.

Alexa 1000

A new resource from Alexa that I've only just begun to explore. It offers the ability to look up traffic for any web site and even to see traffic "rolled up" within those sites. For instance, Lycos is listed as the 10th most popular site for February. By drilling down, you can see that Alexa is combining any Lycos-owned sites into this number, making it really a web property number. You can also discover that Alexa estimates that HotBot draws more page views that the Lycos main site.

AltaVista Switches Web Portal Into High Gear
San Francisco Chronicle, March 27, 2000

Yet another article with the usual analyst quotes of how AltaVista is trying to catch up in the portal game, struggling against being "late" to the market. As a site, AltaVista is closing in on Excite, which the same analysts would acknowledge as being one of the portal leaders. AltaVista would also probably be doing the same to Lycos, if HotBot numbers weren't merged into the Lycos total.

Bob Davis, Lycos' Savior
Forbes, March 30, 2000

Much focus here on the recent European spin-off by Lycos, plus mention of how Lycos has grown.


The Vortals Are Coming! The Vortals Are Coming!

Specialty search engines, topical search engines, vertical portals or "vortals" -- these are web sites which focus on particular topics and which especially allow you to search for information relating to those topics. Many more may be coming, thanks to the emergence of new companies and services catering to the specialty search market.

FindLaw is a classic example of a vertical portal. It focuses on cataloging resources relating to the law and legal issues. Within it, there's even the LawCrawler search engine that collects pages just from legal web sites. The "vertical" term comes out of the idea that these are places where instead of searching horizontally, or broadly across a range of topics, you search vertically within only a narrow band of interest.

While there are plenty of topic-specific web sites, it's harder to find ones that also offer specialty search engines for the topics they cover. The sites may have link lists, perhaps even offer Yahoo-style directory search, but it is unusual to see them additionally offer a crawler-based subject search engine., BetterGetter, SearchButton and especially EoExchange (formerly Aeneid) may change this, as they are all targeting the vertical market. Give these companies a list of sites related to a particular subject, and they'll spider pages just from those web sites. The result should be a search engine that produces high quality results for the subject that it covers. is currently beta testing its search technology with six web sites, then will offer its product broadly to customers in about a month or so. Pricing will start from US $10,000 and go upwards to six figures.

BetterGetter is another service like that's operating in beta mode, with an official launch planned for June., which specializes in offering site-specific search for web sites, announced earlier this year its "CommunitySearch" service. Searchbutton operates to demonstrate how CommunitySearch could be used by potential clients to build niche search engines. The service will be available to customers in the second quarter of this year, the company says.

EoExchange isn't new to vertical search. The company has been providing its "EoCenter" specialty search product for about a year, powering search at places such as Red Herring, American Banker, CMP Media, and

"We've probably got over 100 different portals that we are deployed on right now," said Bob Ainsbury, EoExchange's chief technical officer. "In terms of specific focused search for specific industries, we're grandfathers in that list," he added, commenting on the entry of new companies targeting vortals.

EoExchange has manually classified thousands of sites across the web into categories. It then uses Inktomi technology to spider pages from those sites, and only those sites, into a database for its customers. When you search at place like American Banker, you tap into this database, and only pages from sites classified as relevant to banking information are supposed to appear. About 18 million pages have been indexed this way, and the entire database is refreshed each week. About 10 percent of the index is deemed time-sensitive and refreshed each day.

The company has just added new page monitoring capabilities to its EoCenter product. It's a killer combination that benefits those doing regular research, and the "Making Search Sticky" article below takes a closer look at it.

Don't forget, the big players also remain interested in this market. AltaVista just released a new version of its search software for webmasters, which can be used to power vertical search. Inktomi will also create a specialty search solution for you, but the entry fee starts at about $250,000. This is why the moves by companies such as those I've named are exciting. They extend hope that custom search may become affordable for those running smaller web sites. The result may be that new, helpful search resources will mushroom across the web.

As for webmasters, this change also comes back once again for the need to build good web sites. If you offer lots of good content on a particular issue, then a vortal may want to index you. The good news is that if you've already done work to make your site search engine friendly for the major web-wide services, then vortal crawlers should also do well when visiting you.

Making Search Sticky
The Search Engine Report, April 4, 2000

EoExchange has announced new page monitoring tools that can be integrated into its search results, a feature that offers real benefits to web searchers and perhaps suggests a new way for the major search sites to encourage "stickiness" among their own visitors.


From, this specialty search engine allows you to search for matching pages on any topic from within US presidential candidates' web sites.



Another company with a specialty search product among its offerings.

AltaVista Business Solutions

More about how companies can use AltaVista's technology for their own needs.


Describes Intkomi's search engine product line.


This software lets you build custom Yahoo-like directories for a low price. I use it to manage the Search Links area of Search Engine Watch. A new version was just released.

Specialty Search Engines

My list of specialty search engines in various categories. Also be sure to see the references in the "Invisible Web" area. They'll get you to even more specialty search services.

Search Tools

Interested in more about site-specific search and product reviews and evaluations? SearchTools is a must-visit site.


EoEnabled Associates Program

Not only is specialty search growing, but now you can even earn money by encouraging people to use EoExchange-powered search boxes for high-tech and finance research. Other topics are also planned.


Say No To "Next Results" With QuickBrowse

Want to get multiple pages of results from a search engine combined into one single page? QB-Search will quickly join up to 200 pages of listings from Yahoo, AltaVista, WebCrawler, Google or Go2Net's MetaCrawler. It's a simple, helpful tool that you'll no doubt need at some point. Another powerful offering is QB-Masterpage, which will combine URLs that you select into a single page for quick browsing. Both are available from QuickBrowse.



WebCrawler Voyeur Loses Second Sight

Sadly, one of the first services that let you see what people were searching for online, WebCrawler Voyeur, has closed. Excite@Home, which operates WebCrawler, says it might come back in the near future. If not, Magellan Voyeur pulls from the same stream of searches that WebCrawler Voyeur used but lacks the cool Java scrolling bar that WebCrawler had.

Magellan Search Voyeur

What People Search For

Find more live search places here.


Crawler-Powered Job Site Launches


This new site uses crawling technology to locate job postings from across the web.


Go's Meta Search Software Updated

Express Search

The free meta search tool from Go was recently upgraded with related search functionality and now has comparison shopping and job search categories.

Search Engine Articles

Launching a Search Engine Traffic Campaign
ClickZ, March 29, 2000

Outsourcing your search engine traffic work? Here are some things to ask the companies you contact.


Why leave your 'marks online?
Salon, March 28, 2000

A look at the growth of online bookmark companies, which failed to make a hit with writer Damien Cave.


Greybeard AltaVista aims to burnish image
Ad Age, March 2000

How AltaVista is trying to build its traffic with its first ever marketing campaign.


The Battle for Pocketbooks and Minds
Washington Post, March 24, 2000

A look at new and existing military portals.


Helping Webmasters Land in Search Engines' Nets
New York Times, March 23, 2000

Basic overview of the Search Engine Strategies conference that I hosted last month in New York. In particular, the article covers the challenge of bringing search engines and webmasters together. Plus, you can see me in a suit :)


Google Slices & Dices The Web Simply
Inter@ctive Week, March 23, 2000,4164,2472113,00.html

Basic details on how Google works, plus a look at how the humans at Google are fed by a staff chef.


Interview With The Search Engine
FN Wire, March 2000

Humor site FN Wire interviews Ask Jeeves -- and I do mean Ask Jeeves -- not someone who works there.


Excite, iBeauty Lose Key Name Use Lawsuit
Newsbytes, March 13, 2000

Estee Lauder has won in its fight to prevent Excite from selling banner ads linked to its trademarks, at least in Germany. Cases remain pending in France and the United States, and the Germany ruling is open to appeal.


Terminix Abandons Lawsuit Against Free Speech on the Internet
Public Citizen, March 10, 2000

Terminix has given up in its meta tag-related lawsuit against a site complaining about the pest control company's work. This is a press release from the legal group that represented the protest site. I also volunteered time on behalf of the protest site.


DoubleClick Stitches Together Search Sites, March 9, 2000,,12_318231,00.html

DoubleClick has developed a system to let you buy keyword-linked banner ads across all the search-related properties that it represents.

List Info

How do I unsubscribe?
+ Follow the instructions at the very end of this email.

How do I subscribe?
The Search Engine Update is only available to paid subscribers of the Search Engine Watch web site. If you are not a subscriber and somehow are receiving a copy of the newsletter, learn how to subscribe at:

How do I see past issues?
Follow the links at:

Is there an HTML version?
Yes, but not via email. View it online at:

How do I change my address?
+ Send a message to

I need human help with my subscription!
+ Send a message to DO NOT send messages regarding list management or site subscription issues to Danny Sullivan. He does not deal with these directly.

I have feedback about an article!
+ I'd love to hear it. Use the form at

How do I advertise?
+ To advertise in this newsletter or any of's other 100 newsletters, contact Frank Fazio, Director of Inside Sales, at (203) 662-2997 or via email at

This newsletter is Copyright (c) corp., 2000