THE SEARCH ENGINE UPDATE
June 2, 2000 - Number 78
About The Update
The Search Engine Update is a twice-monthly update of search engine news. It is available only to those people who have subscribed to Search Engine Watch, http://searchenginewatch.com/. Please note that long URLs may break into two lines in some mail readers. Cut and paste, should this occur.
In This Issue
+ About The Search Engine Watch site
+ Conference News
+ Google Gets Kid Safe, Suggests Maps
+ Free Listings Gone At LookSmart
+ Inktomi Reenters Battle For Biggest
+ More Than Just Music Search
+ Survey Reveals Search Habits
+ Snap Promotes Sites
+ Moreover: News Lover's Delight
+ Reunion Search Engine Available
+ WebBrain's Visual Directory
+ WebTop Offers Search Agent
+ Web Wombat Expands Listings
Search Engine Articles
+ Interesting articles relating to search engines.
+ List Info (Subscribing/Unsubscribing)
In the main site, I've posted a new page that describes how search engines create regional listings. In addition, just a reminder that I have an extensive listing of regional guides and other types of search engines in the Search Links area of the web site. I also have a large number of submissions I'll be processing over the coming weeks, so if you submitted, hang in there! Both pages can be found below:
How Search Engines Regionalize
Related to the regional listings page, I've posted a new page in the Subscribers-Only area that covers issues about how search engines deal with multilingual pages, as well as issues to consider when submitting to regional search engines. It's quite detailed, and I think it will answer many of the questions I've gotten on this topic in the past.
More About Countries And Languages
I've previously reported about various search engine affiliate programs that pay you for adding search boxes to your web pages. Now I've created a chart that summarizes the programs offered by the various services.
Search Engine Affiliate Programs
For the pages above, and some below, you'll need your subscriber's password. Forgot it? Use the Password Finder:
Finally, two quick notes. I've held off doing the longer look at the AltaVista web mapping paper I mentioned in the last newsletter, in order to cover some other stories. I do intend a longer write up on this, however. Additionally, RealNames appears to have made substantial changes to their pricing structure. I'll also be following up on this.
The final agenda for the next Search Engine Strategies conference has now been posted. The conference will be held in San Francisco on August 14. I'll be presenting and moderating sessions that feature experts on search engine marketing issues and panelists from the various major search engines themselves, including About.com, AltaVista, Ask Jeeves, Excite, Inktomi, LookSmart and Snap. In addition, there will be a special session on shopping search, which should be of interest to any online retailers. Details about the conference, for attendees or potential sponsors and exhibitors, can be found via the URL below.
Search Engine Strategies 2000 - San Francisco
Google Gets Kid Safe, Suggests Maps
Google has added a new feature to filter adult content out of its listings, and the search engine now also provides links to online maps in response to relevant queries.
"SafeSearch" is meant to prevent sites with pornography or adult content from appearing in Google's listings. To use it, do a search, then select the SafeSearch link that appears on the top of the results page. Any future searches will then be filtered.
Unfortunately, there is currently no way to switch filtering on from the Google home page, which means you cannot block adult content on your initial search. For this reason, you may wish to search on something you absolutely expect to produce clean results, then enable the filter for your subsequent searches.
It's likely that this oversight will be changed in the future. In particular, Google says that the ability to remember your filtering preference may be added to the current page that allows you to set language preferences, making it more of a general settings page for the search engine.
Like other crawler-based search engines that offer filtering, Google is using a combination of blocking known adult content sites and also analyzing web pages for patterns of words identifying them as adult content. The block list comes from SurfWatch, and Google also analyzes the Open Directory's listings to identify adult web sites.
Another new feature are the map links that Google began suggesting in May. Google now watches for any query that looks like it contains an address. If so, links to Yahoo Maps or MapBlast will appear at the top of Google's results. The address is embedded inside these map links, so that if you click on one, your address is passed on to the mapping service. As a result, you should get a map to that location you searched for. Map links are especially likely to occur if you use the name of a major city, US state or US zip code.
In other Google news, I wrote last month that the service was beta testing new language search options. Those have officially gone live now -- see the article below for more information about language search. I also wrote about Google's new support for those with WAP browsers. I since discovered a pleasant surprise. Google not only translates the original page that you select from its search results from HTML into a WAP format, but if you select a link from that page, Google continues to translate as you make your way through the web. Because of this, there's every reason for those using WAP browsers to consider making Google a first stop.
Of course, despite Google's impressive HTML to WAP conversion, there may still be times when you'd prefer to find WML pages specifically written for wireless devices, as can be done using FAST's WAP search engine. This may come to Google in the future, according to Google president Sergey Brin. Google has recently completed a crawl of WML pages, and it may integrate these into its main index and ensure that they rank more highly in response to searches from those using wireless browsers, Brin said.
Finally, a sharp-eyed reader noticed that Google was measuring clickthrough on its results, similar to how Direct Hit and some other services have done. Brin says only about one percent of its search results are measured, and it's done purely in order to measure the impact of any general changes to the service, such as how search results are presented, rather than as a mechanism to influence how pages are ranked.
Google Speaks Languages, WAP, Adds Other Features
The Search Engine Report, May 3, 2000
More about Google's language search, WAP search and university search features.
Google Add My Campus
Google is now accepting suggestions for universities and colleges to add to its list of university searches. Nominate your school via this page.
2000 Webby Awards: Technical Achievement
Google won at the recent Webby Awards for Best Technical Achievement
Google takes search acc't to San Francisco's Pickett
Ad Age, May 29, 2000
Google is about to begin its first advertising campaign, on radio and in print.
Google Bets The Ranch On Linux
TechWeb, May 30, 2000
Google may be running more Linux servers than anyone else in the world, with 4,000 machines operating and plans to increase to 6,000.
Kid Search Engines
This page lists search engines like Google that have with filters designed to screen out porn sites.
Wild About WAP
The Search Engine Report, March 3, 2000
A bit more about wireless search engines and WAP. A link to the FAST WAP search engine and others can be found here.
Pagejacking Complaint Involves High-Profile Sites
A complaint has been filed with the US Federal Trade Commission involving the alleged theft of web pages from high profile sites such as Disney, CNET and the Discovery Channel. The complaint claims that the content was "pagejacked" to generate traffic via search engines for other high profile sites such as eToys and Barnes & Noble. You'll find the full story via the URL below.
Pagejacking Complaint Involves High-Profile Sites
SearchEngineWatch.com, May 12, 2000
Free Listings Gone At LookSmart
In mid-May, LookSmart ceased accepting free submissions to the US-edition of its directory. The company says that this is not yet a permanent move.
"We are in the middle of price testing on the Express Submit product," said spokesperson Liz Connaghan. Previously, LookSmart tested paid-only submissions for commercial sites for two days in April, then returned to allowing commercial sites to use its free submission option.
Currently, there is no free option at all for any type of site. Instead, sites must choose either a $199 "Express" option that promises to review and possibly list a site within 48 hours, with a refund if the site is deemed ineligible, or a $49 option promising a turnaround time of eight weeks or less, with no refund.
LookSmart is a very important directory, especially given that it powers the main results at MSN Search and provides Excite with most of its directory listings. Consequently, it is still extremely worthwhile paying to use the more expensive express service. Even if you are on a tight budget, you should still strongly consider using it because of the potential traffic you will quickly gain.
If you absolutely cannot pay the higher rate, then you may wish to keep checking back at LookSmart. During its testing, LookSmart has offered a non-express option for free, for $25 and now for $49. Given this, there is a chance you may find that the price drops or returns to being free. However, I'd strongly advise spending the extra money to use the faster express option. That price hasn't changed since it was first launched, and there's value inherent in getting listed quickly.
Should the change be made permanent, LookSmart would become the first major directory to use an all-paid submission model. This raises issues about whether some sites will be excluded because they cannot afford to pay. I appreciate having the paid submit option, because it makes life much easier for those trying to get sites listed promptly. However, I'd also like to see some type of free submit option remain, complete with a commitment to also process relevant sites this way in a prompt manner. The balancing act LookSmart faces is how to offer both, without everyone simply choosing the free option. One palatable solution may be to require businesses to use a paid submit service, while non-commercial sites can use a free option, a discounted express option, or both.
From a searcher's point of view, a paid submit system means that the money generated can help support a staff of professional editors catalog the web. Some people prefer this to the Open Directory model of making cataloging affordable by using volunteers, which has allowed some categories to be manipulated by editors with their own agendas. However, there's also a danger by tapping into the cash which paid submit offers, a directory may also loosen its editorial control. LookSmart says it has no intention of doing this.
"This is one of the reasons submissions are handled by editorial staff," said Kate Wingerson, LookSmart's editor in chief. "They don't have any pressure on them to feel they have to accept sites." She added that LookSmart would continue to add any site it considered important as part of its normal editorial process, regardless of whether it had actively submitted to the directory or paid a listing fee.
If you submit to LookSmart, one slight change is that you no longer need to select a category. You can go straight to a submission page, enter your site details, and then LookSmart's editors will list your site in one or more categories, as appropriate.
However, I would suggest taking the time to research the appropriate category for your web site, then submit from within that category. When you do this, the category you selected will be passed on behind the scenes to the LookSmart editor that processes your submission, and they'll consider using it. Since the category you are listed in can influence the traffic you receive for particular queries, it is worth making this extra effort.
Should you find that you wind up in a category that doesn't seem right, don't be afraid to follow up with LookSmart and politely explain why you belong in another location. The editors may not have understood properly how your site should be classified, so you may be helping them.
"We do get customer service enquires after the fact," said Kristin Morse, LookSmart's director of ecommerce. "It's in our best interest that users get categorized where they belong." She cited an example of a company that was listed among cabin rentals. The company explained why, while they had cabins, they were more appropriately listed in a hotel category. As a result, the category was changed, Morse said.
Submission of sub-sections of your web site is also still acceptable. For example, if you had a web site that dealt with inline skating, you could submit the site's home page to the inline skating guides category, while also submitting a subsection about safety to the inline skating safety category.
New details of the submission process can be found here, and you can submit from this page if you choose not to navigate to a particular category. By the way, some readers have asked if I am aware that a quote from one of my previous articles recommending the service is prominently placed on the page. I am, and I am comfortable with them citing this, since it was part of a review and still reflects my opinion that using the option is beneficial.
How LookSmart Works
Some additional details about selecting the right category at LookSmart can be found here.
LookSmart Launches Express Submission Service
The Search Engine Update, Feb. 3, 2000
More details about LookSmart's express submission option, from when it launched.
Growing Pains At Volunteer Directories
The Search Engine Update, March 3, 2000
Discusses some problems that have happened with using volunteers to process submissions.
Inktomi Reenters Battle For Biggest
It's been over a year since one could consider Inktomi among the largest search engines on the web. The company's crawler-built index of web pages has stayed static at 110 million, while competitors such as AltaVista, FAST and Northern Light have pushed past the 200 and 300 million marks. Now Inktomi says its back. Later this month, some of its partners should begin announcing that they are making use of Inktomi's new 500 million page index. When they do, users will have access to the largest searchable database of web pages that the Internet has ever seen.
Inktomi is not adding new documents to enlarge its existing database. Instead, it has created a second database that works in cooperation with the first one. If a query doesn't appear to be satisfied by the smaller database of 110 million pages, then a check will be made of the second database that contains the additional 390 million pages.
Why do this? Inktomi claims that breaking up the index helps it maintain relevancy while also keeping the costs down. "This allows us to be good, big and cheap, all at the same time," said Matthew Hall, vice president of engineering in Inktomi's Search and Directory Division.
Inktomi has worked hard over the past year creating a system that it feels allows it to have a small index of the most popular pages on the web, as explained further in the "Numbers, Numbers" article below. It is presuming that most popular queries will continue to be best satisfied by that smaller index, making it unnecessary to do a wider search against the entire corpus of pages for every query.
"Obviously, if someone submits sports, you don't want to go to all 500 million pages," Hall said.
Going to all 500 million selectively also saves money, because it requires more hardware to match every query against the entire database. This is especially true for Inktomi, which processes nearly 50 million queries per day.
To some degree, I see this separation as an expansion of caching that all major search engines do. When you perform a search, the search engine may check its fast RAM memory to see if it has already answered that question rather than going through the much slower process of checking the index stored on the hard drive. That means it is possible it might miss a relevant page, but this doesn't seem a real worry in practice.
The use of multiple indexes isn't new. A search at AltaVista, for instance, pulls up results that come from a variety of sources, such as Ask Jeeves, RealNames, the Open Directory and its own web crawler results. However, Inktomi is the first major service I know of to break apart its web page index. I don't see this as bad, but Inktomi-competitor FAST argues that it may cause people to miss important documents.
"This is a subtle form of search engine censorship, as there could be gems hidden in the second part of the index that would have been listed on the first one or two page of search results, if only Inktomi had let you search them," said FAST CTO John Lervik, who says that his company's search engine always checks its entire index for every query. FAST currently has an index of 340 million pages.
Whether FAST will continue to be able to do that remains to be seen. FAST feels its technology allows it to cheaply process a large number of queries against a huge index of documents. In lingo speak, FAST says it is scalable, that it can rise to meet demand. However, FAST in no way comes close to handling the amount of traffic that Inktomi currently processes, so its scalability claims haven't hit the same stress test.
For Inktomi's part, it describes the latest release of its search product as a " scalable search index." The odd thing here is that Inktomi said its technology was scalable from the start year ago, so what's different now?
"We always could have as big of index as our customers were willing to pay for. We always had the ability to scale. It just meant buying more machines," said Hall.
So the old system could scale, but it would be expensive. In the new system, by adjusting resources, the cost of going big is more affordable.
The real test will be by consumers. Yes, a gem could be hidden within Inktomi's secondary database, but I see this as unlikely for popular and general queries. In contrast, when performing more obscure searches, you should be able to tap into the much wider collection of documents.
We should be able to see for ourselves later this month. Inktomi says its different partners themselves will announce when they go online with the new index. That also means that unless an Inktomi-powered service specifically says it is using a larger collection, you should assume that it still taps into only the smaller database of 110 million. In fact, some partners may not even search against the entire 110 million page index of documents. How deep to go is a decision left to Inktomi's partners.
Inktomi says that it plans to refresh its larger index at least once every three months, if not more frequently. Pages within the smaller index will be refreshed between every hour to a worst case of 21 days, depending on the particular page.
For webmasters, the best way to be listed in Inktomi remains to use HotBot's Add URL page, which feeds into the Inktomi system. Perhaps someday Inktomi will create an Add URL system within its own site, which would certainly make life easier for those trying to understand how to be listed.
Numbers, Numbers -- But What Do They Mean?
The Search Engine Update, March 3, 2000
Explains how Inktomi hopes that its smaller index can satisfy general queries by focusing on popular documents.
Search Engine Sizes
A look at the size of various search engines, based on reported numbers, along with links to past articles on size issues.
More Than Just Music Search
Until now, the MP3 revolution had pretty much passed me by. I'm based in the United Kingdom, where flat rate Internet calls have only just become possible. Given this, I had no desire or incentive to pay by the minute to download large music files from the web. Consequently, I hadn't been that interested in the fury over Napster, which lets you locate MP3 files.
If you've been in a cave like me, the ZDNet article below is an excellent introduction to Napster and why it has raised the ire of those in the music industry. It was after reading it that I realized Napster wasn't some music-playing software but instead a search tool. In fact, I realized that Napster was putting into action the concept of "distributed search," which has been discussed for many years but not implemented in a mass way until now.
Currently, the major search engines operate under what could be considered a centralized system. The crawler-based services send out spiders, which bring back information to a central index, which you can search. The human-powered services do the same thing, using editors to seek out sites from across the web, though they get ample assistance from webmasters who submit sites.
The problem with centralized search is that information can get easily out of date. It takes time to revisit every page in a 100 million page or larger index. By the time you complete the refresh, it can be time to do over again. Moreover, it can be costly to maintain the hardware to gather up information from all over the web. Wouldn't it be better if web sites themselves transmitted the information they possessed?
That's the concept behind distributed search. Rather than operating from a centralized base, you distribute the load across many sites. In effect, they crawl themselves and report back to a system that allows searching across unified listings.
Napster was an ideal application for distributed search. Existing MP3 search engines have a tough time maintaining listings, because many of the sites posting MP3 files may not exist for long, especially if they post illegally copied songs. In contrast, Napster gets its listings from those running its software. If you are online, and running Napster, then your computer tells Napster what information you have (if you choose). That allows someone to search Napster's listings and be connected to where the song can be downloaded.
Gnutella is an open-source software package like Napster that allows you to locate MP3 files or other types of files across the web. It goes a step beyond Napster, in that it distributes both the query and the index. In other words, Napster takes in information from various sources, but you still use its central index to search. With Gnutella, your query is sent out, bounced around all the computers in the network, which in turn report back to you directly about any finds.
The technology in Gnutella has now been incorporated into a new tool called InfraSearch, which promises fresher search results and suggests it will be able to provide better results than our traditional set of search engines.
These new tools are exciting and offer some significant benefits to search. But don't count out the traditional search engines yet. At their core, these new services depend on users for their information, and what every major search engine will tell you is that you can't trust users. If there is an advantage for someone to lie or mislead a search engine, they will do so.
For instance, Metallica, which has filed a lawsuit against Napster, could hinder the music search service by launching 5,000 applications of Napster and flood the company's database with fake listings for Metallica songs. If the listings were degraded, then the popularity of Napster as an efficient MP3 search tool would diminish. Similarly, if some Napster-like text search tool began accepting listings from across the web, it wouldn't take long until webmasters began flooding it with spam.
Moreover, it is one thing to do distributed search in order to return the location of a song. There's no strong relevancy mechanism that needs to be created. It is quite another to do distributed search and also try to determine which are the most popular documents on the web. It is even harder to do fast, distributed search against the full-text of documents, rather than a few lines that describe a file in abstract.
So, while there's potential in distributed search, don't expect your favorite search engine to disappear overnight, nor even at all. Instead, it is likely we'll see more distributed search applications for particular queries where they make sense, MP3 being one of them. And those seeking MP3 files have every reason to examine Napster, Gnutella and the like, especially as the "traditional" MP3 search engines seem to be losing their value.
For example, I jumped into the MP3 world by purchasing an MP3 player last month, during a visit to the US. Naturally, I wanted songs to play right then and there, but all my CDs were back in the UK. Yes, MP3.com offered some surprisingly good selections, but I also wanted music I knew and loved. Thus, I went in search of illegal copies of music that I owned at home, justifying to myself that this was music that I could have converted legally, if I only had the CD with me.
(By the way, this is the key concept behind MP3.com's "My MP3.com" program, which has been attacked by the music industry. When at home, you put a CD into your computer, and then MP3.com understands that you own that CD. Now, if you are away from your CD, you can listen to that music via MP3.com from anywhere, because you've already "proved" that you own it).
I tried the two biggest MP3 search services that I've written about, AltaVista's and the one at Lycos. They failed me miserably. I generally couldn't find the songs I wanted. I got lots of false hits, and I wasted lots of time. Using some smaller, lesser-known MP3 search engines wasn't any better. In contrast, locating and downloading with Napster was much easier. As for Gnutella, I gave up on it. In order for it to work, you have to connect with another user's IP address to begin with. The site lists some addresses, but after six or seven attempts, I'd had enough.
Finally, a last, non-search comment on this whole MP3 mess, if you'll indulge me. Ultimately, the solution to the pirated MP3 search problem would seem to be making music available online cheaply, not trying to chase some pie-in-the-sky dream of copy protection nor filing expensive lawsuits. I would have loved to have gone to a music industry-backed site, knowing that I could search for and FIND legal copies of songs, which I might then pay 50 cents or $1 each for. I'm not alone in this. It would be convenient, acceptable and ultimately a possible money-maker for the music industry. After all, they aren't having to ship me a physical CD, with packaging, marketing costs, etc. Sure, I could share that song with others -- but I could do that now. The only difference is that the music industry never gave me or others the chance an easy way to buy it from them first.
Plans to launch its own file-sharing search engine on Monday. The company's technology is already demoed on SpinFrenzy (see below).
Like Napster, allows you to search for music located on other people's computers. Unlike Napster, you do your search at a web site, then use a small software applet for downloading.
The Noisy War Over Napster
Newsweek, June 5, 2000
Nice overview of the issues surrounding Napster and its software cousins.
Napster-like technology takes Web search to new level
News.com, May 31, 2000
More about InfraSearch. The article is (and InfraSearch themselves are) incorrect in stating that traditional search engines are limited to static content. They can index dynamic content. They just tend to avoid this because of problems that can be encountered with "spider traps," where they might index the same page over and over, because it appears with slightly different URLs.
News.com, May 15, 2000
This special report examines how technology like Napster may change the economics and distribution of entertainment and other content.
The Value of Gnutella and Freenet
WebReview, May 12, 2000
Examines how Napster-like software can be put to other, helpful uses.
Napster: Net market destabilizer?
ZDNet, May 5, 2000
Great introduction to the concept of Napster and how it could go beyond just music search. One flaw comes up in the example involving auctions. You cannot have a Napster-like auction tool unless the auction sites themselves distribute information. Given eBay's recent action against auction search site Bidder's Edge, that can't be taken for granted. The centralized sites that control their own listings, which can protect those listings, can't be co-opted into distributed search. Of course, people might chose to abandon conducting auctions at centralized sites like eBay in favor of using purely distributed tools. But one cannot discount the value-add that centralized players bring in creating a friendly, organized environment for their users.
Gnutella News & Links
Links to information and articles about Gnutella and similar programs.
Streamlining the Search for Music
Wired, May 30, 2000
A look at some new music search engines that are launching.
The Search Engine Report, Oct. 6, 1997
Infoseek holds a patent relating to distributed search, though it's on the technique of ranking results properly among different sources, rather than distributed search engine in general.
One of the oldest proposals for distributed search, in relation to web-based text search engines, can be found here. It was developed with the cooperation of several major search engines back in 1996, but has never been implemented.
Survey Reveals Search Habits
A study commissioned by RealNames has provided a variety of statistics related to searching, including:
+ Over 75 percent of web users use search engines to traverse the web.
+ Half of web users spend more 70 percent or more of their time searching online.
+ 70 percent of those surveyed know specifically what they are looking for when they use a search engine.
+ 44 percent of web users say they are frustrated with web navigation and search engine use.
+ When unable to find what they are looking for, most web users will try another search engine, but nearly 20 percent completely give up.
The study was conducted in April by Berrier Associates, for RealNames. Over 1,000 people aged 18-49 were surveyed by phone, with a nearly equal number of men and women sampled. Only those who use the web for five or more hours per week were included in the survey.
RealNames Survey Press Release
Has a few non-search stats, such as time spent online.
Snap Promotes Sites
Snap has become the latest directory to say goodbye to the alphabet and reorder sites based on popularity. When you enter a particular category, you'll find sites are now ordered in terms of how popular they are. Snap determines this by using its GlobalBrain technology, which measures clickthrough. Snap measures what sites users click on when they search, and sites with high clickthrough get a ranking boost. The time a person spends viewing a site is also taken into account, though oddly, being an editors pick doesn't help, Snap says. Clickthrough data has also been used to help rank the results presented after a keyword search at Snap for some time.
In addition, Snap "promoted" about 50,000 sites from its secondary "LiveDirectory" to its main directory at the beginning of May. If it happened to your site, and you were the one who submitted it, you would have received a email message informing you of this. In addition, Snap says that every other week, more promotions will be made.
There are two good ways to check on whether a site has been promoted, if you are wondering and didn't get an email message. First, visit the LiveDirectory site when logged into Snap. There, any sites you've submitted will be listed. Click on a site, and on the page that comes up next, there will be a "Listed In" area. Sites in the main directory will say "Snap Main Directory." Those in the LiveDirectory will say "LiveDirectory."
By the way, when you search or view a listing within the directory, you'll also find a "Find more sites about" line followed by search terms. These are the most popular terms used by people who have selected that site after doing a search. This can be useful for doing keyword research. Try searching for some competitors, and see what terms they rank well for. Some of those generic terms may be applicable for describing your own site.
Snap Unveils LiveDirectory
The Search Engine Update, Dec. 6, 1999
Describes how the LiveDirectory submission service works at Snap.
Moreover: News Lover's Delight
I love news, and Moreover is one of the most impressive news products I've come across in some time. It gathers headlines from over 1500 sources, then groups them into nearly 300 newsfeeds. It's one stop shopping for virtually any area of interest. Use the drop down boxes on the left side of the screen to select a newsfeed you are interested in. Then, when the headlines appear, enter your email address and chose whether to receive those headlines each day or weekly. A much needed search capability was also added recently -- use the search box on the upper-left hand side of the screen. Webmasters, you can also create custom newsfeeds in a wide-variety of formats for your own site. It's great content, free for the taking. I'll also follow up in the near future about ways to suggest your site for part of a feed.
Reunion Search Engine Available
You can discover old high school friends through a site called Classmates.com. You have to provide some basic information, and then you are able to browse listings for high schools in the United States and Canada. I was amazed to see how many people I knew -- if only more had added bios of where they are now! In addition to viewing your graduation year, you can also browse other years to find underclassmen you may have known. The company says over 6 million people have registered. Active and retired military personnel will also find an additional section for them, after they've enrolled with their high school data.
WebBrain's Visual Directory
There have been various attempts to allow you to search graphically or display relationships between search terms visually, none of which I've ever found particularly useful. WebBrain is another approach at the matter that has just launched, and it seems to succeed better at it. At least, I had fun playing with it. When you come to the site, your browser window will split into two parts. At the top are category listings, such as News, Recreation and Sports. Select a link, and a range of subcategories will fly out of the link. Then, rather than "drilling down" through categories, you really do fly through the relationships. Unfortunately, it's not always immediately intuitive why the various subcategories arrange themselves in the way that they do. However, I do like that you can see a lot more information at once, since categories are displayed in a horizontal format. Also, when you select a category, links are displayed in the bottom window. Information comes out of the Open Directory's listings, and it looks as if you must have a Java-capable browser to use the site.
The company behind WebBrain also makes informational navigation tools for your desktop and web sites.
Need A WebBrain to Net Search
Wired, May 26, 2000
WebTop Offers Search Agent
Technology used to power EuroFerrert, a long-time European-based service, has been employed in the still relatively new WebTop service. In addition to allowing web searches, the site also offers the WebCheck tool (formerly called k-check), which is an Alexa-like search and discovery tool. When using the web service, information from across the web, from news sources (in partnership with Moreover), company information and WAP-related content may be shown. I hope to take a longer look at the service and the WebCheck tool in the near future. In the meantime, consider trying out both.
You may recall hearing WebTop described as the Dialog-owned search service, when it launched last December. EuroFerret was run by UK-based search technology firm Muscat. In 1999, Muscat was acquired by Dialog, known to many informational professionals for its proprietary research databases. Last month, Dialog was acquired by the Thomson Corporation. Thomas has retained key products such as the Dialog search service and the Dialog brand name, while the Dialog corporation itself and other products such as WebTop were spun off into a new company, called Bright Star. So the connection with Dialog the company remains, though the connection with the Dialog search service does not, aside from some distribution and other agreements. As for EuroFerret itself, it has been replaced by WebTop.
Web Wombat Expands Listings
Australian-based Web Wombat announced in May that its global search engine (first URL) now has an index over 100 million web pages, and the company says it plans to grow even larger. More intriguing are the specialty search engines it offers, such as for Cricket, Indian Cooking and Pokimon. Web Wombat also offers a search engine focused on Australia and New Zealand (second URL).
Search Engine Articles
Ask Jeeves president resigns amid stock lows
News.com, May 31, 2000
Ask Jeeves president Ted Briscoe has resigned from the company.
Disney to Pay GoTo.com $21.5 Million in Settlement
InternetNews.com, May 26, 2000
GoTo wins millions in the logo dispute against Disney's Go.
Judge Says a Spider Is Trespassing on EBay
New York Times, May 26, 2000
A judge has decided that spidering of eBay by Bidder's Edge constitutes trespassing. Bidder's Edge plans to appeal the decision.
Quantum Leap in Searching
Wired, May 25, 2000
If someone builds a quantum computer, then there's a search algorithm just waiting to run on it that might allow billions of documents to be processed quickly.
Souped-up search engines
Nature, May 2000
A look at the state of current web search technology.
Online Web Search Issue
Online, May 2000
Coverage of specialized search engines, the future of search engines, upcoming search technologies and a piece I wrote that is an overview of search engine submission basics.
How do I unsubscribe?
+ Follow the instructions at the very end of this email.
How do I subscribe?
+ The Search Engine Update is only available to paid subscribers of the Search Engine Watch web site. If you are not a subscriber and somehow are receiving a copy of the newsletter, learn how to subscribe at: http://searchenginewatch.com/about/subscribe.html
How do I see past issues?
+ Follow the links at:
Is there an HTML version?
+ Yes, but not via email. View it online at:
How do I change my address?
+ Send a message to [email protected]
I need human help with my subscription!
+ Send a message to [email protected]. DO NOT send messages regarding list management or site subscription issues to Danny Sullivan. He does not deal with these directly.
I have feedback about an article!
+ I'd love to hear it. Use the form at
How do I advertise?
+ To advertise in this newsletter or any of Internet.com's other 100 newsletters, contact Frank Fazio, Director of Inside Sales, at (203) 662-2997 or via email at mailto:[email protected]
This newsletter is Copyright (c) internet.com corp., 2000
Introducing... ClickZ Live!
SES Conference & Expo has merged with ClickZ to bring you ClickZ Live! The new global conference series takes on the identity of the industry's premier digital marketing publication, ClickZ.com, and kicks off March 31-April 3 in New York City. Join the industry's leading tech-advertisers in the advertising capital of the world! Find out more ››
*Super Saver Rates expire Jan 24.