About The Update
The Search Engine Update is a twice-monthly update of search engine news. It is available only to those people who have subscribed to Search Engine Watch, http://searchenginewatch.com/.
Please note that long URLs may break into two lines in some mail readers. Cut and paste, should this occur.
In This Issue
+ General Notes
+ AltaVista Launches Paid Listings
+ AltaVista Paid Listings Q&A (offline only)
+ Father Of AltaVista Resigns
+ FAST Aims For Largest Index
+ Lycos Transforms Into Directory
+ Submitting To Lycos
+ Search Engine Notes
+ Search Engine Articles
+ Subscribing/Unsubscribing Info
It’s been another busy month for search engine developments. That, coupled with traveling during last month, has kept me from making some planned updates within the site.
My top priority this month is to revise the individual pages for each search engine within the Subscribers-Only Area. In the meantime, a new downloadable version of the site should be available by the time you receive this newsletter. It will include all the latest articles on search engine developments, so you can have them for handy reference. I’ll then be diligently working to transfer this information to the search engine specific pages, to make it even easier to focus in on submission-specific issues.
We’ve also introduced a new password finder, to make it easier for all of you who forget your passwords to access the Subscribers-Only Area. I’ve listed a direct link to this below.
Please note that because of the newsletter’s length this month, a Q&A about AltaVista’s paid listings is only available online. You’ll find a link to it at the end of the AltaVista article, below. It’s designed for those who want more information about purchasing placement.
Also please keep an eye on the site’s What’s New page. I plan to post new search engine rating stats, revise the Search Engine EKGs and most importantly, introduce a new link management program to better organize the growing number of external resources that I catalog. If you have sent me a site submission in the past — yes, I do have it. If I think it’s worth being in the site, you should see it appear by the end of this month. After that, the new program should help me update these resources much more quickly.
Finally, the Fifth Annual Tenagra Awards for Internet Marketing Excellence were announced on April 28, and I was deeply honored to have received one in the category of Individual Contribution to Internet Marketing. The Tenagra Awards are the Internet marketing industry’s oldest peer-review awards program. My thanks to the committee members and my sincere apologies that my travel schedule prevented me from attending the ceremony in New York at Web Advertising ’99. Other award winners were eBay, LEGO, Broadcast.com, ZDNet, The Industry Standard and the World Wide Web Consortium’s XML standard. More details can be found via the link below.
The Tenagra Awards
Search Engine News
So it finally happened. One of the major search engines, AltaVista, followed GoTo.com’s lead and introduced paid listings in April. Always a controversial idea, AltaVista did itself no favors through the inept way it launched the paid listings program. The program doesn’t actually affect AltaVista’s crawler-based results, but by moving forward with advertisers before explaining things to the public, AltaVista let dangerous misconceptions fester.
DoubleClick, which handles advertising placement on AltaVista, sent letters to existing banner advertisers on April 7 telling them of the new program. “When users perform keyword searches on AltaVista, what is the first listing they see? Now it can be your company’s listing,” the pitch began. A screenshot mockup also made it appear there would be no delineation between the paid and non-paid search results, although the paid results were labeled “Preferred Placement.”
Almost immediately, word got out on the web. Concern was voiced on several mailing lists and web forums, while AltaVista stayed quiet for over a week. Not until April 15 did the company finally issue a formal press release informing the public and its users of its plans. Its silence fed into the impression that something sneaky or underhanded was going on.
In contrast, when GoTo.com shifted over to paid listings (no, it did not always have them), the company made a big public splash of embracing the model, saying that it firmly believed that the marketplace would improve relevancy better than any algorithm. AltaVista’s making a somewhat similar pitch with its own program, but the delay in going public hurt its reputation. The company admits the program’s introduction should have been handled better.
“I can’t tell you how deeply I regret that, how deeply Rod [Schrock, AltaVista’s President” regrets that. We very badly managed the initial communication,” said AltaVista’s marketing director Celia Francis. “I never really saw what went out,” she said, referring to the DoubleClick pitch. “We should have checked what they sent out to their advertisers, because it wasn’t how we would have liked to describe the program. We hadn’t even finished the design.”
The “AltaVista Relevant Paid Placements” program is still evolving, but here are the current details. AltaVista will display up to two paid links per search term, which appear within a box above the normal top ten crawler-based results. Advertisers bid against each other through an auction process to be listed within this box, pledging to pay at least 25 cents to AltaVista for each time someone clicks on their link.
The actual amount will depend on the price reached at the end of the auction process. Over at GoTo, which has a similar system, bid prices can be over $1 in some categories. Only existing banner advertisers could bid in the trial program for placements through the end of May, but by late June, anyone should be able to purchase placement through an online ordering system.
As at GoTo, AltaVista insists that the paid links be relevant in some way to the search terms. To see some examples of the paid placements, do a search for “insurance,” “auto insurance,” “cars” or “flowers.” All will display paid listings that appear within a box labeled as “AV Relevant Paid Links.”
The paid links mimic the look of AltaVista’s normal crawler-based results, but the surrounding box keeps them physically distinct from these results. So why the outcry? It’s because at all of the major services, there’s an area of the search results page that’s considered “editorial” copy which shouldn’t be touched. When search engines say they don’t sell listings, they mean that they don’t tamper with this editorial area for money. The impression was that AltaVista intended to manipulate this editorial area for payment, and moreover, that it might not disclose this.
In reality, AltaVista’s paid links have no impact on its “hands-off” editorial area. Its crawler-based search results haven’t changed one bit because of the new program, except to move further down the page. In fact, all this concern over AltaVista “selling results” is generally overlooking the issue that there is no one set of “normal” results for AltaVista to sell.
AltaVista displays results that come from crawling the web, but it may also provide matching information from several other data sources: RealNames, AskJeeves, LookSmart and now, the paid listings database. Despite these multiple data sources, it’s the crawler-based results that many people fixate on as the ones that should never be sold or even tampered with.
That’s misguided, because people have been influencing these results and those at other crawler-based search engines for sometime. Manipulating crawler-based search results is increasingly becoming a sophisticated, industrial-strength affair. Any service that depends on automation for its primary listings is vulnerable to people who will try to outwit their ranking systems.
Search engines expend serious amounts of time and energy focused mainly on keeping out obvious and irrelevant spam. But this doesn’t “level” the playing field. Sites with great content continue to be poorly ranked due to design issues, while some relevant sites get higher placement over other relevant sites simply because their webmasters know more about search engine optimization. In short, “pure” crawler-based results don’t exist. Any automated system will have problems that discriminate against some sites, while simultaneously serving to attract those who wish to manipulate the system.
Because of this, it’s probably time to completely eliminate crawler-based results as the primary source of information presented to users, unless relevancy is greatly increased. For one thing, it would immediately reduce the incentive to spam. More importantly, users continue to do extremely broad searches, and crawler-based results often don’t provide the best answers to popular queries such as “games,” “jokes” or “horoscopes.”
Even an excellent link popularity system like Google’s may downgrade new sites that haven’t had a chance to gain good links, while Direct Hit’s popularity ratings are useless if the underlying crawling system can’t index pages from a particular site for some reason (if applied to directory listings, however, it offers huge potential to bring the best to the top. Watch for this coupling to be coming from LookSmart).
Human editors can and should be constantly reviewing query logs to provide us with high-quality choices for the most popular searches. Nor is there any reason these choices must necessarily take the form of a linear “top ten” list. Do a search for “beanie babies” at Snap and then click on the “Beanie Babies Shop” link. The page that appears offers a multitude of choices.
We’re seeing some moves in this direction, such as the changes Excite has implemented that I wrote about last month. The growth of directories has also helped, another topic I’ve written on. But I think the best example of humans becoming involved is at Ask Jeeves. Instead of looking for an automated solution, editors at that service have diligently built up a huge database of answers to questions.
Ironically, it’s AltaVista that has brought this high quality information to more users than ever before by rebranding it as “Ask AltaVista” on its own pages. When initially launched last October, Ask AltaVista information used to appear all the time. Now the data has been downplayed and appears less frequently. Instead of users worrying about the new paid listings, they ought to put their efforts into asking AltaVista to return the Ask AltaVista results to their former prominence.
Perhaps we’ll see this happen. AltaVista is planning a wide-ranging series of changes to be introduced in the next few months. “We actually have a very exciting push coming,” Francis said. “There’s some stuff we are working on that will be significant.” An enlargement of the crawler-based index seem likely to be one of those changes. I’m getting indications that this has already been increased to around 170 million pages.
Don’t forget that running a major search service costs money. If including clearly labeled paid links on results pages helps fund them, so be it. When Infoseek launched, its business plan was for people to pay for searches. That failed miserably. If no one is going to pay these services to search, then they must have a revenue stream to stay in business. Banner ads are one means, but they are expensive and offer poor clickthrough to advertisers. Retailing partnerships are another means, but they largely exclude small businesses. Paid links are a direct and easy concept to understand that offer advertisers of all sizes the ability to reach a precisely targeted audience.
Paid links can also benefit users, because there is often a correlation between advertising budget and quality. Not always, but often. We’ve had yellow pages in the non-web world for ages, and they’ve successfully served phone users. There’s no reason that search services can’t successfully incorporate some of the things we like about yellow pages into their search results, coupled with strong editorial content.
In fact, if we want to see AltaVista try something innovative, how about donating some paid link space to non-profit and educational sites? Let them have some bidding credits to use. There will be plenty of terms that paid advertisers won’t be targeting but which will be of interest to these types of sites. Use the paid links program to let them get to the top of the list, so that the world can find their sites more easily.
By the way, some might fear that a shift away from crawler-based listings might mean the end of little sites being found. This isn’t the case. Crawler-based results should and will continue to be used as back-up to when a primary data source, such as editorially-compiled listings, can’t provide a match. There’s every reason for sites to continue to optimize their pages, and there’s every chance that even those that don’t will continue to get traffic. That’s because a shift away from primarily serving crawler-based results would tend to have the most impact on popular, single-word queries where small sites already can’t compete with the growing trend toward large-scale search engine optimization.
AltaVista Debuts Search Features
The Search Engine Report, Nov. 4, 1998
More about Ask AltaVista and the service’s use of RealNames and LookSmart information in its results.
AltaVista Unveils Increased Search Relevancy
AltaVista, April 15, 1999
AltaVista’s press release on the new paid listings.
Is AltaVista on the take?
Salon, April 22, 1999
An excellent look at the need for more relevancy from Salon editor Scott Rosenberg.
Alta Vista Invites Advertisers to Pay for Top Ranking
New York Times, April 15, 1999
The second paragraph is wrong in implying that the AltaVista search algorithm is now being sold, but overall, it’s an interesting look at opinions from a variety of people.
AltaVista Hazy on Sold Searches
Wired News, April 16, 1999
Wired was the first major media outlet to run a story on the changes at AltaVista, and this follow up article reflects the confusion over the program that even an advertiser had.
I-Search Discussion List
Opposition to the change was great initially, but I’d say opinions are now more balanced as more details have emerged. Click on any of the files ending in “AVspecial.htm” to see comments.
Online Advertising Discussion List
Much discussion occurred here about the AltaVista listings. The online archives are currently down, but there is a way to get archives via email.
Compaq commotion may stir up AltaVista
News.com, April 19, 1999
Some analysts think the changes at Compaq may cause AltaVista to be sold off. Note the comment from one industry analyst that “AltaVista is in a market that could only support two or three competitors in this space down the road.” We’ve been hearing this conventional wisdom since 1996, and three long Internet years later, we have more services, not fewer. The web is not a supermarket, with only room on the shelves for Coke and Pepsi. The web market may be capable of supporting multiple portal offerings, especially if they pursue different business models. I have no doubt that by the end of this year, we will once again have more players, rather than fewer.
GoTo Going Strong
The Search Engine Report, July 1, 1998
A few months after relaunching with paid listings, GoTo was thriving rather than becoming a pariah of the search world.
Search engine GoTo.com files for IPO
News.com, April 16, 1999
Speaking of GoTo, the search service plans to go public.
Pay For Placement?
A summary of articles that deal with paying for listings at the major search engines.
AltaVista Paid Listings Q&A
Ready to purchase paid listings on AltaVista? The program is still being developed, but I’ve written a Q&A should help bring you up-to-speed on what to expect. Because of the newsletter’s size this month, the Q&A is available online only, via the link above.
AltaVista’s Chief Technical Officer, Louis Monier, has left the company to pursue non-search engine related opportunities. His departure is not connected to the recent Compaq resignations nor the introduction of paid links at AltaVista.
“I think it was time to go,” Monier said. “Four years on the same thing is a lot of time.”
Monier is very much the father of the AltaVista search service. The idea of building a large search engine came up during a lunchtime conversation between Monier and two other Digital employees in spring 1995. Monier then led the team that built AltaVista, which debuted in December of that year and set a new standard for indexing the web. Having indexed over 20 million documents, AltaVista was ten times as large as any existing search engine on the web at that time.
Since AltaVista’s debut, Monier has continued as the technical leader of the search engine. In particular, he’s orchestrated the push toward making the service useful to non-English speakers and those living outside the United States, especially with the introduction of a search-by-language feature that has now spread to several other services. He’s also been the push behind search features introduced at the service over the past year, such as Related Searches and the introduction of RealNames information.
As a figurehead, Monier’s departure is as significant as if Yahoo-founders Jerry Yang or David Filo were to depart from their service. It’s a gap AltaVista already feels.
“One thing that’s for sure is that if Louis ever wants to come back, he can. We think the world of him,” said AltaVista’s marketing director Celia Francis.
Both Francis and Monier himself stressed that the introduction of paid links was not behind his departure. “It’s pure coincidence,” Monier said. In fact, Monier had actually wanted the introduction of some type of paid links system.
“I pushed for some sort of paid results, but there not like you see today,” he said. “The implementation doesn’t quite match what I had in mind.” Explaining, he added, “A true auction, for the little people, instead of the big guys through an ad agency. They may evolve into this.”
AltaVista says it does intend to open the listings up to general advertisers by next month.
As for the future, Monier has hooked up with venture capital firm Kleiner Perkins Caufield & Byers and is excited about doing something other than search. He promised more details in the coming weeks.
“I have plans, but I’m not ready to reveal them at this time,” he said.
Kleiner Perkins Caufield & Byers
Northern Light fired the first shot in this year’s search engine size wars, saying it aimed to be biggest on the web back in January. Now the battle has been joined with the entry of Fast Search & Transfer into the web search marketplace. The Norway-based company is aiming to compete with Inktomi for providing backend search services to major portals and corporate sites, with size its key selling point.
To demonstrate its technology, FAST has launched a new site called alltheweb.com. It claims to have already indexed 80 million web pages and plans to hit 200 million by July of this year. If reached, that would be a significant milestone. No one is at that level now.
Currently, AltaVista has the largest claimed index of the web, at 150 million documents. Northern Light is close behind, at 140 million pages — but it says that if index size were audited, it would be the clear winner. Moreover, Northern Light says that by July, it will be at 225 million pages indexed.
“We plan to keep growing the database aggressively, indefinitely. We still believe it to be the largest on the Internet today,” said Marc Krellenstein, Northern Light’s Director of Engineering.
For its part, Inktomi says that it’s relevancy first, size second, on its list of priorities. Its index has been stable at 110 million pages for over a year.
“In terms of the database size, there is no limit to the size of the database we can put up with the Inktomi search engine,” said Dennis McEvoy, Inktomi’s Vice President of Development and Support. “We’re really driven by our customers. We’re finding that people really care about the relevance of the results they get back, so what we’re doing is really spending our time looking at what makes relevance better.”
Ideally, it would be nice to get both size and relevancy in the same package. That’s what FAST says it will do. “We want to throw the gauntlet down,” said David Burns, CEO of FAST’s US operations. “You can’t bring up good sites if you don’t have them.”
Ultimately, Burns says FAST wants to index the 500 to 600 million pages that Forrester Research estimates to exist on the web.
“Our real milestone is to get to 550. We’re publicly saying we’re going to build a search engine to cover the whole web.” As for when this milestone is reached, Burns is less specific. “It’s not measured in multiple years,” he said, indicated that closing a major deal would accelerate the schedule.
One of FAST’s key claims is that its system scales cheaply. It can add capacity inexpensively, using ordinary Pentium III workstations. In fact, its partnership with Dell is meant to help highlight this fact. Dell is supplying FAST with equipment at a discounted rate, in return for being featured within FAST’s search implementations. There’s also the possibility that the relationship will go stronger.
“We know its going to move beyond that,” Burns said, referring to the existing customer-supplier relationship. Certainly Dell made a major showing of support in issuing a joint press release with FAST about the alltheweb.com site.
Of course, Inktomi also claims is that it can scale with the growth of the web, using relatively inexpensive Sun Solaris computer hardware, rather than large workstations like the Digital Alphas that power AltaVista. It’s not backing away from those claims in the face of FAST’s challenge.
I’m not going to drag out the facts and figures here for several reasons. I was briefed on the FAST announcement last week under embargo, which means I couldn’t reveal the company’s plans to others, including Inktomi, prior to this story. I was able to follow up with Inktomi on a variety of specific technical issues about its architecture, as I have done with them and other services in the past. I came away feeling that the claims and counter-claims aren’t something I can cover this month. They involve some highly technical arguments, and it’s somewhat premature for this as FAST isn’t even yet larger than the existing indexes from Inktomi, AltaVista and Northern Light (it does currently exceed Excite, Lycos and Infoseek’s claimed numbers, which are all around the 50 to 60 million web page mark).
The hardware architecture is an interesting topic, and I do hope to revisit it in the future. But I don’t want to get lost in the numbers. You don’t need them to take FAST seriously. The company has already implemented two specialty search services for Lycos, including the impressive MP3 search engine. At the core of its technical staff are students and professors from the Norwegian University of Science and Technology, who studied search before coming to FAST. I’m sure they can build an advanced search engine, just as the people at Inktomi have already done.
The real proof will be in the final implementation, of course. There, FAST has a way to go. Now that it’s live, I spent a little time at the existing alltheweb.com site. I didn’t come away overly impressed with the relevancy, and there were definitely some negatives that I saw. A search for “england” is a good example. You can clearly see duplicate pages that should be resolved somehow, and introducing clustering would help ensure that pages from one particular site don’t dominate the top results, as also occurs in this example.
Having said this, these are early days. I’d expect the service to be refined over time. More importantly, alltheweb.com is not planned to be a challenger to existing portals. FAST wants to power other services, in the way Inktomi currently powers HotBot. It could even be that its partners apply their own relevancy and formatting tweaks like clustering, in the same way that HotBot uses Direct Hit to refine results initially drawn from Inktomi data.
Direct Hit also underscores the need for relevancy over size, something dealt with in the Northern Light article below. Yes, it is very important that crawler-based systems keep pace with the growth of the web. But Inktomi was pushed into the backseat at HotBot by Direct Hit, which promised better results, not a bigger index size. Given this, it’s no wonder that the company is more concerned about increasing relevancy than simply gathering up pages. That’s as it should be. Relevancy is what users are looking for. But if players like FAST and Northern Light can grow and return relevant results, all the better.
All The Web
Northern Light Claims Largest Index
The Search Engine Report, Feb. 2, 1999
This article deals with Northern Light’s claim to having the biggest index of the web, along with a detailed examination on why the other major search engines are aiming more toward increased relevancy.
In an unprecedented move, Lycos changed itself in April from a crawler-based search engine into a Yahoo-like directory of web sites. Never before has a major crawler-based service abandoned automation in preference to human categorization. Even more dramatic is the fact that Lycos is getting its directory data free, courtesy of competitor Netscape.
To understand the change, a little history is in order. Many readers will recall when I first wrote about NewHoo, a web directory using volunteer editors that launched in June 1998. Netscape acquired NewHoo in November 1998, and the company pledged at the time that anyone would be able to use information from the directory through an open license arrangement. In fact, NewHoo was renamed the Open Directory to underscore the fact that its information was free for reuse. Naturally, Netscape became the first company to do this, using the Open Directory to power a branded version within the Netscape web site.
Why give the Open Directory information away? One reason is that NewHoo had very much a “work for the web” attitude that inspired its volunteers. It’s harder to keep them inspired about working for free when the directory is owned by a commercial enterprise. By pledging to make the directory free to anyone, Netscape fueled the impression that volunteers were working for more than just Netscape.
Now its no longer just an impression. Over 10,000 volunteer editors are now powering the main results at Lycos, in an implementation that goes far beyond what Netscape itself has done with the Open Directory information. Three months ago, I quoted Netscape as saying they thought it unlikely that a competitor would use the Open Directory in this way. Now it’s happened.
“At the time, every other major portal had their own directory initiative. We didn’t think that they would scrap their own efforts to adopt the Open Directory, especially when we were one-quarter of the size we are now. As the Open Directory has grown, gained momentum, and surpassed other smaller directories, it’s become more attractive to other portal sites to adopt as a standard platform,” said Rich Skrenta, a founder of the Open Directory who now leads engineering on the project.
Unlike some other Netscape portal competitors, Lycos had the big advantage of risking little by making such a transition. Portions of its web index have remained out of date for months. General search engine users may not have noticed this when doing popular queries, because the service has made a point of getting information from selected “premiere sites” into its index on a regular basis and boosting these pages within the search results. But for very specific queries, Lycos had serious freshness problems.
The move to using the Open Directory information neatly pushes those spidering issues aside. Suddenly, Lycos is no longer to be measured against other crawler-based search engines such as AltaVista and Excite. Instead, it is more appropriate to compare it to Yahoo. Crawler-based results are still available, but they are very much secondary to the directory information.
Don’t forget that Yahoo remains the most popular search site on the web, standing well above its nearest competitors. Human categorization has given it more relevant results, and users have responded to this. While directories such as Snap and LookSmart are now growing in popularity, they’ve always lacked Yahoo’s first mover advantage. In contrast, Lycos is one of the web’s oldest search services. It will be interesting to see if the change to directory results coupled with its already strong brand identity will allow Lycos to close in on Yahoo’s popularity.
Lycos isn’t just taking the Open Directory and plastering its own logos all over it. The company has excluded some material, such as news archives and encyclopedia data that is not directly related to web sites. More importantly, Lycos is applying its own relevancy mechanisms to sift through Open Directory information and rank the results. Lycos editors are also working to merge Open Directory information with Lycos content.
“I think this is a bold move for us and a very positive one,” said Ron Gamble, Director of Directories and Web Guides at Lycos. “We have server tools on our end that really allow our editorial force to integrate high quality Lycos content, and that’s really the ‘value add’ to the users.”
Lycos is hoping to go even further into reshaping the Open Directory core into a product of their own creation by introducing a means to rank sites within the directory, instead of relying on the usual alphabetical listings. This should happen in the next few months.
Now for some specifics at Lycos. When you do a search, you’ll often be able to access information from up to four main sources: matching categories from the Lycos directory, matching web sites from the Lycos directory, matching articles from news wires and matching web pages that come from crawling the web. A navigational bar that appears near the top of the results pages lets you move between information sources and also identifies which type you are viewing.
For instance, when you do a search for a general topic such as “cars,” you’ll first be shown matching categories. The word “Categories” in the navigational bar will be highlighted using reverse text, and matching categories themselves come after this, such as “Recreation > Autos”. Up to 10 category listings are displayed per page, and you can use the “Next” button to advance to a new page of listings. You can also click on links within the navigational bar to instantly move to matches from one of the other information sources mentioned above, if these are available.
Continuing with our “cars” example, on the next page, the category listings come to an end, which means you are next shown matching web sites. Notice that the words “Web Sites” will be highlighted in the navigational bar above these results.
If you clicked “Next” enough times, you’d eventually come to news information, then finally to matching web pages from the Lycos spider. In reality, most people probably won’t find news information by paging through the results. It will either be buried below pages of matching web sites from the directory, or your query will be so specific that there won’t be any news items at all. If you want the news information, use the navigational bar to go straight to it.
There isn’t always information from all sources. For instance, a search for “tonsils” brings up no categories and only one matching web site from the directory. After this, it’s information from the Lycos spider that serves as a backstop, in the same way that Inktomi information serves as secondary results to Yahoo’s own directory listings.
There are two other types of information you may see within search results. “First and Fast” material appears at the very top of the first search results page, in response to popular queries. The links here are handpicked as especially relevant to the search topic, such as listing New York City’s official web site in response to “New York.” Lycos also features its own relevant content such as message boards or travel guides within this area.
“Featured On Lycos” is somewhat similar to “Fast and Fast” in that it features other areas within the Lycos network that might be relevant to your search. This information, if available, appears immediately above the matching web site information or on category pages.
Now let’s go back and look more closely at those different information sources, starting with the categories. Remember that car category I cited above? Assuming you click on it, you’ll be taken to a page that first displays a variety of subcategories all related to cars. Next, the “Featured On Lycos” section mentioned earlier is displayed. Below this, you’ll be shown matching web sites. This format changes slightly as you drill down into the directory. In general, more web sites will be listed rather than subcategories, and you may discover that newsgroups are also listed.
Down at the bottom of the category pages is one of my favorite things about the Open Directory project — the category editors are listed. You can click on a name and send an editor feedback directly. Its a great benefit for those tired of the idea that editors are kept squirreled away, never to be contacted directly by the public.
When viewing web site pages from the directory, you’ll notice a little “Related pages” link that appears at the end of the description. Clicking on this takes you to the category where that site is listed. It’s an easy way to see if there are more sites similar to the one you are interested in, especially if your search initially fails to bring up any category matches.
While results from spidering the web have been downplayed, Lycos says it does not intend to cease its web crawling activities.
“The focus on [crawler-based” search definitely is a little bit gone. Nevertheless, we’ll have to continue to do a good job on that,” Dave Andre, Vice President of Engineering at Lycos. He said the service has just updated its index to eliminate dead links and that it will be scaling up its size in the near future.
Lycos isn’t the only one to be making use of Open Directory information. Lycos-owned HotBot also features a branded version, which replaces the LookSmart-powered directory that HotBot used to feature (links to LookSmart content do remain at the bottom of HotBot’s directory pages).
To access the directory at HotBot, simply select a top level topic of interest from the home page. Be aware that you aren’t shown the entire information available from the Open Directory at HotBot. Listings appear to have been culled to display only the best sites, but I’ll have to confirm this in a future issue.
Now that Lycos has made its move, it will be interesting to see if Netscape and its parent AOL decide to embrace the Open Directory as strongly. They have every incentive. Excite continues to power the main search results of Netscape Search and AOL NetFind, but that agreement is apparently being reviewed in the wake of the Excite-@Home deal. Ideally, AOL could use a non-competitive partner to take over for Excite. If Lycos can leverage the Open Directory so dramatically, certainly AOL could do the same.
Such a move would cause the Open Directory to be primarily powering three major search services, and that gives it real credentials for taking the crown away from Yahoo as the web’s semi-official phone book. Yahoo has held this position because of the sheer number of people who use it. Webmasters know its crucial they be listed within the guide. But now, it’s entirely conceivable that webmasters could begin caring more about being listed within the Open Directory than at Yahoo.
Yahoo still has more than twice the listings of the Open Directory, over 1.2 million versus about 500,000. It also has the advantage of being staffed by professional editors who aren’t promoting their own business interests, something the Open Directory has had trouble with in the past. But the Open Directory’s editor selection process has tightened up, its listings are showing more breadth, and impassioned volunteers knowledgeable about their topics can build a valuable resource. Even assuming there are a few bad apples, with over 10,000 editors, the Open Directory far exceeds the editing power that Yahoo can apply with its 150 or so editors on staff.
Understanding Lycos Search Results
A simple, illustrated guide explaining the results you see on Lycos.
Lycos Directory FAQ
Explains some of the formatting you’ll see as you explore the directory, such as how extremely good sites are shown in bold text.
Lycos to turn ranking system over to users
Boston Globe, April 19, 1999
I like the part that says Netscape has had difficulty convincing other portals to adopt the Open Directory, as if it tried to get Lycos to use Open Directory information, instead of being surprised by the move. The best part is at the end, with the mention of how AOL is facing complaints by volunteers that now say they should get paid for their work. Could the same thing happen with the Open Directory in the future?
Changes in Search Industry Create Strange Bedfellows
New York Times, April 28, 1999
Has comments from other directory sites about the change at Lycos.
The Open Directory
See the service that’s powering Lycos and others.
About The Open Directory
Learn how to become an editor here. It’s not a free-for-all. Editors are interviewed online, and only 10 percent make the cut, I’m told.
Netscape Integrates Directory
The Search Engine Report, Feb. 2, 1999
A look at how Netscape is using the Open Directory information within its own portal. There are links to other articles I’ve written about the Open Directory at the end.
Dogpile Open Directory
The metasearch service Dogpile has also added a branded version of the Open Directory to its service.
Much information about the complaint against AOL, provided by some former AOL community leaders. More details about the issue and plenty of links to news articles.
Forget everything you ever learned about submitting to Lycos, because the situation is now radically different. First, it is imperative that you get your site listed within the Open Directory, since that’s now powering the main listings at Lycos. However, it’s probably best to do this via Lycos rather than directly at the Open Directory itself, for the moment. This will make more sense after we go through the submission tips below.
To start, imagine the top search term you’d like to be found for. Now search for it at Lycos. Look over the top categories that are listed and find one appropriate to your site, preferably as high on the list as possible. This is the category you want to be listed in, because chances are, that’s the category that your target audience will be selecting.
Click on the category. Down at the bottom of the page, you’ll see a link called “Add Website.” Click on this link to submit. A simple form from the Open Directory will appear within a frame. It may look like you are submitting to Lycos, but in reality, you’ll be submitting directly to the Open Directory. This is fine. Give your site a title and a description. If you follow basic Yahoo guidelines to optimize your submission, you’ll be OK.
Unlike Yahoo, there’s no limit to the length of a description you can submit. Each Open Directory editor decides what’s acceptable. Since most existing descriptions are in the 25-word range allowed by Yahoo, it makes sense to use this as a general guide.
Push submit to send your site into the submission queue. That’s it! You’ve now submitted to the Open Directory. I don’t have current turnaround times, but I will get these shortly along with tips on what to do if you don’t get listed quickly.
There’s absolutely no need to go to the actual Open Directory site to do a submission if you’ve followed the procedure I’ve described above from within Lycos. In fact, if you were to submit to the same category from both Lycos and the Open Directory, you’d have actually submitted twice. Once is enough.
The reason I suggest you do the process from within Lycos is because Lycos will list categories differently than the Open Directory. The same search in both places won’t yield the same results because Lycos is applying its own ranking algorithm to sift through the directory information. Since Lycos receives far more traffic than the Open Directory, it makes sense to optimize your submission for how Lycos ranks categories. That will help ensure you get the most traffic possible out of your submission.
Another difference between the Open Directory and Yahoo is that there’s no limit on the number of categories you can submit to. You don’t want to overdo it, and I’d strongly urge not trying to submit to more than ten different categories unless you have a very large site. But there’s no reason not to submit key sections of your site to places where they are relevant within the directory. Consider spreading out your submissions over time, mainly to avoid the appearance of spamming. Again, I’ll bring more tips on this in the near future.
What about the Lycos spider? Submitting to the Open Directory does not submit to the Lycos spider. Instead, you need to use the usual Lycos Add URL page. Likewise, using this page does not submit your site to the Open Directory. To be listed, there, follow the procedure above.
Lycos is promising a number of things in regards to spidering. First, now that the Open Directory transition has been completed, Lycos says it has gone back and added many pages that have been submitted to the spider over the past months. They also say that new web page submissions should appear within three to six weeks. Moreover, Lycos is contemplating an instant add system which may also be integrated with the Open Directory submission process.
I haven’t yet checked to verify these claims, but I will be doing so shortly. But remember, even if the spider is now falling short of promises, the transition to using Open Directory information means webmasters have a new and effective way to get their sites listed within Lycos.
How Yahoo Works
If you understand how to properly submit to Yahoo, you’ll be well-prepared to submit to the Open Directory. This page has tips.
Search Engine Notes
Changes Coming From Infoseek
Just a heads-up that later this week, Infoseek is expected to unveil several new search features, including highlighting search terms within the search results, a related searches prompter similar to that which appears at AltaVista, a larger index size and faster response times. Expect more details next month, after the public release.
iAtlas Launches Filtered Search Solution
iAtlas has launched a search engine that allows users to search the web and narrow down their results by using filters. For instance, you could choose to retrieve results that only come from businesses in a specific geographical region, or which serve a particular industry. iAtlas does this by merging offline databases with a web page database provided by Inktomi. I hope to review the service and bring more details on it next month. Meanwhile, give it a try via the URL below.
Microsoft To Partner With RealNames?
Is Microsoft about to get tighter with RealNames? That’s the rumor, and it’s one that I’ve also heard independently of the article below. If you haven’t registered a name, this is all the more reason to do so. See the second link for more information on registering.
Microsoft may resell RealNames
News.com, April 19, 1999
How RealNames Work
Keeping Up On XML
XML is being used by many companies for internal search products. An easy way to keep up on new developments is by subscribing to XML News, a mailing list produced by Aeneid Corporation. To subscribe, send a bland email to email@example.com
IBM has also launched xCentral, a search engine which exclusively looks for XML data on the Web. This includes XML web pages, style sheets, bulletin board postings and more. See the article below for more details.
IBM Announces XML Specific Search Engine
InternetNews.com, April 28, 1999
WebPosition Gold Adds New Search Engines
A new version of WebPosition Gold now supports ranking checks at MSN Search, a first for this class of software. It is also now checking results from AOL NetFind separately from Excite, which is good, as the results can vary slightly depending on the search. Another plus is that the program’s extensive FAQ and help files that deal with search engine submission and ranking improvement can now be downloaded for offline reading.
Search Engine Articles
Latest Acquisitions Show Amazon Aims To Redefine Retailing
Internet World, May 3, 1999
What does Amazon want with search utility maker Alexa? It bought the company last week, and even Alexa says its uncertain. Whit Andrews tries to puzzle it out for us.
Cheap Search Engine Promotion
Andover News, April 27, 1999
I’m always being asked about search engine submission software — is it worth using? A series of reviews is on my to-do list, but generally, there are few enough major search engines that I think most people are served by manual submission to them. Here’s someone from Andover News who has tried various packages. Get his take on what to use.
Infoseek execs make an exit
News.com, April 27, 1999
Several top executives have left Infoseek. Infoseek says its no big deal, while an analyst says otherwise.
Will Disney’s Go Network pay off?
News.com, April 22, 1999
Have Infoseek and Disney diluted their web brands by merging them into the Go Network? Yes! No! Read the conflicting views.
Portal Commerce Primacy to Wane
Jupiter, April 6, 1999
Online retailers aren’t so enamored with portal deals, according to Jupiter Communications. Its research shows that while portal deals do drive sales, only five percent of commerce executives say they are “highly likely to renew” their current agreements.
Below are sponsor messages that ran in this month’s issue of the Search Engine Report, which may be of interest to Search Engine Update readers.
Submitting alone is not enough to bring new traffic to your site. People must be able to find your site in the top 10 to 30 matches. Check out the award-winning new product WebPosition GOLD which optimizes pages to rank in the top 10, submits, reports your positions, tracks your traffic, and much more. ZD Net 5 Star rating!
Get your FREE download at:
Make your dynamic sites visible to search engines! XBuilder automatically converts dynamic pages written in cgi, perl, or ASP into static HTML. Search engines can then index ALL your pages and to get you the rankings you deserve, while improving performance and delivering content faster. 30 Day Free Trial – http://www.signmeup.com/x.asp?id=2
LIVE Statistics 24/7 – MediaHouse Statistics Server 4.2
SS 4.2 is the only log analyzer that provides a full range of up to the minute Live Statsreports. Up to 1 year of statistical history, click through tracking, demographics reports, licensed for 500 Web sites. Visit our Live Demo. Judge for yourself.
FREE TRIAL – http://www.mediahouse.com/ad/ser.htm
EASIEST SOLUTION FOR INTRANET DOCUMENT PUBLISHING Build a maintenance free intranet today! Your intranet documents will look just like your desktop documents. Try the product rated 9.85 by NetworkWorld & InfoWorlds HOTTEST (98 & 99).
Try FREE today & enter to WIN a Digital Camera!
NEW TOP POSITIONING Search Engine Software -> FREE ONLINE DEMOS
CLOAKbot – Prevent cyber theft of top SE positions – HIDE HTML
Makes it easier to get top positions and keep them
SEAbot – analyzes SE ranking systems to design top pages
TRACKbot – automated daily position tracking
10% discount valid via this URL
How do I unsubscribe?
+ Send a message to firstname.lastname@example.org with the following as the first line of the body:
How do I subscribe?
The Search Engine Update is only available to paid subscribers of the Search Engine Watch web site. If you are not a subscriber and somehow are receiving a copy of the newsletter, learn how to subscribe at: http://searchenginewatch.com/about/subscribe.html
How do I see past issues?
Follow the links at:
Is there an HTML version?
Yes, but not via email. View it online at:
How do I change my address?
+ Send a message to email@example.com
I need human help with my subscription!
+ Send a message to firstname.lastname@example.org. DO NOT send messages regarding list management or site subscription issues to Danny Sullivan. He does not deal with these directly.
I have feedback about an article!
+ I’d love to hear it. Use the form at
This newsletter is Copyright (c) Internet.com LLC, 1999