About The Report
The Search Engine Report is the email companion to Search Engine Watch, http://searchenginewatch.com/. It keeps you informed of changes to the site and general search engine news.
The report has 19,000 subscribers. High-speed email list hosting services are provided by SparkNET, http://www.SparkLIST.com/. You may pass this newsletter on to others, as long as it is sent in its entirety.
Please note that long URLs may break into two lines in some mail readers. Please cut and paste, should this occur.
In This Issue
+ General Notes
+ Site Updates
+ A Bridge Page Too Far
+ Turning Users Into Members
+ Microsoft Planning New Start Page
+ Lycos Buys Tripod
+ Northern Light Expands Content, Attacked By Writers
+ Keeping Tabs On AltaVista
+ Excite Search Software Bug Found
+ Excite To Acquire MatchLogic
+ Search Engines Report Earnings
+ OneSeek MetaCrawler Offers Side-By-Side Results
+ Search Engine Articles
+ Search Engine Notes
By the time you receive this, the site should have moved to a new, faster web server. While the site continues to look the same, I have reorganized pages into subdirectories, and file names have changed from ending in .htm to .html.
These changes should be transparent to most of you. If you enter an old URL, you should automatically be forwarded to the new one. The changes may also make it easier for those of you who wish to read everything within certain sections of the site. Now all the Webmaster's Guide material is in one location, as is the material from the other sections.
If you should stumble across a bad link, please let me know via the site's feedback form. I'll work quickly to correct it.
Search Engine Profits and Losses
Shows net income, cumulative losses, spending in different categories over the past three years.
Search Engines And Capitalization
Many people worry about whether they need to capitalize terms in meta tags, and this page answers those questions. In short, a survey I did found 80% to 90% of people search in the lower case -- so relax about it.
Search Engine Reviews Chart
See who's been winning the reviews done by various magazines and publications. A related page also links to actual reviews.
Search Engine Report Archives
A number of articles from the past few months of the Search Engine Report have been broken out into individual pages. The archives page list these, making it easy to find something you may have missed.
Search Engine News
A Bridge Page Too Far?
One of the more popular bits of advice that have been going around in the past few months is the idea of submitting "bridge" pages or "entry" pages to search engines.
These are pages that have been created to do well for particular phrases. They are also known as portal pages, jump pages and by other names. Regardless of the name, they are easy to identify in that they have been designed primarily for search engines, not for human beings.
Bridge pages are not a new technique. Spammers have used them for ages, setting up hundreds of pages to draw in traffic. However, they've now become much more widespread. I knew they'd hit a new benchmark when I recently discovered State Farm Insurance employing them in a highly sophisticated manner.
State Farm is the largest home and auto insurance company in the United States. Most large companies of its stature are loath to do anything related to spamming. But if State Farm is using bridge pages, does that mean they are spamming the search engines? And if a big company can do it, does that mean bridge pages are legitimate?
A survey of the search engines finds that they do not consider bridge pages necessarily to be spam. It all comes down to how exactly those pages are used.
In order to provide guidance, it's important to understand the various ways bridge pages are employed, both technically and in terms of the content they deliver. I can't cover the technical aspects in this article, because it would make the newsletter too long. However, a new page within Search Engine Watch does provide this information. A link to it is below.
In this article, I'll cover how bridge pages are used and abused, to help you avoid trouble if you go this route.
State Farm makes a good case study. Technically, State Farm is delivering pages to the search engines based on their IP addresses. This means each search engine sees something tailored for them, while everyone else gets a generic page.
State Farm has a domain used just for its search engine submissions: statefarmins.com, as distinct from the statefarm.com domain. About 50 tailored pages were submitted from that domain to both AltaVista and HotBot, and 18 were submitted to Infoseek. Each page was designed for different topics, such as "auto insurance," "boat insurance," and "find an agent for auto insurance." Multiple pages were submitted in hopes of hitting the right combination. Anyone following a submitted page is automatically rerouted to the site's home page.
State Farm wouldn't confirm any of this for "proprietary" reasons, though they were happy for me to speculate. There's nothing really proprietary going on -- a URL search easily reveals the fingerprints of multiple submissions, and testing showed pages were being IP delivered.
State Farm did well for what are no doubt its top terms. They were page one or two ranked for "auto insurance" with Excite, Infoseek, HotBot and Lycos. They were also page one ranked with Infoseek for "life insurance," "home insurance," "car insurance," and "boat insurance." They had page one or two positions for several of these terms with HotBot and Lycos.
At this point, you might think, isn't this spamming? Can you submit all these pages, push the ranking mechanism and tailor pages just for the spiders without grief?
The answer was mixed. Generally, the search engines don't have a problem with what State Farm was doing, because people were not being deceived in the end.
Consider someone searching for "auto insurance." They see the State Farm link and click through. There's no doubt the site is relevant to the topic. The user isn't going to complain, and so the search engines don't have a problem.
"They're not having what some would consider a negative impact on our web results," said David Pritchard, HotBot's marketing director. "I don't think anyone would doubt they are relevant."
HotBot's main concern in this situation is when someone submits so many pages that they crowd other relevant results out of the top listings. Others reflected a similar view, and Infoseek's "More Results" clustering system is meant to help ensure that one site does not dominate the top results.
But what happens when you get away from relevancy? Imagine you are an insurance agent, and you know people are searching the web for information about Monica Lewinsky. So, you create a page optimized for her name, put a misleading description on it and submit it using your sophisticated IP detection system.
You're going to have problems. Why? It will become clearly obvious to visitors that your site has nothing to do with Monica Lewinsky. They'll complain to the search engines, which take a very, very dim view of this type of behavior.
"The main issue for me is whether or not the site is deceiving my users. I have a responsibility to deliver good data to my users. If there's a situation where my spider sees one thing and my user sees something else, I have a problem with that, because it makes me look bad," said Steve Schneider, who oversees AltaVista's spidering process.
It's similar with Excite: "Where we are most concerned is when a company is using it in the wrong way. The consumer walks away thinking, 'Oh, Hotbot or Excite had the wrong information.' We have to be concerned with how that reflects on the brand," said Excite search product manager Kris Carpenter.
Now let's muddy the water. What if you create a page of links about Monica Lewinsky and post that as a service within your site. Now you have legitimate content, so the search engines shouldn't complain, right?
Or, imagine you have a big web site that's completely database driven. The search engines can't crawl your site because of how the database implements URLs. To make up for this, you create lots of pages for different topics, all relevant to your site. You shouldn't have a problem, should you?
Those are just two of the many situations that are impossible to assess generally. If your pages are relevant, people may not complain, and the search engines may not bother with them. However, if you go to excess, submitting many pages, your behavior may be seen as suspect.
I haven't dealt with the issue of using traditional spamming techniques on bridge pages. That's a no-no, all the search engines agree. They do not want you stuffing keywords, using invisible text (depending on the search engine), or doing a number of other things they consider hostile.
Unfortunately (depending on your view), IP delivery makes it harder to spot spam. People who wonder why their own sites don't appear for a particular term are often the best spam police. They investigate top ranked sites and discover any irregularities. IP delivery is a crippling blow to this discovery process.
Imagine you wanted to come up well for auto insurance. You saw that State Farm was doing well, but because of IP delivery, you'd have no way to check their source code and see if they were spamming. The most you could do would be to message the search engine.
The search engine support techs are busy people. They get messages like this all the time. Chances are, they'll look at the page and see exactly what you saw, the "real" page and not the page that was delivered to the spider.
This is exactly what happened with Infoseek, when it examined the State Farm pages in response to this story. At first glance, the pages seemed fine. But then they went back and looked at what the spider actually saw, they unhesitantly called the pages spam and removed them from the index.
"It looked OK on the surface," said Sue LaChance Porter, Infoseek's Director of Technology Products. "When I actually saw the page we had spidered, you could see that they were spamming."
Spamming may have been done to the other search engines. Some did not check the pages at all, and of those that did, I suspect they may not have retrieved the actual IP delivered page.
As for State Farm, it readily admits to experimenting with submissions to get the best results, though it said it did not intend to spam anyone.
"Competitively speaking, it just behooves us to do the best we can," said Bob Reiner, Manager of Interactive Marketing. "We're certainly not trying to do spam," he added. "Spamming to us is a bad term. We don't want to get involved in that."
As an individual, you can't tell if someone is getting past the spam filters, if they use IP delivery. But you can look for clues that may help you guide the search engines to follow up and perhaps keep the playing field more level.
Start with Infoseek. Do the page title and description match what's on the page? If not, you can do what spam vigilantes do and resubmit the page. Wait a few minutes (sometimes up to a day), then do a URL search and see what's now listed. Are the page title and description still different? If so, then spider-specific delivery is occurring.
Take a look at some of the other search engines. Are you seeing the same URL being listed, but with a different title on each search engine? That's more evidence of specific delivery. Also do a search for all URLs from that site. This will quickly show you if someone has submitted multiple pages to test the ranking mechanism.
By now, you have a good idea that someone is being very sophisticated. That doesn't mean that they are spamming, but it does help you better advise the search engines. Tell them that you are seeing spider-specific pages being delivered and explain that you want to ensure they check that spamming techniques are not being used.
If you do this, you may help improve the odds that those hiding spam behind IP delivery may be caught. But don't waste their time. Be certain that this is happening and that it is worth them following up.
At this point, you may also be thinking that bridge pages are a way to go. They are certainly attractive for some reasons. Some search engines have different rules about what they allow, so tailoring content may be helpful. Some sites depend on CGI or other mechanisms that keep their content invisible, so having these bridge pages may be the only way to be represented.
Why not do it, or why keep it to a minimum? There are some good reasons:
First, it takes a lot of time. Imagine you create a topic-specific page for each search engine. That's six pages. Now imagine you want to cover five different topics. Now you are balancing 30 pages. Now imagine you are doing this for very popular topics. You may have to change your pages on a daily basis, to keep up with others aggressively attacking these terms.
Some people decide its worth this incredible time investment, but the vast majority would benefit more from developing their sites with real content, building links and performing other types of Internet publicity.
Second, bridge pages tend to be focused around specific phrases. However, people search for information in many, many ways. You can't anticipate all the ways, but sites with good content can tap into a lot of it. This is because their pages are naturally loaded with relevant terms. With the right tuning of a page title and reinforcement with meta tags, these pages can capture those other terms.
That's a key lesson to keep in mind. You don't necessarily have to create entry pages. Every page in a web site is a possible entry point. Optimize those pages, make sure they are properly submitted, and you may be very pleased with the results.
A further reason is that you could inadvertently misstep. To avoid this entirely, follow up with the search engines before starting, if you feel you have a site that needs to have bridge pages because of particular problems.
Some final notes. All of the search engines are looking at ways to apply relevancy not just to a page, but to a site. That means if someone searches for "Hawaiian Vacations," they'd like to determine which sites have substantial (and legitimate) content related to this topic.
Lycos search manager Rajive Mathur said that some this type of move is "inevitable," for Lycos and for the other search engines. As they transition to this, those finding bridge pages to be effective (and not everyone does) may be looking at lost traffic.
"I'm sure they're looking at [bridge pages” and saying, 'But I've having such good success,'" said Excite's Kris Carpenter. "In three to six months, it may have the opposite effect."
What Is A Bridge Page?
Turning Users Into Members
Yahoo took another large step in its online service evolution by announcing an agreement last month with MCI to offer Internet access under the name "Yahoo Online." Meanwhile, Excite made a partnership with Prodigy. Both moves raise some questions about whether these users of these services will slowly begin to think of themselves as "members."
It's a subtle change, best illustrated in this way. Ask someone who subscribes to AOL, CompuServe or MSN if they are a member. Chances are, they'll say yes. Now ask someone if they are a Yahoo, Excite or Lycos member. Chances are, they'll scratch their head and look a bit puzzled.
No doubt, people consider themselves users of these search-oriented services. But the addition of free email, free web pages, chat and other the other trappings of online services haven't yet produced the attitude change from user to member.
Brett Bullington, Executive Vice President of Strategic and Business Development at Excite, says his service has dedicated users that it considers members, but he doesn't expect they think of themselves this way.
It's similar at Yahoo. Do users think of consider themselves Yahoo members? "I honestly don’t know," said Diane Hunt, Director of Corporate Communications.
Does it make a difference? One advantage the online services have from members is that they pay cash. The Ad Age article below speculates that the Yahoo Online service could bring Yahoo up to $20 million in revenue, presumably per year. That's about quarter of what Yahoo earned in 1997. For a service heavily dependent on advertising, subscription fees would be a nice alternative stream of revenue.
One of the things Yahoo Online subscribers will get is custom content. Take that a step further and envision that Yahoo offers some premium content for a small fee to the 2.5 million or more people that have signed up for Yahoo Mail, or even to the 26 million or more visitors it receives per month. If significant numbers of these people signed up, Yahoo could have substantial subscriber revenue in addition to its advertising earnings.
The change from being a user to a member also has some intangible benefits. It can help build the brand, reinforces loyalty and expands a base of people that bring in new members, says Excite's Bullington.
So will we see an effort to change users into members, perhaps using the offer of premium services as the prime lever? Yahoo and Excite both say they have no current plans to do this to the range of personal services they currently offer.
"It's advertising supported and always has been, and that's how we plan to keep it," Yahoo's Hunt said.
Excite's Bullington says the same and goes a bit further. He raises the point that Excite and Yahoo are heading toward the online service model from a different place than where AOL and MSN started. Access and subscriber fees were never part of their original game plans, and they can build and profit from a membership without these.
"AOL has had years to build their model," Bullington said. "For ourselves and Yahoo, we're both still understanding what it means for someone to be a member or subscriber," he said. "I think its is better for us to find free ways through sponsorships or promotion and get those [services” in the hands of the consumer."
He uses television as a model, where people have loyalty to programs, and producers turn that audience into revenue.
"You don't feel like you subscribe to a TV show," he said. Similarly, people may never feel they subscribe to Excite. But if they behave like subscribers, returning regularly and viewing the Excite programming, then Excite can benefit from them without charging directly.
Yahoo/MCI Internet deal raises competitive issues
Advertising Age, Jan. 1998
Good quotes from other search engines and others.
New Yahoo, MCI Service May Pose Challenge to AOL
Web Week, Jan. 19, 1998
Fighting To Keep Eyeballs
Web Week, Jan 19, 1998
Deals with the continuing goal of search service to keep their visitors. Of interest is the resurrection of the idea we'll be seeing fewer search engines in 1998, rather than more. This is exactly what was expected for 1997, and the reverse happened. We ended up with more major players.
Excite, Prodigy do content deal
News.com, Jan. 21, 1998
Excite and Prodigy Pair Up
Media Central, Jan. 22, 1998
When a prospect does a search for a keyword related to your products or services, do you appear in the top 10 or does your competition? Submitting alone does nothing to insure good visibility. WebPosition is the first software product to monitor and analyze your search positions in the top search engines. WebPosition has been compared to similar "services" on the web and has been overwhelmingly voted the best and most accurate tool for search position management.
With WebPosition you'll know your exact positions for an unlimited number of keywords. You'll know if you drop in rank. You'll know when a search engine FINALLY indexes you. You'll know when you've been dropped from an engine. You'll even know how you rank in relation to your competition!
Try WebPosition yourself for FREE at:
Microsoft Planning New Start Page
Rumors are emerging about Microsoft's plan to launch a new default page for its browser users and others called Microsoft Start. Central to it may be the Inktomi-powered search engine.
What will be key here is to see whether Microsoft will make such a page exclusive to its properties or continue to provide direct searching links to the major search engines. While it would seem obvious to drop these links, the search services have considerable traffic that can be directed back toward Microsoft. Thus, partnership and cross-promotion is likely to continue.
Redmond Web strategy turns again
News.com, Feb. 3, 1998
Lots of analysts making the usual guesswork over whether Microsoft will be successful, and if so, who may lose as a result.
Microsoft to Unite Web Sites
LA Times, Feb. 2, 1998
Lycos Buys Tripod
Last month, Yahoo acquired a stake in free homepage provider GeoCities. This month, Lycos mimics the move and picks up provider Tripod for $58 million in stock.
Again, there are more similarities. Lycos and Tripod are rated as some of the most popular services on the web, as are Yahoo and GeoCities. These partnerships produce mega-audiences and complimentary services.
This story broke as the report was going out, so there wasn't time to follow up on details. A key aspect will be how this move affects the Lycos - GeoCities partnership.
Lycos currently allows people to search "Personal Homepages" using a drop-down box on its front page. This is actually a search of GeoCities free pages. In return, GeoCities points users to Lycos for web-wide searches.
Both companies said no changes were planned to this set-up after the Yahoo partnership was announced. However, now it seems unlikely that GeoCities will continue to be the Personal Homepages choice. Perhaps Lycos will provide homepage searching of GeoCities and Tripod pages, or perhaps Tripod will be the only service searched with this option.
There are also plans to ensure that important Tripod community areas relevant to searches will appear pre-listed, probably much as Lycos already suggests visiting some of its other related content.
Lycos buys Tripod service
News.com, Feb. 3, 1998
Search deals enhance services
News.com, Feb 3, 1998
Yet another article about search engines acquiring services to broaden their appeal, but there some good quotes on the costs.
Lycos Adds New Features, Reorganizes Suggested Links
The Search Engine Report, Jan. 9, 1998
Describes how Lycos and other services are promoting their own content in their listings.
GeoCities Lands Partnership With Second Search Engine
The Search Engine Report, Jan. 9, 1998
Northern Light Expands Content, Attacked By Writers
Northern Light now has material from nearly 3,000 publications available within its "Special Collections" area. It added more than 1,000 publications to the area at the end of January.
The Special Collections area allows people to search through articles from publications such as the Atlantic Monthly, The Economist and Rolling Stone. Matches are displayed, and users pay $1 to $4 to read the full-text of a story. Northern Light also offers a web-wide search service, which is free.
The Special Collections area also came under fire in January from freelance writers, concerned that their material possibly is being resold without permission.
Northern Light is not the only service to offer article reprints. However, it does allow free searching of the material its sells. That makes it an easy target for writers looking for misuse.
Northern Light buys the right to resell its Special Collections material mostly from third parties, and its says that any dispute is with these content aggregators. The writers and their organizations say Northern Light and others reselling articles without a writer's permission are liable. The articles below go into the issue in more depth.
Northern Light in Hot Water with Freelancers
Wired News, Jan. 26, 1998
Summarizes the grievance writers have raised with Northern Light over finding their articles resold without permission.
Northern Light Web Site May be Violating Your Copyright
National Writers Union
The NWU is threatening action against Northern Light. This page has links to letters it has sent the search service.
Is Your Name Up In Lights?
American Society of Journalists and Authors, Jan. 23, 1998
A good summary of how Northern Light and others acquire reprint rights and how to tell if you, as a freelancer, have a real complaint.
Keeping Tabs On AltaVista
A series of tabs have been springing up above AltaVista's search box over the past month or so. These provide easy access to new services that AltaVista has been adding. There are currently four tabs: Translations, Browse By Subject, People Search and Business Search.
Browse By Subject is the newcomer, launched on January 20. This leads to a co-branded version of the LookSmart directory, called AltaVista Subject Search. Users can click on topics of interest and work their way through menus to find appropriate web sites.
LookSmart continues to grow as rival to Yahoo, in terms of being a directory of reviewed web sites. Editors examine submissions, then categorize them into one of over 16,000 subjects. There are currently over 250,000 sites listed, compared to Yahoo's 750,000.
Unlike Yahoo, browsing through LookSmart and its AltaVista-branded version does not bring up a page devoted to each topic. Instead, selecting a topic causes a submenu of topics to load. Eventually, a selection of web sites is displayed. A blue arrow at the end of each topic indicates there are more subtopics, while an image of a text document shows if the next click will bring up a page of results.
In comparison to Yahoo, you sometimes have to click once or twice more to arrive at an actual web site. For example, from the Yahoo home page, you could go to Computers & Internet > Hardware and find at least some actual web sites listed, as opposed to more subcategories.
In LookSmart, you have to go one more level in Computers & Internet > Hardware > Best of The Web before you reach actual sites. On the other hand, with the constant display of where you've come from, it's easy to move around into related areas.
The regular LookSmart service remains searchable, of course. Those entering through the home page can perform a search, with matches from LookSmart reviews listed first, then matches from AltaVista's listings.
One odd note. I found that sometimes if I had gone to the AltaVista branded version, there was no way to go directly to LookSmart. Entering http://www.looksmart.com would cause me to be redirected to http://www.looksmart.com/altavista.html. I could sometimes get around this by dropping the www, entering just http://looksmart.com.
AltaVista's other tabs lead to services launched in December. Translations brings up the stand-alone web page translation feature, while People Search and Business Search bring up these specialty search features offered in partnership with Switchboard.
AltaVista also added some new spotlight boxes below its search box on Feb. 3. These point out what's new with the service and other items relating to Digital and AltaVista.
AtlaVista Subject Search
Excite Search Software Bug Found
A security bug was found in the Excite search software that is used by many webmasters to index their web sites. Those running Excite For Web Servers 1.1 should download a free patch to correct it. The bug affects both Unix and Windows NT operating systems. A new bug-free version 1.1.1 is also now available for Unix and soon to be released for Windows NT.
The bug does not affect the Excite web site. Visiting the site or doing searches does not cause a security problem for users. This is an issue only for webmasters running EWS 1.1. Version 1.0 is unaffected.
The bug was first reported on BugTraq in December, in a message that was also copied to an Excite administrative address. The message was overlooked, causing the company to scramble when it was alerted to the bug on Jan. 12 by Wired News.
Excite readily admitted to being embarrassed that the message slipped through the cracks and pledged that such a thing wouldn't happen again.
"That was definitely something we have dealt with in a significant fashion," said product manager Kris Carpenter. "We're going to reduce the complexity of communicating with us and make it absolutely clear, 'This is how to reach us.'"
The bug allows those knowledgeable about system administration to execute commands and read files via information relayed through the search box, but only on systems with lax security.
"It would require that the webmaster left the server open more than normal," said Carpenter. "The extent of the possible impact is in most cases going to be minimal," though she added, "We definitely are very concerned the impact it could have had on the web community."
The person who discovered the bug, Marc Merlin, agreed that the impact would be limited on a secure system, but he noted that many systems are left unprotected.
"It is true that the impact for very well maintained systems is minimal, but there are too many Unix machines that are vulnerable one way or another," Merlin said.
Excite gets no income from the software. It has been always been free, though a few support contracts were once sold. These expired at the end of 1996. Since then, it has been offered completely unsupported, as a benefit to webmasters.
Excite Security Notice
The patch for version 1.1, the patched full-version 1.1.1, FAQs and information from Excite about the bug.
Excite Bug Discovered
Webpedia, Jan. 1998
More technical details about the bug, and how to patch it.
CGI security hole in EWS (Excite for Web Servers)
BugTraq Archives, Dec. 1997
The original bug report, with technical details
Excite Moves to Patch Search Software
Wired, Jan. 14, 1998
Excite bug opens Unix servers
News.com, Jan. 13, 1998
Excite Search Bug Threatens Web Sites
Wired, Jan. 12, 1998
Survey Finds Only 30% Of Sites Use Meta Tags
A survey has found that only about 30% of commercial web sites make use of meta tags. SiteMetrics used a spider to physically visit 25,000 sites and check for tag usage. SiteMetrics plans to conduct the survey quarterly.
By tag, the meta description tag was used by 28.9% of sites. It controls the description that appears for a web page in search engine listings. The meta keywords tag was used by 31.7% of sites. That tag allows authors to associate words and phrases with a web page, to assist in determining relevancy. Most major search engines support the tags.
SiteMetrics also classified meta tag usage by industry. It found that travel sites were most likely to make use of meta tags, with 39% of them employing the meta keywords tag. Utility web sites had the lowest usage, with only 22% making use of the meta keywords tag.
A similar survey done almost a year ago found 12% of web sites used the meta keywords tags and 11% used the meta description tag. Links to both surveys are below, along with a link to Search Engine Watch's page about meta tags.
SiteMetric's Web Content Survey
Full results of SiteMetric's meta tag survey, released in Jan. 1998.
Meta Attributes By Count
Statistics from a special robot run done in March 1997 to count the type of meta tags used. Produced by the author of "A Dictionary of HTML META Tags," http://vancouver-webpages.com/META/.
How To Use Meta Tags?
Explanation of how to use meta tags on your pages, along with related links to meta tag generators and other resources.
Excite To Acquire MatchLogic
Excite announced Jan. 15 that it would acquire MatchLogic in a stock swap valued at $89 million. MatchLogic provides banner ad server technology and other online advertising services.
Excite said it is making the move to better serve its own advertisers and to gain revenue by providing MatchLogic's services to others. MatchLogic will begin serving ads on Excite and WebCrawler in March.
Excite Buys Ad Tracking Firm in Stock Deal Worth $120M
Web Week, January 19, 1998
Search Engines Report Earnings
Excite, Infoseek and Yahoo have released their latest quarterly figures. Including Lycos figures posted last year, the latest numbers on net income are:
Yahoo: -$1.3 million
Infoseek: -$4.1 million
Excite: -$7.7 million
More details over time can be found on the Search Engine Profits and Losses page, within the site.
Search Engine Profits and Losses
Yahoo earnings beat expectations
News.com, Jan. 14, 1998
Excite shrinks losses
News.com, Jan. 22, 1998
Infoseek Falls On 4Q Loss
TechInvestor, Jan. 23, 1998
Infoseek posts loss, to sell stock
News.com, Jan. 23, 1998
OneSeek MetaCrawler Offers Side-By-Side Results
OneSeek launched in January, a metacrawler that displays the results from two or three search engines side-by-side. Viewing this way is a bit clunky to me, but some people may like it. The site also offers a "WebChains" service that lets you move between related sites using a control panel.
Search Engine Articles
The Search Engines Search For Answers
Yahoo Internet Life, Feb. 1998
A comprehensive look at the various issues surrounding the desires of advertisers and the need for search engines to maintain impartial listings. Quotes and examples of retailer partnerships, positioning issues, and more.
Search engine shoot-out: top engines compared
Cnet, Feb. 1998
Cnet gives HotBot top honors, especially for its fresh index. Infoseek ranks second in a photo finish and gets a perfect 5 for accuracy. However, Infoseek may well be entitled to first place. The review gave it a low score for lacking advanced search capabilities, which Infoseek actually has. AltaVista also gets an honorable mention. WebCrawler gets locked out, lumped in with directories, though it has a comparable size to Open Text and is far more up-to-date.
Yahoo Email Scam Resurfaces
Wired, Jan. 13, 1998
Yahoo gets hit by another email scam. As an attempt to curb this kind of activity in their free email service, email addresses at Yahoo Mail can no longer contain the words "winner" or "contest."
Compaq weighs options for AltaVista
InfoWorld, Jan. 29, 1998
Compaq buys Digital -- now what's it going to do with AltaVista? It's too early to say, but the article covers some of the issues, briefly.
Search Engine Notes
Alexa Has New Release And Bug Fix
Alexa 1.3 is now available online. Alexa is a site discovery tool that works with your browser. For more information about it, see the review below.
There is also a fix for the Internet Explorer 4 bug affecting those running Alexa 1.2. The bug prevents opening a link into a new window. To receive it fix, send a message to firstname.lastname@example.org, if it is not posted on the site by the time you receive this.
Alexa Download Page
Alexa: Searching Serendipty And More
WebCrawler Drops Meta Support Temporarily
WebCrawler is no longer indexing meta tag information, as a result of its conversion to Excite's searching technology. However, meta tag support will return sometime in the future. Excite is also considering adding it to the Excite search engine.
This newsletter is only sent to those who have requested it. To unsubscribe, use the form at http://searchenginewatch.com/list/unsubscribe.html
To subscribe, use the form at http://searchenginewatch.com/list/list.htm
The contents of this report and excerpts of past reports can be found online at http://searchenginewatch.com/sereport/
If you enjoyed this newsletter, consider showing your support by becoming a subscriber of the Search Engine Watch web site. It doesn't cost much and provides you with some extra benefits. Details can be found at http://searchenginewatch.com/about/subscribe.html
This newsletter is Copyright (c) Mecklermedia, 1998