THE SEARCH ENGINE REPORT
April 4, 2000 - Number 41
About The Report
The Search Engine Report is a monthly newsletter that covers developments with search engines and changes to the Search Engine Watch web site, http://searchenginewatch.com/.
The report has 130,000 subscribers. You may pass this newsletter on to others, as long either part is sent in its entirety.
Did you know that there's a longer, more in-depth version of this newsletter? The twice-monthly "Search Engine Update" newsletter is just one of the many benefits available to Search Engine Watch "site subscribers." Learn more about the advantages to becoming a site subscriber at this page:
Please note that long URLs may break into two lines in some mail readers. Cut and paste, should this occur.
In This Issue
+ About The Search Engine Watch site
+ Search Engine Strategies Conference
+ Search Satisfaction And Behavior Results Released
+ Google Adds Directory
+ The Problems With Rating Services
+ The Vortals Are Coming! The Vortals Are Coming!
+ Making Search Sticky (offline only)
+ Search Engine Marketers Prefer
Manual Submission to Auto-Submit Tools
+ Say No To "Next Results" With QuickBrowse
+ WebCrawler Loses Second Sight
+ Crawler-Powered Job Site Launches
+ Go's Meta Search Software Updated
Search Engine Articles
+ The usual round up of interesting articles relating to search engines.
+ List Info (Subscribing/Unsubscribing)
I've reorganized the Search Engine News page and added a new daily news feed of search engine articles. This may also be made available in email form, in the near future
Search Engine News
The Search Assistance Features page has been updated with details of the new page clustering change at Google, and the Media Metrix Search Engine Ratings page has been updated. Also, a new "Search Engine Index" page provides a compilation of interesting search engine-related statistics. Links to both can be found via the What's New page:
The next Search Engine Strategies seminar is less than a month away. It will be held on April 27, in London, and conference details are online at the URL below. The one day conference will feature both experts on search engine marketing issues and panelists from search engines, including confirmed speakers from Inktomi, LookSmart, Netscape/The Open Directory, Search UK and Voila. I'll be presenting and moderating throughout the day, and the conference will also have a look at regional and language issues.
Search Engine Strategies London 2000
Also, for those doing forward planning, the next seminar in the US will be on August 14, in San Francisco. Conference details will be available shortly via the page below:
Search Satisfaction And Behavior Results Released
For the past three years, NPD New Media Services has conducted what I'd consider to be the most extensive survey of search engine user satisfaction available. Every three months, some users at the major search engines are selected randomly by NPD and asked about their visits. The results are fascinating, but they've not been available to the general public until now.
The survey is funded and conducted on behalf of the major search engines, and the results have been used for their own internal purposes. Occasionally, bits and pieces have been released, usually when a search engine performs well in a particular area. For instance, after the latest survey was finished in December, Google announced that 95 percent of its users find what they are looking for all or most of the time, making it number one for that category.
As you'd expect, search engines that rank poorly in some areas do not announce those findings. In fact, some search engines have specifically insisted that comprehensive results of the survey not be released to the general public. As a result, I've been unable to report on the standings.
That's a shame. Numerous search engines representatives have spoken to me about the desire to have some type of mechanism to rate them in terms of relevancy. That's an incredibly difficult task, as relevancy can change by person and by search query. Yet the NPD survey provides one such measure, and I think the search engines should all agree to an open release of key data. It may be embarrassing to those that rank poorly, yet it would also serve as an incentive for them to do better.
While I can't provide comprehensive search engine-specific figures at the moment, I am now able to provide a recap of how the industry as a whole is doing. The answer? Not as well as in the past. Overall, the success rate of finding what you are looking for most of the time or all of the time has dropped to 77 percent, the lowest point since the survey began in the Summer 1997.
What's happened? According to Lisa Manuzza, Director of NPD New Media Services, the main culprit seems to be the emergence of more cluttered and complicated results pages.
"By leaps and bounds, what we hear is that there is too much on the pages," Manuzza said. "People are all confused. 'I just want to know what the results are,' they say."
Could other factors be behind the drop? For instance, if more new web users are being surveyed, they might be less sophisticated about searching, which could cause the success rate to plunge. But Manuzza said demographic factors like this don't appear to be to blame. Likewise, the addition of new search engines to the survey such as Google, Ask Jeeves and GoTo could potentially have caused a slump, if these services themselves earned bad scores. But that's not the case. In fact, these newer services have actually helped keep the overall success rate from slipping further, Manuzza said.
There is some good news. Preliminary results from the Winter 2000 survey are nearly complete, and they show a strong increase in the search success rate, bringing it to its highest point since Fall 1997. Expect a closer look at this change in the future, when the final results are available.
The survey also examines what people do when their searches fail. Nearly 80 percent of people try searching again in a different way at the first search engine they use, rather than trying the same search at a different search engine. This comes from a belief that all the search engines provide basically the same information. That's certainly not true, but the impression is strong enough to keep people from moving elsewhere.
"You'll even see that people are less likely to jump ship than in the past. 'I'm not going to get anything better anyway,' people have said," Manuzza explained. Another factor is the time investment users make, Manuzza said. They get comfortable with a service and understand how things are laid out, making them less inclined to leave.
That should be a warning to some search engines that are constantly changing how they operate. The more they tamper, the more likely they'll impact the comfort level of their users.
Another search behavior surveyed is how we pose our questions. While asking questions is the least popular method, it has seen a huge increase recently due to what I'd call the "Ask Jeeves" factor.
All the major search engines can accept natural language, question-style queries. However, Ask Jeeves explicitly encourages its users to do so. Many users were first exposed to this encouragement when Ask Jeeves began providing some results to AltaVista just over a year ago. Even more have been exposed as the Ask Jeeves site itself has rocketed higher in terms of popularity. Now the Ask Jeeves factor of encouraging questions now seems to have rubbed off everywhere.
"That was non-existent, a year ago. I didn't have any numbers for any of the sites," Manuzza said, about the rise in question asking. "AltaVista was the first to grow, and now, as more people are getting more comfortable with the question asking thing, this was first year I was putting in numbers for practically every guide."
Multiple word queries are still the most popular method of seeking information, but they've lost some ground to single word searching. In the past, single word searching has been bad, since it doesn't provide enough information to help the search engine understand what the user is seeking. But now most search engines have "Related Searches" features. This allows you to enter a single word, then click on a list of multiple keyword suggestions in order to refine your search. This is one reason why single word method seems to be growing in popularity. It allows users to be more comfortable knowing that they'll narrow in on what they are looking for, Manuzza said.
NPD Search and Portal Site Study
Charts and more details from the latest survey.
Google Adds Directory
Users continue to rave about the quality of Google's search results, and that's testimony that its link analysis system works well at delivering relevant information. Now Google is applying link analysis in a new way, to the human powered information of the Open Directory.
All I can say is hurray! I've been chanting for an end to the tyranny of the alphabet when it comes to directory listings for some time. Why should sites that begin with an A appear at the top of list, oppressing better quality sites that may simply begin with a letter further down in the alphabet? In many cases, this offers no help to the searcher.
For instance, imagine you want to find a comparison shopping service. Over at Yahoo, "Shopping Agents" seems to be the most suitable category. In it, the widely-praised service mySimon appears 15th on the list, which is ranked in alphabetical order. The situation is worse when reviewing the Open Directory's "Price Comparisons" category, where mySimon appears 37th on the list.
The trouble here is that it already takes time for human editors to review and list sites. It takes even more time to rank them in order of quality, even though that's exactly what users would like. That's where Google's use of link analysis adds the perfect blend of technology to the human effort of the Open Directory.
For instance, that same Open Directory category mentioned above is now available in Google's edition of the Open Directory. It's basically the same core list of sites (the Open Directory list has a few more, as it is fresher), but Google has reranked the sites in the list so that those with link popularity come at the top. Result? mySimon rises to 5th on the list, which is appropriate for such a popular and well-regarded site.
How to make use of this new directory? One means is to search Google, just as you might ordinarily. As before, that will bring back matching pages of information based on Google's automated crawling of the web. The difference is that you may now get suggested "categories" of information at the top of the search results. Clicking on one of these will take you into the Google Directory. You'll be shown a list of sites for that category that were compiled by the volunteer editors of the Open Directory and ranked according to Google's link analysis system.
When you search, you'll also discover that some pages now have new "Category" links appearing below their listings. For example, in a search for "comparison shopping," mySimon is listed in the top results, along with a link to "Home > Consumers > Consumer Information > Price Comparisons." Selecting this link will take you to the area within the Google Directory where mySimon is listed. Naturally, you'll also find other sites in this category along with mySimon. This means that you can use the category links as a way to find sites similar to those listed in the main search results.
You can browse the Google Directory by selecting the "browse web pages" link underneath the search box, on the Google home page. Browsing the directory also gives you the ability to focus Google's crawler-based results toward any topic you are interested in. This is an incredibly powerful feature that's worth exploring.
Let's say you live in the United Kingdom (I do) and are interested in information about genetically modified foods, which is a hot topic in the UK. You search for "gm foods" on Google, and now you are shown a list of matching pages from all over the web. Great, but now let's say you want to narrow your results to sites specifically related to the UK. Go to the Google Directory, click on Regional, then Europe, then United Kingdom. Now search for "gm foods" from within the category (you'll see that the option is set to "Search only in United Kingdom"). The results which come back will be much more specific to the UK.
What's happening is that Google will return any matching sites for that topic which the editors have listed, plus it goes beyond and brings back any additional relevant pages from these sites that it has found from crawling the web. "You can look through the entire contents of those web sites," explained Google President Sergey Brin.
So by navigating to a specific area of Google, you are essentially turning it into a specialty search engine of your choice. The possibilities are endless. Go to Home > Houses, search for "windows," and you get pages and sites about windows in your home, not Microsoft Windows. Go to World > Deutsch and search for "star trek," and you get links about the series written in German or aimed at German-speakers (trivia tip: Dr. "Bones" McCoy is called "Pill" when Star Trek is translated into German).
In other news, Google has announced that it is planning a family filter and has new international editions in the works. The service also says it has introduced a page clustering feature, so that at most, only two pages per web site will appear in its top results. New Google "buttons" let you search Google or use its GoogleScout feature directly from within your browser.
Search Assistance Features
Explains more about page clustering and ways to find related pages, including using the GoogleScout feature.
Hate the alphabet? Go's volunteer directory also rates sites in order of popularity, with three stars being best.
Features a version of the Open Directory where results have been organized in terms of what people click on, using Direct Hit's technology.
Open Directory Price Comparisons
Google Price Comparisons
Need to comparison shop? This is a wonderful service, especially improved by the fairly recent and needed addition of a search box to the home page. You no longer need to navigate first to a category and then search.
Webbys Technical Achievement Nominees
Like Google? They've been nominated for a Webby -- you can vote for them here. So has AltaVista's translation service. Like another search engine? Write-ins are accepted, but don't get your hopes up.
Google: We're down with ODP
Salon, March 24, 2000
Another look at the Google-Open Directory partnership
The Problems With Rating Services
I'm often asked which are the web's most popular search engines. To answer, I mostly rely on figures from the major ratings services of Media Metrix and Nielsen//NetRatings. Their figures are far from perfect, however. In fact, it's rather alarming how often their figures are cited as proof of any site's popularity without a lot of qualifying by journalists or analysts. Web ratings can be and are manipulated much more by web property owners than could ever happen when compared to television ratings.
I don't mean to knock the numbers entirely. They are the best estimates we have and can be extremely useful. But as with any statistics, they can't be accepted at face value. Reporters and others who quote these figures to the public ought to be critically analyzing them, rather than just parroting back whatever top ten list appears in the latest press release they've been given. Since many of them are not, I thought it would be useful to go through the type of questions I have to consider when pondering the ratings for search engines. By doing so, I hope you'll be able to do your own questioning.
Let's start with Media Metrix, which is the web's oldest rating service. Media Metrix has a sample of at home and at work web surfers that it monitors, in order to estimate what we all are doing. How exactly sampling is done itself can impact whether the data received is trustworthy. But let's set that aside. Just assume that the sampling is perfect, because there will still be enough problems remaining to take issue with the numbers.
Each month, Media Metrix releases data about top web properties and top web sites ("digital" properties not necessarily on the web are also measured). There's a world of difference between a "property" and a "site," yet it is not uncommon to see the terms used synonymously when the figures are reported. For instance, the San Francisco Chronicle article below talks about AltaVista changing its web site as part of an effort to "catch up" with more popular portal sites. The discussion is mostly about web sites, yet web property numbers are what's offered as proof that AltaVista is behind. As we'll see, if web site numbers had been used -- as they should have been -- AltaVista looks much better off.
"Web property" is defined as a collection of web sites owned by one company. For instance, Lycos owns several web sites, such as its Lycos flagship portal site, the HotBot search site and the Tripod home page building service. The combined traffic of all these and other Lycos-owned web sites make up the Lycos "web property" figures.
Let's compare that to television. A television network like NBC has many TV shows, such as ER, Third Watch and The Tonight Show. Ratings companies can estimate how many people tuned into the entire network during a given week versus estimating how many people watched a particular show. They are different numbers, and they both have their uses.
Web property numbers are like network numbers. They are especially helpful if you want to know how many people a web media owner can reach. Want to get your ad out in front of a lot of eyeballs? According to Media Metrix, Yahoo's network had 45 million of them in February. Lycos had 32 million, and AltaVista had 12 million.
Web site numbers are like television show numbers. They are useful if you want to know if a particular web site is popular among surfers, just as you might use television ratings to determine if a particular TV show is a hit with viewers. For example, I use web site numbers because I'm specifically interested in which search engines are popular. The Lycos web property number is useless, in this regard. It mixes in people who went to Lycos, HotBot, Tripod and other properties. Only the web site numbers will let you know if any of these sites is a hit.
Now comes the manipulation part. Media Metrix leaves it to the web property owners to decide what makes up their web site numbers. That's why in January 1999, the Go.com web site jumped from having a 21 percent to a 34 percent reach of among users surfing the web from both at home and at work. All Go sites, such as Disney and ESPN were rolled up into the Go.com web site figure. In essence, the Go.com web site figure was simply allowed to be corrupted into a web property number.
More recently, the same thing happened with the Lycos web site figure. Since March 1999, the Lycos site reach among at home and at work surfers had been dropping from its high of 26 percent down to a low of around 22 percent. Then last December, it suddenly spiked to about 28 percent. It shot up further to about 39 percent in February.
What happened? I don't know exactly, but one major driver was the fact that Lycos decided to combine HotBot site figures with the Lycos figures, according to Media Metrix. These are two completely separate web sites, and it makes no sense to combine them. Nevertheless, this is allowed.
Can you imagine if the Fox television network was able to combine the ratings for shows like The Simpson and Ally McBeal into the ratings for Titus, in order to show how it was more popular than NBC's Friends? No one would stand for it, yet this exact situation is being tolerated in the web ratings world.
It's crucial to know the hit shows on the web. As with television, hit shows can build or break a network. The combination of distinct sites into one site number can allow web media owners to potentially hide their bad shows. The Forbes article below talks about the success of Lycos, while the Chronicle article I mentioned generally looks down upon AltaVista. But the Lycos web site, as the network's flagship show, was on a downward trend until this latest change to how the Lycos web site numbers are calculated. AltaVista's flagship site was on a generally upward trend that threatened to surpass the Lycos site. The web property numbers don't reflect this. It's only when you look at the web site numbers that such patterns emerge.
Of course, even when you know the hit shows, you still don't necessarily know which are the hits for search. For instance, many people go to the Yahoo web site in order to access their email, and there's no way to break them out of the search-specific traffic. That's one reason you are beginning to hear some search engines quote the number of queries per day that they process. Those figures are a better measure of search-specific traffic. Numbers from StatMarket also provide another look at search engine popularity, based on the traffic they send to web sites.
For its faults, at least Media Metrix does regularly cite both web property and web site numbers. In contrast, the monthly releases from NetRatings focus on only web property numbers. You can get web site numbers, but you have to make a special request. That why the NetRatings page that I maintain within Search Engine Watch is so out of date. I'm still waiting for figures to be sent back to me from NetRatings.
Ideally, I'd like to see NetRatings release both web property and web site numbers regularly, as Media Metrix does. In fact, the new format Media Metrix introduced last month is ideal. It shows the web property figure, then a web site breakdown for that property (though not a complete breakdown). But both companies also need to ensure that they, not their clients, define what makes up a web site, and that those definitions align with what the public generally perceives those sites to be.
Ratings, Reviews and Tests
You'll find numbers for major search-related sites from Media Metrix, NetRatings and StatMarket here, plus links to each of those companies from these pages.
PC Data Online Reports
Eventually, I also plan to add a page about search engine popularity according to PC Data Online. The company isn't as well known as the ratings kings of Media Metrix and Nielsen//NetRatings. However, it offers an unparalleled view into the popularity of smaller sites. You aren't just limited to a top 50 list. Enter any site, and if it is in the top 10,000 web sites in terms of traffic (as estimated by PC Data), you'll be shown stats.
A new resource from Alexa that I've only just begun to explore. It offers the ability to look up traffic for any web site and even to see traffic "rolled up" within those sites. For instance, Lycos is listed as the 10th most popular site for February. By drilling down, you can see that Alexa is combining any Lycos-owned sites into this number, making it really a web property number. You can also discover that Alexa estimates that HotBot draws more page views that the Lycos main site.
AltaVista Switches Web Portal Into High Gear
San Francisco Chronicle, March 27, 2000
Yet another article with the usual analyst quotes of how AltaVista is trying to catch up in the portal game, struggling against being "late" to the market. As a site, AltaVista is closing in on Excite, which the same analysts would acknowledge as being one of the portal leaders. AltaVista would also probably be doing the same to Lycos, if HotBot numbers weren't merged into the Lycos total.
Bob Davis, Lycos' Savior
Forbes, March 30, 2000
Much focus here on the recent European spin-off by Lycos, plus mention of how Lycos has grown.
The Vortals Are Coming! The Vortals Are Coming!
Specialty search engines, topical search engines, vertical portals or "vortals" -- these are web sites which focus on particular topics and which especially allow you to search for information relating to those topics. Many more may be coming, thanks to the emergence of new companies and services catering to the specialty search market.
FindLaw is a classic example of a vertical portal. It focuses on cataloging resources relating to the law and legal issues. Within it, there's even the LawCrawler search engine that collects pages just from legal web sites. The "vertical" term comes out of the idea that these are places where instead of searching horizontally, or broadly across a range of topics, you search vertically within only a narrow band of interest.
While there are plenty of topic-specific web sites, it's harder to find ones that also offer specialty search engines for the topics they cover. The sites may have link lists, perhaps even offer Yahoo-style directory search, but it is unusual to see them additionally offer a crawler-based subject search engine.
PinPoint.com, BetterGetter, SearchButton and especially EoExchange (formerly Aeneid) may change this, as they are all targeting the vertical market. Give these companies a list of sites related to a particular subject, and they'll spider pages just from those web sites. The result should be a search engine that produces high quality results for the subject that it covers.
Pinpoint.com is currently beta testing its search technology with six web sites, then will offer its product broadly to customers in about a month or so. Pricing will start from US $10,000 and go upwards to six figures.
BetterGetter is another service like PinPoint.com that's operating in beta mode, with an official launch planned for June.
Searchbutton.com, which specializes in offering site-specific search for web sites, announced earlier this year its "CommunitySearch" service. Searchbutton operates Electionsearch2000.org to demonstrate how CommunitySearch could be used by potential clients to build niche search engines. The service will be available to customers in the second quarter of this year, the company says.
EoExchange isn't new to vertical search. The company has been providing its "EoCenter" specialty search product for about a year, powering search at places such as Red Herring, American Banker, CMP Media, and Office.com.
"We've probably got over 100 different portals that we are deployed on right now," said Bob Ainsbury, EoExchange's chief technical officer. "In terms of specific focused search for specific industries, we're grandfathers in that list," he added, commenting on the entry of new companies targeting vortals.
The company has just added new page monitoring capabilities to its EoCenter product. It's a killer combination that benefits those doing regular research, and the "Making Search Sticky" article below takes a closer look at it.
Don't forget, the big players also remain interested in this market. AltaVista just released a new version of its search software for webmasters, which can be used to power vertical search. Inktomi will also create a specialty search solution for you, but the entry fee starts at about $250,000. This is why the moves by companies such as those I've named are exciting. They extend hope that custom search may become affordable for those running smaller web sites. The result may be that new, helpful search resources will mushroom across the web.
Making Search Sticky
The Search Engine Report, April 4, 2000
EoExchange has announced new page monitoring tools that can be integrated into its search results, a feature that offers real benefits to web searchers and perhaps suggests a new way for the major search sites to encourage "stickiness" among their own visitors.
From Searchbutton.com, this specialty search engine allows you to search for matching pages on any topic from within US presidential candidates' web sites.
Another company with a specialty search product among its offerings.
AltaVista Business Solutions
More about how companies can use AltaVista's technology for their own needs.
Describes Intkomi's search engine product line.
This software lets you build custom Yahoo-like directories for a low price. I use it to manage the Search Links area of Search Engine Watch. A new version was just released.
Specialty Search Engines
My list of specialty search engines in various categories. Also be sure to see the references in the "Invisible Web" area. They'll get you to even more specialty search services.
Interested in more about site-specific search and product reviews and evaluations? SearchTools is a must-visit site.
Search Engine Marketers Prefer
Manual Submission to Auto-Submit Tools
The results of a survey of web site marketers has found that most prefer to submit manually to search engines rather than use auto-submission tools. Most marketers said that this was because they "know" manual submission is more effective. Most respondents also said that they optimize their web pages for crawler-based search engines and submit only "when necessary." Those who do use auto-submit tools preferred software packages over web-based services.
About 300 readers of the I-Search mailing list responded to the survey, which was conducted in February. Results were presented by list moderator Detlev Johnson, at the recent Search Engine Strategies conference, in New York. Full results can be found via the link below.
Search Engine Marketers Prefer Manual Submission to Auto-Submit Tools
SearchEngineWatch.com, March 20, 2000
Say No To "Next Results" With QuickBrowse
Want to get multiple pages of results from a search engine combined into one single page? QB-Search will quickly join up to 200 pages of listings from Yahoo, AltaVista, WebCrawler, Google or Go2Net's MetaCrawler. It's a simple, helpful tool that you'll no doubt need at some point. Another powerful offering is QB-Masterpage, which will combine URLs that you select into a single page for quick browsing. Both are available from QuickBrowse.
WebCrawler Voyeur Loses Second Sight
Sadly, one of the first services that let you see what people were searching for online, WebCrawler Voyeur, has closed. Excite@Home, which operates WebCrawler, says it might come back in the near future. If not, Magellan Voyeur pulls from the same stream of searches that WebCrawler Voyeur used but lacks the cool Java scrolling bar that WebCrawler had.
Magellan Search Voyeur
What People Search For
Find more live search places here.
Crawler-Powered Job Site Launches
This new site uses crawling technology to locate job postings from across the web.
Go's Meta Search Software Updated
The free meta search tool from Go was recently upgraded with related search functionality and now has comparison shopping and job search categories.
Search Engine Articles
Why leave your 'marks online?
Salon, March 28, 2000
A look at the growth of online bookmark companies, which failed to make a hit with writer Damien Cave.
Greybeard AltaVista aims to burnish image
Ad Age, March 2000
How AltaVista is trying to build its traffic with its first ever marketing campaign.
The Battle for Pocketbooks and Minds
Washington Post, March 24, 2000
A look at new and existing military portals.
Helping Webmasters Land in Search Engines' Nets
New York Times, March 23, 2000
Basic overview of the Search Engine Strategies conference that I hosted last month in New York. In particular, the article covers the challenge of bringing search engines and webmasters together. Plus, you can see me in a suit :)
Google Slices & Dices The Web Simply
Inter@ctive Week, March 23, 2000
Basic details on how Google works, plus a look at how the humans at Google are fed by a staff chef.
Interview With The Search Engine
FN Wire, March 2000
Humor site FN Wire interviews Ask Jeeves -- and I do mean Ask Jeeves -- not someone who works there.
GoHip Answers Its Critics
Wired News, March 15, 2000
More about the search service that has garnered complaints about modifying people's email, including a contact number for complaints.
Excite, iBeauty Lose Key Name Use Lawsuit
Newsbytes, March 13, 2000
Estee Lauder has won in its fight to prevent Excite from selling banner ads linked to its trademarks, at least in Germany. Cases remain pending in France and the United States, and the Germany ruling is open to appeal.
Terminix Abandons Lawsuit Against Free Speech on the Internet
Public Citizen, March 10, 2000
Terminix has given up in its meta tag-related lawsuit against a site complaining about the pest control company's work. This is a press release from the legal group that represented the protest site. I also volunteered time on behalf of the protest site.
Boom Town: 'Grumpy' won't say what's next for Yahoo, but scenarios abound
Wall St. Journal, March 6, 2000
Various guesses about business deals Yahoo might cut, with Yahoo itself making no comments other than nothing is being ruled out.
DoubleClick Stitches Together Search Sites
InternetNews.com, March 9, 2000
DoubleClick has developed a system to let you buy keyword-linked banner ads across all the search-related properties that it represents.
The Web: Growing by 2 Million Pages a Day
Industry Standard, Feb. 28, 2000
Compilation of interesting statistics on the growth of the web, from a variety of sources.
How do I unsubscribe?
+ Use the form at http://searchenginewatch.com/sereport/unsubscribe.html or follow the instructions at the very end of this email.
How do I see past issues?
+ Follow the links at http://searchenginewatch.com/sereport/
Is there an HTML version?
+ Yes, but not via email. View it online at
I didn't get Part 1 or 2. Can you resend it?
+ No, but you can view the entire issue online, via the link above.
How do I change my address?
+ Unsubscribe your old one, then subscribe the new one, using the links above.
I need human help with a list issue!
+ Write to [email protected]. DO NOT send messages regarding list management issues to Danny Sullivan. He does not deal with these.
I have feedback about an article!
+ I'd love to hear it. Use the form at http://searchenginewatch.com/about/contact.html.
How do I advertise?
+ To advertise in this newsletter or any of Internet.com's other 100 newsletters, contact Frank Fazio, Director of Inside Sales, at (203) 662-2997 or via email at [email protected].
This newsletter is Copyright (c) internet.com Corp, 2000
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!