SEO News
Search

Search Engine Coverage Study Published

author-default
by , Comments

From The Search Engine Report
August 2, 1999
(a longer version is available to site subscribers)

In 1998, a landmark study published in Science magazine found that search engines fell well-short of listing everything on the web. Last month, a repeat to this study was published in Nature magazine. Once again, search engines were found to contain only a small percentage of the web pages available for indexing.

Both studies made for great headlines about how search engines seemingly fail users, but the number of pages a search engine has indexed has nothing to do with whether it is any good. Size alone doesn't matter.

Don't believe me? Go to your favorite search engine and look for a company, any company. Was the company's web site listed first? That's what many users want to have happen. This example is just one of many that illustrate the value of relevancy over sheer size. Improving relevancy means being a smarter search engine, not just a bigger one.

Even the study itself explains that bigger doesn't mean better: "There are diminishing returns to indexing all of the web, because most queries made to the search engines can be satisfied with a relatively small database," the study's authors write.

In fact, study coauthor Dr. Steve Lawrence isn't saying search engines are bad. Lawrence credits them for making more information available to the public than ever before, but he thinks they could do better.

"The amount of information that you can quickly and efficiently search for using search engines has been increasing, but what has been decreasing is the amount you can search for versus what you could potentially search for," Lawrence said.

So having provided some balance to the headlines you may have read recently, let's examine details from the study in more depth.

Conducted by scientists at the NEC Research Institute, the study found that there were 800 million indexable pages and 180 million images on the web, as of February 1999 . "Indexable" means pages that weren't hidden behind password protection, excluded from indexing by robots.txt files, locked away in databases or basically inaccessible to search engines for other reasons.

The first study found that there were 320 million pages as of December 1997, so it sounds as if the web has more than doubled in size in just over a year. However, the two studies cannot be compared fairly because they used completely different methods of estimating the size of the web.

Previously, the size estimate was derived by comparing the overlap between results at different search engines to extrapolate an overall figure for the web. In the latest study, the researchers used a variety of techniques to determine how many web servers were publicly available (2.8 million) and the mean number of pages per server (289). Multiplying these two figures is where the 800 million web pages count comes from.

From here, the next step was to determine how much coverage of the web each search engine provided.* You need a known starting point, and Northern Light was selected. The researchers did a search that told them Northern Light had an index of 128 million web pages.** Then they divided Northern Light's index size by the entire web size to find that Northern Light covers 16 percent of the web.

Unfortunately, you can't get a count of index size for some of the other crawler-based search engines, so the researchers instead ran a series of test queries -- 1,050 in all.*** They totaled how many web pages were returned for each search engine, and that told them proportionally how big or small each service was compared to Northern Light (which coincidentally led the pack in terms of coverage). Here's the complete rundown:

Northern Light: 16%
AltaVista: 15.5%
Inktomi (Snap): 15.5%
Inktomi (HotBot): 11.3%
Inktomi (MSN Search): 8.5%
Infoseek: 8.0%
Google: 7.8%
Inktomi (Yahoo): 7.4%
Excite: 5.6%
Lycos: 2.5%
Euroseek: 2.2%

Inktomi provided search results to several services when the survey was done, which is why it appears multiple times. It powered primary results at MSN Search and secondary results at HotBot, Snap and Yahoo. The variation in coverage reflects what I and others have written about in the past -- not all Inktomi partners tap completely into its 110 million page index.

Lawrence said the percentages for HotBot and Lycos might be slightly higher than shown, because those services will only display one page per web site in their results. That could have caused an undercount, he said.

The big winner was Northern Light, which saw its traffic triple just after the study was published. That had to be satisfying to CEO David Seuss, who said earlier this year that winning the title of biggest search engine would make the still relatively little-known service more popular. Of course, whether those users will stick with Northern Light remains to be seen, and challenger FAST has just ousted Northern Light from the number one spot with its new 200 million web page index. Excite is also about to weigh in later this month with a new index in the 250 million page range - expect more details in the next newsletter. Northern Light is in the 165 million page range, followed by AltaVista's 150 million pages. All sizes are unaudited, self-reported numbers.

Combined, search engines are seen to be covering only 42 percent of the web, meaning that if you used each search engine individually, you'd access more pages across the web than using any single service alone. In comparison, the last study found a combined coverage of 60 percent. So things are getting worse? Probably, but not conclusively. You can't accurately compare these figures because the methods of estimating the size of the web were so different for each study.

Lawrence agrees that direct comparisons won't be exact, but he feels the overall finding that coverage is dropping is basically correct. "It wouldn't change any of the conclusions, though it might have changed the magnitude," he said of the drop, assuming the same size method had been used in both studies.

The study also reported on the number of dead links at each service, which are an indication of how fresh each search engine's index is. The lower the percentage, the fresher the index. Overall results were:

Lycos: 14.0%
Northern Light: 9.8%
Google: 7.0%
AltaVista: 6.7%
Infoseek: 5.5%
Inktomi (Yahoo): 2.9%
Inktomi (Snap): 2.8%
Excite: 2.7%
Euroseek: 2.6%
Inktomi (MSN Search): 2.6%
Inktomi (HotBot): 2.2%

The poor showing by Lycos reflects another fact I've previously reported, that its index was woefully out of date earlier this year. The situation at Lycos has been greatly improved since then. Moreover, primary results at Lycos now come from the Open Directory, not from the spidered index reviewed in the study.

The study also found that it was more likely that pages from popular and well-known sites would be indexed. That's in part a function of how search engines work -- spiders are more likely to visit a site if they keep coming across links to it -- plus it is a conscious decision on the part of most all the major search engines as a means to provide better listings and combat spam.

The study's authors worry that this trend means high-quality but "unpopular" pages may not list well. In my opinion, this is less a concern. If anything, I feel the trend toward popularity measures means that general users will get more relevant and useful results in response to common queries, which is desperately needed. Meanwhile, those performing more refined and focused queries such as research professionals should still be able to locate much information of use.

The solution to better serving the second group probably won't be to create a single, super-comprehensive 800 million web page search engine. Instead, it's likely to be the creation of new specialty services that do in-depth coverage of sites by topic.

Long overdue is probably an academic search engine. I've talked to several people recently who all wish there was a way to do a search across university web sites, with the knowledge that the search engine would have done a deep and frequent crawl of the sites on its list. Whether such a service would make money for its owners is another issue, of course.

But even as specialty services arise, there are definitely advantages for general purpose search engines to enlarge their indexes and keep pace with the web. It means that they can provide more comprehensive coverage for those users that search for unique or obscure information, such as about a rare disease. And users themselves should consider search engine size when selecting among services. Just be wary about using size alone to decide which service is best.

Coverage Of The Web

The chart below shows how the study's estimates varied from the reported sizes of search engines in February 1999.

08-size

If the bars are close together, it suggests that the reported sizes given by the search engine were indeed correct.

If the "Reported" bar is higher than the "Estimated" bar, it suggests that the reported size for that search engine was inflated. It could also indicate that the research was not correct for that search engine.

Where the "Estimated" bar is higher than the "Reported" bar is higher, it suggests that the research may not have been correct for that search engine, or that the search engine itself underrepresented its index.

More Resources

Accessibility and Distribution of Information on the Web
http://www.wwwmetrics.com/

Produced by the authors of the study, you can request a copy to be sent via email, and more details are also promised for the future.

Search Engine Sizes
http://searchenginewatch.com/reports/sizes.html

You'll find comparison charts and links to past articles that deal with size issues, including the previous Science study, all on this page.

Notes:

* The researchers actually did test queries first, then used Northern Light as a benchmark, but I've reversed the steps to more easily explain the process.

** The researchers did a Boolean NOT search for a term they knew was not in the Northern Light index. As a result, a listing of all pages indexed by Northern Light was displayed.

*** The search engines will provide self-reported size estimates, but there's no way to audit this with some via a search as was done with Northern Light.


ClickZ Live San Francisco This Year's Premier Digital Marketing Event is #CZLSF
ClickZ Live San Francisco (Aug 11-14) will bring together the industry's leading online marketing practitioners to deliver 4 days of educational sessions and training workshops. From Data-Driven Marketing to Social, Mobile, Display, Search and Email, the comprehensive agenda will help you maximize your marketing efforts and ROI. Register today!

Recommend this story

comments powered by Disqus