A special report from the Search Engine Strategies conference in Dallas, Texas December 11th and 12th, 2002.
Today's search engines are experiencing déjÀ vu, it seems, focusing on developing better relevance in search results instead of trying to entertain users as "portals". It's a return to the heady R&D days, before the boom and the IPO craziness clouded everyone's goals, and it's good news for search engine users.
At the recent Search Engine Strategies conference in Dallas, three of the major engines, Inktomi, Yahoo, and AltaVista, discussed the methodologies their teams utilize to help enhance their value to the end user.
Relevance is difficult to determine on an automated basis, because the concept itself is subjective. "One person's good result is different than another's," said Jan Pedersen, Chief Scientist at AltaVista. "Relevancy depends on the task of the user, the context, etc."
Internet searches usually fall into one of four types of tasks, Pedersen said:
- Navigational; that is, "get me to a known site" (e.g. query 'Yahoo.com'. Interestingly, this type of search accounts for about 20-25% of searches
- Informational; where a user desires a list of authoritative sites on a topic such as 'breast cancer'
- Transactional; looking to buy
- Question Answering; e.g. 'What's the area code for Dallas'
Measuring relevance is expensive and slow because of the need for human intervention. All three panelists said that they utilize panels of human editors to help measure and fine-tune their search results, but their methods varied somewhat by engine.
For example, Inktomi's team merges a large and random selection of queries taken from throughout the day, from its own and several competitors' sites. Human judges first grade, then optionally rank the URLs in each set without looking at the titles and descriptions. The metrics are reported in terms of percentages of highly relevant results, percentages of "topN" (e.g. Top 10, Top 50), discounted cumulative gain, and dead links.
"You need hard data to make quick decisions," said Doug Cook, Director of Engineering at Inktomi.
Yahoo's editors utilize a slightly different approach in grading results, taking into account the "visual relevance" of the results, including the title and description.
"Titles and descriptions should be immediately readable, clean, and unbiased," said Srinija Srinivasan from Yahoo This approach, she said, works towards ensuring a sense of trustworthiness, credibility and quality, as well as the ultimate aim of relevance.
Danny Sullivan, who moderated the session, expressed some opinions about the state of relevance. "Mega search is bad," he said. "Big numbers don't mean relevancy," a reference to recent bragging wars between the engines as to the size of their databases. He indicated that anecdotal evidence may not determine overall success at achieving relevance. Rather, one should look at overall levels of satisfaction by users.
Sullivan referenced a study by NPD New Media Services that measured user satisfaction over several years. Over 33,000 participants were asked to score various search engines on how frequently they found what they were looking for. Satisfaction levels seemed to bottom out in 1999 after a steady decline, then jumped in early 2000 to an average success rate of 81% (success defined as "Information found Every Time" and " Most of the Time" combined).
According to NPD, Google ranked highest at the last survey period, with an overall score of 97%, followed by Overture (then called GoTo) with 82%.
Sullivan said that there has not been another survey completed by NPD since 2000, so no further information is available on recent changes in search engine relevancy. One hopes that in the meantime, the engines will continue their quest for perfect-score relevancy.
As Yahoo's Srinivasan said, "You can never be complacent".
NPD New Media Services User Satisfaction Report
This study measured user satisfaction over several years. Over 33,000 participants were asked to score various search engines on how frequently they found what they were looking for.
Dana Todd is a founding partner and Principal of SiteLab International Inc. She is a frequent speaker on Internet marketing topics, including search engine strategies and link-buying.
NOTE: Article links often change. In case of a bad link, use the publication's search facility, which most have, and search for the headline.
The Original Search Marketing Event is Back!
SES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Early Bird rates extended through Sept 19. Register today!