Can you trust search engines to deliver only fair and objective results? Maybe not, say the authors of a study seeking to measure how bias creeps in to search engineering.
If we ask a reference librarian a question, we take it for granted that we'll be offered unbiased help. Librarians know not only how to find information, but how to assess its quality and authority.
Most of us assume the same is true with search engines. Putting aside the issue of paid placement (aka advertising) links that appear in most search results these days, we want to believe that calculated or computed search results are fair, authoritative, and free from bias.
But are they really? After all, search engines are just computer programs, and the people who create these programs have opinions and biases of their own. Not only that, the fundamental concept of identifying the "best" results for a particular query inherently requires that some sources of information be given preferential treatment over other sources. We wouldn't be happy with search results without this filtering and culling.
Two computer scientists from the City College of New York have set out to define and measure bias in search engines. Initial results of their work has been published in the Journal of Information Processing and Management.
The authors conclude that search engines do in fact display bias, but qualify this conclusion by saying that more study is needed to fully understand why. They also conclude that bias tends to disappear when more search terms are used in queries.
The paper is highly academic and sodden with mathematics, so it's rough going for most of us. I asked search guru Genie Tyburski what she thought of the study and its conclusions. Genie is well known for her own work on assessing quality on the Internet, and is a law librarian, well versed in the legal implications of bias.
Genie writes: "The authors have a point, namely that the potential for bias in search results is greater today than ever before. On the other hand, no search technology, or for that matter, paper finding tool exists without bias.
"Consider the cataloger who decides the topics covered in a book, and then forces them into existing OCLC categories. Consider a book index and the terms chosen to pinpoint topics covered by the book. Given that no finding aid exists without bias, does less of it make a better search engine?
"I'd guess, not necessarily. In my experience working with lawyers and librarians, search engine queries typically fail when a) the searcher enters incorrect syntax, b) the searcher fails to consider availability of the information, or c) the searcher enters terms that are too common or have multiple meanings. Search engines really can't address "b," and "a" and "c" have little to do with bias.
"Search engines are finding tools, but people often expect them to yield answers. If education fails to address this issue, then bias assessments may be helpful. Imagine an annual Consumer Reports issue on bias in search engines! On the other hand, if searchers can't distinguish bogus or erroneous information from authentic authoritative information, a little bias thrown into the mix hardly matters."
The paper, "Assessing Bias in Search Engines" by Abbe Mowshowitz and Akira Kawaguchi, is well worth a read even if you skip over the industrial strength math. It raises some interesting issues to think about the next time you wonder if your favorite search engine is giving you the straight scoop.
Assessing Bias in Search Engines
by Abbe Mowshowitz and Akira Kawaguchi
This paper appeared in the journal Information Processing & Management. It's available for free in PDF format, if you register at the publisher's web site. To access the article, use the link below for the registration form. Once you've completed the free registration, you'll be redirected to the table of contents for the March issue, and the link to the article is at the bottom of the page. The other articles in this issue are also highly recommended.
Information Processing & Management
Volume 38 Issue 1, March 2002 - Free Sample Issue
Evaluating The Quality Of Information On The Internet
This excellent resource from search guru Genie Tyburski offers a checklist for discovering quality in Web-based information, commentary on technical trickery, examples of bogus Web sites, and resources for learning more.
NOTE: Article links often change. In case of a bad link, use the publication's search facility, which most have, and search for the headline.
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!