Where can I... watch movies online?
What is there... to do in Boston?
How much... does lasik cost?
Why do... clothes cling in the dryer?
Start typing a query into Google, and sometimes you'll be surprised by what you find. It's been finishing our sentences ever since Google Suggest (now known as Autocomplete) became a default function of the search experience in 2004.
As personalized search has grown in complexity, the appearance of these suggestions now begs the question: just how good has Google become at anticipating our needs?
Actually, not that good. While it might be a little eerie to see a suggested query that actually mirrors what we're thinking, we can take comfort in the fact that we're a dynamic species which constantly takes its cues from the world unfolding around us. If Google knows all our past queries, that alone wouldn't help it anticipate our future ones.
And yet, there would be a tremendous commercial opportunity if search engines could adapt to do just that. According to the Pew Internet & American Life Project, 82 percent of Internet traffic starts on a search engine. Does this suggest that the overwhelming majority of digital marketers' opportunities reside in search?
Hardly. The problem with search behavior: it's rarely as objective as we'd like it to be.
Far too much information about the searchers' motivations and decision-making processes is lost in the nanoseconds elapsed in a typical search query. One crystal-clear metric, however, does get to the heart of this matter: search query length.
In 2003, a team of researchers from Rutgers University published a study called "Query Length in Interactive Information Retrieval," in which they tested the relationship between query length and search effectiveness in interactive information retrieval (what the lay person calls "using Google"). What they found was that the length of search queries correlated positively with the user's satisfaction over the quality of their search results. In other words, the more words you type into Google, the more likely you are to be happy with the results.
This study was one of the first indications that search engine users had evolved from the primitive 1990s into sophisticated information-gatherers, and that search engines were rising to the challenge. The two were speaking to each other in plain language, not some cryptic code.
A year after the Rutgers study, iProspect found that the percentage of four-word search referrals had tripled over the prior year, and for marketers, these longer phrases were converting at a higher rate than their shorter counterparts.
The downside of this evolution, from the search engines' perspective, is that it has begun to erode, not improve, their ability to anticipate search behavior. By 2008, 20 percent of all the queries on Google were either brand new (keywords never before queried in the past) or hadn't been queried in the previous sixth months. By 2010, that number is up to 33 percent (or higher) of all queries based on brand new keywords.
The implications of this trend are profound. Today, most search marketers trade on their ability to pore through reams of data at the outset of the keyword research process. Only with seed lists of eye-popping volume could a campaign ever have a reliable foundation.
The web's vast array of keyword tools are the go-to resource for this process of building a keyword seed list. But those databases tend to be built from existing search engines, using information from *ahem* past queries.
Two years ago, this process could still give you a lock on up to 80 percent of the search opportunity. Now that number is just 67 percent.
What would happen if this pattern were to continue? It might not be long before the split approached 50-50.
Factor in the Pew statistic, about 82 percent of all Internet traffic beginning on a search engine, and the plot thickens:
The deduction we make here is that if the search engine remains prominent across the landscape of digital properties generating traffic across the web (i.e., the trend reflected in the Pew statistic holds up over time), then traditional search marketers will soon be in for a shock.
Search marketing experts such as Aaron Wall and Stoney deGeyter have always urged keyword researchers to extend beyond just what they can get from the tools. They recommend scouring forums and blog comments, plowing through the thesaurus, even interviewing customers to generate an augmented seed list around which to build a campaign.
It's an arduous process that increases an already daunting workload. So it's no surprise that agencies and consultants looking to turn a profit in SEM might cut corners at this point in the process.
The next wave of winners in the search game will be the ones who find efficient ways to marry raw data from consumer research with sound campaign design and management methodology. Search is as much about customer service as it is about marketing -- and both essential business functions revolve around the customers' intent.
Heading into 2011 and beyond, being found will be just as important to brand integrity as being searched.
If you don't speak the consumer's language, how will they ever find you?
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!