One of the highlights of SES NY took place on the final day. It's a shame really, because so many people left before the last day and missed the morning keynote speech by Andrew Tomkins, Yahoo's chief scientist for search.
Tomkins's session was refreshing, because it offered up a lot of new stuff, and there are some deep implications to what he said. For some excellent coverage of the keynote, see Lisa Barone's blog post on the Bruce Clay Blog.
Before I put my own spin on Tomkins's presentation, let's start by reviewing a little history.
How Meta Keywords Died
Long ago and far away (well 10 years ago), one of the primary ranking factors used by the search engines was the data in the keyword meta tag on the Web page. However, in about 1998, the first tidal wave of spam hit.
People started stuffing keywords in volume into their meta tags. These keywords often weren't even related to the content on the page. A porn site could, for example, rank for all kinds of terms, and get traffic for people who were really looking to buy blue widgets.
This is the definition of a horrible search experience. You click on a search result looking for information on blue widgets, and you end up at a porn site. Ouch.
This tactic was successful because the user didn't see the keywords on the Web page itself. A similar trick with the same properties was the stuffing of keywords into hidden text, or text at the bottom of the page. Since these things generally aren't visible, the on-page experience wasn't interrupted by the keyword stuffing activity.
This is what ultimately led to the demise of Alta Vista and the rise of Google – Google had an algorithm (the PageRank algorithm) that discounted the keyword meta tag (and other similar tricks, such as hidden text), and provided quality results with significantly less spam in it.
Basically, the search engines had to stop trusting Webmasters to tell them about their site.
What Yahoo's Planning
In his presentation, Tomkins explained that Yahoo is beginning to turn back the clock on this. No, they aren't going back to meta tags, but they're looking at a whole series of ways to get information from Webmasters. Some of these include:
Yahoo says this information won't be used to affect ranking results. Yahoo wants to use the information to provide a better search listing in their results.
Instead of having to scan a Web page to try and figure out what title to put on the search result, and what description to offer, Yahoo will accept structured data from Web sites that tells them what to use for a title and description. This can range far beyond what you would do with your title tag and meta descriptions tag today.
Yahoo is actively working on this with a small number of highly trusted partners. The big news here is that this signals a move back toward search engines trusting Webmasters to tell them about their sites.
This is, of course, open to real abuse. After all, the title and description of your search listing can significantly affect CTRs (define).
The Return of Trust
The movement in this direction isn't new. Anyone who follows local search knows that Webmasters can update their search listings by providing accurate information on the location of their offices, and contact information.
Yahoo Local has been using Microformats since June 2006. But there are many other methods search engines use to gather local search results and tailor the output results.
For example, you can contact the search engines directly, claim your listing, and update your data. For people with a single office, or a small number of locations, this isn't that hard to do manually.
Services exist for more complicated businesses with more locations, such as Localeze and RelevantAds. These are services that have earned the trust of the search engines, so they treat this information as solid.
One reason for this: these types of services go to great pains to validate the accuracy of the data provided to them by their clients. For example, they manually call the business and verify that the business location and contact information is correct.
A Future for Trust
Webmaster-supplied information can play a huge role in the search engines of the future. While it may not be scalable to build your entire search engine solely on human input, you can certainly use human input to refine the quality of a search engine.
Trust-based systems play a critical role in that. Keyword meta tags may be dead as a ranking signal, but there's no reason why a search engine can't implement something new and more robust (such as an extension of the Microformats protocol) to allow the Webmaster to provide lots of information about their site.
To fully take advantage of that, the search engine needs to know how trustworthy the person supplying the information is. This type of system is already in place in the world of local search, and Yahoo's taking one step further in that direction, as Tomkins outlined.
Don't be surprised if the other search engines follow suit, and take it even further. While Yahoo doesn't plan to have this data affect rankings, it certainly could in future implementations by them – or another engine.
Think of the impact on sites that are thin on simple text. A trusted way to provide information about what can really found there could be invaluable in improving search result quality.
The next trick is to figure out how to rank the trust of the data provided.
SES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Early Bird rates available through Sept 12. Register today!