While Google, Yahoo, Microsoft and Ask.com compete fiercely for users and advertisers, they also find it in their own best interests to cooperate on some things. They have recently come together to present a unified solution to help webmasters, and therefore searchers. Yesterday, the four search engines revealed the latest enhancements to sitemaps at the Search Engine Strategies conference in New York, including a new auto-discovery protocol and the addition of Ask.com to the Sitemaps.org group.
“Publishers know their sites better than anyone, and can give unique insight to help search engines index and rank a site,” Tim Mayer, VP of global Web search, told Search Engine Watch. “I think every search engine wants to make it easy for webmasters to do that in a common format.”
A sitemap is an “inclusion protocol,” as Dan Crow, product manager at Google, describes it. It allows a webmaster to create an XML file which suggest to search engines which pages of a site are important, as well as include optional metadata like the time the page last changed. In contrast, robots.txt is an “exclusion protocol,” because it tells a search engine what pages not to spider.
It’s most useful in helping to index a site with dynamic pages, which may confuse a search engine’s crawler and lead to issues with indexing multiple versions of the same page, or not indexing some pages at all, Mayer said.
From a search engine’s perspective, it’s a more efficient way to know when a page has been updated than recrawling an index, Mayer said. None of the engines plan to stop crawling the Web to index sites, but will use the sitemaps protocol to do so more efficiently.
Back in November, Google, Yahoo and Microsoft announced a common protocol for sitemaps. They formed a working group and promised to continue development of the sitemaps protocol, which was originally developed by Google in 2005.
This week, the three are joined by Ask.com in the effort, and the four search engines announced that all will begin supporting auto-discovery of a sitemap through a line of code in a site’s robots.txt file, such as Sitemap: http://www.mysite.com/sitemap.xml. Google, Yahoo and Ask will begin using the auto-discovery of sitemaps immediately, and Microsoft promises to do so some time in 2007. All four will continue to accept sitemap submissions by webmasters as well.
Yahoo engineers came up with the idea while brainstorming ways to make the sitemaps protocol more useful, and to improve the experience for webmasters, according to Amit Kumar, senior engineering manager for Yahoo Web Search. Yahoo took the idea to the other engines, and they agreed that it made sense, he said.
The Sitemaps.org effort, and today’s extension of that, are parts of a larger trend toward giving webmasters more control. Google’s Webmaster Central and Yahoo’s Site Explorer, for example, are two programs that provide tools to help a webmaster influence the way a site is indexed by a search engine. Microsoft is planning its own version of a webmaster tool, though it is not clear when that might launch.
We report the top search marketing news daily at the Search Engine Watch Blog. You’ll find more news from around the Web below.
- Ask.com To Launch New Search Algorithm Code Named Edison, Search Engine Roundtable
- Structure of a Click Fraud Botnet, Shumans.com
- New Local Search Power Structure, ClickZ
- Yahoo! Increases Quality Initiatives, Yahoo Search Marketing Blog
- A note on traffic exchange programs, Inside AdSense
- Technorati Acquisition Will Build Brand Services, ClickZ
- I Tie Up Mike McDonald & Interview Vanessa Fox… Again., SEOmoz
- Wisdom of Crowds Search, Online Marketing Blog
- Marketing to Social Networking Sites, Targeted, ClickZ Stats
- Tags Are Not the Answer for Search?, Biznology
- How to blog the SEO fishbowl, Cornwall SEO
- When Is It Okay To Cloak?, SearchRank
- Search Engines Unite On Sitemaps Autodiscovery, Search Engine Land
- SEO Site Analysis: Knowing the Score, Part 2, ClickZ
- Search Terms for Q1 2007, ClickZ Stats
- A new case study on botnet-based click fraud