Enhancements to Sitemaps Announced At SES New York

In November Google, MSN and Yahoo! announced that they were all going to support a unified protocol whereby webmasters could notify the search engines of the URLs on their site that they wanted crawled. As expected, enhancements have been made and were announced yesterday at SES. First, Ask is now supporting the Sitemaps protocol. Second, support for auto-discovery has been added. All it takes is a simple line of code added to the robot.txt file. Here is an example:

Sitemap: http://www.mysitename.com/sitemap.xml

Since crawlers check the robots.txt file when they initially visit a site, this directive will provide immediate notice of where the crawler should look to find the sitemap. Webmasters can also use an HTTP request to submit their sitemap. For more information on this, readers are urged to check Sitemaps.org for the most current information.

During the SES session, Sitemaps and URL Submission, a show of hands indicated that there are still quite a number of webmasters who are not submitting xml versions of their sitemaps and rely on the alternative text versions. It will be interesting to see if a year from now the adoption rate has changed as webmasters discover how useful this protocol is for submitting their sites.

About the author

Amanda Watlington is the owner of Searching for Profit, a search marketing consultancy focusing on the interaction of the consumer with businesses using search engines, RSS, blogs, podcasting or other new media to deliver their message. She is a frequent speaker at Search Engine Strategies, WebmasterWorld and other industry conferences. She's the author of three books and has written feature articles for over thirty magazines and journals. She has twenty years of experience as a communications, sales and business strategy consultant, and over ten years as a search marketer.