In November Google, MSN and Yahoo! announced that they were all going to support a unified protocol whereby webmasters could notify the search engines of the URLs on their site that they wanted crawled. As expected, enhancements have been made and were announced yesterday at SES. First, Ask is now supporting the Sitemaps protocol. Second, support for auto-discovery has been added. All it takes is a simple line of code added to the robot.txt file. Here is an example:
Since crawlers check the robots.txt file when they initially visit a site, this directive will provide immediate notice of where the crawler should look to find the sitemap. Webmasters can also use an HTTP request to submit their sitemap. For more information on this, readers are urged to check Sitemaps.org for the most current information.
During the SES session, Sitemaps and URL Submission, a show of hands indicated that there are still quite a number of webmasters who are not submitting xml versions of their sitemaps and rely on the alternative text versions. It will be interesting to see if a year from now the adoption rate has changed as webmasters discover how useful this protocol is for submitting their sites.