In response to the new caching techniques Jennifer reported last week, Matt Cutts posted a more detailed explanation of what is called "crawl caching proxy." In short, Google may use all of its spiders, GoogleBot (Web search spider), AdSense spider, News spider, blog search spider and so on for caching purpose. So when all of these spiders crawl your pages, they are stored in what is called a "crawl caching proxy." The "crawl caching proxy" is then used for retrieving a page's cache. My understanding is that when you conduct a search at Google.com, you may see a cache retrieved by the AdSense bot, from within the "crawl caching proxy."
Meet Your Favorite Search Engine Watch Contributors
Many of SEW's leading expert contributors will be at ClickZ Live, the new online and digital marketing event kicking off in New York (March 31-April 3). Hear from the likes of: Thom Craver, Josh Braaten, Lisa Barone, Simon Heseltine, Josh McCoy, Lisa Raehsler, Greg Jarboe, Dan Cristo, Joseph Kerschbaum, John Gagnon, Eric Enge and more!