To Google, "cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index."
Unfortunately, like Flash before it, dynamic websites seem to break these rules, but not with any "evil" intent. As Ralph Tegtmeier, a.k.a. Fantomaster - a well-known proponent of effective cloaking, notes "almost everything qualifying for state-of-the-art these days cannot be spidered or indexed efficiently!"
The purity of the algorithm is taking on iconic status, like the "cloaking device" did when first introduced into the "Star Trek" language back in 1968 (first used in "The Enterprise Incident"). But Google, in many cases, merely avoids the heavy lifting and either creates tagging, supplemental listings, or removes content, expecting website owners to make changes to meet their impact.
In a recently released video (embedded below), Matt Cutts discussed some of the problems for both sides of the cloaking issue. His viewpoint seems to imply that Google will err on the side of purity and penalize sites not following their directions.
"There is no such thing as white hat cloaking" and "the page the user sees is same page the googlebot sees" show Google really needs a better understanding of how the web is developing.
Website owners looking to use the best technology to enhance their users' experience already have to deal with tracking and privacy issues - just like the search engines. Google knows webmasters are generally trying to give visitors good content, but the technology makes it difficult for the spiders to fully crawl and interpret this dynamically generated content.
Cutts' video recognized some of the issues I addressed last year regarding geotargeting offers based on location. Adapting pages based on search queries, referrer or cookie data can be manipulated for good and bad - but on the whole it is done for profit. What Google needs to realize is a site that is good at conversions must be doing something right, otherwise users wouldn't convert.
Saying the methods are "black hat" really is erroneous. The vast majority of sites using them are just trying to simplify the visitors path to what they want. True, there is a profit motive, but associating that with "evil" or "black hat" is inaccurate.
Surely with Webmaster Tools they could create a platform that would recognize the elements of a dynamic site and allow areas that can be changed based on various criteria. Or perhaps a content tag could be established where the sacrosanct text - text that wouldn't be changed and can be associated to the page.
As Cutts outlined in the launch of Webmaster Tools back in August 2006 - WMT allows sites to discover crawl errors, set the redirect for domains' www and non-www and even warns of spam penalties - and that was six years ago. The web has come a long way in those six years and Google should be able to keep up.
Google wants to show people the best possible results in the search index, or so they say. A recent Reuters article notes, however, that Google's attempts at personalization may be hurting searchers from getting to what they really need to find.
"Google tries very hard to please you by finding you more stuff just like the other stuff you clicked on last time. That is the essence of Google's great cleverness. But that very brilliance is becoming more and more damaging to the shared view out to an objective fact-based world," Reuters noted.
If Google uses prior behavior to change the search results they present us, how does this not use the same techniques as cloaking? "We are now all living in what we believe to be the objective, self-evidently google-able truth. And we are not."
The problem is Google is seen as this impartial entity by the general public, while more of an adversary to marketers. Once upon a time Google was not as adversarial, but whether the spammers caused them to change, or the entry of the government looking over their shoulders shifted their perception, they are now seen as opponents to most marketers.
Google really needs to move back in to the supportive role for SEOs. The changes in how websites generate their pages and the ability to have this content indexed should be part of all search engines' agendas. It is in everyone's best interests.
But the system now favors Google - they are the engine and as such everyone comes to them. The recent Senate hearings shows people are becoming more aware of the power Google has. They need to join the efforts to address the growing use of dynamic elements. People want these new sites that take less time to find what they want.
We can redirect for browser languages, we can create mobile versions and the engines can recognize these as non cloaking. It really is time to go further.
The very nature of dynamic sites and their compartmentalized page areas should work well with the search crawlers. When the canonical tag was first launched, Joost de Valk had some lead time to develop plugins for the integration of the new tag in to WordPress, Drupal and Magento.
I'm sure the people at the various CMS companies would be happy to work with Google on this. Unfortunately there is no API for the organic results, but perhaps WMT could work with developers to create a methodology.
The Original Search Marketing Event is Back!
SES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Early Bird rates available through Sept 12. Register today!