Some people warn against redirects because they can be seen by Google as “cloaking” — a “black hat” search engine optimization (SEO) tactic that shows a search engine’s spider different content than users. But shouldn’t Google know the difference by now?
Is it cloaking, for example, if an e-commerce site wants to geo-target its visitors so availability of products by location can be used for efficiency? Matt Cutts, Google’s Spam Czar, defines it this way: “Cloaking is serving different content to users than to search engines.”
So that method could get you banned by Google. Matt’s article was written more than 3 years ago. Have things changed?
Well, according to Google’s Webmaster Guidelines: “Cloaking refers to the practice of presenting different content or URLs to users and search engines. Serving up different results based on user agent may cause your site to be perceived as deceptive and removed from the Google index.”
Seem like they still haven’t realized that some user experiences can be improved by particular redirects.
Now the search engines would see the same content if they were crawling from the same area, so would that get you a pass? Hopefully, Matt will weigh in on the topic.
At this stage of online marketing, the use of all techniques that aren’t intended to spam the search engines should be allowed. And the engines themselves should know the difference.
Why do we need to tag our pages with a canonical meta tag? Why can’t we use selective content presentation?
The sad answer: Google rules. If you don’t do it their way, they have the right to exclude you from their listings.
Time and again, Matt has said the reason for taking action against cloaking and some types of redirects is to “deliver the most relevant results.” But that is the exact reason some sites would use geo-targeted content.
Vanessa Fox, former Google Webmaster Tools team leader, wrote an interesting suggestion to the issue, but one Google may have an issue with:
If you have your site set to detect a visitor’s location and show content based on that, I would recommend the following:
- Serve a unique URL for distinct content. For instance, don’t show English content to US visitors on mysite.com and French content to French visitors on mysite.com. Instead, redirect English visitors to mysite.com/en and French visitors to mysite.com/fr. T hat way search engines can index the French content using the mysite.com/fr URL and can index English content using the mysite.com/en URL.
- Provide links to enable visitors (and search engines) to access other language/country content. For instance, if I’m in Zurich, you might redirect me to the Swiss page, but provide a link to the US version of the page. Or, simply present visitors with a home page that enables them to choose the country. You can always store the selection in a cookie so visitors are redirected automatically after the first time.
We need to be able to give the user the best experience and at the same time improve our conversion prospects. No doubt, Google would prefer we request a ZIP code on the first page and then give users products available to their area as a workaround. But this would also increase bounce rates and lose you some business.
This is an issue that could use a little work on Google’s side. We now have instantly changing results, so why not allow the integration of geo-targeted results?
I’m sure webmasters would show page versions to use based on some system Google could create. After all, we were good about the canonical tag.
Join us for SES Chicago 2010, the Leading Search & Social Marketing Event, taking place October 18-22! The conference offers 70+ sessions on topics including PPC management, keyword research, SEO, social media, local, mobile, link building, duplicate content, multiple site issues, video optimization, site optimization, usability and more.