SEO News
Google

Take Out Duplication Without a Single Site Change

ben-goodsell-new
by , Comments

Duplication is negatively affecting the ability of your ranking pages to perform as well as they could. This is a common point of frustration among SEO professionals, especially those who manage ecommerce sites.

Too often, waiting on the client or development team to implement changes to a site that will help reduce excess crawling and indexation can hinder making the bottom line. What if there was a way to bypass all that and help search engines identify ranking pages? Google and Bing’s parameter handling tools allow you to do just this!

Warning: Changes made using parameter handling are seen as very strong “hints”. Use with caution.

google-webmaster-tools-pages-crawled-per-day

The Scenario

Back in August, Googler Maile Ohye outlined some best practices and provided some much needed insight into Google Webmaster Tool Parameter Handling. Using the Google Store as an example, she pointed out that while there were only 158 products, there were over 380,000 URLs identified by Googlebot!

These 380,000 URLs are most likely generated by:

  • The functionality that allows changing the way products are displayed through sorting, filtering, and browsing.
  • Tracking user behavior within internal linking.

1. URLs Generated by Changing the Way Products Are Displayed

Most ecommerce sites have hundreds of ways to change how products are displayed in accordance to the user’s preference. For example, if a customer is looking for something cheap they can sort the products they’re interested in by price in ascending order to display the least expensive first.

Using the Google Store example, here are URLs that show the parameters used to sort, filter, and browse (look through multiple pages of) products.

Navigating from the Google Webmaster Tool Dashboard to URL Parameters located under Configuration, displays a list of parameters Google has identified as potential issues. By clicking edit you can specify how parameters like those listed out above change content.

Details on the differences between sorts, narrows, specifies, translates, paginates, and other can be found in the parameter handling presentation Ohye created.

google-webmaster-tools-parameter-handling

By default Letting Googlebot decide is selected, but if you know that the parameter can be safely ignored by Google you can choose that. Additionally if the parameter has a particular value that you know can be safely ignored, that can be specified.

2. Unique URLs Created By Adding Parameters Used for Tracking

The Google Store doesn’t use tracking parameters, but it’s a common tracking tactic to include parameters within internal linking to track user behavior. Ohye’s examples included SID (session IDs), affiliateID, or tracking-id. Sites using Google Analytics will often have something like &utm_medium=320banner in URLs.

URLs like these and all other instances where a parameter exists that doesn't change the content of the page can be specified as seen below.

google-webmaster-tools-parameter-handling-ignore-tracking

While Bing Webmaster Tools doesn’t seem to have the functionality of specifying the way the parameter affects the page content, it does have the ability to ignore parameters. Parameters that are specified as tracking parameters in Google Webmaster Tools can most likely be applied here as well.

bing-webmaster-tools-parameter-handling

Caveats

Parameters Must Be Eligible

One caveat with Google Webmaster Tool parameter handling is the targeted parameter must be what is considered to be an “eligible” parameter. Ohye explains this using the following examples.

Eligible URLs:

  • http://www.googlestore.com/googlesearch.aspx?category=office
  • http://www.googlestore.com/googlesearch.aspx?category=Wearables
  • http://www.googlestore.com/googlesearch.aspx?category=Wearables&size=M

Ineligible URLs:

  • http://www.example.com/Wearables++Youtube++size+M.axd
  • http://example.com/hotels/cancun/a7a141343.html
  • http://example.com/cancun+hotel+zone-hotels-1-23-a7a141343.html

Essentially, parameters must be displayed in the typical question mark, ampersand, and equal sign fashion.

Not Tested

Another more theoretical caveat is the fact that there aren't a lot of case studies detailing the effects sites have seen after implementing parameter handling. This can be a powerful tool and should be used with care. This is noted by Google themselves with the message below shown in the URL Parameter Handling user interface.

google-webmaster-tool-parameter-handling-warning

Potential Advantages

Compared with the use of other methods of limiting the search engine crawl like robots.txt, parameter handling seems like a great option because directives like rel=canonical, rel=prev/next, rel=alternate, and the noindex tag will still be applied!

This means even if you specify for Googlebot to ignore certain parameters, if someone were to link to a URL that was generated by filtering a product set, the equity would still be applied to the canonical page if a rel=canonical were specified.

Cheers to the consolidation of indexing properties and success in the war against duplication and excess indexation!


SES LondonOptimising Digital Marketing Campaigns with Search, Social and Analytics
At SES London (9-11 Feb) you'll get an overview of the latest tools, tips, and tactics in Paid, Owned, Earned, Integrated Media and Business Intelligence to streamline your marketing campaigns in 2015. Register by 5 December to take advantage of Early Bird Rates.

Recommend this story

comments powered by Disqus