Is this worth a 4.0 on the SEO Richter scale? Probably not. Just a rumble really, but oh how nothing shakes up the SEM industry and gets SEOs chatting like a nice bug in Google search results. Bring up the topic of the Supplemental Index and duplicate content, the story gets even juicier.
As has just been confirmed by Google's Vanessa Fox, there is in fact, something amiss with the current "site:" command, which is currently being rectified 'as quickly as possible', and this is merely the result of display issue that which shouldn't have any impact on search queries or ranking. (Special thanks to Vanessa, for working with us on sorting out this issue and finding a solution so quickly!)
But let's dig deeper in into why this is such a big deal in the SEO world.
The "site:" command tells you how many of your sites' pages are indexed in Google. In Google's Webmaster Central, the official syntax is "site:domain.com", and many SEO experts look at this as a real number.
So when Google starts to suddenly return disparaging results for that command, it raises a red flag in the industry, and the conspiracy theories fly. For SEOs and webmasters, the questions that immediately come to mind are along these lines:
- Is this the sign of a stronger "duplicate content" filter?
- Does it mean I'm really in the Supplemental Index or possibly banned for life?
- Did I mess up something on my site?
Probably nothing to raise your blood pressure over, but definitely this glitch is an anomaly in Google SERPs.
As is well documented here at SEW and other sites around the Web, typing "site:www.domain.com", "site: www.domain.com", or "site: domain.com" will return drastically different results. Note the differences when using a space after the colon, as well as when using the www vs. non-www version of a domain.
At SEW, we were alerted to this problem yesterday when the effervescent David Naylor posted that something was amiss with the results for SEW. The "site:" command site:searchenginewatch.com shows only 1 page, with "about 268" similar pages whose results are omitted.
Rest assured, at SEW, we do still have a vibrant pulse, and have not experienced any significant drops in traffic due to this problem. So, it's too early to plan a funeral. I am happy to report that traffic is normal at Search Engine Watch. In fact, it has actually been growing fairly steadily since January 1, and that deserves a post of its own.
As it turns out, Dave Naylor was not the first to discover this problem, as Danny Sullivan points out in his SEL post, Webmaster World has had a discussion going on this for almost a month now. Several large, authority sites, with total numbers of indexed pages reaching in the tens or hundreds of thousands were seeing this result as well.
Because of the strange coincidences in the number of results, Danny Sullivan does get credit for dubbing this "About 260" problem. However, that may not be an entirely correct title, because in some datacenters, the result is "about 359" for the same search. Try the searches among different browsers (Firefox/IE) and with personalized search on/off. While some are not dramatically different, they do still fall into the "About 260" category, other searches are up by at least 100 more results.
SEW blogger Eric Enge dug up similar examples of other authoritative sites exhibiting this problem:
- Clickz.com (traffic is normal here too)
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!