Google's Matt Cutts today announced the launch of a new algorithm that is intended to better detect and reduce spam in Google's search results and lower the rankings of scraper sites and sites with little original content. Google's main target is sites that copy content from other sites and offer little useful, original content of their own.
Positing on Hacker News, Cutts wrote:
"The net effect is that searchers are more likely to see the sites that wrote the original content. An example would be that stackoverflow.com will tend to rank higher than sites that just reuse stackoverflow.com's content. Note that the algorithmic change isn't specific to stackoverflow.com though."
On his blog, Cutts wrote:
This was a pretty targeted launch: slightly over 2% of queries change in some way, but less than half a percent of search results change enough that someone might really notice. The net effect is that searchers are more likely to see the sites that wrote the original content rather than a site that scraped or copied the original site's content.
Cutts said the change was approved last Thursday and launched earlier this week. Cutts announced Google's intention to up the fight against spam in an Official Google Blog post last Friday.
In response to criticism that Google's results were deteriorating and seeing more spam in recent months, Cutts said a newly redesigned document-level classifier will better detect repeated spammy words, such as those found in "junky" automated, self-promoting blog comments. He also said that spam levels today are much better than five years ago.
At Webmaster World, there is discussion about big drops in traffic. Are you seeing any changes as a result of this change?