Matt Cutts, has confirmed Danny Sullivan's report that Google and Bing use social signals as a ranking factor. In a video on Google Webmaster Schedule, Matt Cutts said:
"I filmed a video back in May 2010 where I said that we didn't use that as a signal, and at the time, we did not use that as a signal, but now, we're taping this in December 2010, and we are using that as a signal."
A change has been suspected for awhile as earlier this year, Richard Zwicky, writing for SearchEngineWatch, also identified a patent that suggested that Google was investigating author authority as a ranking factor.
Why Should the Social Graph Become the Link Graph?
The argument for why social sharing measures, such as Twitter retweets and Facebook 'Likes' should form the basis of the social graph is part of the thesis that linking no longer carries the voting power it once did because it is too easily manipulated. To create a link used to be a manual activity that required someone to physically code a link. The labor required, although minimal, was time consuming enough for links to reflect an implicit 'social graph' of relationships between websites.
Nowadays most websites are less likely to represent the social graph as they once did - most people now use their social media profile to publish rather than a blog. Therefore the voting power of a link does not easily mirror public sentiment. Furthermore the authority signal of inbound links has been further diluted by link spam and automation. It's become so easy to create links that it's becoming harder to value them.
Social media on the other hand is percieved to be more resistant to spammers because most communities are self regulating. As users post content to their friends, they are naturally less likely to publish low quality content for risk of alienating those closest to them. The belief is that "friends dont spam friends" and so both Google and Bing are leaning on the social graph to, for want of a better word, 'unspam' the index.
Why Should Author Be an Authority Signal?
It shouldn't necessarily. But "IF" social sharing better reflects the popularity and worth of content on the web, then Google and Bing have a problem on their hands. Tweets and Facebook status provide little surrounding text and no anchor text. This means that both search engines cannot assign any value to the originating page and have few contextual clues about the destination page. Therefore they have to invest in working out 'authority' of an author as a means to generate more relevancy and context around the shared page.
Are Retweets the New Links?
Sullivan concludes his report with the statement that "retweets serve as a new form a link building" - and that a tweet from an authoritative source equates to a link from a respectable or high profile website. Reading between the lines, however, there is little said to corroborate that theory in Cutt's announcement. The emphasis seems to be on 'try' and the confirmation suggests that this feature really only occurs in the Google Realtime index - which almost no one uses.
SEOmoz also put the theory to the test, albeit informally, but unfortunately it does not specifically test the 'author signal' and instead simply tests whether a page that gets retweeted ranks faster or higher than a page that is only linked to. Strictly speaking this is testing retweets vs links, rather than one author signal over another author signal. The end result is nonetheless interesting as the retweeted page ranks the highest. However, the page with no retweets also ranks well. Furthermore, simply by trending on Twitter the destination page is picked up and linked to by Topsy and Tweetmeme - which do pass authority (ironically this also illustrates the problem of automated linking).
So, if retweets are a new form of link building in and of itself (as opposed to being a relationship management component of a link building strategy) then, currently, it is twofold:
- Googlebot crawling and counting links within Google Realtime and attributing weight via that index.
- Trending indexes like Topsy or Tweetmeme linking to the original source and lending weight based on their authority.
What Does It All Mean To The Industry?
Whilst Cutt's announcement may be taken as a indicator of what is to come, there is currently no evidence that author authority makes any difference to the main organic Google index - which is the business end of search marketing.
I hate to be the one that says this - and I give kudos to the industry leaders driving this conversation forward - but to my mind, the entire discussion plays directly into the hands of Google and Bing. Independently they both have a need to be seen at the cutting edge of social media, the new wunderkind on the block. Bing has lots to gain in terms of generating traction and awareness of it's Facebook integration and Google has nothing to lose by experimenting on a separate index.
Twitter can be crawled but does not have enough of a global presence on the web to be counted as an authority on enough topics to make a significant dent in the organic index. Conversely, most links shared on Facebook cannot be crawled and so little or no authority can be credited to them.
If Google cannot calculate influence on Facebook, then it is missing a vital component of the online social graph. Arguably, to make Twitter updates carry more 'value' in the wider online eco-system is a way of forcing Facebook's hand when it comes to access to their data - something Google has been after for a long time.
Twitter Canada MD Kirstine Stewart to Keynote Toronto
ClickZ Live Toronto (May 14-16) is a new event addressing the rapidly changing landscape that digital marketers face. The agenda focuses on customer engagement and attaining maximum ROI through online marketing efforts across paid, owned & earned media. Register now and save!