IndustryPenguin 4.0: a link builder’s perspective

Penguin 4.0: a link builder’s perspective

It’s been two weeks since Penguin 4.0 was officially announced.

It’s been two weeks since Penguin 4.0 was officially announced.

Dr. Pete continues to report high Mozcast temperatures, and stories of recoveries are starting to spread across the net.

Today I want to take a step back and share my thoughts about Penguin 4.0, as a marketer at a link building agency.

As an industry we have been laser-focused on the roll out, recoveries, and any tidbits of information we can glean from Google and their representatives. This is all important information, but I want to share my broader thoughts about Penguin 4.0 now that it’s here, and what it reveals about the SEO industry as a whole.

In no particular order, here are my thoughts:

  • Penguin devaluing links instead of demoting a site is an important step to a better relationship with Google, for all marketers and site owners.
  • I don’t believe the move to devaluing spam will make grey hat or black hat more viable. Manipulation will be harder than ever, so focusing on good links will remain critical.
  • Penguin 4.0 counters Negative SEO. Devaluing the links removes the (algorithmic) barb from NSEO.
  • The era of Penguin is coming to a close. I suspect to see fewer large algorithms moving forward.
  • Links are vital to Google’s algorithm, and the entire web. Penguin 4.0 reinforces that.

Is that enough to cover? Let’s see.

penguin-4

Penguin devaluing links instead of demoting (punishing) sites makes the web a better place

Google wants to punish spam. I understand that.

But algorithmic demotions shouldn’t have been part of the equation. Penguin never should have been punitive—it never should have demoted sites.

Building in a layer of penalization (even if Google didn’t call it such) led to  fear, uncertainty, doubt, frustration, misinformation, and an overall worse relationship between Google and SEOs, marketers, entrepreneurs, and site owners.

This poor relationship was further exacerbated by confusing recovery information, uncertainty surrounding whether your site was affected, and lengthy, unknowable wait times between data refreshes (and updates), which was the only way to recover.

The Penguin 4.0 move to devalue spam links—as opposed to demoting entire sites—is the best news to come out of Penguin 4 so far. This is what SEOs, business owners, and marketers have wanted from Penguin since day one.

If Google doesn’t want to spam to affect their results, then they should do just that: find a way to make sure spam doesn’t impact their results. Find a way to ignore spam.

This comment from Danny Sullivan on Matt Cutts’ infamous rant against guest blogging in 2014 is the perfect summation:

 

danny-sullivan-comment

Danny’s right: Google can’t take a zero tolerance policy on links. Not when links prove so vital to the web. Especially when Google’s advice everywhere else is promote, promote, promote.

Penguin 4.0, fully integrated into the core algorithm, is a good step in the right direction. Google shouldn’t seek to mold the web to match their algorithm—they should mold their algorithm to the web.

Penguin 4.0 will hopefully improve the relationship between Google, SEOs, site owners, and businesses.

Devaluation instead of demotion ISN’T going to lead to a new wave of manipulation

It’s an easy logic chain to follow:

Now that Google’s not punishing spammy backlinks– only devaluing – there’s less risk in building said spammy links. Worst case scenario the links are ignored by Google.

Add in the fact that Penguin happens in “real time” and is granular, and it’ll be easy to test which links are easy to secure but still add value to a page. From there all that’s left is to mass build those links and manipulate the algorithm. Right?

This is NOT reality. This will not be the case.

First, if the logic chain is easy to follow, I guarantee the ranking engineers and webspam team at Google have considered it and made sure such manipulation won’t be possible.

Second, manual penalties still exist. It would be easy for the webspam team to follow in the wake of large-scale Penguin link devaluations and hand out manual penalties.

Finally, I don’t believe it will be as easy as assumed above to observe and predict the effect of Penguin, for three reasons:

  1. Real time isn’t instantaneous.
  2. Devaluation has less impact than demotion.
  3. Granular is less noticeable than site-wide.

The release of Penguin 4 has had the lowest impact of any Penguin to date. Anecdotes and data are still only beginning to spread across the industry, despite the official release on September 23rd.

In my opinion, it’s going to be harder than ever before to accurately observe and measure the effect of Penguin. It’s going to be hard to track the impact and understand whether or not a link type is having the intended effect.

There will be instances where Penguin can be manipulated and tested in isolation. Even then, it will be difficult to be sure. To take that data, the information, and apply it to the live web?

It’s not going to work.

Manipulation isn’t a good bet anymore. It hasn’t been since the first release of Penguin.

If you want real visibility in search, you’re better served spending time investing real value into your website and promoting appropriately.

So skip the manipulation. Do the hard work of creating value and promoting that value to the right audiences, in a way that will result in links. Optimize opportunity, and work strategically. Don’t try and manipulate the algorithm with tricks and hacks.

Negative SEO won’t work with the new Penguin

The entire concept of negative SEO was predicated on triggering a manual penalty or algorithmic demotion by pointing spammy links at a website.

If the new Penguin only devalues spammy links, then pointing those links at a competitor will achieve nothing.

Wait, scratch that. It will achieve something: it will waste your time.

Which is as it should be.

There shouldn’t be more potential benefit in harming a competitor than building up your own brand and visibility. Although its existence was often questioned, the opportunity of Negative SEO was too high. Spam links are easy to create, easy to control, and if they result in a manual action or algorithmic demotion, then spam links had a high impact as well.

Penguin 4.0 and the move to devaluing bad links should end much of the value of Negative SEO. The only potential exception would be if you’re able to earn your competitor a manual penalty—but that’s something Penguin can’t fix. At least with this update, the Penguin factor in Negative SEO has been removed.

And I for one couldn’t be happier – I want to create, not destroy.

The era of Penguin is coming to a close

Large, manual updates are difficult for Google. They typically result in negative publicity, critical scrutiny of their algorithms, and unhappy site owners.

With Penguin 4.0, Google has officially moved Penguin into the core algorithm. With this change, Google has also made it clear there will be no further Penguin announcements in the future.

The era of Penguin is coming to a close.

But more than that, I believe Google won’t unveil large, named updates to the public as often. Machine learning, deep learning, and AI continues to improve, and I suspect it will be harder and harder to understand when a new element is introduced to the algorithm.

Consider the past few years. Matt Cutts has moved on, to be nominally replaced by John Mueller and Gary Illyes. Gary was clearly burned in his attempt to coordinate a Penguin release date. RankBrain was announced only after it was solidly in the algorithm, and only then to shine positive light on Google and their accomplishments. Amit Singhal has been replaced by the former head of AI.

There is very little benefit for Google in confirming or releasing large updates.

Traditionally, we only notice a handful of significant updates across the year. Google’s inside search states they made 665 changes in 2012 alone.

I believe Penguin 4.0 marks the beginning of an era in which Google can update their algorithm with machine learning and AI, and not be forced to rely on large, manual updates that occur outside of the core algorithm.

If Google is able to reduce their large, noticeable updates, then we can certainly expect less communication as well.

Links will be vital to the web for the foreseeable future

Ever since the original release of Penguin in 2012, SEOs have been predicting the release of a new, vital signal into search.

And all we got was RankBrain.

Here we are in 2016, and Google continues to invest into fine-tuning their link quality algorithm.

In fact, a Google Search Quality Senior Strategist just recently stated that links are one of the top two ranking factors (he named links and content as the top ranking factors).

It’s safe to say we can quit predicting the death of links. Links remain vital to the Internet, humanity’s ability to use the web, and search engines’ ability to serve quality results in search.

If driving traffic, building an audience, and growing relationships are important to your website, then links need to be addressed in your marketing strategy.

Penguin 1.0 didn’t change that, nor did Penguin 4.0. The only change is quality—today, securing real links that represent a vote of confidence is more important than ever before.

Resources

The 2023 B2B Superpowers Index

whitepaper | Analytics The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing

whitepaper | Analytics Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook

whitepaper | Digital Marketing The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

whitepaper | Digital Marketing Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y