On-page Optimization is Dead

The SEOMoz team caused an amusing furore when they calculated that a new metric they had discovered had more correlation between high LDA scores and high rankings than any other factor.

LDA, or latent Dirichlet allocation, is effectively one way of measuring relevance without counting keywords, so a crawling tool can “read” two pieces of copy and rate them according to which is better written. It does this by looking for related keywords rather than necessarily for the actual keyword itself.

As a quick example, I know that a block of text mentioning “petrol money,” “excitement,” “anticipation,” and “gutted” is more relevant to a keyword search for [Liverpool FC fan blog” than a block of text mentioning “Anfield,” “Kenny Dalglish,” and “Champions.”

Search engines need this type of insight into content to make their results more accurate. Plus, it has the added benefit of being able to pick out spammy, badly written, or spun content.

What was particularly amusing was how worked up some of the search engine optimization (SEO) community got about this factor. Taking away for a second that the original calculations were wrong anyway and that actually, having a high “LDA” score isn’t as much of a secret to top rankings as it was first claimed — it remains an important point that there are more than 200 ranking factors.

Some factors are certainly more influential on rankings than others, but in an increasingly competitive environment, it seems to me that it is better to do 100 things 1 percent better, rather than one thing 100 percent better. Sounds like the type of advice I would get from my dad, actually, but it’s probably right for SEO.

Let’s look at some examples to highlight what we’re talking about here. Here are the top 10 results for [car insurance” and the number of times each ranking page mentions “car insurance” specifically:

Car Insurance Top 10

As you can see, there is actually an inverse relationship here, so Google can see that the content is about the same theme, and could even be penalizing content that mentions the keyword too many times.

Moving onto some keyword search for [televisions”, the same is true here — fewer mentions = better results:

Televisions Top 10

But here’s where it got interesting! Doing the same for [dresses” shows that the more times the keyword was mentioned, the higher the ranking!

Dresses Top 5

One thing we sometimes forget to do in SEO is to look at a much bigger picture than numbers of links, LDA scores, keyword density, and remember to think about the full user journey — a human user journey — and optimize for that.

Google has a lot of very clever people working on a lot of very advanced algorithms. Rather than trying to keep up and discover ways of cheating the system, we should spend more time looking at how a human sees our sites and engages with them, and how a well-run advertising campaign or a brilliant social strategy would affect our link profile and try to mirror that.

When they manage to crack the algorithm for identifying sarcasm, they will truly be able to build a picture of the web and what people are thinking — plus they can sell it to the Americans as an app (joking!).

So, by all means, keep up with the latest metrics for success. Read into how the search engines are evolving their algorithms. But don’t forget that no single metric is 100 percent of the picture.

No top ranking site is the best for every ranking factor. Each keyword will likely have a different search relevance profile.

Most importantly, don’t forget people have to actually use your site and read your content when they get there.

Also don’t forget on-page optimization is dead.

As you were.

Related reading

Vector graphic of a laptop displaying a search result for 'your website'. A magnifying glass hovers in front of the laptop screen, enlarging the search result.

Simple Share Buttons