SEO7 subtle on-site issues no SEO should miss

7 subtle on-site issues no SEO should miss

Your on-site SEO could be broken without you realizing it. We all know some of the basics: content is king; users come first; avoid thin content, keyword stuffing, above-the-fold advertising, etc. But not all on-site SEO issues are so obvious. Here are seven on-site issues that are very easy to miss.

Your on-site SEO could be broken without you realizing it.

We all know some of the basics. Content is king. Users come first. Avoid thin content, keyword stuffing, above-the-fold advertising, etc.

But not all on-site SEO issues are so obvious.

Here are seven on-site issues that are very easy to miss.

1. Excessive listing (Not, not this kind)

You are currently reading a “list post.” List posts grab attention. Survey research by Conductor suggests that headlines with list numbers are preferred over others. Likewise, CoSchedule analyzed 1 million headlines and found that list posts were by far the most likely to get shared.

So don’t stop making list posts.

But there’s a particular kind of listing that can just trash your site’s rankings.

In 2013, Matt Cutts from Google had this to say on how listing can be interpreted as keyword stuffing:

“Keyword stuffing is almost like a grab bag term to describe a lot of different things…You can be repeating…You can use different words. You know: so you’re talking about ‘free credit cards,’ ‘credit cards,’ ‘weight loss pill,’ you know, all sorts of stuff…it can even be almost gibberish-like…”

So while a list of facts, statements, or opinions draws shareability, simply listing a series of short phrases in succession can be harmful. Avoid long lists of phrases, whether separated by commas or in a numbered list. Search engines expect some degree of elaboration.

As with almost any rule, there are exceptions. Just be aware of the pitfalls, put UX first, and use discretion.

2. Accidental cloaking (even if users can see it)

Cloaking is bad, bad, bad.

Harry Potter may use an invisibility cloak for good, but as far as search engines are concerned, cloaking is inherently evil.

Google explicitly forbids cloaking, defining it as follows:

“…the practice of presenting different content or URLs to human users and search engines…”

It should be obvious why. If a search engine is suggesting a URL based on content that the user can’t see, the user is bound to be disappointed.

But cloaking doesn’t always happen on purpose.

Here are a few ways cloaking can happen by accident:

  • Poorly formatted CSS places content off-screen, where users can’t see it
  • Text foreground and background colors are identical, or nearly so
  • Content is hidden or covered up by JavaScript.

I always recommend against giving search engines the benefit of the doubt. For example, avoid matching foreground and background colors, even if the text is visible to the user due to other formatting elements. Google can misinterpret it, or rightly consider it cloaking on devices that can’t handle the additional formatting.

Also, avoid this related issue: “sneaky redirects”. Google defines them as:

“It’s a violation of Google Webmaster Guidelines to redirect a user to a different page with the intent to display content other than what was made available to the search engine crawler.”

I’ve seen way too many sites unintentionally cloak content in this way. There are only two ways you should redirect a URL: with a 301 or, if it’s legitimately temporary, with a 302. Anything else runs the risk of being dependent on the device or the user-agent, which runs the risk of being seen as cloaking.

Don’t do it.

3. Links with no anchors

This is just another form of accidental cloaking, but it’s so common and so easy to miss that I’m placing it in its own section.

This is an easy mistake to make. You type an “href” link in html and you forget to include the anchor text. Or maybe you intend to update the anchor text, delete it, and forget to add the new anchor text.

Unfortunately, an innocent mistake like this will result in a link to a URL that is invisible to the user, but visible to the search engines.

You know what that means…it’s cloaked.

Keep an eye out for this one.

4. Excessive bolding and other formatting

Shoving your keywords into “bold” or “strong” tags doesn’t exactly happen by accident, but it’s the kind of thing that is encouraged often enough that otherwise innocent webmasters might do it, thinking that it’s just standard or even best practice.

It’s not.

You can find plenty of correlative studies showing an association between bolded keywords and rankings, but you need to keep something else in mind.

Bolding is often used as a feature of content structure.

If your content is structured around specific ideas, it’s only natural that you will have bolded subheadings that feature keywords related to that idea, in much the same way that it will show up naturally in h2 or h3 tags.

This does not mean that you should parse your content and bold your keywords.

Bolding and italicizing certainly play a role within primary content, but it is for emphasis (see?) or to make the content easier to skim. It should not be used to simply bold your keywords wherever they appear.

Need evidence?

Brian Chang tried putting his keywords inside “strong” tags. The result? His rankings tanked a few pages. Want to take a stab at what happened when he removed them? That’s right. His rankings recovered.

Inspired by this, Sycosure ran an experiment and saw similar results. Their rankings plummeted, although they did recover before removing the formatting.

In short, while results are unpredictable, bolding your keywords is unlikely to improve your rankings by any significant amount, but it is likely to hurt your website in a way that makes it not worth the risk.

5. Frames

Frames and iframes display content from other URLs on a single page, with one URL for each frame. This breaks the entire conceptual framework of the web: one page for each URL, one URL for each page.

Google warns webmasters not to use frames and iframes:

“Google supports frames and iframes to the extent that it can. Frames can cause problems for search…Google tries to associate framed content with the page containing the frames, but we don’t guarantee that we will.”

Google’s support subdomain goes on to recommend that if you must use frames, you should place alternate content within the NoFrames tag, but I would strongly recommend avoiding them altogether. Frames break web standards and are likely to cause problems that extend even beyond SEO.

6. Links to penalized sites

Matt Cutts has stated quite explicitly that “Google trusts sites less when they link to spammy sites or bad neighborhoods.”

Unfortunately, it’s not at all obvious when you’ve linked to a bad neighborhood or a penalized site.

You may have linked to a site that appears authoritative but that uses a lot of spammy SEO tactics. You might have linked to a site that used to be trustworthy but was since acquired by nefarious people, or that otherwise went downhill. Or you may have simply been careless and used the first source you could find to support an argument, without realizing that the site was spam nirvana.

Whatever the reason, innocent webmasters end up linking to bad neighborhoods more often than they realize.

Google can be forgiving here but, as I keep saying, don’t give them the benefit of the doubt. They won’t always do the same for you.

This tool can help you catch links to bad neighborhoods on your site.

I also recommend taking these very important steps:

  • Except for your own social media profiles, the standard of quality for site-wide links in your navigation should be extremely high
  • Don’t allow trackbacks on your blog
  • Your on-site content guidelines should be very concrete about the standards needed for citation. Specifically, only cite authoritative or primary sources
  • Review your site’s external links periodically.

That said, do not remove all outbound links on your site. This is a very bad idea. A case study by Reboot Online makes it very clear that authoritative outbound links are good for your site. Your site should not be a dead end for links.

7. Nofollow

Don’t confuse the nofollow and noindex tags!

I see this one a lot.

Most sites will need to hide some pages from the search engines for one reason or another: to prevent duplicate content issues, to hide paywall content, to hide back-end content, for split tests, and for any number of other case by case issues where content needs to be there for users but where it will be problematic for search engines.

Always, always, always use this if you have to block your own content from search engines:

<meta name="robots" content="noindex, follow">

Never, never, never use this on your own content:

<meta name="robots" content="noindex, nofollow">

Remember how I said earlier that almost all rules have exceptions? This one doesn’t.

The “nofollow” tag tells search engines not to pass PageRank through the links. It does not prevent PageRank from being divided by the number of links on the page, and you cannot stop the PageRank damping factor.

When you use “noindex, nofollow” you are telling Google to throw away all the PageRank that flows into the noindexed page. When you “noindex, follow” you are telling Google not to index the page, but to pass the PageRank forward so that the links back to your site inherit that PageRank.

Likewise if you “nofollow” a link to one of your own pages instead of using “noindex.” Don’t do it!

So there you have it. Have your own on-site issues to share? Let’s hear them in the comments.

Resources

The 2023 B2B Superpowers Index

whitepaper | Analytics The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing

whitepaper | Analytics Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook

whitepaper | Digital Marketing The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

whitepaper | Digital Marketing Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y