SEOAre You Making These Three Silly SEO Mistakes?

Are You Making These Three Silly SEO Mistakes?

Ignoring link equity, mismanaging exclusions and failing to communicate with search engines are just three common - but easily-avoidable - mistakes marketers make regarding their SEO.

While SEO is not a new thing to many, it seems like the limelight has been given to other topics, like content marketing. SEO will never die as it is greatly focused on how we get our sites indexed as well as how they are perceived by search engines. I truly enjoy the endeavor of onboarding a client site into an SEO initiative but find that there are many out there that make some common mistakes.

While your mind might be on how much you are going to catapult your Organic visibility and traffic, you have to ensure you are not making the silly mistakes that are going to keep you from getting there.

 

1. Letting Link Equity Die Off

As I often say, links are the right leg of SEO. Of course, the left leg is on-site SEO, but the point to be understood is that your domain credibility with search engines heavily relies on your link profile. Let’s take a look at the few mistakes I see in link equity mismanagement.

The first mistake I often see is letting pages with link equity die off. Maybe in your recent site redesign you decided to consolidate pages and instead of properly redirecting those pages, you are now letting them simply sit in a 404 error code abyss. This does not allow a user as well as a crawling search engine to pass from the linking site to yours, and with this, the link equity does not pass through either.

Second, you were mindful enough to redirect these pages but did so through a 302 temporary redirect, so you’re not getting the utmost amount of link equity passing through to the new destination page.

What do I do?

Take a look at the Top Pages section in Open Site Explorer. This will show you the pages of your site that have inbound links directing to them. You will be able to tell if these linked pages are showing a 404 or 302 status. Additionally, review your Bing Webmaster Tools account within the Crawl Information and Robots.txt Exclusion section to understand what pages you are excluding from crawling(more on exclusions below) and the amount of links those pages possess.

seo-mistakes1

 

seo-mistakes2

2. Mismanagement of Exclusions

It is imperative that we show search engines the content that we want to rank for. However, the mistake many web teams often make is accidently including Meta Robots no-index tagging on pages. The typical culprit of this is that you had site pages in staging, and you did not want search engines to see them, so you employed a Meta Robots exclusion in the page source Head section. When the page rolls through production the removal of this tag is forgotten. Soon, these pages are removing from the indices of search engines due to this no-index exclusion.

Sometimes, crawling gets removed due to improper robots.txt directives. If your site has a robots.txt file, the additions to the file may be telling search engines to not crawl certain pages, folders or page types. While this doesn’t mean these pages will immediately fall out of a search engine index, it does mean that they will not be crawled and eventually will fall out of indices. When reviewing the robots.txt file, ensure that you are not blocking search critical pages, site images, JavaScript, CSS or any supporting page elements that are needed to load site pages.

What do I do?

Utilize tools such as Screaming Frog that will tell you what pages are found on your site and all of those that may be carrying a Meta Robots tag. Additionally, review your Google Webmaster Tools account within the Fetch as Googlebot section. This will tell you what page elements are not visible due to crawling exclusion. Also, review the Robots.txt Tester to analyze what errors may exist here.

seo-mistakes3

seo-mistakes4

3. No Communications with Engines

It boggles my mind when I see that webmasters want search engines to see their content but do nothing to help these crawling search engines know what on-site pages are important and fail to provide education about site elements. There are two key areas of interest here, you need to provide sitemaps for search engines, and you need to help them understand your site content and your focus. Within Google Webmaster Tools and Bing Webmaster Tools you are able to tell the engines more about your site.

These include:

  • The ability to know if your schema markup is valid and error free
  • The chance to incorporate Data Highlighter information to inform Google more so on your products, events, etc.
  • Upload your sitemaps to understand how many pages are being indexed
  • Indicate your targeted country of focus
  • Exclude perimeters
  • The ability to demote sitelinks in search results

What do I do?

Make sure that you create and validate profiles for your site in Google Webmaster Tools and Bing Webmaster Tools. In the least, submit sitemaps to search engines here and begin the communication process with them so that they understand what to crawl and gain an understanding from you how much of your site content is getting indexed.

seo-mistakes5

Check back next month for part two, with four more common but avoidable SEO mistakes.

Resources

The 2023 B2B Superpowers Index
whitepaper | Analytics

The 2023 B2B Superpowers Index

8m
Data Analytics in Marketing
whitepaper | Analytics

Data Analytics in Marketing

10m
The Third-Party Data Deprecation Playbook
whitepaper | Digital Marketing

The Third-Party Data Deprecation Playbook

1y
Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study
whitepaper | Digital Marketing

Utilizing Email To Stop Fraud-eCommerce Client Fraud Case Study

1y