Nothing makes for crappier morning than when you grab a coffee, sit down to your analytical dashboard, and find out that you or your client’s organic traffic has hit the floor. Even worse, you next discover that the top ranking pages are now non-existent in the search engine results pages (SERPs).
Have you educated a client’s developer about the no-nos of search engine optimization (SEO) at a structural level? Let’s examine a few structural site elements that, without proper instruction, can lead to SEO disaster.
The first stop for any crawling search engine is the robots.txt file. Using this file can be a great way to exclude non-search engine worthy content from the eyes of the bots. But this file also has the potential to wipe an entire site out of the search engine’s indices.
From the most basic level, any instance of “Disallow: /” will leave the site completely invisible to search engines. It’s also important to note that any inclusion of top-level directories (i.e., “Disallow: /top-category/”) will force this category and all subcategories further down out of search engine view.
- Check out your robots.txt file directly from your site, Google Webmaster Tools, or other SEO tools such as Optispider.
Robots Meta Tags
While it’s possible to exclude specific pages at the robots.txt file level, you also have the ability to exclude content and links at the page level via the robots meta tag. This tag can be placed in the head of the source code or anywhere in the page source with still the same SERP position swiping power.
Knowledgeable SEOs who want to pass or exclude link value while either providing or restricting content availability to search engines often use these specific tags. Much like the robots.txt file, this tag can do a lot of damage in the SERPs for your SEO campaign. Any usage of this code element should be passed on for approval by the site’s SEO resource.
- Check out your internal nofollow Robots Meta tag usage at Open Site Explorer
Misuse of this HTML attribute isn’t as common as the two previously mentioned. However, this deserves mentioning because it falls into the overall category of accidently blocking the pathway or restricting access to search engines.
This tag is found appended to link code as [rel=”nofollow””. This is another tool that can be used to sculpt the flow of internal link juice. While this won’t have a devastating impact on SEO efforts, it can definitely be a hindrance if the tagging is placed and the SEO team isn’t notified.
- Check out your nofollow tag usage at Open Site Explorer or the SEO Toolbar.
These three elements are widely understood and considered commonplace in SEO. Each of these elements can be utilized to help your SEO efforts if done right. So make sure that those who are involved in site maintenance efforts understand that misusing these files and tags can bring organic search referrals to dizzying lows.
Sometimes the simple mistakes are the ones that hurt the most. Assuming that all parties involved in an SEO campaign are as savvy as you could lead to SEO disaster.