There are surely hundreds of different ways that people can mess up their sites. Based on our experience from working with scores of clients, here are the 9 most common problems that we find:
- Broken information architecture - These are sites that fail to map the nature of the information they are providing into an understandable hierarchy. Bad for users and search engines.
- Poor site usability - Results in poor conversion rates, poor lifetime value per visitor, and a less attraactive site for linking to.
- Mismanaged internal link juice - Some sites allocate their link juice poorly, resulting in not enough of it going to their most important pages.
- Content getting buried over time - A surprising number of sites that create huge amounts of content use publishing systems that buries stuff over time. Such a lost opportunity. People and crawlers are looking for this stuff, don't hide it!
- Bad redirects - Why is is that every engineer who does not have a background in SEO defaults to using a 302 redirect? OK, that's probably unfair (you might even say ignorant), but it sure seems that way at times.
- Poor titles and headers - Keyword tools are wonderful weapons. These are valuable for far more than SEO. Keyword tools tell you what language people use when referring to your products and services. Even in a world without web sites, this is something you would want to know. Then you need to make sure you have pages and content that addresses the major topic areas that relate to your business.
- Insufficient content - No content (or tools) means no links means no traffic. It's that simple. What unique value are you offering a visitor to your web site? Why would someone link to your site?
- Duplicate content - It's unbelievable how much duplicate content that some sites create. It's a common killer, and it's a very big factor in poor page rank management.
- All flash site - Very, very pretty, but not a great experience for a crawler. This does not mean you can't have an (almost) all Flash site. Just make sure you offer text link navigation options and use a technique like Scalable Inman Flash Replacement (sIFR) to tell the crawler what is in the movie.
- Same meta description (and keywords) on every page - These elements are included in duplicate content filter checks by the search engines. And, of course, the meta description is important because it often gets used as the description that the search engine uses for your web page, so make sure it describes the unique info to be found on that page.
We have run into many other problems along the way, of course, but these are the most common offenders. What other common SEO mistakes do you see? Discuss this over in the SEW forums.
The Original Search Marketing Event is Back!
SES Denver (Oct 16) offers an intense day of learning all the critical aspects of search engine optimization (SEO) and paid search advertising (PPC). The mission of SES remains the same as it did from the start - to help you master being found on search engines. Register today!