When you pursue a competitive market you quickly find out how hard it is to rank for the highest volume terms. In education it's terms like "online degrees" and "nursing schools". In the mortgage space, it's terms like "mortgage" and "refinance". It's really hard to remove an entrenched competitor from high positions for these terms. First, they are making lots of money, and can afford to keep investing in staying there.
That's one of the beautiful things about the web, because looking at the real search terms entered by people really reveals something about the human mind. And, man, the human mind is all over the place. While a term like "refinance" is huge in search volume, the volume in related terms is much higher.
How big is this phenomena? At Google's Universal Search announcement, Udi Manber put up a slide that stated that 20% to 25% of the search queries Google sees every day are search queries it has never seen before. Let that sink in for a moment. To me, that number was startlingly large.
If we look a bit more deeply at how this plays out in a competitive market space, you will quickly see that in every market space on the web that there is this long tail phenomena. In conventional terms:
The sum of the searches on all the low volume terms = the sum of the traffic on all the high volume terms.
Better still, ranking for the low volume terms is often much easier to do. So you can get a ton of traffic to your site in a competitive market without ever ranking highly for the most important terms. This is why you see so many people talking about the long tail today. How do you chase the long tail? There are basically two ways to pursue the long tail:
Write in depth articles. This provides you access to long tail terms simply through the natural combination of words that the search engine will extract from your article. The scope of this is somewhat limited, of course, as there are so many word combinations that can be extracted from one article.
Implement lots of pages all targeted at different terms. The trick with this approach is to make the pages unique and different from each other, so they are not seen as spammy duplicate content.
The ideal world is to implement a site with thousand of pages, each with their own in depth article. However, this is not for the faint of heart. However, there are two major ways to go at achieving this goal:
User generated content. Social media sites that succeed in drawing an initial crowd have a strong potential to really take off, because the user generated content naturally creates a very strong long tail pull affect. It becomes a feeding frenzy, because more user generated content causes the site to rank for more long tail terms in the search engines, which drives more traffic, which drives more content.
Of course, the trick here, is to create the initial buzz around the site.
Machine generated content. Tricky waters here. I am not talking about machine generating crappy sites. The user experience still must rule, if you plan to be in the game for the long term. But if you have the ability to access a variety of large databases in your market space, the opportunity exists to present that data and present analysis of that data that really does add value to the user.
Neither of these things are easy to do. But it can be a lonely experience (and a not very profitable one) to chase a highly competitive market any other way.