Optimizing for highly competitive keywords requires a completely different strategy than optimizing for non-competitive ones. First, let’s clarify a few points. When I talk about long tail or fat head keywords, I am talking in relation to the search demand for those particular keywords. I am not talking about the offer (the number of sites competing for those keywords). Although demand and competition are generally in direct proportion, there are cases where this is not the case, such as unexploited niches.
In this post, let’s just explore the simple case where you are targeting non-competitive keywords. They have decent demand but not a lot of competition. You may be asking yourself how you can tell the competitive and non-competitive keywords apart.
My strategy is to perform the same search several times, but using advanced modifiers to look for the keyword in the title, in the URL, in the text and in the incoming links. If there are a large number of sites for the incoming links search when compared to the other searches, it’s a good indicator of heavy competition. Why? Because as I mentioned before, this tells me search engines are relying more on the off-page metrics to rank the pages than the on-page ones.
An alternative approach is to check the number of back links of the top search results. I don't use that because not all links to a site are counted for each specific ranking. For example, a site like Amazon has millions of back links and the fact that some pages rank for particular product searches doesn't mean all their links contributed to that ranking. It is also important to note that, as I explained before, not all top-ten results will remain there. It is very important to identify the web authorities when doing your competitive research.
So now, you have used my technique and you have some really good keywords that don't have a lot of competition. The goal is to use on-page optimization techniques. First we study how our chosen competitor (be sure it’s a web authority) incorporates the keywords into its content. Then we proceed to incorporate the keywords into our content so that the metrics are the same or similar.
Relevance metrics are the quality signals or key differentiators search engines use to organize web pages. We need to look for density, prominence, weight, proximity and rarity at a minimum. Let me explain these metrics in more detail:
Density is how many times a keyword appears in the content compared to the rest of the keywords.
Prominence is how high in the page the keywords appear compared to other keywords.
Weight is the relative size of the font and capitalization used for the keywords.
Proximity is only relevant for phrases of two words or more and is about how closely the words are placed together.
Rarity, also known as term weight in Information Retrieval science, is a measure of how much the keyword appears in the document relative to all other documents.
Most SEOs like to use fixed numbers for these metrics, but I prefer to use the same metrics as the competing web authority I am trying to match or beat. The idea is that search engines should see the same or a similar relevance profile when they evaluate your page for those keywords as when they evaluate your chosen competitor.
This work is definitely easier to accomplish with a dedicated tool like the keyword density analyzer. The SEO suite I plan to release next month in public beta also incorporates such functionality.
How profitable your SEO efforts are going to be is dependant primarily upon how much demand your keywords have and your ability to meet that demand better than your competitors. On the other hand, how likely you are to get the necessary rankings in the first place is going to depend largely on how much competition you face on the SERPs and developing your winning strategy.
For competitive keyword phrases we need a more involved strategy that I going to explain in a follow up post…