Link Mass: How to determine how much effort it takes to rank for any particular keyword phrase

Based on the emails and response I received for my contribution to the “Link Building Secrets” project, I know that I am not the only one that loves to use metrics to measure how close I am to my goals. Thanks to everyone for your emails and encouraging comments. In this post I want to reveal another useful metric I use for our internal and client projects.

When you check the backlinks of sites ranking for competitive keywords (terms with many search results) you see that those sites have a large number of links pointing to them. But if you count the links of the top ten (using Yahoo Site Explorer, as the rest of the backlink checkers are not very useful), you notice that the results at the top don’t necessarily have more links than the ones at the bottom. This is the case because each link carries a unique rank-boosting weight (real PageRank and other link-value factors in the case of Google) that contributes to the ranking of the page for that particular term. In order to simplify things, I like to refer to the combinations of positive and negative link value factors of a page as its Link Mass.

Understanding Link Mass

In order to explain link mass better, let me first outline a couple of fundamental concepts: importance and reputation. Importance (or PageRank) is a measure of the visibility of a page. This is regularly represented by the sheer number of incoming links a site has and their respective importance. However, a page can be very important (highly visible), but not necessarily very reputable. For search engines the reputation of a page depends primarily on how much it is associated with spam. Search engines internally label each page in their index as spam or not spam (using a range instead of just black or white). The best way to understand reputation is to think of it like your credit score. The more infractions you commit, the more points are taken from your score. Similarly, the more your site links to sites that are considered spammy, the more it affects your score. There are other factors that can affect your reputation and flag your site as potentially spammy: getting too many links in a short period of time without natural signs to support it, having an artificial (unnatural) pattern in your link structure, evidence of massive link exchanges, or having the majority of links coming from very low-quality sources (already flagged as spam by the search engine). The idea is that your importance improves with the quantity and quality of your incoming links, and your reputation diminishes with any signal that search engines can pick out flagging your site as guilty of link spam.

With these concepts clear, it is simple to explain link mass. The link mass of a page is the sum of the importance (or real PageRank) contributions from incoming links, minus the reductions in the reputation score for the reasons explained above. A few other things to note:

1. The more links you have and the higher their quality, the higher your link mass; the more negative elements in your site or the sites linking to you the less your link mass.

2. Pages with higher link mass contribute more link mass to the pages they link to than pages with less link mass.

3. Links coming from diverse sources contribute more link mass than links coming from affiliated sources.

Calculating Link Mass

Determining link mass requires a careful and thorough analysis of the link structure of a site or page. First you need to extract all the links to a site, as I explained in a previous post (BTW, I still plan to release the tool I promised in that post, I am just not very happy with the performance at the moment), and then you need to measure the importance and reputation of each incoming link. The importance can be as simple as determining the toolbar PageRank. Determining the reputation of a page is difficult because we don’t know for sure which pages Google or other search engines flag as spam. However, I think the technique I described in the link building project article offers a good approximation of the link mass of a page, because any negative factors are factored in when search engines decide to prioritize the crawling and indexing of the site. After all, why would they want to crawl a page often if they believe it is not very reputable?

Identifying link opportunities

On the Web there are content consumers and content producers. It is important to understand this because content producers and content consumers do not regularly speak the same language. Content consumers are the ones who type queries into search engines. Search engine rankings, however, are primarily influenced by the content producers via the links and endorsements they include in their content on a regular basis. I am not going to discuss here the motivations of content producers to give links (paid or editorial), but rather I want to offer a very simple way you can identify link opportunities. When a content producer writes content naturally, it is unlikely that he or she will use the exact same words that somebody (the content consumer) types into the search box. In a previous blog post, I talked about how people type their problems into search engines by expressing the symptoms they have, not necessarily by typing in the solutions. This is why content producers need keyword research to write more successful content. It is also one of my favorite strategies to find keywords with low link mass (and thus require less effort to achieve).

How to determine the effort required and project the ROI

Link mass is therefore an important metric to consider when determining the amount of effort necessary to rank for a particular term. Just counting the sheer number of links for a top-ranking site doesn’t tell us the whole story. We need to consider the importance and reputation of each one of the links, because some links will contribute more link mass (ranking power) than others, and it is usually a good idea to try to get the hardest ones first. The easy ones will be easy. 🙂

Different link-building strategies require different amounts of effort, time and budgets. If you are a smart link builder you will try to sync your keyword research with your link-building strategy. Among a group of relevant keywords for your site, you want to focus on the ones that have the same or similar demand (volume of searches or potential clicks) but that require the less effort (lowest link mass) in order to begin seeing results sooner rather than later.

It is best to keep a simple spreadsheet where you can tally those numbers. For example, let’s say that you are trying to get your site to reach a link mass requiring ten links with a PageRank of 7, five links with a PageRank of 4, and twenty links with a PageRank of 3. You also want to make sure that the sites where the links are placed do not have any visible spam signals, such as a very old search engine cache or massive link-exchange directory. Then you measure how much time and effort it takes you to get a PageRank-7 link, a PageRank-5 and a PageRank-3. Please note that I am not talking about buying links, but the effort (e.g. research, write and pitch a guest post to a site with such characteristics, or finding a dofollow blog and leaving a thoughtful comment). Knowing your target link mass along with these numbers will allow you to project the cost and time to reach a particular ranking. You can combine the data from your keyword research to determine the revenue potential of the keywords, and with some simple math calculate the ROI for yourself or your client.

Final Thoughts

The two most fundamental aspects of any SEO project in terms of the return for time spent are keyword research and link building. You need the keyword research to make sure you are getting relevant traffic, but without links you won’t rank for any keyword that is going to send you search traffic, period. I hope this helps you with your own link-building strategies. Let me know in the comments.

23 replies
  1. Gavin Mitchell
    Gavin Mitchell says:

    Interesting to see your scientific approach to the problem, Hamlet. I think most SEOs perform a similar analysis before starting a new campaign, but on a more informal, cursory basis.

    My main concerns would be the time it takes to manually perform analysis like this when the competition has extensive link profiles and the level of complexity you'd probably need to go into for a good level of accuracy.

    Is this something you're working on automating?

  2. Hamlet Batista
    Hamlet Batista says:

    Yes. I agree it takes a lot of work, but I think that it makes the SEO effort more predictable.

    This is one of the ideas that we have in research and development. I decided to start exposing them here to get early feedback and measure interest.

  3. Jez
    Jez says:

    Regards mass link exchange / purchase, what is your opinion on systems like

    Do you think they will survive?

    Also, in sectors like "debt management" the top performers all seem to be supported by spam large spam networks linking up through progressively better quality sites. Looking at sectors like that increasingly makes me question Google's ability to deal with spam.

  4. Arnie
    Arnie says:

    Another excellent article. 99.9% of the businesses out there just do not understand how much work is involved in rearching and obtaining high quality links. Can't wait to see the new tool you are developing.

  5. Jeffrey L. Smith
    Jeffrey L. Smith says:


    Impeccable post. I am glad to see others who give so freely with such prized gems. That technique is worth its weight in gold. I have a similar metric I use for determining the barrier to entry for search targets, but I doubt I could have described it so clearly. The narration is flawless. Thanks for dropping by this week and reading a few posts, it's an honor…

  6. Aldo N. Gomera Cruz
    Aldo N. Gomera Cruz says:

    I was searching through articles about Link Mass as part of an assignment for my Advanced E-Commerce Class and actually found what I was looking for in this excellent blog post.

    I really liked when you talked about the factors that can affect the reputation and flag a site as potentially spammy. I think this is the main reason that prevent the success of a lot of new sites in the market, they just want to get as much links as they can, without caring about the reputation of those links or projecting any ROI. That's why our potential spammy sites list grows faster every day.

    Cheers from a Dominican at Utah State University!

  7. Hamlet Batista
    Hamlet Batista says:

    Jez – I think that they will find more creative ways to hide their link networks. I've seen some link buying that is really hard for search engines to detect.

    Arnie – Thanks. I will make sure to let you know once it is ready.

    Loren – I'm glad you enjoyed the post.

    Jeffrey – Thank you very much for your kind words. You have excellent material on your blog. Keep up the good work.

    Aldo – Thanks and welcome to my blog. It is really interesting that you are studying about this in class. Please stay in touch.

  8. David Hopkins
    David Hopkins says:

    Do you have any reference for Goolge giving pages a spam score? I have noticed that pages that get bombed with spam from the usual suspects (Viagra, gambling etc) often have no toolbar PR, when their sibling suggest they should have PR. This happens even when the spam has had its tags stripped.

    Looking forward to the possibility of your tool that can retrieve all links from Y!. Have you ever tried doing something like this: -site:serpsvilian .com

    I noticed there is also some argument in the Yahoo API that lets you ignore links from certain domains.

  9. Google Search Sucks
    Google Search Sucks says:

    Good point I like to use the term Link Mass. However, one thing I noticed is that in some sectors you have to embrace spam techniques simply to compete. Keeping yourself completely white hat for these terms is an uphill battle you cannot win. For example one niche I thought I could help a client win is not going well because everyone on the first page has tons of "review my site" types of back-links. There is no way I could prove that all of those links were paid for, but any SEO can tell. Moral of the story… link mass varies by vertical and industry.

  10. Motivational Guru
    Motivational Guru says:

    I do agree that link building and keywords is indeed important in SEO. It although take a lot of time and effort to make your page rank but it is indeed worth all of the effort if your able to do it. I have taken some of your tips and will keep in mind since am starting out with a new blog and need to have it rank .

  11. Scott
    Scott says:

    Re the comment on google giving a spam score, it would be very interesting to see if any major sites / keywords show up with high spam score, a high PR and also high up in the search results.

    I've been looking at a lot of I guess you can call them "spammy directories" with recent listings from major uk companies, one being Its interesting to note that for such a large company its homepage page rank is zero.

    One question I have whilst on the subject, and is probably answered elsewhere, why does google only show very limited backlinks for a URL from the search results, but from google tools its a completely different result ?

  12. Paul
    Paul says:

    before i was really confused of the search engines all the time gives the tops, now i could understand that this is because of the links are many thank you very much for the information

  13. Money Merge Account
    Money Merge Account says:

    i agree with what mr googleman had to say… them SEO peoples can easily tell (for the most part) which links are being paid for 😮 But yea, know which key terms have the most traffic is crucial in succesfully marketing your business.

  14. Palcom Web
    Palcom Web says:

    Search engine optimization is a time taking and tricky business. It requires a lot of effort and hard work to rank in top. But the key phrases used to rank well on one search engine may totally fail or be less effective to rank on other search engines. Well all the majorly known search engines differ from each other in some form or the other. It is for this reason that some people create web pages for a particular search engines while the rest of the pages are created for other search engines. Usually a slight difference is present in these pages. So when indexing takes place the search engine crawlers might find the slightest difference and mark them as spam. To overcome these difficulties a robot.txt file is created which is a simple txt or word pad file that is uploaded in the root folder of your site.

    Write the following
    User-Agent: (Spider Name)
    Disallow: (File Name)

    To disallow all engines from indexing a file you simply use the * character where the engines name would usually be. However beware that the * character won't work on the Disallow line.

  15. Marc Crouch (Alterit
    Marc Crouch (Alterit says:

    Very useful article, thank you for posting it!

    I was actually trying to explain this to somebody the other day, and was struggling to convey the importance of link quality. Your analogy with credit scoring system is spot on, I will definitely use that in future!

  16. Ryan Bassett
    Ryan Bassett says:

    Excellent article on determine the competitive analysis of a competing website. I use SEO Powersuite’s SEO Spyglass for this. Although web tools like Magestic SEO might be better. This is super important so you don’t waste your efforts trying to rank for a keyword that is outside your reach. I hope that people get to read this post.


Trackbacks & Pingbacks

  1. […] Hamlet Batista explains what Link Mass is […]

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *