In this blog, I’ve often spoken about different link-building strategies. Generally, we can break them down into two categories: chasing links vs. letting them chase you. Both methods have their pros and cons, and personally I’ve found that a mixed approach of link acquisition and link baiting is best. In this post I’m going to talk specifically about how each works and the strategies to employ. Whether you are a “chaser” or a “chasee” I’m going to tell you why you should make sure you’re doing both. Read more
Based on the emails and response I received for my contribution to the “Link Building Secrets” project, I know that I am not the only one that loves to use metrics to measure how close I am to my goals. Thanks to everyone for your emails and encouraging comments. In this post I want to reveal another useful metric I use for our internal and client projects.
When you check the backlinks of sites ranking for competitive keywords (terms with many search results) you see that those sites have a large number of links pointing to them. But if you count the links of the top ten (using Yahoo Site Explorer, as the rest of the backlink checkers are not very useful), you notice that the results at the top don’t necessarily have more links than the ones at the bottom. This is the case because each link carries a unique rank-boosting weight (real PageRank and other link-value factors in the case of Google) that contributes to the ranking of the page for that particular term. In order to simplify things, I like to refer to the combinations of positive and negative link value factors of a page as its Link Mass. Read more
As the Web keeps growing, search phrases become more competitive, and the demand for links increases, the art of link building becomes far more difficult. It’s that much more difficult if you only know traditional link-building tactics. As we move forward, it’s going to be increasingly important to think outside the box and use our creativity to come up with new link-building ideas. Fortunately, as a regular reader of my blog, you won’t have such a problem. 😉
David Hopkins, a loyal reader, asked me last week if I had some advanced link-building strategies up my sleeve. As a matter of fact I do and, as you know by now, when a loyal reader asks I deliver. I have been overwhelmed lately, but luckily Paul sent me an e-mail yesterday unwittingly reminding me about this topic. Here is what he wrote:
I was reading about mingle2.com on SEOMoz and I was wondering how did Mike [Matt] managed to have so many visitors in such a short period? High position on ‘free dating online’? What do you think?
The post he is referring to is the one in which Matt says he is leaving SEOmoz. I had read the post too and found the numbers truly amazing. I also read an interview that provides more background information about Matt’s phenomenal success, but instead of explaining how he did it (Matt explains this in the interview) I think it would be more useful to generalize the concept and provide a solid framework so that you can build off of the idea.
This is the final post in the long tail vs fat head optimization strategies series. The focus of this post is to expand on the optimization strategy for highly competitive keywords. We are going to leverage our insights from the link analysis explained in Part 2 to build better and smarter links.
The purpose of link analysis is to identify the link sources that are providing the ranking boost to your competitor. I explained several principles that are very important when evaluating links. Ideally you will try to get the same link sources that are linking your chosen web authority to link to you (at least the most important or more authoritative ones). You will also want to make sure the links come with similar anchor text or textual context (text surrounding the anchor) and complement all this effort with some traditional and out-of-the-box link-building tactics. Unfortunately, this is easier said than done.
We all know that building solid, natural and authoritative links to your site or blog is the best way to obtain unshakable rankings. There has been a lot of chatter lately about using social networking sites to help build traffic and eventually links. Those links are hard to get, unless of course you have a power user account at a popular site like Digg. Unfortunately, getting a power user account involves a lot of work that most bloggers are not willing to put in.
Power users carry more weight than regular ones, the main benefit being that you need far less votes to make it to the Digg homepage. One hundred power users account for around 50% of the stories that make it there. The traffic you would get by being linked to the Digg homepage is not itself profitable, but many of those eyes glaring at the screen are the linkerati – influencers that will link and blog about your story, giving you a lot of very valuable natural and authoritative links.
The basic ingredients for success at Digg are: diggable posts/articles, a power user submitting your article, and a digg-friendly landing page. Most stories make it to the home page if they get over 50 votes in less than 24 hours. To help you achieve such numbers, I've seen many blogs directly or indirectly promoting a service called Subvert and Profit (S&P), designed to get all the votes needed to land on the Digg homepage. You basically pay $1 per vote and they pay $.50 cents to the Diggers. This means that if you create a diggable post and a digg-friendly landing page, you only need to invest around fifty bucks to make it to the Digg homepage. And if you can make it there, you can make it anywhere. Right. Read more
According to the number of Sphinns, it looks like a lot SEOs agree it’s a myth. That’s understandable, as it would be very unfair for the search engines to allow this type of thing to happen.
Unfortunately the situation is not as simple as it first seems. As has been my practice on this blog, let's dig a little bit deeper to understand why—although difficult and possibly expensive—it is very well possible to pull of this exploit. For those concerned, I explained how to counter this type of attack in a previous post about negative SEOs. Check it out. Read more
Link building is without a doubt the most time consuming—but most rewarding—aspect of search engine optimization. It usually takes more effort to promote your content (build links) than to actually create it. As I have stressed repeatedly before, compelling, useful content should make your link building efforts much easier.
Before I go any further, let me note that I have a slightly different perspective when evaluating link-building tactics than most SEO consultants. I do SEO primarily for my own sites and my income depends on the ability of those sites to make money. That means that I try to build links that primarily offer long-term value. I still try to get the short-term and medium-term value links, but I like to build authority for my sites. If you’re working for a client or a boss that wants to see immediate results, your priorities will probably be different.
In situations where I have to pay or put some serious effort to get a link, the most important criteria is always: Will the link send useful, converting traffic?
Why is this my most important criteria? Let's explore three different scenarios to illustrate this: Read more
Let's face it. We all like to check the Google PageRank bar to see how important websites, especially ours, are for Google. This tells us how cool and popular our site is.
Links that are too easy or relatively easy to get do not help much in getting traffic or authority for search engine rankings.
If your link is placed on a page where there are several hundred links competing for attention, it is less likely that potential visitors will click than if the page only has a few dozen links.
The value of your link source is in direct relation to how selective that source is when placing links on the page and how much traffic the source gets. The value also declines with the number of links on the page.
Google is understood to use algorithms to measure the importance and quality of each page. The PageRank was invented by Google founders and is used for measuring absolute importance of a page. The TrustRank algorithm describes a technique for identifying trustworthy pages — quality pages. We can not tell for sure to what extent Google is using this algorithm if at all, or at least their publicly known version. What we can say, is that based on observation, we can definitely say that they do not treat all links equal and they do not pass authority to your page from all of your link sources.
33 Wood Avenue South,
New Jersey 08830