Every SEO strategy has its good and bad
Search engines have long kept their ranking formulas secret in order to give users the most relevant results and to preclude malicious manipulation. Despite all the secrecy, though, it remains extremely difficult for search engine companies to prevent either direct or indirect manipulation.
We can broadly classify search engine optimization (SEO) strategies into two camps: passive and active. Passive optimization occurs automatically by search engine robots, which scour the web finding and categorizing relevant sites. Most websites listed on the search engine result page (SERP) fall into this category. If it were up to search engine companies like Google and Yahoo, this would be the only type of optimization that existed.
Active optimization takes place when website owners purposefully engage in activities designed to improve their sites’ search engine visibility. This is the kind of search engine optimization we normally talk about when we think of SEO.
Let’s go one step further and classify these deliberate forms of optimization based upon the tactics employed. In general these are: basic principles, exploiting flaws, algorithm chasing, and competitive intelligence.
Basic principles are just that. We know that search engines, just as like human beings, intuitively recognize a website has a lot of information about, say, “classic cars” when that phrase appears many times in the text, has links to other sites about classic cars, and is linked to by similar sites. Using such basic principles, we can optimize our site by creating content-rich web pages that include the targeted keyword phrases. Moreover, we can attract high quality and relevant links, and we can make our website navigation easy for both the search engine crawler and our visitors. This strategy is primarily used by so-called “white hat” SEOs. The problem is that search engines do not always rank every such relevant site highly. It seems unfair, but it’s true. Here is an example.
Exploiting flaws is technique that involves altering a web page in such a way that it confuses search engine ranking algorithms and drives them to give a high ranking to an undeserving page. This strategy is primarily used by “black hat” SEOs. Of course, such tricks are often short-lived—search engine companies are on a constant lookout to plug these holes.
Algorithm chasing is an optimization technique based upon serious study on search engine patents, public research papers, and observable data from search engine result pages. The goal is to identify patterns that might give a near approximation of the ranking formulas employed by major search engines, and thus provide a clear path toward search engine optimization. This strategy is primarily used by what we call “gray hat” SEOs. The drawback to this approach is its inherent difficulty, compounded by the fact that search engines companies regularly change their algorithms.
To illustrate how difficult this actually is, take a moment to consider all the rankings factors listed by Seomoz.org (http://www.seomoz.org/article/search-ranking-factors). This is not a scientific deconstruction of search engine algorithms, but rather a poll of expert opinions—it’s what SEO experts believe is going on in those elusive data centers. Most estimations are no doubt based upon empirical evidence, but some are nothing more than educated guesses. Between 2005 and 2007, these perceptions have changed many times. Of course, so have the algorithms employed by Google, Yahoo, Microsoft and others.
Competitive intelligence is based on the idea that optimization can be achieved by carefully examining high-ranking websites, their content and their links. With just a basic understanding of how search engines operate, we can deduce how those sites achieved their high rankings and then apply that knowledge to the sites we want to optimize. This strategy is also used by gray hat SEOs. The technique can be very effective, but it serves to note that websites with high rankings are listed prominently for varying reasons. Some rank high for just a few days or weeks, and others for several years. Choosing the wrong sites for competitive intelligence can result in unreliable results.
The need for a more dependable SEO strategy
We know that a search engine’s job, first and foremost, is to rank highest the most relevant websites for each and every search we make. These sites are considered web authorities. For example, you couldn’t imagine a web search for “Apple Computer” not listing Apple Inc.’s homepage. Search engines need to find these web authorities and list them first.
The strategy that I’d like to propose combines competitive intelligence with web authorities. That is, we must devise techniques designed to identify such web authorities for the target terms we want to optimize, and apply competitive intelligence to understand why those sites get to be top-ranked. Only by optimizing our sites based on this knowledge can we truly succeed.
Let’s call this strategy SEO Intelligence. At first glance, our technique might seem gray or black hat. In reality, however, it is not at all. We must always keep in mind that, in order to optimize our site and become a web authority, we still need to provide truly useful content. Only then will visitors and the topical community link to our site and grant the authority we need.
A word of caution
One of the most frustrating experiences about learning search engine optimization is the large number of often conflicting ideas and suggestions from SEO experts in blogs and forums. Evidence of this is the “Most Controversial Factors” section of the 2007 ranking factors report. The problem is that some ideas and suggestions are “gut feelings”—conclusions based on incomplete experiments, a misunderstanding of basic principles, or the ever-changing search engine ranking algorithms. As a rule of thumb, it is generally good practice to follow advice that is backed with observable evidence or that you can experiment with on your own to confirm or debunk.
In future posts I will discuss the tactics and techniques necessary to identify web authorities.
July 4, 2007 at 11:19 am
Wow! Great header, Hamlet. Keep improving your blog and I will do the same with mine.
July 4, 2007 at 12:40 pm
Paul, I am glad you like it. My web designer helped me improve the blog a little bit. He is currently working on a new and original theme for my blog. It's going to take a few days to be finished, but I think it is going to be worth it.
July 4, 2007 at 1:04 pm
Yes the header is a lot more appropriate, another really interesting post BTW...
July 4, 2007 at 1:40 pm
Thanks. I am working to improve the overall quality of the blog.
July 4, 2007 at 11:16 pm
I recommend you: <a href="http://pearsonified.com/" rel="nofollow">http://pearsonified.com/</a>. There are quite useful information about improving your WP.
July 5, 2007 at 4:11 pm
Thanks Paul, we follow his work. We learned about him from his excellent work on the copyblogger's theme :-)
July 4, 2007 at 11:33 pm
"SEO Intelligence" - SEO became a science and there are so many different factors that need to be taken into consideration that being an SEO expert means - research, analysis, knowledge exchange. It is getting more difficult every day because there are more and more web pages on the Internet but the SERP is still the same - 10 places. I am wondering what the situation will be in 10 years. 3D SERP? More advanced vertical search? I am happy with the trend - you can't just jump into SEO these days and have great results in one month. You need to learn a lot, work hard, stay motivated and keep improving your analysis skills every single day. Getting crowdy, ha? That is why we need more sophisticated methods and ways to improve our websites. Thanks for helping us with this journey, Hamlet.
July 5, 2007 at 1:50 am
SEO is a science, thanks for the post :)
July 5, 2007 at 8:52 am
I agree but it wasn't a few years ago when every kid who knew something about it could jump on the first page. And search engines were not so smart as they are now.
July 5, 2007 at 10:17 am
I wouldn't say that they are any smarter. Just more complex. I still see a lot of poor quality sites that rank in the top 10. If they were smarter, they'd have improved their quality. They are certainly harder to manipulate then they used to be.
July 5, 2007 at 4:13 pm
nicknick, Thanks for your comment
July 5, 2007 at 11:49 am
For me smarter and more complex is the same. <blockquote>I still see a lot of poor quality sites that rank in the top 10.</blockquote> It does not mean that search engines are not getting smarter. They are not perfect. <blockquote>If they were smarter, they’d have improved their quality.</blockquote> How do you know they haven't? <blockquote>They are certainly harder to manipulate then they used to be.</blockquote> So they are smarter? :-)
July 7, 2007 at 1:41 pm
Very nice Hamlet. Good work. If you keep writing like this, I'll be a reader for a <strong>long</strong> time.
Web Design Newcastle
September 25, 2007 at 1:43 am
Wow. Pretty inticing title there Hamlet. My lips are watering at the thought of installment two. Good job.
January 16, 2008 at 9:34 pm
SEO will just keep evolving and evolving as time passes on. Search engines are really young if you think about how long ago they were created. There will always be a constant battle between SEOs and SE programmers.
January 25, 2008 at 10:54 am
I am trying SEO for pass one year. Out of my 7 sites one 1 works for a while but not able to implement others. Hope this information will help me out understanding further.
February 12, 2008 at 8:03 am
"One of the most frustrating experiences about learning search engine optimization is the large number of often conflicting ideas and suggestions from SEO experts in blogs and forums... The problem is that some ideas and suggestions are “gut feelings”....it is generally good practice to follow advice that is backed with observable evidence ...." I like this. Reading the SEO forums can be confusing and misleading. What we really need is a set of proven principles. I'm looking forward to your next post on this topic. One such example of steps that lead to SEO results is the wonderful story of Julian Paling, a hardworking London SEO, who describes "How I got to Number 1 in Google." I think you'll enjoy it. <a href="http://dfinitive.com/blog/seo/how-i-got-aerobed-to-number-1-in-google/" rel="nofollow">http://dfinitive.com/blog/seo/how-i-got-aerobed-t...</a>
February 17, 2008 at 9:29 am
wow you have a very good web, well talking about seo, it's is very hard to improve our seo, because now a days there are to many new site/blog..
February 20, 2008 at 5:05 am
Physics, Mathematics, Seo... Seo becomes science... :D
May 6, 2012 at 7:35 pm
Making that important distinction between passive and active search engine optimization -- and then using the advanced principles you list and describe -- is key to seeing a high search rank in an awfully competitive digital world. Thank you for sharing!