Quality is king in search engine rankings. Of course spam sites using the latest techniques make their way up top—but their rankings are temporary, fleeting, and quickly forgotten. Quality sites are the only ones that maintain consistent top rankings.
This wasn’t always true. A few years ago it was very easy to rank highly for competitive search terms. I know that for a fact because I was able to build my company from scratch just by using thin affiliate sites. Everybody knows that there has been a drastic change. At least on Google, it is increasingly difficult for sites without real content to rank highly, no matter how many links back they get.
Why is this?
Search engines use “quality signals” to rank websites automatically. Traditionally, those signals were things you might expect: the presence of the searched words in the body of the web page, and on the links coming from other pages. There is increasing evidence that Google is looking at other quality signals to deduce the relevance of websites.
Here are some of the so-called quality signals Google might be using to provide better results:
Voluntary Quality Raters. It is well known that Google pays more than 10,000 people to do searches and rate the results they see. I can safely assume that these raters do not visit every single result to see if it is a quality one. Having great titles and meta descriptions is a good way to appeal to those raters. For example: If I am a Quality Rater and I see a listing that says: “Advanced search engine marketing tips to succeed online” and another that says “Search engine optimization, SEO, PPC, SEM, Search engine positioning…” which one do you think I'd rate as spam?
Involuntary Quality Raters. You are a Quality Rater, too—and you didn’t even know it! Each time you do a search, see the results and retype the search again, Google’s software takes notice. Even when you do a search, click on some of the results, shake your head and keep looking, Google is listening. Your behavior signals to Google that the results it gave you might not be very relevant after all.
(As an aside: EGOL at Seomoz pointed out an interesting idea: Theoretically Google could use the personalized information they gather from us (Google Reader, Toolbar, Web history, etc.) to tell what we like—but also what we know about. For example, if my profile tells them that I read a lot about science, they might trust my clicks when I do searches for 'genetic research'. If my passion was celebrity gossip, maybe less so!)
Visitors actions. Are visitors doing searches and staying on your site, or are they hitting the back button and re-formulating the search? You want to provide sticky content and most important of all, you want your content to match the visitor’s query. You can provide the best page in the world about baseball caps, for example, but if the user was searching for baseball scores, that’s a strikeout.
Social bookmarking/rating. Unique and useful content is what drives visitors to bookmark and thumb-up your content. Content with many of these is content that search engines can trust as quality content.
Analytics (Visit length/Bounce rates, Return visits). This might come as a surprise to the overly naïve, but Google is giving Google Analytics away for free for a reason. My opinion is that analytics provides quality information no other method can provide them (except maybe the Google toolbar). Sites with high bounce rates, short visit lengths and few return visits are big signals that scream: SPAM. Many webmasters don’t install analytics for this very reason. They are afraid. My opinion, though, is that Google uses the information in aggregate form, not for specific sites.
Feed subscribers/readers. One of the best measures of the success of a blog is the number of RSS subscribers. A lot of subscribers means the blog is of high quality. No wonder Google recently bought Feedburner and set it free. The data is worth a lot more than subscriptions. I am experimenting with Feedburner’s FeedFlares and I think I have a nice idea that I plan to share later to help you increase your RSS subscribers. Stay tuned.
What does it all mean?
I’m trying to get across here that chasing search engine algorithms to discover how they rank websites is a losing proposition. It is practically impossible to keep track of so many variables. Many of them are out of SEO control anyway.
The good news is that it is increasingly easier for sites that are good for users to rank high. As I mentioned before, you make better use of your time by working on building useful and original content for your blog. With good content and visitors visiting, recommending, bookmarking and subscribing to your feed, links and high rankings will follow. Quality has always been king, and it looks like it’s going to stay that way.