The Fascinating Role of AI in Accelerating SEO Success
In this post I’m going to officially introduce RankSense: its rebirth as an SEO Artificial Intelligence platform, why I built it, and the problems it aims to address.
I believe that SEO success should take a few weeks, not six months or more. Unfortunately, that is the sad reality we live in now. Developing new content, building an audience around it, and waiting for Google to reward your efforts does take many months. But is that the only way to see SEO success?
When you operate a large site, such as an online retail or enterprise site with thousands or millions of pages, you have several factors that greatly delay SEO results. On the flip side, you also have other key SEO levers to pull besides content marketing. I covered one of those in detail in the popular Practical Ecommerce article “How to Detect and Correct Duplicate Content Pages.” Technical SEO remains a massive untapped opportunity for large sites.
In order to understand the current challenges, I’ll go through the best-case scenario I saw while running my SEO agency. The best-case scenario is an SEO-savvy company that understands that pages need to be indexed before they rank. This was not typically the case because most companies’ understanding of SEO is limited to ranking #1 for competitive head terms.
Here is the scenario:
An SEO consultant or in-house SEO starts with a Screaming Frog, Moz or Deep Crawl crawl of the site, which can take several days or weeks, depending on how big the site is.
Our SEO specialist takes the reports from those fine tools and writes a detailed list of action items, which are converted to JIRA tickets. The technical tickets get assigned to developers, and the content/writing tickets get assigned to marketing people or are performed directly by the SEO specialist.
Developers are generally busy with other projects that get higher priority because their ROI is clear and certain. SEO projects often don’t get prioritized. I believe most companies only invest “gambling” money on SEO: money that could have a big payoff but that they aren’t afraid to lose.
Developers publish changes according to their software development lifecycles and the complexity of the fixes. In some cases, the fixes or recommendations are impossible to implement because of limitations in the content management or e-commerce system.
Later, the changes get published to a staging environment, and the SEO consultant or in-house person reviews them before they are released to production.
The changes are now live, and we need to wait weeks for Google to pick up the changes.
So, in this best case scenario where the company is SEO-savvy, the developers are available, the teams execute with precision, and the recommendations work, we have to wait around 2 months to see results.
Now, imagine the improvements added $5,000 per day in organic revenue. If we could accelerate the work by one month, we would save $150,000 in opportunity cost, money that could be invested in other marketing efforts.
Accelerated results would also allow us to determine the effectiveness and ROI of the SEO strategy faster and more accurately.
There has to be a better way. Right?
That is what I thought.
I designed our artificial intelligence SEO software with the main goal of compressing time in the SEO cycle. This goal was anchored on three core principles:
- Tasks should be done in parallel.
- Repetitive, high-value tasks should be automated.
- Processes should be improved by a data-driven feedback loop.
In software engineering, we have the well-known concepts of batch and stream processing. Batch processing is essentially how SEO work is done now. All the tasks are completed in large chunks which must be performed sequentially because they depend on one another. This is the obvious choice when you have people performing the work. Each person is specialized on a different type of task. One specialist, say a web developer, needs to wait on the completed work from another, such as an in-house SEO.
However, when you have machines doing the work, you can tell them to complete small portions of the work and collaborate to do independent tasks in parallel.
Let me explain how RankSense completes the workflow I described above in parallel.
The first breakthrough idea is that we don’t need to simulate a spider crawl to audit sites like all other SEO tools do. We piggyback on the crawls from search engine crawlers that take place all the time. Think about that for a minute.
Google, Bing and other search engines are actively crawling sites all the time, and if we could tap into those crawls to perform our audits, we get at least 3 benefits:
- We would get an accurate picture of SEO issues as seen by search engines.
- We would not slow down sites with extra crawls or consume bandwidth.
- We would detect SEO issues in real time as they are discovered by search engines. (This is my favorite because it aligns with my goal of compressing time.)
That is exactly what RankSense does, and it works incredibly well.
When you are manually addressing SEO issues, getting real-time reports is not very helpful because there is still the bottleneck of the manual processes.
Now, I’ll need to introduce the next breakthrough that enabled our automated implementation.
I wrote my first AI system during my first programming class back in college many years ago. We were tasked with writing a Connect 4 game where the computer would play against the human player. I wrote about this a while ago in this post. My classmates designed their games by writing down a set of rules the machine would follow, but I wrote mine so the computer would look ahead to find the best possible moves. Computer chess programs work in a similar way. The advantage of a game is that you have complete information, but in many domains, like SEO, that is not the case. For our first generation, we are focusing on a rules-based approach, and using machine learning to rank which workflows (set of rules) work best over time.
A large number of high-value technical SEO fixes are repetitive. For example, many duplicate content issues can be addressed the same way using canonicals or 301 redirects. The big challenge is generalizing, since most sites are completely unique.
I borrowed the solution and our next breakthrough from two industries: marketing automation and cyber security. In most marketing automation tools, custom workflows are heavily used to personalize e-mail marketing campaigns. In the security industry, antivirus and firewall vendors use small “fingerprints” or signatures to detect viruses or hacking attacks. I combined both ideas: we treat SEO issues like viruses with “signatures” that are consistent across sites, and we use customizable workflows to guide precise recommended actions. As my goal is to compress time, we now have a growing list of predefined workflows to address the most common SEO problems and opportunities.
If our client’s SEO specialist doesn’t agree with how we are prescribing solutions, they can create their own workflow with a visual drag and drop interface.
Here is our automation designer. We are still working to make it more intuitive to use, but it works really well.
Pause for a moment to think about the implications of these two key components of our system. We can audit sites in real time, and as the issues are identified, we can execute predetermined fixes in an instant. Just think about how much valuable time this saves. Massive. Right?
Let me explain the third, and most important breakthrough in our system. While saving time is valuable, it is hard to sell a solution based on opportunity cost alone. Nobody wakes up thinking they want to save opportunity cost. The reason people invest in SEO or SEO tools is because they’re trying to increase business with more traffic. So, a second key goal for me is that our recommendations actually work and increase revenue – every time. But, how do you do that when Google is a moving target, constantly changing its algorithm?
In order to appreciate our third breakthrough idea, imagine that you could:
- Make quick changes to your site, say title changes,
- Learn when those changes are picked up by Google and other search engines, and
- See the positive or negative impact of those changes directly inside your analytics package.
This would allow you to see the traffic before and after the change and the revenue impact. You’d be able to determine the effectiveness of the change pretty easily. Now, imagine if information about effective changes came not only from your own site, but from all the sites in the network. The crowd-sourced information would help prioritize changes with a proven track record of success. Plus, it would help you avoid changes that have a track record of failures. That would be super powerful. Right?
Well, this is our third and most valuable breakthrough.
Now, let me show you the kind of amazing results that we’ve seen from combining these three ideas. This client implemented RankSense on January 4. Their organic rankings improved for significantly more keywords with RankSense than with manual efforts.
We dramatically increased the number of pages receiving traffic (blue line, below), while maintaining and growing the number of visitors per page. This has resulted in more than $400k in additional revenue in less than 4 months. I’ll cover the details of what worked so well for them in a separate post.
Recently, SEO teams such as Etsy’s have popularized an interesting approach to validate the SEO impact of site changes. It consists of treating SEO changes as scientific experiments. We started playing with this idea early this year. We had some great success, so we added the capability to run SEO experiments to our platform. We are specifically focusing on search snippet experiments because we see the biggest promise there. You never know what messaging will resonate best with search visitors, so it’s beneficial to be able to test different messaging. We have found that there’s massive opportunity in treating organic search snippets like ads.
SEO A/B testing is the main focus of similar tools from our friends at Distilled and YC combinator startup that recently launched. We don’t make it our main focus because our mission is to accelerate results, and change validation is important, but only one piece of the puzzle.
It looks like we have good timing because many companies now know all the SEO work they need to do, but they are having difficulty getting it done fast.
I had a fun sales conversation a few weeks back where the prospect didn’t believe one bit of what our platform does, even while looking at a live demo. He told me that 301 redirects take a week or more to get implemented by their SI (systems integrator), and forget about URL rewrites – they are not possible. How can we address SEO issues or add new SEO features if the e-commerce platform doesn’t support them?
We came up with a solution for this in 2010 while working at Altruik. We simply reverse proxied the whole site, and made the changes on the fly on the reverse proxy. Before that, to my knowledge, most reverse proxies improving SEO were limited to a directory or subdomain. The adoption challenge we faced back then was that we added around 500ms to the page load time, which is too much.
Content delivery networks like Cloudflare that make on the fly security changes solve this problem by focusing on site speed. They offload page resources (which take more of the page load time) across a distributed network of caches, which leads to faster sites. The challenge is that building a CDN from scratch takes serious capital investment.
Google solved this problem for us beautifully with the introduction of Google Cloud CDN, which is directly integrated into HTTP(S) load balancers, and has high speed connections to the mayor CDN providers (Akamai, Cloudflare, etc.) You can learn more about the advantages of Google Cloud CDN here.
The combination of Kubernetes and Google Cloud CDN allow us to deliver our SEO optimizations while speeding up our clients’ sites instead of slowing them down. Our on-the-fly changes take around 10 milliseconds, but the speed gains more than make up for that. We cache all page resources on the CDN by default, and our infrastructure scales automatically thanks to Kubernetes. It also helps that most clients and potential clients are moving to the cloud.
This is a third party CDN performance report from Cedexis where Google Cloud CDN is outperforming the leaders https://www.cedexis.com/google-reports/
Using this system, we can implement RankSense for new clients very fast, with only two DNS changes on their end:
- (Optional) One record to generate the Comodo SSL certificate, and
- A DNS change to activate our software.
Some of our clients that already have CDNs like Akamai, Cloudflare, etc., enjoy the fast interconnect partnership those CDNs have with Google Cloud.
We cache all page resources on their CDNs, and optimize their sites. The integration remains very simple.
Now, while our clients get a CDN with our service, we don’t see ourselves as a CDN provider. The CDN is one of the delivery mechanisms. For example, we have direct API integration to Akamai to automate provisioning.
Because we run our infrastructure in Kubernetes, we can deliver our software on most cloud providers and on premise.
This novel approach of improving websites on the edge of the CDNs will open up all sorts of interesting applications outside of SEO and cyber security. We actually have a licensing partner that will leverage our technology to address the quality of content on very large enterprises (a partnership we’re very excited about).
Who will benefit from automated SEO?
Sites with many pages will see the most benefit from the platform – the more pages, the better. We’ve focused primarily on e-commerce retailers (and needed to obtain PCI compliance to do so). If you operate an e-commerce or enterprise site, and you’d like to learn more about how your company could benefit from our platform, please feel free to sign up for a trial.
We also offer a white label version of the platform to agencies serving enterprise and e-commerce clients, but most agencies with or without in-house SEO expertise could benefit from our software. Please feel free to reach out to me directly for agency partnerships.