How Is Machine Learning Shaping the Future of SEO?


SEO has always been an area of constant change. From its inception in the early 2000s when webmasters started figuring out they could manipulate results to today’s era of content and user experience prioritization, it’s commonly accepted that a new update or new technology could force a major change to your strategy—at virtually any time.

But a new paradigm in search has opened up, beginning with the introduction of Google’s RankBrain and, in my estimation, expanding radically over the course of the next few years. We’re entering the machine learning generation of online search, and we need to be prepared for how it’s about to shape our understanding of SEO.

RankBrain: The Toe in the Water

Machine learning isn’t entirely new; in fact, rudimentary machine learning algorithms started emerging and flourishing in the 1990s (and their ancestors have been around as long as computers themselves). But until recently, these algorithms have had limited applicable use, and haven’t been integrated into a consumer product as widespread or popular as Google Search.

Up until this point, all Google updates were manual pushes; Google developers would engineer some new piece of code, then push it to the core algorithm (or keep it as a separate branch). This is why we saw massive volatility when major updates like Panda and Penguin hit. RankBrain, on the other hand, is designed to figure out what new features are needed, all on its own, and integrate them on a recurring basis.

In terms of scope, RankBrain isn’t that big. It’s a modifier working in conjunction with Hummingbird, the branch of Google’s algorithm responsible for identifying and interpreting natural language and query intent (semantic search). Specifically, it’s designed to help Google interpret the intent of ambiguously worded or otherwise complex queries. So far, it’s doing a great job—but you probably haven’t noticed anything dramatic. That’s because this was intentionally planned as a small-scope update, with potentially bigger machine learning advances to come.

Why Machine Learning Updates Are Different

You might be saying to yourself, “so what? It’s another Google update. We’ve survived tons of them before, and we’ll survive more in the future.” And to a degree, you’re right. On the surface, this update isn’t much different from any other. But take a look at the ways a conceptual machine learning algorithm could differ from a manual one:

  • It’s unpredictable. It’s easy to guess what a human might determine to be “good” or “relevant,” but because of our natural Anthropic bias, it’s hard to guess what a machine might determine. That makes machine learning updates unpredictable, a scary notion for a search marketer.
  • It’s fast. When a machine learning algorithm comes up with an idea for how to improve a swath of code, it pushes it. No questions asked. Currently, there’s a supervisory system in place for RankBrain (and presumably will be one for any update), but the speed at which is works is unlike the world we’re used to, where it’s a year or two between major pushes. As a result, it will be harder to keep up with the changes than it was with human modifiers.
  • It’s always on. Machines don’t need breaks, and they don’t stop until you tell them to. Accordingly, RankBrain (and future algorithms) are constantly searching for new ways to improve, which has a compounding effect on the number of changes that are eventually pushed.

Collectively, these qualities make it likely that a major machine learning algorithm will quickly and constantly reassemble the traditional search structures, far outpacing human marketers, and in ways that are somewhat unpredictable. This might seem silly, given RankBrain’s current limitations, but keep in mind it’s a small portion of a small branch of the algorithm—what would happen if a similar, larger algorithm took over the core?

Reasons for Apathy

My intention isn’t to sound the alarm. In fact, there are some good reasons to support the fact that we should be indifferent to machine learning algorithms. The most important, in my mind, is the fact that even though they evolve on their own, they’re still created by people with specific purposes in mind. Accordingly, it’s highly unlikely they’ll radically depart from structures that have already been established.

Second, because these algorithms are so fast and so constant, their updates are naturally harder to detect. That means we might have already seen the last of major game-changers like Panda—instead, they’d roll out so gradually we’d hardly even notice.

Finally, even though Google has more faith in AI than most other corporations (or individuals, I might add), it seems unlikely they would turn their entire basis for profit over to a machine they couldn’t control. For the next several years, maybe even decades, there will likely be a firm human hand over any machine learning algorithms Google develops, which means their possible negative effects might be mitigated.

Looking Forward

It’s hard to say exactly when the next machine learning component might come to Google, or what exactly that component might look like, but I can almost guarantee machine learning algorithms will gradually replace what we’ve come to know as the “typical” means of search engine updating. Unfortunately, there isn’t much you can do in the meantime except pay attention to how Google develops RankBrain and future products, and hedge your bets by keeping your SEO strategies as future-focused and malleable as possible.

About Timothy Carter

Timothy Carter is founder of digital marketing agency, OutrankLabs. He’s also the Director of Business Development for the Seattle-based content marketing agency, AudienceBloom. When Timothy isn't telling the world about the great work his company does, he's planning his next trip to Hawaii while drinking some Kona coffee.

Filed under: Insights