Want to automate social selling on LinkedIn?

Check out our LinkedIn content automation and employee advocacy manager

B2B and content marketing strategies like this in your inbox twice a month
By clicking Subscribe, you agree with our Terms.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
SEO
8
min read
January 12, 2024

How Google Uses AI in Search in 2022

Parthi Loganathan
CEO of Letterdrop, former Product Manager on Google Search

Google Search has changed a lot in the past 24 yrs since its founding. In 2015, shortly before I joined the Search team as a Product Manager, they started using AI for the first time. Till then, Google exclusively used rule-based algorithms such as PageRank and similar. That meant that a human could understand why a search query returned the results that it did. This was very important to Google to ensure the highest quality results. They didn't want a situation where someone searched for something, and they couldn't explain why an irrelevant, offensive, or poor quality page was returned. AI models aren't deterministic like rule-based algorithms, so you could get different results each time you run it, and you wouldn't know why.

Since then, we've seen a lot of advances in AI and natural language models, both within and outside Google. For example, they introduced Transformers in 2017 (not the Michael Bay kind) to better understand queries. OpenAI introduced GPT-3 in 2020 that can produce sentences and answer questions for information retrieval. Google has since improved on it with its GLaM model.

Google still uses its rule-based algorithms, but it has started augmenting them with new AI models to handle edge cases they couldn't before. I remember a talk from Rajan Patel, then Director of Engineering, on how AI was being cautiously introduced into Search.

Here's a friendly overview of some of the AI models Google has started using since 2015. It broaches deeply technical concepts, but I only cover them at a high level. More importantly, I discuss the impact on Search that should be interesting to marketers and non-technical founders who agonize over SEO. My background is in product management and software engineering. I'm not a machine learning researcher myself, so don't worry, I can't confuse you too much. 😅


Rankbrain Ties Together Concepts to Understand Queries Like a Human

RankBrain launched around when I joined Google. It was the first deep learning model that they deployed into Search.

Before, Google could only search for the right pages if you used the right keyword. They had mappings of synonyms but couldn't understand descriptions or more complex synonyms.

RankBrain understands the semantics behind queries. It maps words to concepts by turning each word into a "word vector" using its patented Word2vec algorithm. This basically creates a graph of all words or phrases and understands their relationship to each other based on how "close" they are together.

Word2vec builds relationships between words
Word2vec builds relationships between words

‎RankBrain helps decide whether a page is relevant to your query even if you didn't use the "right" search terms. It simulates human-level understanding of what you intended to search for. For example, if you searched for "improve my mouse," it understands that you're thinking of fixing a broken computer mouse and not teaching your pet mouse new tricks... as cool as that may be.

After content quality and links, RankBrain was the third most important factor in ranking back in 2015. RankBrain is still the major AI system that Google uses today in 2022.


Neural Matching Understands Vague Queries and Returns Relevant Non-gamified Pages

Google introduced Neural Matching in 2018, and it is used in 30% of all search queries. I had left Search and was working on messaging products in G Suite at this time, so I don't have first-hand knowledge on it.

Neural Matching matches better queries to pages. It helps Google understand synonyms better. This is great when you're describing something but not quite sure what it's called. Google's official Search Liaison, Danny Sullivan, explains how the word "change" can be interpreted differently in different contexts.

It uses neural networks using the Deep Relevance Matching Model to find true synonyms and deal with fuzzy matches where you're not super clear on what you're searching for.

This algorithm exclusively uses your query and web pages to figure out what's relevant. It doesn't use the link structure of the web. This way Google can find a relevant page for you even if it's new and the site doesn't have many links to it. This means Google can cut through the noise of "spammy" content and web pages trying to game the system with fake backlinks and keyword stuffing and return the best results to you.

For example, if you searched for "how to manage a green," you might be confused. Isn't green a color? How do you manage a color? Google understands that "green" is one of the four personality types in a leadership management framework and that you want to learn more about working with someone who scores green in this personality test.


BERT Understands Your Entire Query, Not Just Keywords

Wrong BERT - this one is from Sesame Street
Wrong BERT - this one is from Sesame Street

‎BERT, aka Bidirectional Encoder Representations, not the one from Sesame Street, is a natural language model using neural networks that Google adopted in 2019. It's now used in every English search query.

BERT uses transformers to build a relationship between words in a sentence. This is helpful in long conversational queries where the order of words and prepositions matter in understanding what you're searching for.

Many machine learning models fail at parsing large sequential inputs that depend on each other. They don't have a long "attention" span. The model parses information in pieces and downplays the importance of earlier pieces of information. So they wouldn't handle long queries where a word at the start of the sentence impacts a word at the end.

BERT reads and parses sentences both forwards and backwards instead of just parsing it left-to-right word by word. It uses this to understand whether there is a relationship between words at different parts of the sentence. So it doesn't drop important words that might be modifiers to your query.

For example, suppose you search for "2022 travel visa uk to china". In this case, it's essential to understand that the searcher is a UK citizen looking for a visa to China in 2022, and not a Chinese national looking to travel to the United Kingdom. You get very different results if you ignore the order and simply look at the keywords.


MUM Understands Searches Holistically

In May of 2021, Google started using MUM or Multitask Unified Model. It's basically a supercharged BERT. While BERT simply generates a language model that can understand your queries better, MUM can also generate its own sentences based on your query. The goal of MUM is to help you find the information you need with fewer searches. Google tries to understand your intent with the initial query and guess what else you need to know, so you don't have to keep searching.

Not only does it understand your entire query, but it generates new related queries and anticipates what you're going to ask next. So if you were to search for "how to buy a car in paris," MUM could also figure out that you might search for "where to find deals on cars in france", "how to get car insurance in paris", "what cars have the best resale value in france", etc.

MUM understands not just text, but images too. So in the future, you might be able to just take a picture of a car and search "buy this" and it'll figure out the rest for you.

MUM reduces Google's dependence on keywords and gets Google closer to truly understanding what people want without using keywords as proxies. You might be thinking to yourself, "Oh no, how will I get my content on the first page of Google?" Don't worry. If you're a business, this is good news. This means that producing high-quality content will help you stand out from the SEO spam cluttering the internet.


As a Marketer, You Should Focus On Quality, Not Gaming Search

You don't really need to understand any of the above as a marketer. I wrote this because I want you to know that you can trust Google to do its job. Every year, it becomes better at understanding true human intent when it comes to Search.

What does that mean for you? Focus on creating high-quality content that is genuinely useful and easy to consume. Use keyword research and as information that guides you. They are not the end-all-be-all when it comes to creating content for SEO. You should still go through basic SEO checklists like the one built into Letterdrop and make sure you're doing your best to make your posts easily indexable by Google, but don't try to gamify search. It's enough to distill your learnings from running your business into educational pieces of content that can help more people and attract new customers to your business.

I hope this piece helped you understand that you can invest in creating useful content, say no to spammy low-quality SEO content, and not get too stressed about the technical details behind the ever-changing world of SEO. As Google's Search AI becomes more advanced, you can worry less about gaming it and invest more in great content.

Subscribe to newsletter

No-BS growth strategies and content marketing tactics in your inbox twice a month.

By clicking Subscribe, you agree with our Terms.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.