Getting embed with BERT

How does BERT affect SEO, and is there a way to optimize for it? AJ Kohn provided Coywolf with insights on how SEOs should respond to the latest Google update.

On October 25, 2019, Pandu Nayak, Vice President of Search at Google, announced that Google could understand searches better than ever before. Pandu highlighted that their breakthrough was a result of their research with a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers (BERT). He described it as “models that process words in relation to all the other words in a sentence, rather than one-by-one in order.”

BERT models can therefore consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.

If you’re like me, the mention of neural networks, language processing, and bidirectional encoder representations from transformers, make your head spin a little. To gain some clarity about BERT, I reached out to AJ Kohn. AJ first wrote about BERT in November 2018. He said that BERT is about embeddings and he discussed how SEOs can optimize for it.

Become a member to read the full article or sign in