Google's BERT Update Crowns 'Context' as Queen & Marks the Death of 'Stop Words'


bert_alog_a_i.jpgLast month Google announced:

...the biggest leap forward in the past five years, and one of the biggest leaps forward in the history of Search.

Dubbed BERT (Bidirectional Encoder Representations from Transformers), Google's latest advancement in the science of language understanding is the result of significantly improved machine learning in terms of how Google understand queries.

What you need to know is that BERT now takes into account how words like to, for, & no, and other qualifiers like a lot, or none, have on the intent of the query.

Google's blog post provides multiple examples of how this alters the search results. Below we see how the change might look like in the wild...

Do esthesticians stand a lot

In essence, BERT is all about interpreting context within your content. So, while content is still King, context is the Queen that adds nuance to your keywords. Here's another Before and After example that shows the effect of factoring in the contextual meaning of the word no in relation to the keywords curb and parking...

parking on a hill with no curb

Another contextual factor BERT uses is proximity. In the example above...

TO READ THE FULL ARTICLE