New BERT Update Makes Google Search Smarter Using AI
25 OCT 2019
BERT Introduces “context” to the Search Query
In the past Google would treat similar search queries the same without applying context:
What exactly has changed?
Simply put, the search engine results pages got a lot smarter when BERT was introduced into the Google algorithm. Powered by AI learning, this change now allows Google to understand the “context” of a long-tail search query.
What does BERT affect?
Since BERT reads the entire sequence of words in a search query as opposed to the traditional directional models, which read the text input sequentially (left-to-right or right-to-left), it is safe to say this update will mostly impact “long-tail” search queries.
What is BERT Model?
BERT (Bidirectional Encoder Representations from Transformers) is a pre-training model that was published by researchers Jacob Devlin, Ming-Wei Chang, Kenton Lee and Kristina Toutanova on the 24 May 2019 at Google AI Language. It has the capabilities of producing accurate results in a number of natural language processing tasks.
Let’s Get into the Details
As you can see in the long-tail search terms below, Google was previously mixing irrelevant information with quality results, resulting in getting the context wrong.
In the example below Google tried to verify your request by suggesting a search term in context.
So What is Next: Will There Be An Ernie Update?
This is a huge step up for quality search results on Google’s part, and I hope that Google keeps the ball rolling. Have you ever seen a Bert without an Ernie? My guess is that Google has left this open for even more exciting changes coming, maybe even understanding the context of some short-tail search queries in the future.
For the search term below I would like to get the correct results without using quotes.
Did BERT update affect Image Search?
As of the time of writing this article, image search does not appear to be affected by BERT.
Google BERT Update is a significant algorithm update released by Google on 25 October 2019 in an effort to introduce “context” to long-tail search queries. This is done by the BERT (Bidirectional Encoder Representations from Transformers) artificial intelligence model that can understand natural language processing pre-training.