On Nov 2019 Google released a major update in BERT algorithm. The inclusion of the BERT algorithm, the objective behind user search queries should be better understood, and more relevant results.
All about the BERT Algorithm
Tech giant Google recently updated a major algorithm alteration in BERT algorithm. If thereâs one thing that learned about Google Search over the 15 years, itâs that the curiosity of people is endless. Everyday, we see billions of searches, and 15 percent of those queries are those that we havenât seen beforeâ so weâve built ways to return results for queries that we canât expect. Google BERT Algorithm Update is the latest one.
When people like you or I come to search, the best way to formulate a query is not always quite certain. We may not know the right words to use, or how to spell it, because we often come to search for learningâ we donât really have the expertise to begin with.
Search is about language understanding at its core. It is our job to find out what you are looking for and to surface useful web-based data, regardless of how you spell or combine the terms in your question.
While over the years we have continued to improve our ability to understand language, sometimes we still donât get it right, especially with complex or conversational queries. In fact, thatâs one of the reasons people often use âkeyword-ease,â typing strings of words they think weâre going to understand, but theyâre not actually asking a question naturally.
With our research teamâs latest advances in language understanding â made possible by machine learning â we are making a significant improvement in how we understand queries, representing the biggest leap forward in the past five years, and one of the biggest leaps forward in Search history.
Last year, we introduced and open-sourced a natural language processing (NLP) neural network-based pre-training technique called Transformer Bidirectional Encoder Representations, or as we call itâ BERT, for short. This technology allows anyone to train their own state-of â the-art answering system for questions.
This breakthrough was the result of Googleâs transformer research: models that process words in a sentence rather than one-by-one in order in relation to all the other words. Therefore, BERT models can consider the full context of a word by looking at the words that come before and after it â especially useful for understanding the purpose behind search queries.