BERT will impact 1 in 10 of all search queries. This is the biggest change in search since Google released RankBrain.
Bert (Bidirectional Encoder Representations from Transformers) is the new open source neural network of Google search engine. Bert will be able to process the natural language more effectively and, in this way, affect the results of 1 out of every 10 queries in your search engine. Thus, it complements RankBrain, a tool presented five years ago.
Bert has already started its implementation in English queries and will do so in other languages. This affects both the main searches and the result of the highlighted fragments. This neural network is able to process words in every sentence, unlike how it was done until now: word by word. In this way, Bert understands the full context of each sentence, relevant in long and conversational consultations.
With this, Google continues to improve its understanding of natural language so that its search engine understands what each user is looking for and, thus, show the most appropriate website.
Rolling out. BERT started rolling out this week and will be fully live shortly. It is rolling out for English language queries now and will expand to other languages in the future.
Featured Snippets. This will also impact featured snippets. Google said BERT is being used globally, in all languages, on featured snippets.
What is BERT? It is Google’s neural network-based technique for natural language processing (NLP) pre-training. BERT stands for Bidirectional Encoder Representations from Transformers.
It was opened-sourced last year and written about in more detail on the Google AI blog. In short, BERT can help computers understand language a bit more like humans do.
When is BERT used? Google said BERT helps better understand the nuances and context of words in searches and better match those queries with more relevant results. It is also used for featured snippets, as described above.
Source: searchengineland.com, ai.googleblog.com