BERT: An Update to the Google Search Algorithm
Fifteen percent of the billions of searches Google sees every day are unique queries. To keep up with its users, Google has designed a new way for the search engine to generate results for unanticipated searches.
Since people generally go to Google Search when they need a piece of information, they don’t always know exactly what words to search. Using machine learning, the Google research team has made improvements in how they understand queries through the science of language understanding. We’ve broken down the basics of the update coming to the Google Search algorithm.
What Is BERT?
Called one of the biggest leaps forward in the history of Google Search, the Bidirectional Encoder Representations from Transformers is a neural network-based technique for natural processing. The open-source technology allows anyone to train their own question answering system.
These transformers are models that can process words in relation to other words in the same sentence. This allows BERT to consider the context of a search phrase before generating results.
How Will BERT Impact You?
BERT is being applied to both search ranking and featured snippets. It is anticipated to impact 1 in every 10 English language searches in the U.S and has been applied to featured snippets in over 20 countries. Using machine learning, BERT can take knowledge from one language and adapt it to others, allowing Search to improve for all languages over time.
The intention of BERT is to allow users to search in a way that feels natural while still getting the results they need.
When Is BERT Launching?
Google began to roll out BERT in mid-October and expects it to fully launch not long after.
In short, this update to the Google Search algorithm will help the platform continuously generate better results in relation to each query. If this information has you questioning your current SEO and SEM practices, Kraus Marketing can help keep your strategy up to date. Contact us today!