Understanding searches better than ever before

Last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it–BERT, for short.

This technology enables anyone to train their own state-of-the-art question answering system.

Read more on Google

Google Applies New BERT Model to Search Rankings, Affecting 1-in-10 Queries

Google is rolling out what it says is the biggest step forward for search in the past 5 years, and one of the biggest steps forward in the history of Search altogether.

Google is using a new technology it introduced last year, called BERT, to understand search queries.

Read more on Search Engine Journal

A deep dive into BERT: How BERT launched a rocket into natural language understanding

Google describes BERT as the largest change to its search system since the company introduced RankBrain, almost five years ago, and probably one of the largest changes in search ever.

The news of BERT’s arrival and its impending impact has caused a stir in the SEO community, along with some confusion as to what BERT does, and what it means for the industry overall.

Read more on Search Engine Land

Google BERT Update: Background and analysis

It’s the biggest change to Google’s algorithm for five years, affecting one in ten search queries.

With the Google BERT Update, Google aims to improve the interpretation of complex long-tail search queries and display more relevant search results.

Read more on SearchMetrics

The New Google BERT Update Explained In Plain English

The Google BERT update was announced on October 24, 2019, but reported that it has already been rolling out for a few days.

According to the release, Google says that this will affect 10% of queries, which means this is one of the biggest Google updates of the last 5 years!

Read more on The Hoth