AI Term:BERT (Bidirectional Encoder Representations from Transformers)

·

·

« Back to Glossary Index

BERT, which stands for Bidirectional Encoder Representations from Transformers, is a method for training artificial intelligence models to understand language.

Let’s think of BERT like a super reader. When you and I read a sentence, we look at the words before and after each word to understand what that word means in that sentence. BERT does something similar, but on a much larger scale.

Let’s say we have the sentence “I love to play ___.” You might guess that the missing word could be “soccer” or “piano” or any number of things. BERT does the same thing but with many sentences at once, and it learns from this process. It’s like filling in the blanks in a huge, complicated word puzzle.

The special thing about BERT is that it reads both ways – left to right, and right to left. So, in our example, it would look at the words “I love to play” and also “play to love I” to guess the missing word. That’s why we say it’s “bidirectional”. This helps BERT understand the context of words in a sentence really well, which makes it good at understanding language.

BERT is used in many applications, like search engines, where it helps understand what people mean when they type in a search query.

« Back to Glossary Index