Large Language Models

BERT

Bidirectional Encoder Representations from Transformers - a model that understands context by looking at text from both directions.

  • Transformer: Explore how Transformer relates to BERT
  • Encoder-Only: Explore how Encoder-Only relates to BERT
  • Masked Language Modeling: Explore how Masked Language Modeling relates to BERT
  • Bidirectional: Explore how Bidirectional relates to BERT

Why It Matters

Understanding BERT is crucial for anyone working with large language models. This concept helps build a foundation for more advanced topics in AI and machine learning.

Learn More

This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.

Tags

large-language-models transformer encoder-only masked-language-modeling

Related Terms

Added: November 18, 2025