Large Language Models

GPT

Generative Pre-trained Transformer - an autoregressive language model architecture that predicts the next token given previous context.

  • Transformer: Explore how Transformer relates to GPT
  • Autoregressive: Explore how Autoregressive relates to GPT
  • OpenAI: Explore how OpenAI relates to GPT
  • Decoder-Only: Explore how Decoder-Only relates to GPT

Why It Matters

Understanding GPT is crucial for anyone working with large language models. This concept helps build a foundation for more advanced topics in AI and machine learning.

Learn More

This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.

Tags

large-language-models transformer autoregressive openai

Related Terms

Added: November 18, 2025