Large Language Models

Subword Tokenization

Breaking words into smaller units, balancing vocabulary size with representation granularity.

This concept is essential for understanding large language models and forms a key part of modern AI systems.

  • Tokenization
  • BPE
  • WordPiece

Tags

large-language-models tokenization bpe wordpiece

Related Terms

Added: November 18, 2025