Encoder-Decoder
A architecture where the encoder processes input and the decoder generates output, used in translation and sequence-to-sequence tasks.
Related Concepts
- Transformer: Explore how Transformer relates to Encoder-Decoder
- Seq2Seq: Explore how Seq2Seq relates to Encoder-Decoder
- T5: Explore how T5 relates to Encoder-Decoder
- BART: Explore how BART relates to Encoder-Decoder
Why It Matters
Understanding Encoder-Decoder is crucial for anyone working with large language models. This concept helps build a foundation for more advanced topics in AI and machine learning.
Learn More
This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.
Tags
Related Terms
BART
Bidirectional and Auto-Regressive Transformer - combines BERT-like encoder with GPT-like decoder for sequence-to-sequence tasks.
T5
Text-to-Text Transfer Transformer - frames all NLP tasks as text-to-text problems using a unified encoder-decoder architecture.
Transformer
A neural network architecture introduced in 'Attention is All You Need' (2017) that relies entirely on self-attention mechanisms, becoming the foundation for modern LLMs.