Large Language Models
Sequence-to-Sequence
Models that transform input sequences to output sequences, used for translation, summarization, and generation.
This concept is essential for understanding large language models and forms a key part of modern AI systems.
Related Concepts
- Encoder-Decoder
- Machine Translation
- T5
Tags
large-language-models encoder-decoder machine-translation t5
Related Terms
Encoder-Decoder
A architecture where the encoder processes input and the decoder generates output, used in translation and sequence-to-sequence tasks.
Machine Translation
Automatically translating text from one language to another using neural models (typically encoder-decoder architectures).
T5
Text-to-Text Transfer Transformer - frames all NLP tasks as text-to-text problems using a unified encoder-decoder architecture.