Training & Optimization
LoRA
Low-Rank Adaptation - a parameter-efficient fine-tuning method that updates only small low-rank matrices instead of full weights.
Related Concepts
- Fine-Tuning: Explore how Fine-Tuning relates to LoRA
- PEFT: Explore how PEFT relates to LoRA
- Adapter: Explore how Adapter relates to LoRA
- Parameter Efficiency: Explore how Parameter Efficiency relates to LoRA
Why It Matters
Understanding LoRA is crucial for anyone working with training & optimization. This concept helps build a foundation for more advanced topics in AI and machine learning.
Learn More
This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.
Tags
training-optimization fine-tuning peft adapter