Training & Optimization
Student Model
The smaller model in knowledge distillation learning to mimic the teacher’s behavior.
This concept is essential for understanding training & optimization and forms a key part of modern AI systems.
Related Concepts
- Knowledge Distillation
- Teacher Model
- Model Compression
Tags
training-optimization knowledge-distillation teacher-model model-compression
Related Terms
Knowledge Distillation
Training a smaller 'student' model to mimic a larger 'teacher' model, transferring knowledge while reducing size.
Model Compression
Techniques to reduce model size and computational requirements (quantization, pruning, distillation) for efficient deployment.
Teacher Model
The larger, more accurate model in knowledge distillation that guides student training.