Training & Optimization

Distillation Temperature

A hyperparameter in knowledge distillation controlling how soft the teacher’s outputs are.

This concept is essential for understanding training & optimization and forms a key part of modern AI systems.

  • Knowledge Distillation
  • Temperature
  • Transfer Learning

Tags

training-optimization knowledge-distillation temperature transfer-learning

Related Terms

Added: November 18, 2025