L2 Regularization
Adding the sum of squared weights to the loss function, penalizing large weights and improving generalization.
Related Concepts
- Regularization: Explore how Regularization relates to L2 Regularization
- L1 Regularization: Explore how L1 Regularization relates to L2 Regularization
- Ridge: Explore how Ridge relates to L2 Regularization
- Weight Decay: Explore how Weight Decay relates to L2 Regularization
Why It Matters
Understanding L2 Regularization is crucial for anyone working with training & optimization. This concept helps build a foundation for more advanced topics in AI and machine learning.
Learn More
This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.
Tags
Related Terms
L1 Regularization
Adding the sum of absolute weights to the loss function, promoting sparsity and feature selection.
Regularization
Techniques to prevent overfitting by adding constraints or penalties to the model (L1, L2, dropout, early stopping).
Weight Decay
A regularization technique that shrinks weights toward zero during optimization. Equivalent to L2 regularization in standard SGD, but differs when using adaptive optimizers like Adam.