Neural Networks & Deep Learning
Group Normalization
Normalizing groups of channels independently, more stable than batch normalization for small batch sizes.
This concept is essential for understanding neural networks & deep learning and forms a key part of modern AI systems.
Related Concepts
- Batch Normalization
- Layer Normalization
- Normalization
Tags
neural-networks-deep-learning batch-normalization layer-normalization normalization
Related Terms
Batch Normalization
A technique that normalizes layer inputs to stabilize and accelerate training by reducing internal covariate shift.
Normalization
Scaling features to a standard range (typically 0-1 using min-max scaling) to improve model training and convergence. Often used interchangeably with standardization (mean=0, std=1), though technically distinct.