Training & Optimization
RMSprop
An optimizer using moving average of squared gradients to adapt learning rates, addressing AdaGrad’s diminishing rates.
This concept is essential for understanding training & optimization and forms a key part of modern AI systems.
Related Concepts
- Optimizer
- AdaGrad
- Adam
Tags
training-optimization optimizer adagrad adam