Training & Optimization

RMSprop

An optimizer using moving average of squared gradients to adapt learning rates, addressing AdaGrad’s diminishing rates.

This concept is essential for understanding training & optimization and forms a key part of modern AI systems.

  • Optimizer
  • AdaGrad
  • Adam

Tags

training-optimization optimizer adagrad adam

Related Terms

Added: November 18, 2025