Training & Optimization

Batch Gradient Descent

Computing gradients using the entire dataset, providing stable but slow updates.

This concept is essential for understanding training & optimization and forms a key part of modern AI systems.

  • Gradient Descent
  • Mini-Batch
  • SGD

Tags

training-optimization gradient-descent mini-batch sgd

Related Terms

Added: November 18, 2025