Training & Optimization
ZeRO
Zero Redundancy Optimizer - techniques for memory-efficient distributed training by partitioning optimizer states.
This concept is essential for understanding training & optimization and forms a key part of modern AI systems.
Related Concepts
- Distributed Training
- Memory Efficiency
- Large Models
Tags
training-optimization distributed-training memory-efficiency large-models