Neural Networks & Deep Learning

He Initialization

Weight initialization designed for ReLU activations, preventing vanishing/exploding gradients in deep networks.

This concept is essential for understanding neural networks & deep learning and forms a key part of modern AI systems.

  • Weight Initialization
  • Xavier Initialization
  • ReLU

Tags

neural-networks-deep-learning weight-initialization xavier-initialization relu

Related Terms

Added: November 18, 2025