Neural Networks & Deep Learning

Leaky ReLU

A variant of ReLU allowing small negative values (f(x) = x if x > 0, else αx where α ≈ 0.01), preventing dead neurons.

This concept is essential for understanding neural networks & deep learning and forms a key part of modern AI systems.

  • ReLU
  • Activation Function
  • PReLU

Tags

neural-networks-deep-learning relu activation-function prelu

Related Terms

Added: November 18, 2025