ReLU
Rectified Linear Unit - an activation function that outputs the input if positive, zero otherwise. f(x) = max(0, x).
Related Concepts
- Activation Function: Explore how Activation Function relates to ReLU
- Leaky ReLU: Explore how Leaky ReLU relates to ReLU
- ELU: Explore how ELU relates to ReLU
Why It Matters
Understanding ReLU is crucial for anyone working with neural networks & deep learning. This concept helps build a foundation for more advanced topics in AI and machine learning.
Learn More
This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.
Tags
Related Terms
Activation Function
A non-linear function applied to neuron outputs that introduces non-linearity, enabling networks to learn complex patterns.
ELU
Exponential Linear Unit - an activation function that allows negative values, helping with vanishing gradients.
Leaky ReLU
A variant of ReLU allowing small negative values (f(x) = x if x > 0, else αx where α ≈ 0.01), preventing dead neurons.