Neural Networks & Deep Learning
ELU
Exponential Linear Unit - an activation function that allows negative values, helping with vanishing gradients.
This concept is essential for understanding neural networks & deep learning and forms a key part of modern AI systems.
Related Concepts
- Activation Function
- ReLU
- Leaky ReLU
Tags
neural-networks-deep-learning activation-function relu leaky-relu
Related Terms
Activation Function
A non-linear function applied to neuron outputs that introduces non-linearity, enabling networks to learn complex patterns.
Leaky ReLU
A variant of ReLU allowing small negative values (f(x) = x if x > 0, else αx where α ≈ 0.01), preventing dead neurons.
ReLU
Rectified Linear Unit - an activation function that outputs the input if positive, zero otherwise. f(x) = max(0, x).