Activation Function
A non-linear function applied to neuron outputs that introduces non-linearity, enabling networks to learn complex patterns.
Related Concepts
- ReLU: Explore how ReLU relates to Activation Function
- Sigmoid: Explore how Sigmoid relates to Activation Function
- Tanh: Explore how Tanh relates to Activation Function
- Softmax: Explore how Softmax relates to Activation Function
Why It Matters
Understanding Activation Function is crucial for anyone working with neural networks & deep learning. This concept helps build a foundation for more advanced topics in AI and machine learning.
Learn More
This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.
Tags
Related Terms
ReLU
Rectified Linear Unit - an activation function that outputs the input if positive, zero otherwise. f(x) = max(0, x).
Sigmoid
An activation function that squashes values to range (0,1), often used for binary classification and gates in LSTMs.
Softmax
An activation function that converts a vector of values into a probability distribution, commonly used for multi-class classification.