Neural Networks & Deep Learning

ReLU

Rectified Linear Unit - an activation function that outputs the input if positive, zero otherwise. f(x) = max(0, x).

  • Activation Function: Explore how Activation Function relates to ReLU
  • Leaky ReLU: Explore how Leaky ReLU relates to ReLU
  • ELU: Explore how ELU relates to ReLU

Why It Matters

Understanding ReLU is crucial for anyone working with neural networks & deep learning. This concept helps build a foundation for more advanced topics in AI and machine learning.

Learn More

This term is part of the comprehensive AI/ML glossary. Explore related terms to deepen your understanding of this interconnected field.

Tags

neural-networks-deep-learning activation-function leaky-relu elu

Related Terms

Added: November 18, 2025