Neural Networks & Deep Learning

Swish

A smooth activation function (x * sigmoid(x)) that often outperforms ReLU, discovered through neural architecture search.

This concept is essential for understanding neural networks & deep learning and forms a key part of modern AI systems.

  • Activation Function
  • ReLU
  • GELU

Tags

neural-networks-deep-learning activation-function relu gelu

Related Terms

Added: November 18, 2025