# Activation Function

In a neural network, an activation function normalizes the input and produces an output which is then passed forward into the subsequent layer.  Activation functions add non-linearity to the output which enables neural networks to solve non-linear problems.  In other words, a neural network without an activation function is essentially just a [linear regression](https://machine-learning.paperspace.com/wiki/linear-regression) model.&#x20;

### Activation Function Types

Common activation functions include **Linear**, **Sigmoid**, **Tanh**, and **ReLU** but there are many others.

![](https://2327526407-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-LvBP1svpACTB1R1x_U4%2F-LvNWUoWieQqaGmU_gl9%2F-LvO3qs2RImYjpBE8vln%2Factivation-functions3.jpg?alt=media\&token=f96a3007-5888-43c3-a256-2dafadd5df7c)
