Topics

Activation Function

In a neural network, an activation function normalizes the input and produces an output which is then passed forward into the subsequent layer. Activation functions add non-linearity to the output which enables neural networks to solve non-linear problems. In other words, a neural network without an activation function is essentially just a linear regression model.

Activation Function Types

Common activation functions include **Linear**, **Sigmoid**, **Tanh**, and **ReLU **but there are many others.

Last modified 2yr ago

Copy link