Activation Functions in Neural Networks: What Is Activation Function?
Activation Functions in Neural Networks: What Is Activation Function?
Activation Functions in Neural Networks: What Is Activation Function?
It’s just a thing function that you use to get the output of node. It
is also known as Transfer Function.
As you can see the function is a line or linear. Therefore, the output of
the functions will not be confined between any range.
Fig: Linear Activation Function
Equation : f(x) = x
The logistic sigmoid function can cause a neural network to get stuck at
the training time.
tanh is also like logistic sigmoid but better. The range of the tanh
function is from (-1 to 1). tanh is also sigmoidal (s - shaped).
Fig: tanh v/s Logistic Sigmoid
The ReLU is the most used activation function in the world right
now.Since, it is used in almost all the convolutional neural networks or
deep learning.
As you can see, the ReLU is half rectified (from bottom). f(z) is zero
when z is less than zero and f(z) is equal to z when z is above or equal
to zero.
Range: [ 0 to infinity)
4. Leaky ReLU
The leak helps to increase the range of the ReLU function. Usually, the
value of a is 0.01 or so.