Activation Functions
Activation Functions
There are number of useful activation functions to be used by processing units of ANN. Some of the most
commonly used activation functions are the linear function, threshold function and sigmoid function.
In practice we can consider many useful activation functions [27]. Three of the most used
activation functions are the threshold function, the linear function and the sigmoid function [41].
There are a number of common activation functions in use with neural networks
Where is a slope parameter. This is called the log-sigmoid because a sigmoid can also be
constructed using the hyperbolic tangent function instead of this relation, in which case it
would be called a tan-sigmoid. Here, we will refer to the log-sigmoid as simply sigmoid.
The sigmoid has the property of being similar to the step function, but with the addition of a
region of uncertainty. Sigmoid functions in this respect are very similar to the input-output
relationships of biological neurons, although not exactly the same. Below is the graph of a
sigmoid function.
3. Softmax Function
The softmax activation function is useful predominantly in the output layer of a clustering
system. Softmax functions convert a raw value into a posterior probability. This provides a
measure of certainty. The softmax activation function is given as: