0% found this document useful (0 votes)
0 views

Activation+Functions

Activation functions are essential in neural networks as they enable non-linear transformations, allowing the model to solve complex problems beyond simple linear equations. The sigmoid activation function is commonly used for probability predictions, as it outputs values between 0 and 1, but can lead to issues like getting stuck during training. Other functions, like softmax, provide generalized alternatives to the sigmoid function.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

Activation+Functions

Activation functions are essential in neural networks as they enable non-linear transformations, allowing the model to solve complex problems beyond simple linear equations. The sigmoid activation function is commonly used for probability predictions, as it outputs values between 0 and 1, but can lead to issues like getting stuck during training. Other functions, like softmax, provide generalized alternatives to the sigmoid function.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

Activation Functions

Deep Learning MasterClass


Why do we need Activation Functions ?

Without activation function, weight and bias would only have a linear transformation, or neural network is just a

linear regression model, a linear equation is polynomial of one degree only which is simple to solve but limited in

terms of ability to solve complex problems or higher degree polynomials.

But opposite to that, the addition of activation function to neural network executes the non-linear transformation

to input and make it capable to solve complex problems such as language translations and image classifications.

In addition to that, Activation functions are differentiable due to which they can easily implement back

propagations, optimized strategy while performing backpropagations to measure gradient loss functions in the

neural networks.
Different Activation Functions
Sigmoid Activation Function
The main reason why we use sigmoid function is because it exists between (0 to 1).

Therefore, it is especially used for models where we have to predict the probability as

an output.Since probability of anything exists only between the range of 0 and 1,

sigmoid is the right choice.The function is differentiable.That means, we can find the

slope of the sigmoid curve at any two points.The function is monotonic but function’s

derivative is not.The logistic sigmoid function can cause a neural network to get stuck

at the training time.The softmax function is a more generalized logistic activation


Thank You

You might also like