0% found this document useful (0 votes)
50 views19 pages

Activation Function: Presented by

This document summarizes several common activation functions used in neural networks. It describes 7 types of activation functions: binary step, linear, sigmoid, tanh, ReLU, leaky ReLU, and softmax. For each function, it provides the mathematical equation, code implementation, output range, and derivative. It also discusses how to choose activation functions for the hidden and output layers of different neural network types like MLP, CNN, and RNN.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
50 views19 pages

Activation Function: Presented by

This document summarizes several common activation functions used in neural networks. It describes 7 types of activation functions: binary step, linear, sigmoid, tanh, ReLU, leaky ReLU, and softmax. For each function, it provides the mathematical equation, code implementation, output range, and derivative. It also discusses how to choose activation functions for the hidden and output layers of different neural network types like MLP, CNN, and RNN.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

ACTIVATION FUNCTION

PRESENTED BY
LALITHADEVI.B
RC2013003011017
FULL TIME RESEARCH SCHOLAR,
DEPT. OF COMPUTER SCIENCE ENGINEERING,
SRM INSTITUTE OF SCIENCE AND TECHNOLOGY.

1
Overview of Neural networks
- Made of interconnected neurons.

- Each neuron is characterized by its weight, bias and activation function.

2
x = (weight * input) + bias

Post that, an activation function is applied on the above result.

- forward movement of information is known as the forward propagation.

- error is calculated.

- Based on this error value, the weights and biases of the neurons are

updated -  back-propagation.

3
Types of activation functions

•Binary Step Function


•Linear function
•Sigmoid function
•Tanh function
•ReLu function
•Leaky ReLu
•Softmax Function

4
1. Binary Step Function

f(x) = 1, x>=0
= 0, x<0

Range = return either


0 or 1

5
def binary_step(x):

if x<0:

return 0

else:

return 1

Derivative(Gradient)

f '(x) = 0, for all x

6
2. Linear Function

f(x) = ax

def linear_function(x):

return 4*x

7
Derivative(Gradient)

f '(x) = a

8
3. Sigmoid Function

f(x) = 1/(1+e^-x)

Range = between 0 and 1

import numpy as np

def sigmoid_function(x):

z = (1/(1 + np.exp(-x)))

return z

9
Derivative(Gradient)

f'(x) = sigmoid(x)*(1-sigmoid(x))

10
4. Tanh Function

tanh(x) =2 / sigmoid(2x)-1

tanh(x) = 2 / (1+e^(-2x)) -1

 Range = (-1 to 1)

def tanh_function(x):

z = (2/(1 + np.exp(-2*x))) -1

return z
11
Derivative(Gradient)

12
5. ReLu Function

f(x)=max(0,x)

 Range = 0 or x

def relu_function(x):
if x<0:
return 0
else:
return x

13
Derivative(Gradient)

f '(x) = 1, x>=0

= 0, x<0

14
6. Leaky ReLu Function

f(x) = 0.01x, x<0


= x, x>=0

def leaky_relu_function(x):
if x<0:
return 0.01*x
else:
return x

15
Derivative(Gradient)

f '(x) = 1, x>=0

= 0.01, x<0

16
7. Softmax Function

Range = between 0 and 1


def softmax_function(x):

z = np.exp(x)

z_ = z/z.sum()

return z_

if you have three classes, 3 neurons in the output layer.

Suppose, got the output from the neurons as [1.2 , 0.9 , 0.75].

Applying the softmax function over these values, will get the following result

[0.42 ,  0.31, 0.27].


17
How to Choose a Hidden Layer Activation Function
Multilayer Perceptron (MLP): ReLU activation function.

Convolutional Neural Network (CNN): ReLU activation function.

Recurrent Neural Network: Tanh and/or Sigmoid activation function.

How to Choose an Output Activation Function


Binary Classification: Sigmoid activation.

Multiclass Classification: Softmax activation.

Regression : Linear activation

18
THANK YOU

19

You might also like