Lect 5-6activation Function
Lect 5-6activation Function
Multiple layer
Single layer fully connected
Unit delay
operator
outputs
3. Recurrent network
RNN topology involves backward links from output to the input and hidden
layers. The notion of time is encoded in the RNN information
processing scheme. They are thus used in applications like speech
processing where inputs are time sequences data.
• Feed forward networks (SLP & MLP ):
• Recurrency
– Nodes connect back to other nodes or themselves
– Information flow is multidirectional
– Sense of time and memory of previous state(s)
To make the work more efficient and to obtain the exact output some force
and activation may be given. This activation helps in achieving the exact
output. The activation function is applied over the net input to calculate the
output of an ANN
Information Processing consist of two parts
•Input
•Output
The integration function is associated with the input of a processing
element. This function serves to combine activation, information into a net
input to the processing element
The nonlinear activation function is used to ensure that actual response of
neuron is bounded i.e. conditioned
Activation functions for the hidden units are needed to introduce
nonlinearity into the network
0
-∞ +∞
Vk
If Vk >= θ then Yk = 1
Otherwise 0
Piecewise-linear Function
Oj
+1
(v) = 1 v >= +1/2
If Vk >= 0 then Yk = +1
Otherwise -1
Yk
+1
-∞ +∞
Vk
-1
Nonlinear Activation Function
Oj
+1
ini
Hyperbolic tangent Function