Perceptron: Neuron Model (Special Form of Single Layer Feed Forward)
Perceptron: Neuron Model (Special Form of Single Layer Feed Forward)
1 if v 0
( v )
1 if v 0
b (bias)
x1 w1
w2 v y
x2
(v)
wn
xn
Perceptron for Classification
The Perceptron is used for binary classification.
First train a Perceptron for a classification task.
Find suitable weights in such a way that the training examples are
correctly classified.
Geometrically try to find a hyper-plane that separates the examples of the
two classes.
The Perceptron can only model linearly separable classes.
When the two classes are not linearly separable, it may be desirable to
obtain a linear separator that minimizes the mean squared error.
Given training examples of classes C1, C2 train the Perceptron in such a way
that :
If the output of the Perceptron is +1 then the input is assigned to class C1
If the output is -1 then the input is assigned to C2
Boolean function OR – Linearly separable
X1
1 true true
false true
0 1 X2
Condition Used in Perceptron
The bias shifts the decision boundary away from the origin and does not
depend on any input value.
X1
1 true false
false true
0 1 X2
FFNN for XOR
The ANN for XOR has two hidden nodes that realizes this non-linear
separation and uses the sign (step) activation function.
Arrows from input nodes to two hidden nodes indicate the directions of
the weight vectors (1,-1) and (-1,1).
The output node is used to combine the outputs of the two hidden nodes.
Inputs Output of Hidden Nodes Output X1 XOR X2
X1 X2 H1 H2 Node
0 0 0 0 –0.5 0 0
0 1 –1 0 1 0.5 1 1
1 0 1 –1 0 0.5 1 1
1 1 0 0 –0.5 0 0
Since we are representing two states by 0 (false) and 1 (true),
we will map negative outputs (–1, –0.5) of hidden and output
layers to 0 and positive output (0.5) to 1.
The Backpropagation Algorithm
The total error for the neural network is the sum of these errors: