PMR5406 Redes Neurais e Lógica Fuzzy: Aula 3 Single Layer Percetron
PMR5406 Redes Neurais e Lógica Fuzzy: Aula 3 Single Layer Percetron
e Lógica Fuzzy
Aula 3
Single Layer Percetron
Baseado em:
Neural Networks, Simon Haykin, Prentice-Hall, 2nd
edition
Slides do curso por Elena Marchiori, Vrije
Unviersity
Architecture
• We consider the architecture: feed-forward NN
with one layer
• It is sufficient to study single layer perceptrons
with just one neuron:
PMR5406 Redes Neurais e Single Layer Perceptron 2
Lógica Fuzzy
Perceptron: Neuron Model
• Uses a non-linear (McCulloch-Pitts) model
of neuron: b (bias)
x1
w1
v y
x2 w2
(v)
wm
xm
is the sign function:
+1 IF v >= 0
(v) = -1 IF v < 0
Is the function sign(v)
i 1
wixi b 0 decision
boundary C1
decision x1
C2
region for C2
w1x1 + w2x2 + b <= 0 w1x1 + w2x2 + b = 0
PMR5406 Redes Neurais e Single Layer Perceptron 5
Lógica Fuzzy
Perceptron: Limitations
x2 C1 C2
1 1 -1
0 -1 1 C1
0 1 x1
PMR5406 Redes Neurais e Single Layer Perceptron 7
Lógica Fuzzy
Perceptron: Learning Algorithm
x2 1
- - +
Decision boundary:
C2 2x1 - x2 = 0
-1 1/2 1 x1
- + -1 + C1
Proof:
• suppose x C1 output = 1 and x C2 output = -1.
• For simplicity assume w(1) = 0, = 1.
• Suppose perceptron incorrectly classifies x(1) … x(n) … C1.
Then wT(k) x(k) 0.
Error correction rule:
w(2) = w(1) + x(1)
w(3) = w(2) + x(2) w(n+1) = x(1)+
…+ x(n)
w(n+1) = w(n) + x(n).
Convergence theorem (proof)
• Cauchy-Schwarz inequality:
||w0||2 ||w(n+1)||2 [w0T w(n+1)]2
n2 2
||w(n+1)||2 ||w0|| 2 (A)
x(k )
2
||w(n+1)||2
k 1
x1
w1
y
x2 w2
wm m
xm y x j (n)w j (n )
j 0
E ( w(n)) e ( n ) 1
2
2
m
e(n ) d (n ) x j (n)w j (n )
j 0
• We can find the minimum of the error function E by means of
the Steepest descent method
(gradient of E ( w )) E
w1
, ,
E
wm
• make a small step in that direction