EELU ANN ITF309 Lecture 07 Spring 2024
EELU ANN ITF309 Lecture 07 Spring 2024
n11 a11
p1 2
2
P1
1 n12 a12
1
1 n12 a12 1 .5 AND
p2
1 1
1.5 P4
1
Individual Decisions 1
w
EELU ITF309 Neural Network 2
3/19/2024 Lecture 7 9
Solved Problem P11.1
Design a multilayer network to distinguish
these categories.
p1 1 1 1 1 p3 1 1 1 1
T T
p 2 1 1 1 1 p 4 1 1 1 1
T T
Class I Class II
Wp1 b 0 Wp 3 b 0
Wp 2 b 0 Wp 4 b 0
There is no hyperplane that can separate these two categories.
EELU ITF309 Neural Network
3/19/2024 Lecture 7 10
Solution of Problem P11.1
p1 2 n11 a11
p2 2
1 n12 a12
1
p3 2 n12 a12 1
1
p4 2 1 OR
1
AND
EELU ITF309 Neural Network
3/19/2024 Lecture 7 11
Function Approximation
Two-layer, 1-2-1 network
1
f ( n)
1
n
, f 2
( n) n
1 e
w12 1, w12 1, b2 0.
-1
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
EELU ITF309 Neural Network
3/19/2024 Lecture 7 12
Function Approximation
The centers of the steps occur where
the net input to a neuron in the first
layer is zero.
n11 w11 p b11 0 p b11 w11 (10) 10 1
n12 w12 p b21 0 p b21 w12 10 10 1
The steepness of each step can be
adjusted by changing the network
weights.
EELU ITF309 Neural Network
3/19/2024 Lecture 7 13
Effect of Parameter Changes
3
b21
20 15 10 5 0
-1
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
w12
1.0
2
0.5
0.0
1
-0.5
-1.0
0
-1
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
w12
1.0
2
0.5
0.0
1
-0.5
-1.0
0
-1
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
b2
1.0
2
0.5
0.0
1
-0.5
-1.0
0
-1
-2 -1.5 -1 -0.5 0 0.5 1 1.5 2
xn
The backpropagation algorithm was used to train the
multi-layer perception MLP
MLP used to describe any general Feedforward (no recurrent
connections) Neural Network FNN
Input signals
1
x1 1 y1
1
2
x2 2 y2
2
i wij j wjk
xi k yk
m
n l yl
xn
Input Hidden Output
layer layer layer
Error signals
bm k + 1 = bm k – sm
F
-------m---
EELU ITF309 Neural Network
n m
3/19/2024 Lecture 7 S 31
BP the Sensitivity
Backpropagation: a recurrence
relationship in which the sensitivity at
layer m is computed from the
sensitivity at layer m+1.
Jacobian matrix: n n n m 1
1
m 1
1
m 1
1
n n n
m m m
1 2 sm
m 1 n m 1
nm 1
n m 1
n n
2 2
2
m
n m
n .
m
n m 1 2 sm
nsmm11 nsmm11 nsmm11
m
n1
Network n2m
nsmm
EELU ITF309 Neural
3/19/2024 Lecture 7 32
Matrix Repression
The i,j element of Jacobian matrix
s m 1 m
m
wi ,l al bi m 1
nim 1 m 1
a m
m 1
f m
( n m
j )
wi , j wi , j
l 1 j
n mj n j m
n j
m
n mj
wim, j1 f m (n mj ).
n m1 m 1 m f m (n1m )
W F (n m
) , 0 0
n m
m (n m )
0 f 0
F m (n m ) 2
.
f m (n mm )
Neural
0 0
EELU ITF309 Network s
3/19/2024 Lecture 7 33
Recurrence Relation
The recurrence relation for the sensitivity
F n Fˆ
ˆ m 1 T
(n )(W ) Fˆ
s m
m
m
m 1
F m m 1 T
m 1
n n n n
F m (n m )(W m1 ) T s m1 .
s M 2F M (n M )(t a)
w1m,S m wSmm1 , j
wim,S m
pR S1 Sm S m 1
SM a SMM
w1S1 , R wSmm1 ,S m
p e
1-2-1
Network
3
Network Response
Initial Network Sine Wave
Response: 2
a2 1
-1
-2 -1 0 1 2
EELU ITF309 Neural Network p
3/19/2024 Lecture 7 40
Forward Propagation
0
Initial input: a = p = 1
0.368
error:
e = t – a = 1 + sin --- p – a = 1 + sin --- 1 – 0.446 = 1.261
2
4 4
EELU ITF309 Neural Network
3/19/2024 Lecture 7 41
Transfer Func. Derivatives
n
f 1 (n) d 1 e
n
dn 1 e (1 e n ) 2
1 1
1 n n
(1 a1
)( a1
)
1 e 1 e
f 2 (n) d (n) 1
dn
i
g p = 1 + sin ----- p
2 2
4 1 1
0 0
-1 -1
1-3-1 Network -2 -1 0 1 2 -2 -1 0 1 2
i 1 i2
3 3
2 2
1 1
0 0
-1 -1
-2 -1 0 1 2 -2 -1 0 1 2
i4 i 8
EELU ITF309 Neural Network
3/19/2024 Lecture 7 46
Illustrated Example 2
3 3
2
1-2-1 2
1-3-1
6
g p = 1 + sin ------ p 1 1
4
0 0
2 p2 -1 -1
-2 -1 0 1 2 -2 -1 0 1 2
3 3
2
1-4-1 2
1-5-1
1 1
0 0
-1 -1
-2 -1 0 1 2 -2 -1 0 1 2
2 5 2
1
5
1 3 1 3
2 4
4 2
0 0 0
0
1
-1 -1
-2 -1 0 1 2 -2 -1 0 1 2
1-2-1 1-9-1
2 2
1 1
0 0
-1
-2
Generalize well
-1 0 1
Not generalize well
2
-1
-2 -1 0 1 2