Lecture Notes For Chapter 4 Artificial Neural Networks: Data Mining
Lecture Notes For Chapter 4 Artificial Neural Networks: Data Mining
Activation Function
X1 X2 X3 Y
1 0 0 -1
1 0 1 1
1 1 0 1
1 1 1 1
0 0 1 -1
0 1 0 -1
0 1 1 1
0 0 0 -1
X1 X2 X3 Y
1 0 0 -1
1 0 1 1
1 1 0 1
1 1 1 1
0 0 1 -1
0 1 0 -1
0 1 1 1
0 0 0 -1
Y sign ( 0 . 3 X 1 0 . 3 X 2 0 . 3 X 3 0 . 4 )
1 if x 0
where sign ( x )
1 if x 0
10/12/2020 Introduction to Data Mining, 2nd Edition 5
Perceptron Learning Rule
Intuition:
– Update weight based on error: e =
– If y = , e=0: no update needed
– If y > , e=2: weight must be increased so
that will increase
– If y < , e=-2: weight must be decreased so
that will decrease
10/12/2020 Introduction to Data Mining, 2nd Edition 7
Example of Perceptron Learning
0.1
X 1 X2 X3 Y w0 w1 w2 w3 Epoch w0 w1 w2 w3
1 0 0 -1 0 0 0 0 0 0 0 0 0 0
1 0 1 1 1 -0.2 -0.2 0 0 1 -0.2 0 0.2 0.2
2 0 0 0 0.2 2 -0.2 0 0.4 0.2
1 1 0 1
3 0 0 0 0.2
1 1 1 1 3 -0.4 0 0.4 0.2
4 0 0 0 0.2
0 0 1 -1 5 -0.2 0 0 0 4 -0.4 0.2 0.4 0.4
0 1 0 -1 6 -0.2 0 0 0 5 -0.6 0.2 0.4 0.2
0 1 1 1 7 0 0 0.2 0.2 6 -0.6 0.4 0.4 0.2
0 0 0 -1 8 -0.2 0 0.2 0.2
Weight updates over
Weight updates over first epoch all epochs
Since y is a linear
combination of input
variables, decision
boundary is linear
Since y is a linear
combination of input
variables, decision
boundary is linear
XOR Data
y x1 x2
x1 x2 y
0 0 -1
1 0 1
0 1 1
1 1 -1
Also referred to as
“feedforward neural networks”
�
�
: learning rate
At output layer L: