Neural Networks
Neural Networks
Outline
1
Neural Network Paradigm
2
Definition of ANN
3
Biological Neural Networks
• A biological neuron
has three types of
main components;
dendrites, soma (or
cell body) and axon.
• Dendrites receives
signals from other
neurons.
• The soma, sums the incoming signals. When
sufficient input is received, the cell fires; that is it
transmit a signal over its axon to other cells.
Artificial Neurons
4
Artificial Neurons
A physical neuron
• From experience:
examples / training
data
• Strength of connection
between the neurons
is stored as a weight-
value for the specific
connection.
• Learning the solution
to a problem =
changing the
connection weights
An artificial neuron 9
Artificial Neurons
5
Artificial Neurons
Four basic components of a human biological The components of a basic artificial neuron
neuron
11
Model Of A Neuron
Wa
X1
X2 Wb f() Y
X3 Wc
6
Model Of A Neuron
A neural net consists of a large number of simple
processing elements called neurons, units, cells
or nodes.
13
Model Of A Neuron
Each neuron has an internal state, called its
activation or activity level, which is a function of
the inputs it has received. Typically, a neuron
sends its activation as a signal to several other
neurons.
14
7
Model Of A Neuron
15
x1 w1
y Axon
x2 w2 Activation Function:
yin = x1w1 + x2w2 (y-in) = 1 if y-in >=
and (y-in) = 0
Dendrite
-A neuron receives input, determines the strength or the weight of the
input, calculates the total weighted input, and compares the total
weighted with a value (threshold)
- If the total weighted input greater than or equal the threshold value,
the neuron will produce the output, and if the total weighted input less
than the threshold value, no output will be produced 16
8
First Neural Networks
17
X1
2
X2 2
Y
-1
X3
9
The First Neural Neural Networks
X1
2
X2 2
Y
-1
X1
• Neurons in a McCulloch-Pitts network are
2 connected by directed, weighted paths
X2 2
Y
• Each neuron has a fixed • The threshold is set such that any
threshold. If the net input into non-zero inhibitory input will
the neuron is greater than the prevent the neuron from firing
20
threshold, the neuron fires
10
The First Neural Neural Networks:
Examples
1
AND
X1
X1 X2 Y
Y
1 1 1
X2 1
1 0 0
AND Function
0 1 0
0 0 0
Threshold(Y) = 2
21
OR
2
X1
X1 X2 Y
Y
1 1 1
X2 2
1 0 1
0 1 1
AND Function
OR Function
0 0 0
Threshold(Y) = 2
22
11
The First Neural Neural Networks:
Examples
AND
X1 2 NOT
Y X1 X2 Y
X2
1 1 0
-1
1 0 1
AND NOT Function
0 1 0
0 0 0
Threshold(Y) = 2
23
Bias of a Neuron
• We need the bias value to be added to the
weighted sum ∑wjxj so that we can transform it
from the origin.
v = ∑wjxj + b, here b is the bias
x1-x2= -1
x2
x1-x2=0
x1-x2= 1
x1
24
12
Bias as extra input
x0 = +1 w0
Activation
x1 W1
Input v function
Attribute
values x2 w2 f ( )
Output
Summing function class
y
xm wm weights
m
v wx
j0
j j
w0 b
25
y f (u b)
26
13
Examples of Activation Functions
* ReLU(x)= max(0,x)
* Tanh(x) = (ex-e-x)/(ex+e-x)
28
14
Multilayer Networks
29
• Linear Separable:
x y x y
• Linear inseparable:
x y
• Solution?
30
15
A Multilayer Feed-Forward Neural
Network
Output Class
Ok
Output nodes
w jk
Oj
Hidden nodes
wij - weights
Input nodes
Network is fully connected
Input Record : xi
31
32
16
A Multilayer Feed Forward Network
33
34
17
A Multilayered Feed–Forward
Network
• OUTPUT LAYER – corresponds to the class attribute.
• There are as many nodes as classes (values of the
class attribute).
Ok k= 1, 2,.. #classes
• Network is fully connected, i.e. each unit provides input
to each unit in the next forward layer.
35
A Multilayered Feed–Forward
Network: XOR function
2
2
X1 -1 Z1
XOR
Y X1 X2 Y
-1
1 1 0
Z2
X2
2
1 0 1
2
0 1 1
XOR Function
0 0 0
18
Example: combining logistic
models
Inputs
.4 Output
-1 .5744
-.5 S
0.6179
.1 .5
2 -.2 .6454
.3
S
S
.3
5 .1
Dependent
Independent Weights HiddenL Weights variable
variables ayer
Prediction
Sigmoid(x) = 1/(1+e-x) 37
Dependent
Independent Weights HiddenL Weights variable
variables ayer
Prediction
Sigmoid(x) = 1/(1+e-x) 38
19
Example: combining logistic
models
Inputs
Output
-1
-.5 .5
0.6179
2 -.2
S
.3
5 .1
Dependent
Independent Weights HiddenL Weights variable
variables ayer
Prediction
Sigmoid(x) = 1/(1+e-x) 39
Dependent
Independent Weights HiddenL Weights variable
variables ayer
Prediction
Sigmoid(x) = 1/(1+e-x) 40
20
Example: combining logistic
models
-1 .4 .5744
-.5 S
.1 .5 0.6179
2 -.2 .6456
.3
S
S
.3
5 .1
Dependent
Independent Weights HiddenL Weights variable
variables ayer
Prediction
no target for hidden units... 41
Classification by back
propagation
42
21
Classification by Back propagation
A dataset
Fields class
1.4 2.7 1.9 0
3.8 3.4 3.2 0
6.4 2.8 1.7 1
4.1 0.1 0.2 0
etc …
43
44
22
Classification by Back propagation
45
46
23
Classification by Back propagation
47
48
24
Classification by Back propagation
Adjust weights based on error
Training data
Fields class
1.4 2.7 1.9 0 1.4
3.8 3.4 3.2 0
6.4 2.8 1.7 1 2.7 0.8
4.1 0.1 0.2 0
0
etc …
1.9 error 0.8
49
50
25
Classification by Back propagation
Training data Feed it through to get output
Fields class
1.4 2.7 1.9 0
3.8 3.4 3.2 0 6.4
6.4 2.8 1.7 1
4.1 0.1 0.2 0 2.8 0.9
etc …
1.7
51
52
26
Classification by Back propagation
53
27
The decision boundary perspective
Initial random weights
55
56
28
The decision boundary perspective
Present a training instance / adjust the weights
57
58
29
The decision boundary perspective
Present a training instance / adjust the weights
59
60
30
Recap
61
Recap
62
31
Neural Network Architectures
63
completely
connected feedforward recurrent
(directed, a-cyclic) (feedback connections)
64
32
Networks types
65
33