Introduction To Neural Networks
Introduction To Neural Networks
• The Brain
– Pattern Recognition
– Association
– Complexity
– Noise Tolerance
• The Machine
– Calculation
– Precision
– Logic
The contrast in architecture
Axon Output
ANN
• The “building blocks” of neural networks are the
neurons.
• In technical systems, we also refer to them as units or
nodes.
• Basically, each neuron
receives input from many other neurons.
changes its internal state (activation) based on the
current input.
sends one output signal to many other neurons,
possibly including its input neurons (recurrent
network).
The Architecture of ANN
ANN
vertical
diagonal
horizontal
Categorize images solid
vertical
diagonal
horizontal
Categorize images solid
vertical
diagonal
horizontal
Categorize images solid
vertical
diagonal
horizontal
solid
Categorize images
vertical
diagonal
horizontal
Input neurons
Pixel brightness
0.0
-.75
.75
A neuron
+
Sum all the inputs
.50 .50
0.00
-.75
+ .75
0.0 .50
+
.50
-.75
.75
Weights
.50 .50 x 1.0
1.0 0.00 x 1.0
-.75 x 1.0
1.0
+ .75 x 1.0
0.0 .50
+
1.0
.50
-.75
1.0
.75
Weights
.50 .50 x -.2
-.2 0.00 x 0.0
-.75 x .8
0.0
+ .75 x -.5
0.0 -1.075
+
.8
-1.075
-.75
-.5
.75
Weights
.50 .50 x -.2
-.2 0.00 x 0.0
-.75 x .8
0.0
+ .75 x -.5
0.0 -1.075
+
.8
-1.075
-.75
-.5
.75
Squash the result
.50
0.0
+
1.075 .746
-.75
.75
Sigmoid squashing function
1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
-1.0
Sigmoid squashing function
1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
-1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
-1.0
Sigmoid squashing function
The squashed version
comes out here 1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
-1.0
Sigmoid squashing function
1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
-1.0
Sigmoid squashing function
1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
-1.0
the answer stays between -1 and 1.
1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
-1.0
Squash the result
.50
0.0
+
1.075 .746
-.75
.75
Weighted sum-and-squash neuron
.50
0.0
.746
-.75
.75
Make lots of neurons, identical except for
weights
To keep our picture
clear, weights will either
be
1.0 (white)
-1.0 (black) or
0.0 (missing)
Receptive fields get more complex
Repeat for additional layers
Receptive fields get still more complex
Repeat with a variation
Rectified linear units (ReLUs)
1.0
.5
-2.0 -1.5 -1.0 -.5
-.5
If your number is positive, keep it.
Otherwise you get a zero. -1.0
solid
vertical
diagonal
horizontal
solid
vertical
diagonal
horizontal
solid
vertical
diagonal
horizontal
solid
vertical
diagonal
horizontal
solid
vertical
diagonal
horizontal
solid
vertical
diagonal
horizontal
solid
vertical
diagonal
horizontal
Errors truth
0.
solid
vertical
0.
diagonal
0.
horizontal
1.
Errors truth
0.
answer
.5
solid
vertical
0. .75
diagonal
0. -.25
horizontal
1. -.75
Errors error
.5
truth
0.
answer
.5
solid
vertical
.75 0. .75
diagonal
.25 0. -.25
horizontal
1.75 1. -.75
Errors error
.5
truth
0.
answer
.5
solid
vertical
.75 0. .75
diagonal
.25 0. -.25
horizontal
1.75 1. -.75
total 3.25
Learn all the weights: Gradient
descent
Error at:
original weight
Weight
Learn all the weights: Gradient
descent
Error at:
lower weight
original weight
Weight
Learn all the weights: Gradient
descent
Error at:
lower weight
original weight
higher weight
Weight
Numerically calculating the gradient is expensive
Error at:
lower weight
original weight
higher weight
Weight
Calculate the gradient (slope)
directly
Error at:
original weight
Weight
Slope
Error at:
change in
weight = +1
original weight
Weight
Slope
Error at:
change in
weight = +1
original weight
move along
the curve
Weight
Slope
Error at:
change in
weight = +1
original weight
change in
error = -2
Weight
Slope slope = change in error
change in weight
Error at:
change in
weight = +1
original weight
change in
error = -2
Weight