Characteristics of Artificial Neural Networks

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 38

Introduction to AI

Neural Networks
G5AIAI Neural Networks

Neural Networks
• McCulloch & Pitts (1943) are generally
recognised as the designers of the first
neural network

• Many of their ideas still used today (e.g.


many simple units combine to give
increased computational power and the
idea of a threshold)
G5AIAI Neural Networks

Neural Networks

• Hebb (1949) developed the first learning


rule (on the premise that if two neurons
were active at the same time the strength
between them should be increased)
G5AIAI Neural Networks

Neural Networks
• During the 50’s and 60’s many
researchers worked on the perceptron
amidst great excitement.
• 1969 saw the death of neural network
research for about 15 years – Minsky &
Papert
• Only in the mid 80’s (Parker and LeCun)
was interest revived (in fact Werbos
discovered algorithm in 1974)
G5AIAI Neural Networks

Neural Networks
G5AIAI Neural Networks

Neural Networks

• We are born with about 100 billion


neurons

• A neuron may connect to as many as


100,000 other neurons
G5AIAI Neural Networks

Neural Networks
• Signals “move” via electrochemical
signals

• The synapses release a chemical


transmitter – the sum of which can cause
a threshold to be reached – causing the
neuron to “fire”

• Synapses can be inhibitory or excitatory


G5AIAI Neural Networks

The First Neural Neural Networks

McCulloch and Pitts produced the first


neural network in 1943

Many of the principles can still be seen


in neural networks of today
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

The activation of a neuron is binary. That is,


the neuron either fires (activation of one) or
does not fire (activation of zero).
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3 For the network shown here the activation


function for unit Y is

f(y_in) = 1, if y_in >= θ else 0

where y_in is the total input signal received


θ is the threshold for Y
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

Neurons in a McCulloch-Pitts network are


connected by directed, weighted paths
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

If the weight on a path is positive the path is


excitatory, otherwise it is inhibitory
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

All excitatory connections into a particular


neuron have the same weight, although
different weighted connections can be input
to different neurons
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

Each neuron has a fixed threshold. If the net


input into the neuron is greater than or equal
to the threshold, the neuron fires
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

The threshold is set such that any non-zero


inhibitory input will prevent the neuron from
firing
G5AIAI Neural Networks

The First Neural Neural Networks


X1
2

X2 2
Y

-1

X3

It takes one time step for a signal to pass


over one connection.
G5AIAI Neural Networks

The First Neural Neural Networks

1
AND
X1

Y
X1 X2 Y
1 1 1
X2 1
1 0 0
AND Function
0 1 0
0 0 0

Threshold(Y) = 2
G5AIAI Neural Networks

The First Neural Neural Networks


OR
X1 2
X1 X2 Y
Y
1 1 1
X2 2
1 0 1
0 1 1
AND Function
OR Function
0 0 0

Threshold(Y) = 2
G5AIAI Neural Networks

The First Neural Neural Networks


AND
X1 2 NOT
Y X1 X2 Y
X2
1 1 0
-1
1 0 1
AND NOT Function
0 1 0
0 0 0

Threshold(Y) = 2
G5AIAI Neural Networks

The First Neural Neural Networks


2
2
X1 -1 Z1
XOR
Y X1 X2 Y
-1
1 1 0
Z2
X2
2
1 0 1
2
0 1 1
XOR Function
0 0 0

X1 XOR X2 = (X1 AND NOT X2) OR (X2 AND NOT X1)


G5AIAI Neural Networks

Modelling a Neuron

in i   j Wj , iaj • aj :Activation value of unit j


• wj,I :Weight on the link from unit j to unit i
• inI :Weighted sum of inputs to unit i
• aI :Activation value of unit i
• g :Activation function
G5AIAI Neural Networks

Activation Functions

• Stept(x) = 1 if x >= t, else 0


• Sign(x) = +1 if x >= 0, else –1
• Sigmoid(x) = 1/(1+e-x)
• Identity Function
G5AIAI Neural Networks

Simple Networks

AND OR NOT
Input 1 0 0 1 1 0 0 1 1 0 1
Input 2 0 1 0 1 0 1 0 1
Output 0 0 0 1 0 1 1 1 1 0
G5AIAI Neural Networks

Simple Networks

-1
W = 1.5

x t = 0.0

W=1
y
G5AIAI Neural Networks

Perceptron
• Synonym for Single-
Layer, Feed-Forward
Network
• First Studied in the
50’s
• Other networks were
known about but the
perceptron was the
only one capable of
learning and thus all
research was
concentrated in this
area
G5AIAI Neural Networks

Perceptron
• A single weight only
affects one output so
we can restrict our
investigations to a
model as shown on
the right
• Notation can be
simpler, i.e.

O  Step0 j WjIj
G5AIAI Neural Networks

What can perceptrons represent?

AND XOR
Input 1 0 0 1 1 0 0 1 1
Input 2 0 1 0 1 0 1 0 1
Output 0 0 0 1 0 1 1 0
G5AIAI Neural Networks

What can perceptrons represent?


1,1
1,1
0,1
0,1

0,0 1,0
1,0
0,0
AND XOR

• Functions which can be separated in this way are called


Linearly Separable

• Only linearly Separable functions can be represented by a


perceptron
G5AIAI Neural Networks

What can perceptrons represent?

Linear Separability is also possible in more than 3 dimensions –


but it is harder to visualise
G5AIAI Neural Networks

Training a perceptron

Aim

AND
Input 1 0 0 1 1
Input 2 0 1 0 1
Output 0 0 0 1
G5AIAI Neural Networks

Training a perceptrons
-1
W = 0.3

x t = 0.0
W = 0.5

W = -0.4
y

I1 I2 I3 Summation Output
-1 0 0 (-1*0.3) + (0*0.5) + (0*-0.4) = -0.3 0
-1 0 1 (-1*0.3) + (0*0.5) + (1*-0.4) = -0.7 0
-1 1 0 (-1*0.3) + (1*0.5) + (0*-0.4) = 0.2 1
-1 1 1 (-1*0.3) + (1*0.5) + (1*-0.4) = -0.2 0
G5AIAI Neural Networks

Learning
While epoch produces an error
Present network with next inputs from
epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While
G5AIAI Neural Networks

Learning
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While

Epoch : Presentation of the entire training set to the neural


network.
In the case of the AND function an epoch consists of
four sets of inputs being presented to the network (i.e.
[0,0], [0,1], [1,0], [1,1])
G5AIAI Neural Networks

Learning
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While

Training Value, T : When we are training a network we not


only present it with the input but also with a value that
we require the network to produce. For example, if we
present the network with [1,1] for the AND function
the training value will be 1
G5AIAI Neural Networks

Learning
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While

Error, Err : The error value is the amount by which the value
output by the network differs from the training value.
For example, if we required the network to output 0
and it output a 1, then Err = -1
G5AIAI Neural Networks

Learning
While epoch produces an error
Present network with next inputs from epoch
Err = T – O
If Err <> 0 then
Wj = Wj + LR * Ij * Err
End If
End While

Output from Neuron, O : The output value from the neuron


Ij : Inputs being presented to the neuron
Wj : Weight from input neuron (Ij) to the output neuron
LR : The learning rate. This dictates how quickly the network
converges. It is set by a matter of experimentation. It
is typically 0.1
G5AIAI Neural Networks

Learning
I1
1,1
0,1
After First Epoch
Note
I1 point = W0/W1
I2
I2 point = W0/W2 0,0 1,0

I1
1,1
0,1

At Convergence
I2
0,0 1,0
G5AIAI Neural Networks

And Finally….

“If the brain were so simple


that we could understand it
then we’d be so simple that
we couldn’t”
Lyall Watson

You might also like