0% found this document useful (0 votes)
33 views39 pages

AI Mod4 Session 8 Best Fit Line & ANN

This module is based on artificial intelligence.

Uploaded by

purne cole
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views39 pages

AI Mod4 Session 8 Best Fit Line & ANN

This module is based on artificial intelligence.

Uploaded by

purne cole
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 39

ARTIFICIAL NEURAL

NETWORKS
NEURAL NETWORKS
Some NNs are models of biological neural networks and some are
not, but historically, much of the inspiration for the field of NNs came
from the desire to produce artificial systems capable of sophisticated,
perhaps intelligent, computations similar to those that the human
brain routinely performs, and thereby possibly to enhance our
understanding of the human brain.

Most NNs have some sort of training rule. In other words, NNs learn
from examples (as children learn to recognize dogs from examples of
dogs) and exhibit some capability for generalization beyond the
training data.
NEURAL NETWORKS
LEARN FROM EXAMPLES
• No requirement of an explicit description of the problem.
• No need for a programmer.
• The neural computer adapts itself during a training period,
based on examples of similar problems even without a
desired solution to each problem. After sufficient training the
neural computer is able to relate the problem data to the
solutions, inputs to outputs, and it is then able to offer a
viable solution to a brand new problem.
• Able to generalize or to handle incomplete data.
APPLICATIONS OF NEURAL
NETWORKS
classification
in marketing: consumer spending pattern classification
In defence: radar and sonar image classification
In agriculture & fishing: fruit and catch grading
In medicine: ultrasound and electrocardiogram image
classification, EEGs, medical diagnosis

recognition and identification


In general computing and telecommunications: speech,
vision and handwriting recognition
In finance: signature verification and bank note verification
APPLICATIONS OF NEURAL
NETWORKS
assessment
In engineering: product inspection monitoring and control
In defence: target tracking
In security: motion detection, surveillance image analysis and
fingerprint matching

forecasting and prediction


In finance: foreign exchange rate and stock market forecasting
In agriculture: crop yield forecasting
In marketing: sales forecasting
In meteorology: weather prediction
INTRODUCTION TO
NEURON
INTRODUCTION TO
NEURON
INTRODUCTION TO
NEURON
The brain is a collection of about 10 billion interconnected neurons. Each
neuron is a cell that uses biochemical reactions to receive, process and
transmit information.
Each terminal button is connected to other neurons across a small gap called a
synapse.
Neuron
s
ANNS – THE BASICS

ANNs incorporate the two fundamental


components of biological neural nets:

1. Neurones (nodes)
2. Synapses (weights)
NEURONE VS. NODE

Neuron in Brain

A Neuron/Node in
Artificial Neural Network
Structure Of A Node

Squashing function limits node output:


SYNAPSE VS. WEIGHT
ELEMENTS OF NEURAL
NETWORKS
Inputs W eights
p1 w1

w2
a
p2
w3 f O utput
p3

1
Bias
a  f p 1 w 1  p 2 w 2  p 3 w 3  b  f  p i w i  b 

Neural computing requires a number of neurons, to be connected


together into a neural network. Neurons are arranged in layers.
ACTIVATION FUNCTIONS
The activation function is generally non-linear. Linear
functions are limited because the output is simply
proportional to the input.
ACTIVATION FUNCTIONS
The activation function is generally non-linear. Linear functions are
limited because the output is simply proportional to the input.
ACTIVATION FUNCTIONS
The activation function is generally non-linear.
Linear functions are limited because the output is
simply proportional to the input.
NETWORK ARCHITECTURE
Three different classes of network
architectures

single-layer feed-forward neurons are organized


multi-layer feed-forward in acyclic layers
recurrent
The architecture of a neural network is
linked with the learning algorithm used to
train
Single Layer Feed-forward

Output layer
Input layer
of
of
neurons
source nodes
Multi Layer Feed-forward
3-4-2 Network

Input Output
layer layer

Hidden Layer
Recurrent Network
Recurrent Network with hidden
neuron(s): unit delay operator z-1
implies dynamic system
z-1

z-1

z-1
The Single Neuron Structure
Bias
b
x1 w1
Activation
Local function
Field
Output
Input
signal
x2 w2  v  ( ) y

  Summing
function

xm wm
Synaptic
weights
The Single Neuron Structure
• The neuron is the basic information processing
unit of a NN. It consists of:
1 A set of synapses or connecting links, each
link characterized by a weight:
m
W1, W2, …, Wm
2 An adder function (linear combiner) which
u  wjxj
j 1
computes the weighted sum of
the inputs:

3 Activation function (squashing function) y  (u  b)


for limiting the amplitude of the
output of the neuron.
BIAS AS EXTRA INPUT
• Bias is an external parameter of the neuron. Can be
modeled by adding an extra input.
w0
x0 = +1

x1 w1 Activation
Local function
Field
Input Output
signal x2 w2  v  ( ) y

 
Summing
function

xm wm Synaptic
weights
Feedforward Neural
Networks
The basic structure off a feedforward Neural Network
FEED-FORWARD NETS
Information flow is unidirectional
Data is presented to Input layer
Passed on to Hidden Layer
Passed on to Output layer

Information is distributed

Information processing is parallel

Internal representation (interpretation) of data


Feeding data through the net:
(1  0.25) + (0.5  (-1.5)) = 0.25 + (-0.75) = - 0.5

Squashin 1
0.3775
g: 1 e 0.5
BACKPROPAGATION
ALGORITHM
1. A set of examples for training the network is assembled.
Each case consists of a problem statement (which
represents the input into the network) and the
corresponding solution (which represents the desired
output from the network).

2. The input data is entered into the network via the input
layer.

3. Each neuron in the network processes the input data


with the resultant values steadily "percolating" through the
BACKPROPAGATION
ALGORITHM

4. The actual output of the network is compared to expected


output for that particular input. This results in an error value..
5. The connection weights in the network are gradually adjusted,
working backwards from the output layer, through the hidden
layer, and to the input layer, until the correct output is
produced. Fine tuning the weights in this way has the effect of
teaching the network how to produce the correct output for a
particular input, i.e. the network learns.
FACE RECOGNITION
Handwritten digit recognition
EXAMPLE: VOICE
RECOGNITION
Task: Learn to discriminate between two different
voices saying “Hello”

Data
 Sources
 Steve Simpson
 David Raubenheimer
 Format
 Frequency distribution (60 bins)
 Analogy: cochlea
Network architecture
Feed forward network
60 input (one for each frequency bin)
6 hidden
2 output (0-1 for “Steve”, 1-0 for
“David”)
Presenting The Data
Steve

David
Presenting The Data
(Untrained Network)
Steve 0.43

0.26

David

0.73

0.55
Calculate Error
Steve 0.43 – 0 = 0.43

0.26 –1 = 0.74

David 0.73 – 1 = 0.27

0.55 – 0 = 0.55
Backprop Error And Adjust
Weights
0.43 – 0 = 0.43
Steve
0.26 – 1 = 0.74

1.17

0.73 – 1 = 0.27
David
0.55 – 0 = 0.55

0.82
Repeat process (sweep) for all
training pairs
Present data
Calculate error
Backpropagate error
Adjust weights

Repeat process multiple times

You might also like