0% found this document useful (0 votes)
25 views34 pages

1 - Introduction

Artificial Neural Networks (ANNs) are computational models inspired by biological neural networks. ANNs can perform tasks such as pattern recognition and classification through a learning process. The document discusses the history of ANNs, biological neurons, pattern recognition tasks performed by ANNs, and advantages of ANNs such as adaptive learning, parallel processing, and fault tolerance. It also compares biological neural networks to ANNs and the McCulloch-Pitts model, one of the earliest and simplest models of ANNs.

Uploaded by

Raj Saha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views34 pages

1 - Introduction

Artificial Neural Networks (ANNs) are computational models inspired by biological neural networks. ANNs can perform tasks such as pattern recognition and classification through a learning process. The document discusses the history of ANNs, biological neurons, pattern recognition tasks performed by ANNs, and advantages of ANNs such as adaptive learning, parallel processing, and fault tolerance. It also compares biological neural networks to ANNs and the McCulloch-Pitts model, one of the earliest and simplest models of ANNs.

Uploaded by

Raj Saha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

Artificial Neural Networks

Dr. Amod Kumar


Professor

National Institute of Technical Teachers Training & Research


Sector 26, Chandigarh -160 019
Introduction
Pattern & Data

• Humans understand Patterns - numbers, text, pictures,


sounds. Recall even in the presence of noise.
• Conventional machines don’t recognise patterns. They
only recognize data.

• Humans continuously learn from examples.


• Not understood well enough to implement it in an
algorithmic way in a machine.
Pattern Recognition Tasks
• Pattern Association
• Pattern Classification
• Pattern Mapping
Pattern Association

 Set of Pattern or pattern pairs


 Autoassociative and Heteroassociative
 Memory Function
 Accretive behavior

Ex: Printed Char, fixed symbols


Network Behavior
Pattern Classification

 Individual Pattern or pattern pair not stored


 Relation between patterns of one class
 Test and training patterns of a class different
 Accretive behavior

Ex: Speech spectra of steady vowels, hand printed characters


Pattern Classification
Pattern Mapping

 Relation between I/P and O/P patterns


 Generalization and not memorizing
 Test and training patterns different
 Interpolative behavior

Ex: Speech spectra of vowels, hand printed chars


A Brief History of ANN
The history of ANN can be divided into the following three eras −
ANN during 1940s to 1960s
Some key developments of this era are as follows −
1943 − It has been assumed that the concept of neural network
started with the work of physiologist, Warren McCulloch, and
mathematician, Walter Pitts, when in 1943 they modeled a
simple neural network using electrical circuits in order to
describe how neurons in the brain might work.
1949 − Donald Hebb’s book, The Organization of Behavior, put forth
the fact that repeated activation of one neuron by another
increases its strength each time they are used.
1956 − An associative memory network was introduced by Taylor.
1958 − A learning method for McCulloch and Pitts neuron model
named Perceptron was invented by Rosenblatt.
1960 − Bernard Widrow and Marcian Hoff developed models called
"ADALINE" and “MADALINE.”
A Brief History of ANN
ANN during 1960s to 1980s

Some key developments of this era are as follows −

1961 − Rosenblatt made an unsuccessful attempt but proposed the


“backpropagation” scheme for multilayer networks.
1964 − Taylor constructed a winner-take-all circuit with inhibitions
among output units.
1969 − Multilayer perceptron MLPMLP was invented by Minsky and
Papert.
1971 − Kohonen developed Associative memories.
1976 − Stephen Grossberg and Gail Carpenter developed Adaptive
resonance theory.
A Brief History of ANN
ANN from 1980s till Present

Some key developments of this era are as follows −


1982 − The major development was Hopfield’s Energy approach.
1985 − Boltzmann machine was developed by Ackley, Hinton, and
Sejnowski.
1986 − Rumelhart, Hinton, and Williams introduced Generalised
Delta Rule.
1988 − Kosko developed Binary Associative Memory BAM and also
gave the concept of Fuzzy Logic in ANN.

The historical review shows that significant progress has been made
in this field. Neural network based chips are emerging and
applications to complex problems are being developed. Surely, today
is a period of transition for neural network technology.
ANN Definition
 An artificial neural network (ANN) may be defined as an
information-processing model that is inspired by the way biological
nervous systems, such as the brain, process information.
 This model tries to replicate only the most basic functions of the
brain. An ANN is composed of a large number of highly
interconnected processing elements (neurons) working in unison to
solve specific problems.
 Artificial neural networks, like people, learn by example. An ANN is
configured for a specific application, such as pattern recognition or
data classification through a learning process. In biological
systems, learning involves adjustments to the synaptic connections
that exist between the neurons.
 Many neural networks now being designed are statistically quite
accurate, but they still leave their users with a bad taste as they
falter when it comes to solving problems accurately. They might be
85—90% accurate. There exist certain real-world applications
which tolerate that level of error.
Biological Neuron
Cell Body = 10-80 m
Synaptic junction gap = 200 nm
Length of neuron = 0.01 – 1m

Frequency of pulses = 1-100 /sec


Propagation Speed (neuron)= 0.5 - 2 m/s
Propagation Speed (junction)= 0.5 ms
Threshold potential for firing = 10 mV
Total Neurons in cortex = 1011
Total synaptic connections = 1015
How the brain works
• Each neuron receives inputs from other neurons
- A few neurons also connect to receptors.
- Cortical neurons use spikes to communicate.
• The effect of each input line on the neuron is controlled by a
synaptic weight
• The weights can be positive or negative.
• The synaptic weights adapt so that the whole
network learns to perform useful computations
• Recognizing objects, understanding language, making
plans, controlling the body.
• You have about 1011 neurons each with about 104 weights.
• A huge number of weights can affect the computation in
a very short time. Much better bandwidth than a
workstation.
Modularity and the Brain
• Different bits of the cortex do different things.
• Local damage to the brain has specific effects.
• Specific tasks increase the blood flow to specific regions.

• But cortex looks pretty much the same all over.


• Early brain damage makes functions relocate.

• Cortex is made of general purpose stuff that has the


ability to turn into special purpose hardware in response
to experience.
• This gives rapid parallel computation plus flexibility.
• Conventional computers get flexibility by having stored
sequential programs, but this requires very fast central
processors to perform long sequential computations.
ANN versus BNN
Before taking a look at the differences between Artificial Neural
Network (ANN) and Biological Neural Network (BNN), let us take a
look at the similarities based on the terminology between these two.

The following table shows the comparison between ANN and BNN:

Biological Neural Artificial Neural


Network BNN Network ANN
Soma Node
Dendrites Input
Synapse Weights or Interconnections
Axon Output
The Brain vs. Computer

 Speed • Slow • Faster than neuron

 Processing • Parallel Processing • Sequential processing

 Size • Computing elements • Computing elements


Large limited

 Complexity • Highly complex • Less complex

 Storage • Adjustable • Destroying

 Fault tolerance • Yes • No

 Control • Local control • Central control


Mechanism
Advantages of Neural Networks

 Neural networks, with their remarkable ability to derive meaning


from complicated or imprecise data, are used to extract patterns
and detect trends that are too complex to be noticed by either
humans or other computer techniques.

 A trained neural network could be thought of as an “expert” in a


particular category of information it has been given to analyze.

Other advantages of working with an ANN include:

1. Adaptive learning: An ANN is endowed with the ability to learn


how to do tasks based on the data given for training or initial
experience.
2. Self-organization: An ANN can create its own organization or
representation of the information it receives during learning time.
Advantages of Neural Networks
3. Real-time operation: ANN computations may be carried out in
parallel. Special hardware devices are being designed ‘and
manufactured to take advantage of this capability of ANNs.
4. Fault tolerance via redundant information coding Partial
destruction of a neural network leads to the corresponding
degradation of performance. However, some network capabilities
may be retained even after major network damage.
 Currently, neural networks can’t function as a user interface which
translates spoken words into instructions for a machine, but
someday they would have this skill. Then appliances and word
processors would simply be activated by voice.
 Touch screen and voice editing would replace the word
processors of today. Besides, spread-sheets and databases
would be imparted such level of usability that would be pleasing to
everyone. But for now, neural networks are only entering the
marketplace in niche areas where their statistical accuracy is
valuable.
McCulloch-Pitts Model of ANN
The following diagram represents the general model of ANN followed by
its processing.

The output can be calculated by applying the activation function over the
net input.
McCulloch-Pitts Model of ANN
 Activation of an M-P neuron is binary i.e. at any point of time
the neuron will fire or it will not fire. If the net input yin is more
than a threshold , the neuron fires otherwise it does not.
y
1 if yin > 
1.0 y=
0 if yin < 

0  yin
Binary

 The weights associated with the links may be excitatory (+ve)


or inhibitory (-ve)
 All the excitatory weights will be same and all the inhibitory
weights will also be same.
 If there is even one inhibitory input, the neuron will not fire.
 Neuron will fir when there is no inhibitory input and yin >  due
to excitatory links.
Linear Separability
An ANN does not give an exact solution for a nonlinear problem.
However, it provides an approximate solution to nonlinear problems.
Linear separability is the concept wherein the separation of input
space into regions is based on whether the network response is
positive or negative.
A decision line is drawn to separate positive and negative responses.
The decision line may also be called as the decision-making Line or
decision-support Line or linear-separable line. The necessity of the
linear separability concept was felt to clarify classify the patterns
based upon their output responses.
Generally, the net input calculated to the output unit is given as –

⇒ The linear separability of the network is based on the decision-


boundary line. If there exist weight for which the training input
vectors having a positive (correct) response, or lie on one side of the
decision boundary and all the other vectors having negative, −1
response lies on the other side of the decision boundary then we can
conclude the problem is "Linearly Separable".
Linear Separability
Consider, a single layer network as shown in the figure.

The net input for the network shown in the figure is given as-

The separating line for which the boundary lies between the
values x1 and x2, so that the net gives a positive response on one
side and negative response on the other side, is given as
Linear Separability

If weight w2 is not equal to 0 then we get,

Thus, the requirement for the positive response of the net is

During the training process, the values of w1, w2 and b are


determined so that the net will produce a positive (correct)
response for the training data. If on the other hand, the threshold
value is being used, then the condition for obtaining the positive
response from the output unit is
Net input received > θ (threshold)
yin > θ
Linear Separability

The separating line equation will then be

During the training process, the values of w1 and w2 have to be


determined, so that the net will have a correct response to the
training data. For this correct response, the line passes close
through the origin. In certain situations, even for a correct
response, the separating line does not pass through the origin.
Linear Separability
Example 1
Ex: Implement AND function using McCulloch-Pitts neuron with binary
inputs.
Example 2
Ex: Implement XOR function using McCulloch-Pitts neuron with binary inputs.
Example 2
Example 2
Example 3
Ex: using the linear separability concept, obtain the response for OR
function with bipolar inputs and targets.

Sol:
Example 3

You might also like