Presentation On Neural Networks
Presentation On Neural Networks
Neural Networks.
Basics Of Neural Networks
• Neural networks refers to a connectionist model that
simulates the biophysical information processing occurring
in the nervous system.
• It can also be defined as an interconnected assembly of
simple processing elements ,units or nodes whose
functionality is loosely based on the animal neuron.
• And a cognitive information processing structure based
(on models of brain function. In a more formal engineering
context a highly parallel dynamical system with the
topology of a directed graph that can carry out information
processing by means of it's state response to continuous
or initial input.
Basics Of Neural Networks
Benefits Of Neural Networks
• Non-linearity
• Input-output Mapping
• Adaptivity
• Evidential Response
• Contentional Information
• Fault Tolerance
• RESEARCH THOUGHT
• 1. Neural networks are highly parallel structures which is true
because human brain functions in the same way.
• 2. But apart from being parallel it is has priority based
parallelism.
• 3. Apart from being parallel there is interaction between
these parallel processes and in the end one process may
dominate while others vanish or survive with much lower
priority.
Benefits Of Neural Networks
Facts
• 1. Knowledge is
acquired by the
network from its
environment
through a learning
process.
• 2. Interneuron
connection
strengths known as
synaptic weights
are used to store
the acquired
knowledge
LEARNING IN NEURAL
NETWORKS
RESEARCH THOUGHT
• 1. Normally learning process is iterative process in which
neural networks consistently learn from environment.
• 2. Neural networks must try for self eradication of
of error by heuristically moving towards the goal state.
• 3. It means that there should be combination of heuristic
knowledge and previous data to obtain the final result.
MEMORY BASED LEARNING
• Past experiences are explicitly stored in a large memory of correctly
classified input-output examples.
• xi is the input vector
• di denotes the desired response
• c1 and c2 are classification examples
• Retrieving and analyzing the training data by putting into classifications
c1 and c2.
RESEARCH THOUGHT
• 1. Since memory based learning is only a classification process it is
inaccurate because it does not account long term and short term
memory.
• 2. It should be a layered process where information if filtered from the
forward layers to the backward layers.
• 3. The forward layers are short term memory layers where as back
layers are long term memory layers.
• 4. The neural network must operate by considering all the layers giving
short term memory layers more priority than long term memory layers.
MEMORY BASED LEARNING
Memory Based Learning
(Working Example)
• In memory based learning there is classification of input-
output examples {(Xi,Di)}i=1 to N.where Xi is the input
vector and Di is the desired response.
• A working example of memory based learning is car
movement. We shall classify all the cases into two parts 1
(car speed up) and 0 (car slow down).The input signals
are X1 -> road conditions , X2-> traffic signal, X3-> fuel
efficiency , X4-> road ascent.
• Now when a set of inputs are applied to X1, X2, X3, X4
then the response is either speed up or slow down. As per
the memory based learning all these cases can be stored
during learning and new cases can then be classified as
being of either speeding up or slow down of the car
depending on the input conditions.
Memory Based Learning
(Working Example)
HEBBIAN LEARNING
• When an axon of cell a is near enough to cell b and
repeatedly takes part in firing it some growth process or
metabolic change takes place in one or both cells such
that efficiency of a as one of the cells firing b is increased.
• It means that two neurons on the either side of synapse
are activated simultaneously causing the strength of
synapse to increase.
RESEARCH THOUGHT
• Hebbian learning should be classifieds into two parts
• 1. A process in which there is gradual shift toward
strengthening of synapse if the input total synaptic weights
are below a threshold value.
• 2. If the synaptic weights inputs are above a threshold
value there is a fast shift in a single iteration.
HEBBIAN LEARNING
Hebbian Learning (Working
Example)
• Hebbian learning in mathematical terms can be expressed
by considering a synaptic weight Wkj of neuron k with pre-
synaptic and post-synaptic signals denoted by Xj and Yk.
If pre-synaptic and post-synaptic signals are synchronous
then there is increase in weight .The adjustment applied to
the synaptic weight Wkj at time step n is
• Δ Wkj(n) = F(Yk(n),Xj(n))
• A working example is introduction of traffic signal X1->Red
, X2-> Yellow and X3-> Green. We can observe that initial
slow down at red signal Y1(n) and initial startup at green
signal Y2(n) is slow but with time the response becomes
stronger and faster.
Hebbian Learning (Working
Example)
COMPETITIVE LEARNING
RESEARCH THOUGHT
• Blotzmann learning puts the neurons in only
two states +1 and -1 whereas actually they
should take a number of states depending on
the set of inputs previous states.
BOLTZMANN LEARNING
Boltzman Learning (working
Example)
• Boltzman machine operates on the energy generated
when a signal moves from neuron j to neuron k. This
process continues till the system reaches thermal
equilibrium or the desired state.
• A working example can be a thermostat which keeps a
check on the heat energy being released in various
processes in a factory. The Boltzmann system can
gradually learn the amount of heat energy released during
all processes and then learn to adjust the weights
maintaining an optimal temperature. In fact it can
automatically guide the temperature maintenance all the
time.
P (change) = 1/1+exp (-∆E/Ti)
Boltzman Learning (working
Example)
• We can easily calculate it as-
RESEARCH THOUGHT
• 1. Supervised learning should be object based in which
we try to learn about an object from the environment.
• 2. It means there is need to first learn about the object
properties and then about the object methods.
• 3. Once the object has been learned neural network may
simulate it for a set of inputs
SUPERVISED LEARNING
Supervised Learning (Working
Example)
• The aircraft control system can become a good
example of supervised learning because the
aircraft navigation system faces new
environmental conditions all the time. But these
conditions are fed to the GPS, Ground support
and other devices which teach the system to
deal with them.
• The system can do the error correction learning
to stay on course and learn to manage the
system. When the system has fully learned to
automate itself, it can be put onto a pilot less
vehicle for self navigation with minimal outside
help.
Supervised Learning (Working
Example)
UNSUPERVISED LEARNING
• In unsupervised learning there is no external teacher.
Rather the process is made for a task-independent
measure of quality of representation that the network is
required to learn.
• Various stochastic methods like standard deviation
regression are used to obtain useful information from data.
RESEARCH THOUGHT
• Since there is no supervision required and all data is
collected and then analyzed it would be useful to first
create a broad classification of environment
• Once the environment has been classified data from the
environment can be further classified to make the data
collected to be more meaningful.
UNSUPERVISED LEARNING
Unsupervised Learning
(Working Example)
• In case of unsupervised learning there is no
error correction support applied. The data is
statistically classified into one or more classes.
• Unsupervised learning can be used in weather
forecast system. The data can be collected in
the form of variable values T (Temperature
Conditions), C (Cloud formations), H (Humidity
reading in and around a place), A (Air flow
readings).
• These can then use unsupervised learning to
learn to make a correct weather forecast as an
output of the neural network.
Unsupervised Learning
(Working Example)
SINGLE LAYER PERCEPTRON
RESEARCH THOUGHT
• Single layer perceptron should be clocked as being a
slower and a faster neuron.
• Further the weights themselves should be a function of
time and should depend on delta t.
SINGLE LAYER PERCEPTRON
Single Layer Perceptron
(Working Example)
• Single layer Perceptron does binary classification and
then does error correction as per the learning rule by
modification of weights.
• An example can be a Perceptron that calculates the price
of a product. We can consider the input variable with
some initial weights X1(Market demand), X2(Input
material prices), X3(Past growth) X4(Profit expected). This
can be expressed as a linear equation.
RESEARCH THOUGHT
• As there are numerous layers in multi layer perceptron
they should be classified as-
• 1. Which layer is faster than others?
• 2. Which layer has a higher priority?
• 3. Which layer is responsible for what part of output?
Multi-Layer Perceptron
(Working Example) XOR
RADIAL BASIS FUNCTION
RESEARCH THOUGHT
• SVM can calculate the probability of each point being a part of
classification.
• It can further deduce the results as to validity of each input for a
classification.
SUPPORT VECTOR MACHINES
Support Vector Machines
(Working Example)
• Support Vector machines use the hyper plane equation to
separate the examples into two classes +1 or -1. Support
vector machines use training data { (Xi,di) }i=1 to N. di =
+1 or di = -1 is the desired response from the neural
network. It uses equations