0% found this document useful (0 votes)
43 views21 pages

Neuro Fuzzy - Session 1

This session covers the various components of human neuron and how the neurons communicate with each other.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views21 pages

Neuro Fuzzy - Session 1

This session covers the various components of human neuron and how the neurons communicate with each other.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

18CSE352T

NEURO FUZZY AND GENETIC PROGRAMMING

SESSION 1
What is Soft Computing?
• Soft computing deals with approximate models
• It deals with partial truth, uncertainty and approximation
• Gives solutions to complex real-life problems
• The role model for soft computing is the human mind
• Soft computing is not a single method
• It is a combination of several methods such as
• Artificial Neural Networks
• Fuzzy Logic
• Genetic Algorithms
• Machine Learning
• Expert Systems
Hard Computing vs Soft Computing
Artificial Neural Networks (ANN)
• ANNs are information processing systems that are inspired by the way biological
nervous system and the brain works
• They are usually configured for specific applications such as
• Pattern Recognition
• Data Recognition
• Image Processing
• Stock Market Prediction
• Weather Prediction
• Image Compression
• Aim: To bring the traditional computers a little closer to the way human brain
works
Where does the power of a human being lie?
• Computers are fast processing systems
• Still, there are many activities that a normal human being may require only a
fraction of a second to perform, while it would take ages by even the fastest
computer
• Example: Recognizing a face
• We do it effortlessly, in spite of the infinite variations due to distance, angle of
vision, lighting, posture, distortion due to mood or emotion of the person, and so
on
• Occasionally, we can even recognize a face after a gap of 20 years
• Human beings do not think in terms of data, but in terms of patterns
• When we look at a face, we never think in terms of pixel values, but perceive the
face as a whole, as a pattern
• The structure of a human brain is drastically different from the architecture of a
computer
The Biological Neuron
• The biological neuron is the building block of a human brain
The Biological Neuron – contd.
• The biological neuron is the building block of a human brain
• It consists of 3 primary parts:
• Dendrites
• Collect stimuli from the neighbouring neurons and pass it on to soma
• Soma
• It is the main body of the cell
• It accumulates the stimuli received through the dendrites
• It ‘fires’ when sufficient stimuli is obtained
• When a neuron fires, it transmits its own stimulus through the axon
• Axon
• It helps to pass the stimulus to the neighbouring neurons
The Biological Neuron – contd.
• There is a small gap between the end of an
axon terminal and the adjacent dendrite of
the neighbouring neuron
• This gap is called synapse
• A nervous stimulus is an electric impulse.
• It is transmitted across a synaptic gap by
means of electrochemical process
• The synaptic gap has an important role to play in the activities of the
nervous system
• It scales the input signal by a weight
• If the input signal is x, and the synaptic weight is w, then the stimulus that
finally reaches the soma due to input x is the product x x w
• This weight together with other synaptic weights, embody the knowledge
stored in the network of neurons
The Biological Neuron – contd.
The Artificial Neuron
• An artificial neuron is a computational model based on the structure and
functionality of a biological neuron.
• It consists of
1. a processing element
2. a number of inputs
3. weighted edges connecting each input to the
processing element.
• A processing unit is represented by a circle
• The input units are shown with boxes to distinguish
them from the processing units
The Artificial Neuron – contd.
Notational Convention
Symbol Used Description
Xi The ith input unit
Y The output unit. In case there are more than one output units, the jth
output unit is Yj
xi Signal to the input unit Xi
wi Weight associated with the interconnection b/w xi and the output unit
Y. In case there are more than one output units, wij denotes the weight
between input unit Xi and the jth output unit Yj
y_in The total (or net) input to the output unit Y. It is the algebraic sum of
all weighted inputs to Y
y_out Signal transmitted by the output unit Y. It is known as the activation
of Y
The Artificial Neuron – contd.
Example for a Neural Network with more than one output layer
The Artificial Neuron – contd.
• The net input y_in to the processing element Y is obtained as
𝑚

𝑦𝑖𝑛 = 𝑥1 𝑤1 + 𝑥2 𝑤2 + … + 𝑥𝑚 𝑤𝑚 = ෍ 𝑥𝑖 𝑤𝑖
𝑖=1

• If there are more than one output units, the the net input to the jth output unit Yj,
denoted as y_inj is given by
𝑚

𝑦_𝑖𝑛𝑗 = 𝑥𝑖 𝑤1𝑗 + 𝑥2 𝑤2𝑗 + … + 𝑥𝑚𝑗 𝑤𝑚𝑗 = ෍ 𝑥𝑖 𝑤𝑖𝑗


𝑖=1

• The weight wi associated with the input Xi may be positive, or negative.


• A positive weight means the corresponding input has an excitatory effect on Y
• If the weight is negative, then the input Xi is said to have an inhibitory effect on Y.
The Artificial Neuron – contd.
• The output signal transmitted by Y is a function of the net input yin
𝑦𝑜𝑢𝑡 = 𝑓(𝑦_𝑖𝑛)
• The function f() is known as the activation function of the neuron
• The output 𝑦𝑜𝑢𝑡 is referred to as the activation of Y
• In the simple case, f() is a step function
• A binary step function for the output unit is defined as
1, 𝑖𝑓 𝑦_𝑖𝑛 > 0
𝑦_𝑜𝑢𝑡 = 𝑓(𝑦_𝑖𝑛) = ቊ
0, 𝑖𝑓 𝑦_𝑖𝑛 ≤ 0

1, 𝑖𝑓 σ𝑚𝑖=1 𝑥𝑖 𝑤𝑖 > 0
𝑦_𝑜𝑢𝑡 = 𝑓(𝑦_𝑖𝑛) = ൝
0, 𝑖𝑓 σ𝑚
𝑖=1 𝑥𝑖 𝑤𝑖 ≤ 0

.
The Artificial Neuron – contd.
• When a non-zero threshold θ is used the equations become
1, 𝑖𝑓 𝑦_𝑖𝑛 > θ
𝑦_𝑜𝑢𝑡 = 𝑓(𝑦_𝑖𝑛) = ቊ
0, 𝑖𝑓 𝑦_𝑖𝑛 ≤ θ

1, 𝑖𝑓 σ𝑚𝑖=1 𝑥𝑖 𝑤𝑖 > θ
𝑦_𝑜𝑢𝑡 = 𝑓(𝑦_𝑖𝑛) = ൝
0, 𝑖𝑓 σ𝑚
𝑖=1 𝑥𝑖 𝑤𝑖 ≤ θ

• .We will be learning about various other activation functions also


Note:
• The structure of an artificial neuron is simple
• It’s processing power is also very limited
• However, a network of artificial neurons, ANN has wonderful capacities
The Artificial Neuron – A Summary
• A 2-input neuron looks like:
• 3 things are happening here
1. Each input is multiplied by a weight
𝑥1 = 𝑥1 ∗ 𝑤1
𝑥2 = 𝑥2 ∗ 𝑤2
2. All the weighted inputs are added together with a bias b
(𝑥1 ∗ 𝑤1 ) + (𝑥2 ∗ 𝑤2 ) + b
3. Finally the sum is passed through an activation function
𝑦 = 𝑓((𝑥1 ∗ 𝑤1 ) + (𝑥2 ∗ 𝑤2 ) + b)
• The activation function is used to turn an unbounded input into an output that has a
nice, predictable form
.
Artificial Neural Network
• A neural network is a bunch of neurons connected together.

• A hidden layer is any layer between the input (first) layer and the output (last) layer.
• Hidden layers are layers of mathematical functions each designed to produce an
output specific to an intended result
• A hidden layer is where weights are applied to the inputs and activation function is
applied
• There can be multiple hidden layers
Characteristics of the Brain
• The most striking feature of brain : Its extremely parallel and decentralized
architecture
• It consists of more or less 100 billion neurons interconnected among them
• Each neuron is connected to its neighbours but not to a neuron far away
• No centralized control in a brain
• The brain is very slow compared to the present day computers. Neurons operate at
milliseconds range while the modern VLSI microchip process signals at nanosecond
scale of time
• The power of the brain lies in the concurrent activities of 100 billion neurons
• Brain has fault tolerance capability. As knowledge is stored inside the brain in a
distributed manner it can restore knowledge even when a portion of the brain is
damaged
Essential Features of the Brain
Aspect Description
Architecture The average human brain consists of about 100 billion neurons. There
are nearly 1015 number interconnections among these neurons. Hence
the brain’s architecture is highly connected.

Mode of operation The brain operates in extreme parallel mode. The power of the brain lies
in the simultaneous billion neurons and their interactions.

Speed Very slow, and also very fast. Very slow in the sense that neurons
operate at milliseconds miserably slow compared to the speed of
present day VLSI chips that operate at nanoseconds computers are
tremendously fast and flawless in number crunching and data processing
human beings. Still the brain can perform activities in split-seconds (e.g.,
converse in natural out common sense reasoning, interpret a visual
scenery, etc.) which a modern supercomputer to carryout.
Essential Features of the Brain
Aspect Description
Fault tolerance The brain is highly fault tolerant. Knowledge is stored within the
brain in a distributed manner. Consequently, if a portion of the
brain is damaged, it can still go on functioning by retrieving the
lost knowledge from the remaining neurons.

Storage mechanism The brain stores information as strengths of the interconnections


among the neurons. New information can be added by adjusting
the weights without already stored information.

Control There is no global control in the brain. A neuron acts on local


information available with neurons. The neurons pass on the
results of processing only to the neurons adjacent to them.

You might also like