0% found this document useful (0 votes)
94 views33 pages

Function of Single Biological Neuron and Modelling of Artificial Neuron From It

The document discusses the functions of biological neurons and how they are modeled in artificial neurons. It provides details on the key components of biological neurons like synapses, dendrites, soma, axon and myelin sheath. It then discusses three models of artificial neurons - McCulloch-Pitts model, Rosenblatt's perception model and Adalina model. The models are characterized by weights, thresholds and an activation function. Hidden layers are also discussed which help process data between the input and output layers. Feed forward and feedback neural networks are compared in terms of connections and learning rules.

Uploaded by

Mary Morse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
94 views33 pages

Function of Single Biological Neuron and Modelling of Artificial Neuron From It

The document discusses the functions of biological neurons and how they are modeled in artificial neurons. It provides details on the key components of biological neurons like synapses, dendrites, soma, axon and myelin sheath. It then discusses three models of artificial neurons - McCulloch-Pitts model, Rosenblatt's perception model and Adalina model. The models are characterized by weights, thresholds and an activation function. Hidden layers are also discussed which help process data between the input and output layers. Feed forward and feedback neural networks are compared in terms of connections and learning rules.

Uploaded by

Mary Morse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 33

FUNCTION OF SINGLE BIOLOGICAL NEURON AND

MODELLING OF ARTIFICIAL NEURON FROM IT


The human brain consists of large numbers more than a billion
of neural cells that process information. Each cell works like a
simple processor. The massive interaction between all cells and
their parallel processing only makes the brain abilities possible.

A Biological neuron (or) a nerve cell consists of synapses,


dentrites, hillock, soma and axons.

Synapses are elementary signal processing devices and is a


biochemical device, which converts pre-synaptic electrical signal
into a chemical signal and then back into a post-synaptic electric
signal. It is the point of connection between two neurons and a
muscle or gland electrochemical communication between
neurons.

Dentrites are branching fibres that extend from the cell body
(or) soma. It receives signals from other neurons.
Soma (or) cell body of neuron contains the nucleus and sum
of all incoming signals and converts them into output signals.

Axon is a singular fiber that carries information away from the


soma to the synaptic sites or other neurons.

Axon Hillock is the site of summation for all incoming


information.

Myelin sheath consists of fat containing cells that insulate the


axon from electrical activity. This insulation acts to increase the
rate of transmission of signals.

Nodes of Ranvier are the gaps (about 1m) between myelin


sheath cells long axons.

Terminal buttons of a neuron are the small knobs at the end of


an axon that release chemicals called neurotransmitters.
CHARACTERISTICS (OR) PROPERTIES OF
ANN
1. The ANN exhibit mapping capabilities i.e., they can map
input patterns to their associated output patterns.

2. The ANN learned by examples.

3. The ANN posses the capability to generalize. They can


predict new outcomes from past trends.

4. The ANN are robust systems and are fault tolerance.


Hence they can recall full patterns from incomplete, partial
(or) noisy patterns.

5. The ANN process information in parallel at high speed and


in distributed manner.
The artificial neuron can be modeled form biological neuron
as mathematical function.

(i) The axon and dentrites of biological neuron are modeled


as wires (or) inputs.
(ii) Synapses of biological neuron are modeled as weights.
(iii)The activity in soma of biological neuron are modeled as
threshold function.
Neuron consists of three basic components i.e., weights,
thresholds and single activation function.

The values w1, w2,wm are the weights to determine the


strength of input vector X=[ x1,x2..xm].
Each input is associated weight of the neuron connection
i.e., XW. The positive weight excites and the negative weight
inhibits the node output n

i=1
Yin=X1W1+X2W2+.XmWm =
The nodes internal thresholds is the magnitude
XW
i i

offset. It affects the activation of the node output S as


n
S= f(Yin)= f
i=1
[ XiWi - ]

To generate the final output S , the sum is passed on to a


non-linear filter f called activation function (or) transfer
function (or) squash function which releases the output.
MODELS OF A NEURON:
1. Mc Culloch-pitts (MP) Model

2. Rosenblatts Perception model of a neuron

3. Addina model of a neuron.


1. Mc culloch pitts (MP) model

The equations describe he operation of MP model are:

Activation: Yin = WiXi

Output signal: S= f(Yin)

Where, Yin is weighted sum of n-input values(Xi)


= bias.
Rosenblatts Model:

The equation describes the operation of Rosenblattss model


are:

Activation: Yin = WiXi

Output signal: S= f(Yin)


Error: = t-S
Weight change: Wi =Xi

Where, =Learning rate


t=target output
S= actual binary output
Addina Model:

The equations describe the operation of addina model are:

Activation: Yin = WiXi

Output signal: S= f(Yin)= yin


Error: = t-S = t-yin
Weight change: Wi =Xi

Output is linear function of activation value.


Comparision between ANN and
BNN:
Characteristics ANN BNN
Fast in processing Slow in processing
1. Speed information information
Performs operations Perform operations in
2. Processing in sequential mode massive parallel

3. Size and Less complex Highly complex and


complexity non-linear
Information is stored Information is stored
4. Storage in memory in strength of the
interconnections.

5. Fault tolerance No inbuilt fault Inbuilt Fault


tolerance tolerance
Central control limit No central control
6. Control is present and it unit is present.
mechanism monitors all activities
of computing
Properties BNN ANN

Cell body Neurons

Dentrites Weights (or)


Interconnections
Soma
(signal from other Net input
Associat neurons)
ed terms Axon
(Firing frequency) Output

Synaptic strength Connection strength


It can control at any
stage No control
It will not work under
assumptions
Models: Feed forward Mc culloch pitts
Feed back
VARIOUS ACTIVATION FUNCTIONS AND THEIR
IMPORTANCE
Activation function is used to calculate the output response of a
neuron. i.e., it performs mathematical operation on the signal
output. In otherwords, it is used to convert the input into an
outputs.

All activations are designed to produce values between 0 and


1

In general, it is non-linear.

Horizontal axis represents sum of inputs.

Vertical axis represents the value of the function produced i.e.,


output. Also called as transfer function (or) squash function.

The most commonly used activation function are:


(i). Threshold Function (Hard-limiter).
(ii). Piecewise linear function
(iii). Sigmoidal function (S- shape function).
Threshold Function:
A threshold function is either binary type (or) bipolar type.
A binary threshold function can be defined as
1; if I0
Y=
0; if I<0
i.e., Y=1, If the weighted sum of the inputs is
positive.
Y=0, If the weighted sum of the inputs is
A bipolar threshold function can be defined
negative.
as
1; if I 0
Y=
-1; if I<0
i.e., Y=1; If the weighted sum of the inputs
is positive.
Y=-1; If the weighted sum of the inputs
is negative.

Also called as Heaviside function


Peicewise linear function:
It is also called as saturating linear function and can have
either a binary or bipolar range for the saturation limits of the
output.

This is a sloping function that produces


-1 for a negative weighted sum of inputs
1 for a positive weighted sum of inputs.
I ; proportional to input values between +1 and -1 weighted
sums.

1; if I0
Y= I; if -1 I 1
-1; if I<0
Sigmoidal Function:
The non linear curved S-Shaped function is called the sigmoid
function.

It is mathematically well behaved, differentiable, and strictly


This function
increasing
produces
function.
0; for large negative input values
1; for large positive values, with a
smooth transition between two
asymptic values.
Y= f(I) = 1/(1+e-I) ; 0
f(I) 1

=1/(1+exp-I); 0
f(I) =
Where, 1 Slope parameter (or) Shape
parameter

It is also called as unipolar sigmoidal


function.
Signum Function:
It is a non linear function.

It is an odd function.

If the weighted sum of inputs =1, then


output Y=1
If the weighted sum of inputs =0, then
output Y=0
If the weighted sum of inputs =-1, then
output Y=-1
eg: Tanh(Y)

1; If I>0
Y= f(I) = 0; If I=0
-1; If I<0

It is also known as Bipolar sigmoidal


function.
HIDDEN LAYERS
It links input and output and helps in processing of data.

The number of neurons in each hidden layer can be calculated as

M(H,n) = H where, n= Number of input units.


I H= Number of hidden units

Where H
=0 ; If H<I
I

H d, H= log2M It is only for Hard limiting function.

Number of neurons also depends on learning algorithm that trains


the network.
Feed forward network are classified as (i) Single layer feed
forward.
(ii) Multi layer feed forward.
Single layer Feed forward Multilayer feed forward.

No Hidden layers Hidden layers present

Connections within a network


Connections within a network are not direct, but with the
are direct. help of hidden layers.

Hard limiting function Is used Signoid function is used as


as activation function. activation.
It cannot solve linear It can solve linear separability
separability situation. situation

It makes use of Delta Rule It makes use of Back


as learning rate. proportion learning scheme
as learning rule
FEED FORWARD AND FEED BACKWARD NEURAL
NETWORKS
Feed forward neural Feed backward neural
network network
1) Information flow is 1) Information flow is bidirectional
unidirectional. (or) multidirectional
i.e., input to output i.e., input to output and o/p to
i/p
2) No internal states 2) Internal states are present
3) No feedback 3) Feedback is present
4) No sense of time (or) memory 4) Sense of time (or) memory of
of previous state previous state is possible.
5) Also known as Bottom-up (or) 5) Also known as interactive (or)
Top-down network recurrent network.
6) Not dynamic 6) Feed back neworks are dynamic

7) Applications are like pattern 7) Applications are like Image


recognition processing.
Feed forward neural Feed backward neural
network network
8) The structure of feed 8) The structure of feed back
forward network is network is

9) Acyclic in nature 9) Cyclic in nature


10) Block diagram 10) Block diagram
Inpu Instantaneo Output i/p O/p
us network Instantaneous
t
network
Delay
element Feedbac
k
Mc culloch-pitts (MP) Model of neuron:
Neuron with threshold activation function is called Mp model.

The Mc culloch-pitts(MP) model of a neuron is characterized by


its formalism, elegant and precise mathematical defination.

The MP neuron allows binary 0 (or) 1 rates only. i.e., It is


binary activated. These neurons are connected by direct
weighted path. The connected path can be excitatory (or)
inhibitory. Excitatory connection have positive weights and
inhibitory connections have negative weights. The neuron is
associated with the threshold value.

The neuron fires if the net input to the neuron is greater than
the threshold. The threshold is set such away that the inhibition
is absolute because, non-zero inhibitory input will prevent the
neuron from firing.

It takes only one time step for a signal to pass over one
connection link.
Let Y is the MP neuron, it can receive signal from any number
of other neurons.

The connection weights from X1,X2..Xn are excitators


denoted by and
The connection weights from Xn+1.Xn+m are inhibitors
denoted by -p.

Then MP neuron Y has activation where, =


function threshold
and is given by
Yinterm
weighted sum of its inputs and bias = Net
.i/p signal received by
neuron Y
1; if Yin
Firing Rule: f (Yin) =
0; if Yin<

The threshold should satisfy the relation > n-p


This is the condition for absolute inhibition.

The MP neuron will fire if it receives k (or) more excitatory inputs


and no inhibitory inputs, where k > (k-1) .
In MP neuron model, the missing features are

i) Non-binary input and output


ii) Non linear summation
iii) Smooth thresholding
iv) Stochastic
v) Temporal information processing.
Synaptics are modeled as weights.
Dentrites are modeled as inputs.
Neurons are modeled as Activation function followed by
summing junction.

The output signal S is typically a non-linear function f(Y in) of


the activation value Yin. n+
m
i=n+1
The Activation value Yin= W i Xi + -PiXi -

Drawbacks:
1) Weights are fixed.
2) Not having capability of learning.
1)AND Function using MP neuron
model:
The truth table for AND function is
The MP neuron to implement AND
X1 X2 Y function is given by
1 1 1
1 0 0 X1 1
0 1 0
Y
0 0 0
X2 1
The threshold on unit Y is 2.
The output Y = Y= f (Yin)
Yin = weights X Inputs.
= 1.X1 + 1.X2
= X1+X2
The activations of o/p neuron can be formed
1; if Yin 2
Y= f(Yin) =
0; if Yin 2
(i) X1=X2=1 Yin = X1+X2 = 1+1 =2
Y= f (Yin) =1 ( since, Yin =2 >1)

(ii) X1=1, x2=0 Yin = X1+X2 =0+1 =1


Y= f(Yin) =0 (Since, Yin = 1< 2)

(iii) X1=0; X2=0 Yin = X1+X2 = 0+0 =0


Y= f (Yin) =0 (since, Yin =0<2)
2) OR function using MP neuron model:
The truth table for OR function is
X1 X2 Y
1 1 1
1 0 1
0 1 1
0 0 0

The MP neuron to implement OR function is


given by

X1
3

X2 3
The threshold for the unit is 3.
Yin = 3X1+3X2
1 ; If Yin 3
Y= f(Yin) =
0; If Yin <3

On Presenting the inputs,

(i) X1=X2=1; Yin= 3X1+3X2


=3(1)+3(1) = 6 > Threshold 3 i.e., Y=1

(ii) X1=1, X2=0; Yin=3X1+3X2


= 3(1)+3(0) =3 threshold i.e., Y=1

(iii) X1=X2=0; Yin= 3X1+3X2


= 3(0)+3(0)=0 < Threshold i.e., Y=0
3) NOT Function using Mp neuron model:
The truth table for NOT
function is
X Y

1 0

0 1

The MP neuron to implement NOT function is


given by

1
X Y

The threshold for unit Y is 1


The net input is Yin = X.
= X (=1)
The output activation is given by
1; if Yin < 1
Y = f (Yin) =
0; If Yin 1

On presenting the inputs,


(i) X1=1, Yin=1 ; Applying activation Y= f(Yin) = 0
(ii) X1=0, Yin =0 ; Applying activation Y= f(Yin) =1
4) NAND (AND-NOT) Function using Mp
neuron model:
The Truth table for AND NOT function is
X1 X2 Y
1 1 0
1 0 1
0 1 0
0 0 0

The MP neuron to implement OR function is


given by
X1 1

X2 -
1
The threshold of unit Y is 1

The net input Yin = X11+X22


= X1(1) + X2(-1) = X1- X2

The output activation is given by


1 ; If Yin 1
Y = f(Yin) =
0; If Yin <1

On presenting the inputs

(i) X1=X2=1 ; Yin = X1 X2 = 1-1 =0 <1 i.e., Y= f (Yin) =0

(ii) X1=1, X2=0 ; Yin = X1 X2 = 1-0 =1=1 i.e., Y= f (Yin) =1

(iii) X1=0, X2=1 ; Yin = X1 X2 = 0-1 =-1 <1 i.e., Y= f (Yin)


=0

(iv) X =X =0 ; Y = X X = 0-0 =0 <1 i.e., Y= f (Yin) =0

You might also like