Artificial Neural Networks: Perceptron
Artificial Neural Networks: Perceptron
NETWORKS
PERCEPTRON
INTRODUCTION
• How do animals learn, forget, recognize and perform various types of tasks?
Psycho-physical: experiments help to understand how individual neurons and groups of neurons work.
McCulloch and Pitts introduced the first mathematical model of single neuron, widely applied in subsequent
work.
ANN Oct 2017 4
BACKGROUND
An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the
biological nervous systems, such as the human brain’s information processing mechanism.
The key element of this paradigm is the novel structure of the information processing system.
It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve
specific problems.
ANNs, like people, learn by example.
An ANN is configured for a specific application, such as pattern recognition or data classification,
through a learning process.
Learning in biological systems involves adjustments to the synaptic connections that exist between the
neurons.
This is true of ANNs as well.
In the human brain, a typical neuron collects signals from others through a host of fine structures called
dendrites.
The neuron sends out spikes of electrical activity through a long, thin stand known as an axon, which splits
into thousands of branches.
At the end of each branch, a structure called a synapse converts the activity from the axon into electrical
effects that inhibit or excite activity in the connected neurons.
In the training mode, the neuron can be trained to fire (or not), for particular input
patterns.
In the using mode, when a taught input pattern is detected at the input, its associated output
becomes the current output.
If the input pattern does not belong in the taught list of input patterns, the firing rule is used to
determine whether to fire or not.
The firing rule is an important concept in neural networks and accounts for their high
flexibility.
A firing rule determines how one calculates whether a neuron should fire for any input pattern.
It relates to all the input patterns, not only the ones on which the node was trained on previously.
The weighted inputs are then added together and if they exceed a pre-set threshold
value, the perceptron fires.
Otherwise it will not fire and the inputs tied to that perceptron will not have any
effect on the decision making.
In mathematical terms, the neuron fires if and only if;
x1w1 +x2w2 + ⋯ + 𝒙𝒏 𝒘𝒏 > 𝑻
𝑛 0, 𝑛𝑒𝑡 ≤ 𝜃
𝑦 𝜑 =ቊ
𝑥1 𝑤1 𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖 1, 𝑛𝑒𝑡 > 𝜃
𝑖=1
∑ 𝜑
𝑥2 𝑤2
𝑛 0, 𝑛𝑒𝑡 ≤ 𝜃
𝑥1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖 1, 𝑛𝑒𝑡 > 𝜃
=1
𝑖=1
∑ 𝜑
𝑥2 𝑤2 = 0.5
=1
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 𝜃
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 1 × 0.5 + 1 × 0.5 1, 𝑛𝑒𝑡 > 𝜃
=1
𝒏𝒆𝒕 = 𝟎. 𝟓 + 𝟎. 𝟓 = 𝟏
∑ 𝜑
𝑥2 𝑤2 = 0.5
=1
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 1 × 0.5 + 1 × 0.5 1, 𝑛𝑒𝑡 > 0.5
=1
𝑛𝑒𝑡 = 0.5 + 0.5 = 1 𝒚 𝝋 =𝟏
∑ 𝜑
𝑥2 𝑤2 = 0.5
=1
∑ 𝜑
𝑥2 𝑤2
𝑛 0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖 1, 𝑛𝑒𝑡 > 0.5
=0
𝑖=1
∑ 𝜑
𝑥2 𝑤2 = 0.5
=1
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 0 × 0.5 + 1 × 0.5 1, 𝑛𝑒𝑡 > 0.5
=0
𝒏𝒆𝒕 = 𝟎 + 𝟎. 𝟓 = 𝟎. 𝟓
∑ 𝜑
𝑥2 𝑤2 = 0.5
=1
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 0 × 0.5 + 1 × 0.5 1, 𝑛𝑒𝑡 > 0.5
=0
𝑛𝑒𝑡 = 0 + 0.5 = 0.5 𝒚 𝝋 =𝟎
∑ 𝜑
𝑥2 𝑤2 = 0.5
=1
∑ 𝜑
𝑥2 𝑤2
𝑛 0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖 1, 𝑛𝑒𝑡 > 0.5
=1
𝑖=1
∑ 𝜑
𝑥2 𝑤2 = 0.5
=0
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 1 × 0.5 + 0 × 0.5 1, 𝑛𝑒𝑡 > 0.5
=1
𝒏𝒆𝒕 = 𝟎. 𝟓 + 𝟎 = 𝟎. 𝟓
∑ 𝜑
𝑥2 𝑤2 = 0.5
=0
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 1 × 0.5 + 0 × 0.5 1, 𝑛𝑒𝑡 > 0.5
=1
𝑛𝑒𝑡 = 0.5 + 0 = 0.5 𝒚 𝝋 =𝟎
∑ 𝜑
𝑥2 𝑤2 = 0.5
=0
∑ 𝜑
𝑥2 𝑤2
𝑛 0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖 1, 𝑛𝑒𝑡 > 0.5
=0
𝑖=1
∑ 𝜑
𝑥2 𝑤2 = 0.5
=0
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 0 × 0.5 + 0 × 0.5 1, 𝑛𝑒𝑡 > 0.5
=0
𝒏𝒆𝒕 = 𝟎 + 𝟎 = 𝟎
∑ 𝜑
𝑥2 𝑤2 = 0.5
=0
𝑛𝑒𝑡 = 𝑥𝑖 . 𝑤𝑖
0, 𝑛𝑒𝑡 ≤ 0.5
𝑥1 𝑖=1 𝑦 𝜑 =ቊ
𝑤1 = 0.5 𝑛𝑒𝑡 = 0 × 0.5 + 0 × 0.5 1, 𝑛𝑒𝑡 > 0.5
=0
𝑛𝑒𝑡 = 0 + 0 = 0 𝒚 𝝋 =𝟎
∑ 𝜑
𝑥2 𝑤2 = 0.5
=0
𝒙𝟏 𝒙𝟐 𝒚
1 1 1
0 1 0
1 0 0
1 1 1
𝒘𝟏 = 𝟎. 𝟓, 𝒘𝟐 = 𝟎. 𝟓
The Concept is AND Gate.