Lecture On Pattern Classification and Pattern Association
Lecture On Pattern Classification and Pattern Association
• For example, let us considered the set of S of all three bit patterns .We
may divide the patterns of S into two classes A and B where A is the
class of all patterns having more 0’s than 1’s and B the
converse.Therefore,
000
001 A
010
011
100
101 B
110
111
x1 X1 +1
.
.
Y
X2
x2 +1 y_out
X3
x3 +1
Pattern Association
3
y_in = xw
i 1
i i =x1w1+x2w2+x3w3
=x1+x2+x3
1 if y_in≥2
y_out=f(y_in)={
0 otherwise
Classification of 3-bit Pattern with two output units
3
y_in1 == i 1
xi wi1
=x1w11+x2w21+x3w31=x1+x2+x3
x1 X1 W11=+1 1 if y_in1≥2
y_out1 =f(y_in1)={
-1 Y1 y_out1
0 , otherwise
X2 +1 3
x2 -1 xw
i 1
i i
Y2
y_in2=
+1 y_out2 =x1w12+x2w22+x3w32=-x1-x2-x3
X3
1, if y_in2≥-1
x3 W32=-1 y_out2= f(y_in2) ={
0, otherwise
Pattern Association
Given an input pattern, and a set of patterns already stored in the
memory , finding the closest match of the input pattern among the
stored patterns and returning it as the output, is known as pattern
association.
Pattern Association
The basic concept of pattern association is explained below with the
help of a simple illustrative example.The example is inspired by
Hopfield network[1982].
P1
+3 +3 Inactive Unit
P2 +3 P3 Active Unit
-1 -1 -1 -1
P4 +3 P5
+3 +3
P6
The essintial features of Network
The essential features of the network are describe below.
i)PE states :at any instant, a unit may either be in an active or an
inactive state.Moreover, depending on the circumstances , the state
of a unity may change from active to inactive and vice versa.In fig.
An active unit is shown with a black circle and an inactive unit is
indicated by a hollow circle.
ii)Interconnections: All interconnections are bidirectional
magnitude of the weight associated with an interconnection give
the strenght of influence the connected units play on each other.
The essintial features of Network
iii)Signed weights: A negative weight implies that the corresponding units tend to
inhibit, or activate, each other.
iv)Initialization : The network is initialized by making certain units active and keeping
others inactive.The initial combination of active and inactive units is considered as the
input pattern.After initialization, the network passes through a number of
transformations.The transformation take place according to the rules described below.
v)Transformations:At each stage during the sequence of transformations,the next
state of the every unit pi, i= 1,…..,6, is determined. The next state of the every unit pi
is obtained by considering all active neighbours of pi and taking the algebraic sum of
the weights of the paths between pi and the neighbouring actives units.If the sum is
greater than 0, then pi becomes active for the next phase.Otherwise it becomes
inactive. The state of a unit without any active unit in its neighbourhood remains
unaltered.This process is known as parallel relaxation.
Example with Hopfield Network
• For example, let the network be initialized with the pattern shown in
Fig.
• Initially, all units except p2 and p5 are inactive.
• To find the state of P1 in the next instant, we look for the active
neighbours of p1 and find that p2 is the only active unit connected to p1
through an interconnection link of weight+3.
• Hence p1 becomes active in the next instant.Similarly , for p3,both p2
and p5 are active units in its neighbourhood.
• The sum of the corresponding weights is w23 +w35 =+3-1=+2.Hence p3
also becomes active.
Example with Hopfield Network
Fig.: Pattern association through parallel relaxation
P1 P1 P1
+3 +3 +3 +3 +3 +3
P2 +3 P3 P2 +3 P3 P2 +3 P3
-1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1
P4 +3 P5 P4 +3 P5P4 +3 P5
+3 +3 +3 +3 +3 +3
P6 P6 P6
(a) (b) (c)
Example with Hopfield Network
• However, p2 itself becomes inactive because the only active unit in its
vicinity, p5 is connected to it through a negatively link.
• Table shows the details of computations for transformation of the
network from Fig.(a) and (b).
• The configuration of fig.(b) is not stable.
• The network further transforms itself from above fig.(b) to fig.©, which
is a stable state.
• Therefore , we can say that the given network associates the pattern
shown in Fig.(a) to that shown in Fig.©.
Table: Computation of parallel relaxation
on fig.(a)
Unit Present State Active Neighbouring Sum Next State
unit(s)
P1 Inactive P2 +3 Active
P2 Active P5 -1 Inactive
P3 Inactive P2,P5 +3-1=+2 Active
P5 Active P2 -1 Inactive
P6 Inactive P5 +3 Active
Example with Hopfield Network
Fig.: Non trivial patterns stored in a Hopefield
Network
P1 P1 P1
+3 +3 +3 +3 +3 +3
P2 +3 P 3 P2 +3 P3 P2 +3 P3
-1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1 -1
P4 +3 P5 P4 +3 P5 P4 +3 P5
+3 +3 +3 +3 +3 +3
P6 P6 P6
(a) (b) (c )
Example with Hopfield Network
• A little investigation reveals that the given network has three non-
trivial stable states as shown in above fig. (a) to (c).
• The trivial state is that where all units are inactive.
• It can be easily verified that if one or more of the units P1,P2, P3 is/ are
active initially while the rest P4, P5, P6 are inactive, the network
converges to the pattern shown in above Fig.(a) .
• Similarly the pattern of fig.(b) is associated with any input pattern
where at one unit of the group {P4, P5,P6} is /are active.Finally, an
input pattern having active units from both the groups {P1,P2,P3} and
{P4, P5,P6}would associate to with the pattern depicted in fig. (c ).
• Hence the given network may be thought of as storing three non-
trivial patterns as discussed above.Such networks are also referred to
as associative memories, or content addressable memories.
Activation Function
• The output from a processing unit is termed as its activation .
• Activation of a processing unit is a function of the net input to the
processing unit.
• The function that maps the net input value to the output signal value,
i.e. the activation , is known as the activation function of the unit.
• Some common activation functions are presented below.
Identity Function
• The simplest activation function
is the identity function that y
passes on the incoming signal as g(x)=x
the outgoing signal without any a
change.
• Therefore, the identity activation a
function g(x) is defined as x
g(x)=x
1 g(x)
x
Threshold activation function
• In Basic step function , occasionally, instead of 0 a non-zero threshold
value θ is used. This is known as the threshold step function
and is defined as
1, if x>θ
g(x)={
0, otherwise
• The shape of the threshold function is shown in fig. below.
Threshold activation function
y
1 g(x)
x
θ
y
y
+1
+1 g(x) g(x)
θ
x
-1 -1
1 δ1
0.5 δ2
δ1<δ2
0
Fig:Binary Sigmoid Function
Binary Sigmoid Function
g'(x)= δ/2(1+g(x))(1-g(x))
Bipolar Sigmoid Function
• Following Fig. presents its form graphically.
-1
h'(x)=(1+h(x))(1-h(x))
Hyperbolic Tangent Function