0% found this document useful (0 votes)
3 views22 pages

22-ANN Lecture TwentyTwo LVQ

Vector Quantization involves representing input vectors with codewords in Voronoi regions, where each region corresponds to a cluster of vectors. Learning Vector Quantizers (LVQ) is an unsupervised learning method that can be adapted for supervised tasks, updating weights based on the proximity of input patterns to codewords. However, LVQ faces challenges such as sensitivity to initialization and instability with overlapping classes, leading to the exploration of additional mechanisms to improve its performance.

Uploaded by

stharun1808
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views22 pages

22-ANN Lecture TwentyTwo LVQ

Vector Quantization involves representing input vectors with codewords in Voronoi regions, where each region corresponds to a cluster of vectors. Learning Vector Quantizers (LVQ) is an unsupervised learning method that can be adapted for supervised tasks, updating weights based on the proximity of input patterns to codewords. However, LVQ faces challenges such as sensitivity to initialization and instability with overlapping classes, leading to the exploration of additional mechanisms to improve its performance.

Uploaded by

stharun1808
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 22

Vector Quantization

• Figure shows vectors in two dimensional space.


• Associated with each cluster of vectors is a
representative codeword.
• Each codeword resides in its own Voronoi region.
• These regions are separated with imaginary lines
in figure for illustration.
• Given an input vector, the codeword that is chosen
to represent it is the one in the same Voronoi
region.
Two Dimensional Voronoi Diagram

Codewords in 2-dimensional space. Input vectors are marked with an x, codewords are
marked with red circles, and the Voronoi regions are separated with boundary lines.
Definition of Voronoi Diagram

• Let P be a set of n distinct points (sites) in the plane.

• The Voronoi diagram of P is the subdivision of the


plane into n cells, one for each site.

• A point q lies in the cell corresponding to a site pi 


P iff
Euclidean_Distance( q, pi ) < Euclidean_distance( q, pj ),
for each pi  P, j  i.
Learning Vector Quantizers (LVQ)
• An unsupervised learning mechanism which can be
adapted to solve supervised learning tasks in which
class membership is known for every training pattern
• Each node in an LVQ is associated with an arbitrarily
chosen class label.
• The number of nodes chosen for each class is roughly
proportional to the number of training patterns that
belong to that class.
• The weight update rule uses a learning rate  (t ) which
is a function of time.
Algorithm
Initialize all weights to random values in the range [0 1]
Repeat
Adjust the learning rate
for each input pattern ik in the training set, do
Find each node j* whose weight vector wj* is closest to ik
for l =1,…n, do
Update the weight wj*l as follows:
if the class label of node j* equals
The desired class of ik
w j *,l  (t )(ik , l  w j *,l )
then
else w j *,l   (t )(ik , l  w j *,l )
end
end
Example:

The training set T contains 6 examples.


T={(i1;1),(i2,0), (i3,0), (i4,0), (i5,0), (i6,1)}

Node A is associated with class 1, and other two nodes with


class2
Learning rate 0.5 until t=6 then 0.25 t=12 and thereafter 0.1
1. Sample i1, winner w3, w3 changed to (0.95, 0.65,0.60) (i1 in class
1 and w3 in class 2, so weight not changing towards i1)
2. Sample i2, winner w1, w1 changed to (0.30,1.05,0.45)
.
.
.

Association between input samples and weight vectors stabilize by


the second cycle of pattern presentation, although the weight
vectors continue to change, converging approximately to the
centroids of the associated samples in 150 iteration.
net=newlvq(minmax(p),w,[.6 .4])
LVQ and its problems
... crucially depends on the Euclidean metric and is thus
inappropriate for high-dimensional, heterogeneous, complex
data
... is not stable for overlapping classes
... is very sensitive to initialization
… LVQ does not have a proper mathematical background
Adding Conscience Mechanism

You might also like