0% found this document useful (0 votes)
13 views

Object Recognition

The document discusses various techniques for object recognition and pattern classification in digital images, including pattern vectors, minimum distance classifiers, matching by correlation, K-nearest neighbor classification, K-means clustering, multi-layer perceptrons, support vector machines, self-organizing maps, and neural networks.

Uploaded by

A J
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Object Recognition

The document discusses various techniques for object recognition and pattern classification in digital images, including pattern vectors, minimum distance classifiers, matching by correlation, K-nearest neighbor classification, K-means clustering, multi-layer perceptrons, support vector machines, self-organizing maps, and neural networks.

Uploaded by

A J
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Object recognition

Digital Image Processing (EEE F435)


Ashish Chittora
Pattern and pattern classes
Pattern examples

Heart beat ECG signal Finger print pattern

Speech signal Face pattern


Pattern arrangements
• Vectors (quantitative descriptors)
• Strings (structural descriptors)
• Trees (structural descriptors)
Pattern vectors
Pattern vectors
Decision theoratic methods
Decision theoratic methods
Decision theoratic methods
Minimum distance classifier
Minimum distance classifier
Minimum distance classifier
Minimum distance classifier

d1(x) = 4.3x1+1.3x2-10.1
d2(x) = 1.5x1+0.3x2-1.17
d12(x) = d1(x)-d2(x) = 2.8x1+1.0x2-8.9 = 0
Minimum distance classifier
Minimum distance classifier
Matching by Correlation
Matching by Correlation
Matching by Correlation
Matching by Correlation
Other classification algorithms
• K-nearest neighbor
• K-means clustering
• Multi layer perceptron (MLP)
• Support vector machine (SVM)
• Self organizing maps (SoM)
• Neural networks
K-nearest neighbor (KNN)
K-nearest neighbour
• Step-1: Select the number K of the neighbors
• Step-2: Calculate the Euclidean distance of K number of
neighbors
• Step-3: Take the K nearest neighbors as per the calculated
Euclidean distance.
• Step-4: Among these k neighbors, count the number of the data
points in each category.
• Step-5: Assign the new data points to that category for which
the number of the neighbor is maximum.
• Step-6: Our model is ready.
K-means clustering
• Unsupervised learning algorithm
• Requires number of clusters as input.
K-means clustering
• Step-1: Select the number K to decide the number of
clusters.
• Step-2: Select random K points or centroids. (It can be
other from the input dataset).
• Step-3: Assign each data point to their closest centroid,
which will form the predefined K clusters.
K-means clustering
• Step-3: Assign each data point to their closest centroid, which
will form the predefined K clusters.
• Step-4: Calculate the variance and place a new centroid of each
cluster.
• Step-5: Repeat the third steps, which means reassign each data
point to the new closest centroid of each cluster.
• Two stages correspond to E (expectation) and M (maximization)
of EM algorithm – Expectation: what is the expected class? –
Maximization: what is the mle of the mean?
K-means clustering
• Step-6: If any reassignment occurs, then go to step-4 else go
to FINISH.
• Step-7: The model is ready.
K-means steps
Limitations of K-means clustering
• Every data point is assigned uniquely to one and
only one cluster
• A point may be equidistant from two cluster
centers
• A probabilistic approach will have a ‘soft’
assignment of data points reflecting the level of
uncertainty
K-means in image segmentation
• Goal: partition image into regions
– each of which has homogeneous visual
appearance
– or corresponds to objects
– or parts of objects
• Each pixel is a point in R_G_B space
• K-means clustering is used with a palette of K
colors
• Method does not take into account proximity of
different pixels
K-means in image segmentation
Support vector classification
• SVM picks best separating hyper-plane
according to some criterion
–e.g. maximum margin
• Training process is an optimization
• Training set is effectively reduced to a
relatively small number of support vectors
Feature spaces
• We may separate data by mapping to a
higher-dimensional feature space
–The feature space may even have an
infinite number of dimensions!
• We need not explicitly construct the new
feature space
Kernels
• We may use Kernel functions to implicitly map
to a new feature space
• Kernel function:
• Kernel must be equivalent to an inner product
in some feature space
Linear separator
Selecting best separator
Support vector classification
• Find closest point in convex hull
• Plane bisects the closest points
Classification margin
Support vector classification
• Maximum margin
Support vector classification
• Misclassification error and the function
complexity bound generalization error.
• Maximizing margins minimizes complexity.
• “Eliminates” over-fitting.
• Solution depends only on Support Vectors, not
on the number of attributes.
Linear SVM
Linear SVM
Self study
• Introduction to Neural networks
Thank you

You might also like