Department of Computer Science and Engineering (CSE)
Department of Computer Science and Engineering (CSE)
Department of Computer Science and Engineering (CSE)
(AUST)
Department of Computer Science and Engineering (CSE)
List of Experiments
Experiment 1: Designing a minimum distance to class mean classifier.
Experiment 2: Implementing the Perceptron algorithm for finding the weights of
a Linear Discriminant function.
Experiment 3: Designing a Minimum Error Rate classifier.
Experiment 4: Density Estimation using Parzen Window.
Experiment 5: Implementing a feature selection system using PCA.
Experiment 6: Implementing a feature transformation system using LDA.
Text Book:
1. Pattern Classification (2nd Edition) by R. O. Duda, P. E. Hart and D. Stork, Wiley
2002
Course Overview
1. Goals:
2. Course Content:
The course will cover both basic and advanced topics of pattern classification covering
discriminant functions, linear/non-linear classifier, parametric/non-parametric techniques, and
feature selection and transformation techniques. The course will cover the theoretical part of the
taught algorithms/techniques as well as their impact on different set of scenarios or data set.
Therefore, students will not only be able to implement different techniques but also learn their
motives, working principles, impact and know when to use. During this course they will also learn
to code in MATLAB. At the end of the semester, students will have an understanding of different
components of a general pattern recognition system.
3. Course Organization:
Students will have to carry out their experiments individually. Each week a task will be given
where every student will be asked to implement and discuss on their own assignments.
Therefore, there will be two sorts of jobs to complete.
i. Lab Implementation.
ii. Report Writing.
In the implementation part, students have to implement the given algorithm with suitable data.
Whereas, in report writing, they have to write a formal report on the given task along with finding
and discussion sections that must highlight the working principles and effects of changing
different parameters or data set. Finally, all students will attend a final test and face a viva board
during the last week of the semester.
Experiment No 1
A. Sessional Task:
Given the following two-class set of prototypes
1. Plot all sample points from both classes, but samples from the same class should have
the same color and marker.
2. Using a minimum distance classifier with respect to ‘class mean’, classify the following
points by plotting them with the designated class-color but different marker.
X1 = (-1,-1)
X2 = ( 3, 2)
X3 = (-2, 1)
X4 = ( 8, 2)
1 T
Linear Discriminant Function: gi ( X ) X T iY
iY iY
2
3. Draw the decision boundary between the two-classes.
Experiment No 2
A. Sessional Task:
You are given the following sample points in a 2-class problem:
1 = (1,1), (1,-1), (4,5)
2 = (2,2.5), (0,2), (2,3)
1. Plot all sample points from both classes, but samples from the same class should have
the same color and marker. Observer if these two classes can be separated with a
linear boundary.
2. Consider the case of a second order polynomial discriminant function. Repose the
problem of finding such a nonlinear discriminant function, as a problem of finding a
linear discriminant function for a set of sample points of higher dimension. Generate
the high dimensional sample points. [Hint: Φ-machine.]
3. Use Perceptron Algorithm (one at a time) for finding the weight-coefficients of the
discriminant function boundary for your linear classifier in question 2.
B. Exercise:
You are given the following sample points in a 3-class problem:
1 = (0,2), (1,3), (1,1)
2 = (1.5,0), (0,-1), (2,-1)
3 = (4,0), (4,3), (3,1)
Use Perceptron Algorithm (one at a time) for finding the weight-coefficients of the
discriminant function boundary for your linear classifier. Draw the decision boundary
between the two classes.
Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)
Experiment No 3
A. Sessional Task:
Design a Minimum Error Rate Classifier for a two-class problem with the given data
below:
x1 = -7:.2:7; x2 = -7:.2:7;
[X1,X2] = meshgrid(x1,x2);
mu1 = [0 0];
Sigma1 = [.25 .3; .3 1];
F1 = mvnpdf([X1(:) X2(:)],mu1,Sigma1);
F1 = reshape(F1,length(x2),length(x1));
%surfc(x1,x2,F1);
meshc(X1,X2,F1);
axis([-7 7 -7 7 -1.0 .6])
xlabel('x1'); ylabel('x2'); zlabel('Probability Density');
hold on;
mu2 = [2 2];
Sigma2 = [.5 .0; 0 .5];
F2 = mvnpdf([X1(:) X2(:)],mu2,Sigma2);
F2 = reshape(F2,length(x2),length(x1));
%surfc(x1,x2,F2);
meshc(X1,X2,F2);
axis([-7 7 -7 7 -1.0 .6])
xlabel('x1'); ylabel('x2'); zlabel('Probability Density');
caxis([min(F2(:))-.5*range(F2(:)),max(F2(:))]);
plot3(1,1,-1.0,'rx');
Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)
Experiment No 4
A. Sessional Task:
For the following samples in a one-dimensional problem:
x1, x2, …, x16 = 0, 1, 3, 4.5, 5.5, 6.0, 6.5, 7.0, 7.2, 7.5, 8.0, 8.8, 9.2, 9.3, 11, 13
Give the values of the k-nearest neighbor estimate pj(x), for j=16 and kj=j , for the
range x=0 to13.
Vj 5,
When centered at x=2,
p j (k j / j ) / Vj (4 /16) / 5 0.05
B. Exercise
For the following samples in a one-dimensional problem:
{xi = -7, -5, -4, -3, -2, -2.1, 0, 1.1, 2, 3, 4, 4.5, 4.6, 4.8, 5, 6}
Plot the Parzen Window estimates pn(x), for n and hn=hj/n, for the range x=-10 to10
(step interval should be 0.1 and hj=1).
1 n 1 x xi
When centered at x, the estimated density value is pn ( x) ( h )
n i 1 hn n
Your window function is zero mean, unit variance normal density function.
Ahsanullah University of Science and Technology
(AUST)
Department of Computer Science and Engineering (CSE)
Experiment No 5
A. Sessional Task:
Your today’s task is to carry out Principal Component Analysis (PCA) on a set of face
images. The basic task is to learn the use of princomp() matlab function in order to obtain
the eigen-vectors and eigen-values. Reducing the original feature dimension is the main
objective.
1. Read a set of face images and find the newly transformed face vectors based on PCA.
Reconstruct the face images from the transformed face vectors.
i. Read a set of input face image and generate the face feature (column) vector. Each
image is a 2D array of intensity values, i.e., a 2D matrix. In order to prepare the
column vector representation, all row values should be storied as a column and
appended one after another.
a
a b b
For example: face1 c d c
d
ii. Prepare the matrix containing all the face vectors, where each row contains a face
vector (but transposed).
face'1 a b c d
For example: X face'2 p q r s
iii. Compute the eigen-vectors (evec) and eigen-values (eval). Discard unnecessary
eigen-vectors based on eigen values. [Show eigen faces: each evec vector should
be rearranged as 2D matrix like input image.]
Use function: [COEFF,SCORE,latent] = princomp(X); See help for details.
2. The computation of eigen-vector and eigen-values should be done with cov() and
eig() functions.
filepre='imgset/img_';
s=num2str(i); % i is the image number.
impath=strcat(filepre,s,'.bmp');
Experiment No 6
A. Sessional Task:
Your today’s task is to carry out Linear Discriminant Analysis (LDA) on a set of input data
and learn about the basic equations associated with LDA.
Compute the Linear Discriminant projection for the following two-dimensional dataset:
and