Homework 1: 1. Solve The Following Problems From Chapter 2 of The Text Book: 7, 12, 13, 31, 38
Homework 1: 1. Solve The Following Problems From Chapter 2 of The Text Book: 7, 12, 13, 31, 38
Homework 1: 1. Solve The Following Problems From Chapter 2 of The Text Book: 7, 12, 13, 31, 38
and
1 0 1 1
1 , 2 .
0 1 1 2
x1
4. Consider the following class-conditional density function for feature vector X :
x2
p ( X ) ~ N ( , ),
where
1 12 12
; 2
; 12 21
2 21 2
(a) Write down the expression for the Euclidean distance between point X and mean vector μ.
(b) Write down the expression for the Mahalanobis distance between point X and mean vector
μ. Simplify this expression by expanding the quadratic term.
(c) How does the expression for Mahalanobis distance simplify when is a diagonal matrix.
(d) Compare the expressions in (4a) and (4b). How does the Mahalanobis distance differ from
the Euclidean distance? When are the two distances equal? When is it more appropriate to
use the Mahalanobis distance? Support your answers with illustrations.
5. The class-conditional density functions of a binary random variable X for four pattern classes
are shown below:
The loss function is as follows, where action i means “decide pattern class i”:
1 2 3 4
1 0 2 3 4
2 1 0 1 8
3 3 2 0 2
4 5 3 1 0
(a) A general decison rule d(x) tells us which action to take for every possible observation x.
Construct a table defining all possible decision rules for the above problem. As an example,
one of the possible decision rules is:
x = 1, take action 1
x = 2, take action 2
(b) Compute the risk function Rd() for all the decision rules, where
Rd ( ) L , d ( x) p( x ).
x
Rd P ( ) Rd ( ).
* Note: You may use the MATLAB package to generate multivariate Gaussian patterns. Use
the “mvnrnd” (multivariate normal random point generator) and “plot” commands in matlab
to generate and plot the data. Type “help mvnrnd” and “help plot” to learn more about these
commands.