PRML Test 2
PRML Test 2
IIT Hyderabad
EE5610/AI5000 - PRML
1
The least square loss is sensitive to outliers, and hence robust regres-
sion methods are of interest. The problem with the least square loss in
the existence of outliers (i.e. when the noise term en can be arbitrarily
large), is that it weighs each observation equally in getting parameter
estimates. Robust methods, on the other hand, enable the observations
to be weighted unequally. More specifically, observations that produce
large residuals are down-weighted by a robust estimation method.
In this problem, you will assume that en are independent and identically
distributed according to a Laplacian distribution, rather than Gaussian.
That is, each en ∼ Laplace(0, b) = 1
2b exp(− ebn ).
(a) Provide the loss function JL (w) whose minimization is equivalent
to finding the ML estimate under the Laplacian noise model.
(b) Suggest a method for optimizing the objective function in (a), and
write the update equations.
(c) Why do you think that the above model provides a more robust fit to
data compared to the standard model assuming Gaussian distribution
of the noise terms?
K
Y
p(t|x, W) = N(tk |wTk φk (x), β−1
k )
k=1
where wk can be estimated using the method discussed in the class from
the k−th dimension of the data. Suppose we have the training data as
follows
2
x t
0 [−1 − 1]T
0 [−1 − 2]T
0 [−2 − 1]T
1 [1 1]T
1 [1 2]T
1 [2 1]T
Let each input xn be embedded into a 2-D space using the following
basis functions:
t̂n = WT φ(xn )