Assignment 4
Assignment 4
All questions carry equal marks but are not equally difficult, so you may want to strategize accordingly.
Problem 1 Let the conditional distribution of observation Y given a real random parameter Θ = θ > 0 be :
Problem 2 Let the conditional distribution of observation Y given a real random parameter Λ = λ ≥ 0 be
f (y/λ) ∼ Poisson(λ). The distribution of Λ is given by
1
e−λ λα−1
f (λ) = 1{λ≥0} ,
Γ(α)
where α > 0 is a known constant, and Γ(·) is the Gamma function defined as:
Z ∞
Γ(x) = z x−1 e−z dz.
0
Problem 3 Let Yi ∼ Unif[1, θ], i = 1, 2, · · · , N be i.i.d observations and the parameter θ ∼ Unif[1, 10].
(a) Find θ̂M AP (y).
(b) Show that θ̂M AP (y) converges in distribution to the true value of θ as N → ∞, that is,
Problem 4 Let yi = a + wi for i = 1, 2, · · · , N where wi are i.i.d. and independent of a with distribution
2
wi ∼ N (0, σw ). Further, a ∼ N (0, σa2 ).
(a) Compute âM AP (y), âM M SE (y) and their associated error covariances.
(b) Consider an alternate sequential procedure to compute the MMSE estimate.
• Suppose you have obtained j measurements y1 , y2 , · · · , yj . Compute the MMSE estimate of a based
on these j measurements and denote it by âj = âM M SE (y1 , · · · , yj ). Denote the corresponding error
covariance as σj2 .
2
• Express âj as a function of âj−1 , σj−1 and yj .
1 1 j
• Show that 2 = 2+ 2.
σj σa σw
x
Problem 5 Let x = 1 ∼ N (0, Rx ).
x2
(a) Compute the MMSE estimate x̂2 (x1 ) of x2 based on x1 .
(b) Compute the associated MMSE cost in part (a) and prove that it is zero if and only if Rx is singular.
(c) Extend the result to show that if x ∼ N (0, Rx ) is a N -dimensional random vector and Rx is not positive
definite, then any component of x can be perfectly estimated by a linear combination of other components.
Problem 6
(a) Let vector parameters β, θ1 and θ2 be related as β = A1 θ1 + A2 θ2 + c, where A1 , A2 and c are known.
Show that β̂LLS = A1 θ̂1,LLS + A2 θ̂2,LLS + c.
(b) If y1 and y2 are orthogonal, then x̂LLS (y1 , y2 ) = x̂LLS (y1 ) + x̂LLS (y2 ).