ME (Math) 577 HW02
ME (Math) 577 HW02
bility measure P represents the uniform distribution, i.e., P [{ζi }] = 15 ∀i. Let us
define two random variables on this probability space as follows.
X(ζ) , ζ and Y (ζ) , ζ 2
(i) Show that the random variables X and Y are not statistically independent.
(ii) Show that the random variables X and Y are statistically uncorrelated.
Hint: Independence ⇒ P [XY ] = P [X]P [Y ]; Uncorrelated ⇒ E[XY ] = E[X]E[Y ].
Problem 02.02
Let X be a (continuous) random variable whose probability density function (pdf)
f (•) is unknown, but the expected value E[X] = µ and the variance V ar[X] = σ 2
are computed from the physical measurements. Find the best estimate fˆ(•) of the
unknown f (•) by maximizing the entropy:
Z
H(X) , − log f (x) f (x) dx
R
Hint: You may use Lagrange multipliers.
Problem 02.03
Let X1 , · · · , Xn be independent and identically distributed (i.i.d) Cauchy random
variables with the common probability density function (pdf):
1
fXi (x) = , i = 1, · · · , n
π 1 + (x − µ)2
Pn
Show that the pdf of Y , n1 i=1 Xi is identical to the pdf of Xi ’s and is indepen-
dent of n, i.e.,
1
fY (x) =
π 1 + (x − µ)2
Hint: You may use the characteristic function.
1
Problem 02.04
Let X be a random variable defined on a given probability space R, B, P , where
the map X : R → R is onto. Let g : R → R be a Borel-measurable function. Let us
define Y = g(X) almost surely.
(a) Is Y a random variable? If so, is Y defined on the same probability space as
X? Justify your answer.
(b) If g is a monotonically increasing function (i.e., θ1 < θ2 implies g(θ1 ) ≤ g(θ2 )),
then show that
FX (θ) if ϕ ≥ g(θ)
FX,Y (θ, ϕ) ,
FY (ϕ) if ϕ < g(θ)
Problem 02.05
X
Let the random vectors X ∈ Rn and Y ∈ Rm be jointly Gaussian. Let Ψ =
Y
have the jointly Gaussian probability density function
1 1
−1
fΨ (ψ) = p exp − (ψ − µΨ )T PΨΨ (ψ − µΨ )
(2π)(n+m) |PΨΨ | 2
where
µX
E[Ψ] = µΨ = ;
µY
PXX PXY
Covar[Ψ] = PΨΨ = ; and |PΨΨ | is the determinant of PΨΨ ;
PY X PY Y
E[X] = µX ∈ Rn and Covar[X] = PXX ∈ Rn×n ;
E[Y ] = µY ∈ Rm and Covar[Y ] = PY Y ∈ Rm×m .
Then, solve the following parts of the jointly Gaussian random vector problem
[Hint: See the section on random vectors in Chapter 2 of A.H. Jazwinski (Academic
Press, New York, 1970)]:
−1
PXX PXY A11 A12
(a) Let = . Show that
PY X PY Y A21 A22
−1 −1
• A11 = PXX − PXY PY−1 T
Y PXY and A22 = PXYT −1
− PXY PXX PXY
• AT21 = A12 = −A11 PXY PY−1 −1
Y = −PXX PXY A22
−1 −1 −1
• A11 = PXX +PXX T
PXY A22 PXY PXX and A22 = PY−1 −1 T −1
Y +PY Y PXY A11 PXY PY Y
2
(b) Prove part the random vectors X and Y are individually (jointly) Gaussian
random vectors.
(c) Show that the random vectors X and Y are uncorrelated, i.e., E[XY T ] =
E[X]E[Y ]T if and only if X and Y are statistically independent, i.e.,
fXY (θ, ϕ) = fX (θ)fY (ϕ).
(d) Let us construct a p-dimensional random vector Z = QX + SY + C almost
surely, where Q, S and C are constant matrices of compatible dimension.
Show that Z is Gaussian. Find the mean and covariance matrix of Z. [Hint:
You may use the characteristic function].
X V
(e) Let Let Z = be a linear transformation of Ψ = via a bijective
Y W
mapping from Rn+m onto Rn+m such that V and W are mutually uncor-
related and W = Y almost surely. Find an expression for PZZ in terms of
PXX , PXY and PY Y .
fXY (θ,ϕ)
(f) Given the conditional density fX|Y (θ|ϕ) = fY (ϕ) and making use of the
above results, show that
1 1 T −1
fX|Y (θ|ϕ) = p exp − θ − µ X|Y =ϕ PX|Y θ − µ X|Y =ϕ
(2π)n |PX|Y | 2