0% found this document useful (0 votes)
4 views3 pages

Set12 Soln

The document contains solutions to practice problems related to probability distributions, including Bernoulli and Uniform distributions, as well as joint distributions and properties of random variables. Key results include the p.m.f. of the second order statistic from a Bernoulli sample, the distribution of order statistics from a Uniform sample, and the joint p.d.f. of two order statistics. Additionally, it discusses the properties of multivariate normal distributions and convergence of random variables in different senses.

Uploaded by

supermanyash5656
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views3 pages

Set12 Soln

The document contains solutions to practice problems related to probability distributions, including Bernoulli and Uniform distributions, as well as joint distributions and properties of random variables. Key results include the p.m.f. of the second order statistic from a Bernoulli sample, the distribution of order statistics from a Uniform sample, and the joint p.d.f. of two order statistics. Additionally, it discusses the properties of multivariate normal distributions and convergence of random variables in different senses.

Uploaded by

supermanyash5656
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

MSO205A PRACTICE PROBLEMS SET 12 SOLUTIONS

Question 1. Let X1 , X2 , X3 be a random sample from Bernoulli(p) distribution, for some p ∈ (0, 1).
Find the p.m.f. of X(2) .

Answer: X(2) is a discrete RV supported on {0, 1}. Now,

P(X(2) = 0)

= P(the second smallest of X1 , X2 , X3 is 0)

= P(at least two of X1 , X2 , X3 are 0)

= P(exactly two of X1 , X2 , X3 are 0)

+ P(all three of X1 , X2 , X3 are 0)


 
3
= (1 − p)2 p + (1 − p)3 , (using independence of X1 , X2 , X3 )
2
= (1 − p)2 (2p + 1).

3

Similarly, P(X(2) = 1) = 2 p2 (1 − p) + p3 = p2 (3 − 2p). Therefore, X(2) ∼ Bernoulli(p2 (3 − 2p)).

Question 2. Let X1 , · · · , Xn be a random sample from U nif orm(0, 1) distribution. Identify the
distribution of X(r) for r = 1, · · · , n.

Answer: We have FX(r) (x) = P(X(r) ≤ x) = 0, ∀x ≤ 0 and FX(r) (x) = P(X(r) ≤ x) = 1, ∀x ≥ 1.


For x ∈ (0, 1), P(Xi ≤ x) = x and P(Xi > x) = 1 − x, ∀i = 1, · · · , n. If Y is the number of
Xi , i = 1, · · · , n which fall in (0, x], then Y ∼ Binomial(n, x). Then,
n  
X n k
FX(r) (x) = P(X(r) ≤ x) = P(Y ≥ r) = x (1 − x)n−x .
k
k=r

In this case,
d n!
FX (x) = xr−1 (1 − x)n−r .
dx (r) (r − 1)!(n − r)!
Since FX(r) is differentiable everywhere except possibly at the points 0, 1, we conclude that X(r) is
a continuous RV with the p.d.f.

n! r−1
(r−1)!(n−r)! x (1 − x)n−r , if 0 < x < 1,


fX(r) =
0, otherwise.

and hence X(r) ∼ Beta(r, n − r + 1).

Question 3. Let X1 , · · · , Xn be a random sample from a distribution given by a p.d.f. f . Find the
joint p.d.f. of (X(r) , X(s) ) of 1 ≤ r < s ≤ n.
1
2 MSO205A PRACTICE PROBLEMS SET 12 SOLUTIONS

Answer: In the lecture notes, we have already discussed that the joint p.d.f. of (X(1) , · · · , X(n) ) is
given by

n! Qn

f (yi ), if y1 < · · · < yn ,
i=1
g(y1 , · · · , yn ) =
0, otherwise.

Now, integrating out co-ordinate variables yi , i ∈ 1, 2, · · · , n, i 6= r, i 6= s, we have



n! r−1
(r−1)!(s−r−1)!(n−s)! (F (yr )) [F (ys ) − F (yr )]s−r−1 (1 − F (ys ))n−s f (yr )f (ys ), if yr < ys ,


gX(r) ,X(s) (yr , ys ) =
0, otherwise

where F denotes the common DF for X1 , · · · , Xn .

Question 4. Let Y ∼ Np (b, K). Then for any c ∈ Rn and a n × p real matrix B, consider the n
dimensional random vector Z = c + BY . Show that Z ∼ Nn (c + Bb, BKB t ).

Answer: The Joint MGF of Y = (Y1 , Y2 , · · · , Yp ) is given by


 
1
MY (u) = exp ut b + ut Ku , ∀u ∈ Rp .
2

We compute the MGF of Z = (Z1 , · · · , Zn ). For any v ∈ Rn , we have

MZ (v) = E exp(v t Z) = E exp[v t (c + BY )]

= exp(v t c)E exp[(v t B)Y ]

= exp(v t c)MY (B t v)
 
t t 1 t t
= exp(v c) exp v Bb + v BKB v
2
 
1
= exp v t (c + Bb) + v t BKB t v .
2

Identifying the distribution of Z through the joint MGF, we conclude Z ∼ Nn (c + Bb, BKB t ).

Pp
Question 5. Let Y ∼ Np (b, K) with K being invertible. Then show that j=1 λj (Yj − bj ) = 0 for
some scalars λ1 , λ2 , · · · , λp if and only if λ1 = λ2 = · · · = λp = 0.

Answer: Any Variance-Coviance matrix is positive semi-definite. In our case, since K is invertible,
we conclude that K is positive definite. Then there exists an orthogonal matrix A such that
D = At KA is a diagonal matrix, with eigen-values of K as the diagonal entries.
If possible, let λt (Y − b) = 0 for some λ ∈ Rp . Then λt (Y − b)(Y − b)t λ = 0 and in particular,

λt Kλ = E λt (Y − b)(Y − b)t λ = 0.


Since K is positive definite, we conclude that λ = 0 ∈ Rp .


If λ = 0 ∈ Rp , we have λt (Y − b) = 0. This concludes the proof.
MSO205A PRACTICE PROBLEMS SET 12 SOLUTIONS 3

P∞
Question 6. Let c := m=1 m−3 < ∞. Then the function f : R → [0, 1] given by

 1 x−3 , if x ∈ {1, 2, · · · }

c
f (x) =
0, otherwise

is a p.m.f.. Let X be a discrete RV with this p.m.f. and consider the following sequence of RVs
{Xn }n defined by 
X, if X ≤ n,

Xn = , ∀n.
0, otherwise

Show that the sequence of RVs {Xn }n converges in first mean to X, but not in the second mean.

Answer: We have, for any positive integer n


n n
1 X m 1 X 1
= < ∞,
c m=1 m3 c m=1 m2
and
n n
1 X m2 1 X 1
= < ∞.
c m=1 m3 c m=1 m
Therefore, EXn and EXn2 both exist, for all n. Moreover,
∞ ∞
1 X m 1 X 1 n→∞
E|Xn − X| = = −−−−→ 0,
c m=n+1 m3 c m=n+1 m2

but
∞ ∞
1 X m2 1 X 1
= = ∞, ∀n.
c m=n+1 m3 c m=n+1 m
Hence, {Xn }n converges to X in first mean, but not in the second mean.

You might also like