0% found this document useful (0 votes)
61 views

SC222: Tutorial Sheet 4

This document provides 14 practice problems related to concepts in probability and statistics including: 1) Jointly distributed random variables, covariance, correlation, and independence. 2) Moment generating functions and their relationship to moments. 3) Transformations of random variables and their distributions. 4) Properties of uncorrelated and independent random variables. 5) Multinomial distributions, expectations, and variances. 6) Bounds on probabilities involving means and variances. The problems cover both theoretical concepts and practical applications involving distributions like uniform, geometric, and normal.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views

SC222: Tutorial Sheet 4

This document provides 14 practice problems related to concepts in probability and statistics including: 1) Jointly distributed random variables, covariance, correlation, and independence. 2) Moment generating functions and their relationship to moments. 3) Transformations of random variables and their distributions. 4) Properties of uncorrelated and independent random variables. 5) Multinomial distributions, expectations, and variances. 6) Bounds on probabilities involving means and variances. The problems cover both theoretical concepts and practical applications involving distributions like uniform, geometric, and normal.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

SC222: Tutorial Sheet 4

Problems based on Jointly Distributed Random Variables, Expectation, Vari-


ance, Covariance and the Weak Law of Large Numbers.

Pb 1) The joint PMF of a discrete random vector (X1 , X2 ) is given by the following table

x2 \ x1 −1 0 1
0 1/9 2/9 1/9
1 1/9 2/9 1/9
2 0 1/9 0

a) Determine the covariance of X1 and X2 .


b) Calculate the correlation coefficient ρX1 ,X2 = √ Cov(X1 ,X2 ) of X1 and X2 .
Var(X1 )Var(X2 )

c) Are X1 and X2 independent random variables? Justify your answer.

Pb 2) The moment generating function of a random variable X is a function MX (t) of a free


parameter t, defined by MX (t) = E[etX ] (if it exists).

(i) Compute the moment generating functions for the following distributions.
a) Geometric distribution with parameter p.
b) Uniform distribution over the interval [a, b].
(ii) If the moment generating function exists for a random variable X, then show
that the nth moment about the origin (or E[X n ]) can be found by evaluating the
nth derivative of the moment generating function at t = 0.

Pb 3) Suppose that we have a resistance R. We know that the value of R follows a uniform
law between 900 and 1100 Ω. What is the density of the corresponding conductance
G = 1/R ?

Pb 4) (Universality of the uniform distribution) Let X be a real valued random vari-


able and let U ∼ U ([0, 1]). Since FX : R → [0, 1] is not always one-to-one, therefore,
we define FX−1 as
FX−1 (u) = sup{x ∈ R : FX (x) ≤ u}.
Show that FX−1 (U ) and X has the same distribution.

Pb 5) If X and Y are two independent random variables, then so are g(X) and h(Y ).

Pb 6) Two random variables X and Y are said to be uncorrelated if their covariance is 0.


Suppose X and Y are independent uniformly distributed random variables over the
common interval [0, 1]. Define Z = X + Y and W = X − Y . Show that Z and W are
not independent, but uncorrelated random variables.
Pb 7) Let the joint PDF of random variables X and Y be defined as
π π
fX,Y (x, y) = kcos(x + y) for 0 ≤ x ≤ ,0 ≤ y ≤ .
4 4
Determine the constant k and the marginal probability density functions (fX (x) and fY (y))
of X and Y . Are the random variables X and Y are orthogonal? Justify. (The ran-
dom variables X and Y are said to be orthogonal if the mathematical expectation
E[XY ] = 0.)

Pb 8) Let X and Y be linearly dependent real valued random variables. Show that X and
Y are not independent (in the probability sense.)

Pb 9) (Multinomial Distribution) Let Ω be a sample space associated with a random


experiment E, and let B1 , B2 , .., Bn be a partition of Ω. Assume that we perform m
independent repetitions of the experiment E and that the probability pk = P [Bk ] is
constant from one repetition to another. If Xk denotes the number of times that the
event Bk has occurred among the m repetitions, for k = 1, 2, .., n, then, determine the
joint PMF of the random vector (X1 , X2 , ..., Xn ) and Cov(Xi , Xj ). Also, calculate the
Xn
expectation and variance of the random variable X = n1 Xn .
k=1
√ √
Pb 10) Show that if X ≥ 0 and E(X) = µ then P (X ≥ µ) ≤ µ.
2
Pb 11) Let X have variance σX and Y have variance σY2 . Show that −1 ≤ ρX,Y ≤ 1. Further,
argue that, if ρX,Y = 1 or −1, then X and Y are related by Y = a + bX, where b > 0
if ρX,Y = 1 and b < 0 if ρX,Y = −1.

Pb 12) Consider n independent trials, each of which results in any of the outcomes i, i =
3
X
1, 2, 3, with respective probabilities p1 , p2 , p3 , pi = 1. Let Ni denote the number of
i=1
trials that result in outcome i, and show that Cov(N1 , N2 ) = −np1 p2 . Also explain
why it is intuitive that this covariance is negative.

Pb 13) Suppose that X is a random variable with mean and variance both equal to 20. What
can be said about P [0 ≤ X ≤ 40]?.

Pb 14) From past experience, a professor knows that the test score of a student taking her
final examination is a random variable with mean 75.

(a) Give an upper bound to the probability that a student’s test score will exceed
85.
(b) Suppose in addition the professor knows that the variance of a student’s test
score is equal to 25. What can be said about the probability that a student will
score between 65 and 85?

You might also like