Joint Random variable assignment 2
Joint Random variable assignment 2
2. Let X and Y be two non-negative continuous random variables having respective CDFs FX and FY . Suppose
that for some constants a & b > 0, FX (x) = FY x−a
b . Determine E(X) in terms of E(Y ).
3. Let X be a random variable having an exponential distribution with parameter 12 . Let Z be a random
variable having a normal distribution with mean 0 and variance 1. Assume that, X and Z are independent
random variables. (a) Find the pdf of T = √ZX . (b) Compute E(T ) and V ar(T ).
2
4. Let X and Y be two identically distributed random variables with Var(X) and Var(Y ) exist. Prove or
disprove that Var X+Y
2 ≤ Var (X).
4
5. Let X and Y be i.i.d. random variables each having a N (0, 1). Calculate E[(X + Y ) /(X − Y )].
20
c(X1 − X2 )
6. Let X1 , . . . , X5 be a random sample from N (0, σ 2 ). Find a constant c such that Y = p has
9-
X32 + X42 + X52
a t-distribution. Also, find E(Y ).
01
7. Consider the metro train arrives at the station near your home every quarter hour starting at 5:00 AM. You
walk into the station every morning between 7:10 and 7:30 AM, with the time in this interval being a uniform
random variable, that is U ([7 : 10, 7 : 30]).
2
(a) Find the distribution of time you have to wait for the first train to arrive?
er
(b) Also, find its mean waiting time?
t
X
8. Let X and Y be iid random variables each having uniform distribution (2,3). Find E Y ?
es
1
9. Let X and Y be two random variables such that ρ(X, Y ) = 2, V ar(X) = 1 and V ar(Y ) = 4. Compute
V ar(X − 3Y ).
em
1
Pn
10. Let X1 , X2 , . . . , Xn be iid random variables with E(X1 ) = µ and V ar(X1 ) = σ 2 . Define X = n i=1 Xi , S 2 =
1
Pn 2
n−1 i=1 Xi − X . Find (a) V ar(X) (b) E[S 2 ].
IS
11. Pick the point (X, Y ) uniformly in the triangle {(x, y) | 0 ≤ x ≤ 1 and 0 ≤ y ≤ x}. Calculate E[(X −Y )2 /X].
( y
y − 1+x
(1+x)4 e , x, y ≥ 0
12. Find E(Y /x) where (X, Y ) is jointly distributed with joint pdf f (x, y) =
0, otherwise.
1
13. Let X have a beta distribution i.e., its pdf is fX (x) = β(a,b) xa−1 (1 − x)b−1 , 0 < x < 1 and Y given X = x
has binomial distribution with parameters (n, x). Find regression of X on Y. Is regression linear?
14. Let X ∼ EXP (λ). Find E[X/X > y] and E[X − y/X > y].
15. Consider n independent trials, where each trial results in outcome i with probability pi = 1/3, i = 1, 2, 3. Let
Xi denote the number of trials that result in outcome i amongst these n trials. Find the distribution of X2 .
Find the conditional expectation of X1 given X2 > 0. Also determine cov (X1 , X2 | X2 ≤ 1).
16. (a) Show that cov(X, Y ) = cov(X, E(Y | X)).
(b) Suppose that, for constants a and b, E(Y | X) = a + bX. Show that b = cov(X, Y )/V ar(X).
1
17. Let X be a random variable which is uniformly distributed over the interval (0, 1). Let Y be chosen from
1/x, 0 < y ≤ x
interval (0, X] according to the pdf f (y/x) = Find E(Y k /X) and E(Y k ) for any fixed
0, otherwise.
positive integer k.
18. Suppose that a signal X, standard normal distributed, is transmitted over a noisy channel so that the received
measurement is Y = X +W , where W follows normal distribution with mean 0 and variance σ 2 is independent
of X. Find fX/y (x/y) and E(X | Y = y).
19. Suppose X follows Exp(1). Given X = x, Y is a uniform distributed rv in the interval [0, x]. Find the value
of E(Y ).
20. Consider Bacteria reproduction by cell division. In any time t, a bacterium will either die (with probability
0.25), stay the same (with probability 0.25), or split into 2 parts (with probability 0.5). Assume bacteria act
independently and identically irrespective of the time. Write down the expression for the generating function
of the distribution of the size of the population at time t = n. Given that there are 1000 bacteria in the
population at time t = 50, what is the expected number of bacteria at time t = 51.
21. Let N be a positive integer random variable and X1 , X2 , . . . be a sequence of iid random variables. N is
20
independent of Xi ’s. Find the moment generating function (MGF) of SN = X1 + X2 + . . . + XN , the random
sum in terms of MGF of Xi0 s and N . Also show that:
(a) E[SN ] = E[N ]E[X] (b) V ar[SN ] = E[N ]V ar[X] + [E[X]]2 V ar[N ].
9-
22. If E[Y /X] = 1, show that V ar[XY ] ≥ V ar[X].
01
23. Suppose you participate in a chess tournament in which you play until you lose a game. Suppose you are a
very average player, each game is equally likely to be a win, a loss or a tie. You collect 2 points for each win,
2
1 point for each tie and 0 points for each loss. The outcome of each game is independent of the outcome of
every other game. Let Xi be the number of points you earn for game i and let Y equal the total number of
er
points earned in the tournament. Find the moment generating function MY (t) and hence compute E(Y ).
−y
e , 0<x<y<∞
24. Let (X, Y ) be two-dimensional random variable with joint pdf is given by f (x, y) =
t
0, otherwise
es
X1α X22α . . . Xnnα , α > 0 where α is any constant. Determine E(W ), V ar(W ) and the pdf of W .
26. Let (X, Y ) be a two-dimensional continuous type random variables. Assume that, E(X), E(Y ) and E(XY )
are exist. Suppose that, E(X | Y = y) does not depend on y. Find E(XY ).
27. For each fixed λ > 0, let X be a Poisson distributed random variable with parameter λ. Suppose λ itself is
a random variable following exponential distribution with parameter 1. Find the probability mass function
of X.
28. Let X and Y be two discrete random variables with
P (X = x1 ) = p1 , P (X = x2 ) = 1 − p1 , 0 < p1 < 1;
and
P (Y = y1 ) = p2 , P (Y = y2 ) = 1 − p2 , 0 < p2 < 1.
If the correlation coefficient between X and Y is zero, check whether X and Y are independent random
variables.
2
29. Suppose the length of a telephone conversation between two persons is a random variable X with cumulative
distribution function
0, −∞ < t < 0
P (X ≤ t) = ,
1 − e−0.04t , 0 ≤ t < ∞
where the time is measured in minutes.
(a) Given that the conversation has been going on for 20 minutes, compute the probability that it continues
for at least another 10 minutes.
(b) Show that, for any t > 0, E(X/X > t) = t + 25.
30. A real function g(x) is non-negative and satisfies the inequality g(x) ≥ b > 0 for all x ≥ a. Prove that for a
random variable X if E(g(X)) exists then P (X ≥ a) ≤ E(g(X))b .
λ
31. Let X have a Poisson distribution with mean λ ≥ 0, an integer. Show that P (0 < X < 2(λ + 1)) ≥ λ+1 .
32. Does the random variable X exist for which P [µ − 2σ ≤ X ≤ µ + 2σ] = 0.6? Justify your answer.
20
9-
2 01
t er
es
em
IS