Chapter 5 Sol
Chapter 5 Sol
Y =1 Y =2
1 1
X=1 3 12
1
X=2 6
0
1 1
X=4 12 3
Solution:
(a)
P (X ≤ 2, Y > 1) = P (X = 1, Y = 2) + P (X = 2, Y = 2)
1 1
= +0= .
12 12
(b)
X
PX (x) = P (X = x, Y = y).
y∈RY
1 1 5
3
+ 12
= 12
for x = 1
1 1
PX (x) = 6
+0= 6
for x = 2
1 1 5
+ = for x = 4
12 3 12
1
So:
5
12
x=1
1
PX (x) = 6
x=2
5
x=4
12
X
PY (y) = P (X = x, Y = y).
x∈RX
1
+ 16 + 1 7
3 12
= 12
for y = 1
PY (y) =
1 1 5
+0+ = for y = 2
12 3 12
So:
7
12
y=1
PY (y) =
5
y=2
12
(c)
1
P (Y = 2, X = 1) 12 1
P (Y = 2|X = 1) = = 5 = .
P (X = 1) 12
5
(d)
Using the results of the previous part, we observe that :
1 5
P (Y = 2|X = 1) = 5
6= P (Y = 2) = 12
.
So, we conclude, two variables are not independent.
Solution:
(a)
2
Table 2: Values of Z
Y =1 Y =2
X=1 −1 −3
X=2 0 −2
X=4 2 0
Z = X − 2Y
1
P (X = 1, Y = 2) = 12
for z = −3
P (X = 2, Y = 2) = 0 for z = −2
1
PZ (z) = P (X = 1, Y = 1) = 3
for z = −1
1 1 1
P (X = 2, Y = 1) + P (X = 4, Y = 2) = + = for z = 0
6 3 2
1
P (X = 4, Y = 1) = for z = 2
12
So:
1
12
z = −3
0 z = −2
1
PZ (z) = 3
z = −1
1
z=0
2
1
z=2
12
(b)
1
P (X = 2, Y = 1) 6 1
P (X = 2|Z = 0) = = 1 = .
P (Z = 0) 2
3
3
3. A box contains two coins: a regular coin and a biased coin with P (H) = 23 . I choose
a coin at random and toss it once. I define the random variable X as a Bernoulli
random variable associated with this coin toss, i.e., X = 1 if the result of the coin
toss is heads and X = 0 otherwise. Then I take the remaining coin in the box
and toss it once. I define the random variable Y as a Bernoulli random variable
associated with the second coin toss. Find the joint PMF of X and Y . Are X and
Y independent?
Solution:
We choose each coin with probability 0.5. We call the regular coin as coin1 and the
biased coin as coin2.
Let X be a Bernoulli random variable associated with the first chosen coin toss. We
can pick the first coin “coin1” or second coin “coin2” with equal probability 0.5.
Let Y be a Bernoulli random variable associated with the second chosen coin toss.
We can pick the first coin “coin1” or second coin “coin2” with equal probability 0.5.
4
P (X = 0, Y = 0) = P (first coin = coin1)P (T |coin 1)P (T |coin 2)
+ P (first coin = coin2)P (T |coin 1)P (T |coin 2)
= P (T |coin 1)P (T |coin 2)
1 1 1
= × = .
2 3 6
P (X = 0, Y = 1) = P (first coin = coin1)P (T |coin 1)P (H|coin 2)
+ P (first coin = coin2)P (T |coin 2)P (H|coin 1)
1 1 2 1 1 1 1
= × × + × × = .
2 2 3 2 3 2 4
P (X = 1, Y = 0) = P (first coin = coin1)P (H|coin 1)P (T |coin 2)
+ P (first coin = coin2)P (H|coin 2)P (T |coin 1)
1 1 1 1 2 1 1
= × × + × × = .
2 2 3 2 3 2 4
P (X = 1, Y = 1) = P (first coin = coin1)P (H|coin 1)P (H|coin 2)
+ P (first coin = coin2)P (H|coin 1)P (H|coin 2)
= P (H|coin 1)P (H|coin 2)
1 2 1
= × = .
2 3 3
Y =0 Y =1
1 1
X=0 6 4
1 1
X=1 4 3
By comparing joint PMFs and marginal PMFs, we conclude two variables are not
independent. For example:
5
P (X = 0) =
12
7
P (Y = 1) =
12
1
P (X = 0, Y = 1) = 6= P (X = 0) × P (Y = 1).
4
5
4. Consider two random variables X and Y with joint PMF given by
1
PXY (k, l) = , for k, l = 1, 2, 3, ...
2k+l
(a) Show that X and Y are independent and find the marginal PMFs of X and
Y.
(b) Find P (X 2 + Y 2 ≤ 10).
Solution:
(a)
RX = {1, 2, 3, · · · }
RY = {1, 2, 3, · · · }
1
PXY (k, l) = .
2k+l
∞
X X 1
PX (k) = P (X = k, Y = l) = k+l
l∈RY l=1
2
∞
1 X 1 1
= k l
= k.
2 l=1 2 2
∞
X X 1
PY (l) = P (X = k, Y = l) = k+l
k∈RX k=1
2
∞
1 X 1 1
= = l.
2l k=1
2 l 2
By calculating the marginal PMFs we observe that PXY (k, l) = PX (k) · PY (l) for all
k ∈ RX and l ∈ RY (k, l = 1, 2, 3, · · · ). So, these two variables are independent.
(b)
There are different cases in which X 2 + Y 2 ≤ 10:
6
P (X 2 + Y 2 ≤ 10) = P (X = 1, Y = 1) + P (X = 1, Y = 2) + P (X = 2, Y = 1)
+ P (X = 2, Y = 2) + P (X = 1, Y = 3) + P (X = 3, Y = 1)
1 1 1 1 1 1 11
= 2+ 3+ 3+ 4+ 4+ 4 = .
2 2 2 2 2 2 16
5. Let X and Y be as defined in Problem 1. Also, suppose that we are given that
Y = 1.
(a) Find the conditional PMF of X given Y = 1. That is, find PX|Y (x|1).
(b) Find E[X|Y = 1].
(c) Find Var(X|Y = 1).
Solution:
(a)
P (X = x, Y = 1) P (X = x, Y = 1) 12
PX|Y (x|1) = = 7 = P (X = x, Y = 1).
P (Y = 1) 12
7
12 1 4
7
× 3
= 7
x=1
12 1 2
PX|Y (x|1) = 7
× 6
= 7
x=2
12 1 1
× = x=4
7 12 7
4
7
x=1
2
PX|Y (x|1) = 7
x=2
1
x=4
7
(b)
X 4 2 1 12
E[X|Y = 1] = xPX|Y (x|1) = 1 × +2× +4× = .
x
7 7 7 7
(c)
7
X 4 2 1 28
E[X 2 |Y = 1] = x2 PX|Y (x|1) = 1 × + 4 × + 16 × = .
x
7 7 7 7
6. The number of customers visiting a store in one hour has a Poisson distribution
with mean λ = 10. Each customer is a female with probability p = 34 independent
of other customers. Let X be the total number of customers in a one-hour interval
and Y be the total number of female customers in the same interval. Find the joint
PMF of X and Y .
PXY (i, j) = P (X = i, Y = j)
=0 if i < j.
To find PXY (i, j) for i ≥ j, we can use the law of total probability:
PXY (i, j) = P (X = i, Y = j)
X∞
= P (X = i, Y = j|X = k)PX (k) (law of total probability).
k=0
8
X
Solution: We can use LOTUS to find E Y
. Let us first remember the following
useful identities:
∞
X 1
kxk−1 = for |x| < 1,
(1 − x)2
k=1
∞
X xk
− ln(1 − x) = for |x| < 1.
k=1
k
The first one is obtained by taking derivative of the geometric sum formula, and
the second one is a Taylor series. Now, let’s apply LOTUS.
X2 + Y 2
X Y
E =E +E
XY Y X
X ∞ X ∞
X m
E = P (X = m, Y = n)
Y n=1 m=1
n
∞ X ∞
X m 2 m−1 n−1
= p q q
n=1 m=1
n
∞ ∞
X 1 2 n−1 X
= p q mq m−1
n=1
n m=1
∞
X 1 2 n−1 1
= p q
n=1
n (1 − q)2
∞
X 1 n−1
= q
n=1
n
∞
1 X qn
=
q n=1 n
1 1
= ln .
1−p p
Y
By symmetry, we obtain the same answer for E X . Therefore:
2
X +Y2
2 1
E = ln .
XY 1−p p
9
8. Consider the set of points in the set C:
Suppose that we pick a point (X, Y ) from this set completely at random. Thus,
1
each point has a probability of 11 of being chosen.
Solution:
10
(b) For i = −1, 0, 1, we can write
PXY (i, 1)
PX|Y (i|1) =
PY (1)
1
1
= 11
3 = , for i = −1, 0, 1.
11
3
Thus, we conclude
1
3
for i = −1, 0, 1
PX|Y (i|1) =
0 otherwise
=0
Suppose that we pick a point (X, Y ) from this set completely at random. Thus,
1
each point has a probability of 11 of being chosen.
Solution:
11
(a) To find E[X|Y = 1], we have
X
E[X|Y = 1] = xi PX|Y (xi |1).
xi ∈RX
2
= − 02
3
2
=
3
(c) To find E[X| − 1 ≤ Y ≤ 1], let A be the event that −1 ≤ Y ≤ 1. To find
E[X|A], we need to find the conditional PMF, PX|A (k), for k = 1, 0, 1. First,
note that
3 3 3 9
P (A) = PY (0) + PY (1) + PY (−1) = + + = .
11 11 11 11
Thus, for k = −1, 0, 1, we have
11
PX|A (k) = P (X = k, A).
9
So, we can write
11
PX|A (−1) = P (X = −1, A)
9
11 3 1
= PXY (−1, 0) + PXY (−1, 1) + PXY (−1, −1) = = ,
9 9 3
11
PX|A ( 0 ) = P (X = 0, A)
9
11 3 1
= PXY (0, 0) + PXY (0, 1) + PXY (0, −1) = = ,
9 9 3
11
PX|A ( 1 ) = P (X = 1, A)
9
11 3 1
= PXY (1, 0) + PXY (1, 1) + PXY (1, −1) = = ,
9 9 3
12
Thus, we have
X
E[X|A] = xi PX|A (x)
xi ∈RX
1 1 1
= (−1) + 0 + 1. = 0.
3 3 3
(d) To find E X 2 − 1 ≤ Y ≤ 1 , we use the conditional PMF and LOTUS. We
have
X
E[X 2 A] = x2i PX|A (x)
xi ∈RX
1 1 1 2
= (−1)2 · + 02 · + 12 · = .
3 3 3 3
10. The number of cars being repaired at a small repair shop has the following PMF
1
8
for x = 0
1
8
for x = 1
1
PX (x) = 4
for x = 2
1
for x = 3
2
0 otherwise
Each car that is being repaired is a four-door car with probability 34 and a two-door
car with probability 41 independently from other cars and independently from the
number of cars being repaired. Let X be the number of four-door cars and Y be
the number of two-door cars currently being repaired.
Solution:
(a) Suppose that the number of cars being repaired is N. Then note that RX =
RY = {0, 1, 2, 3} and X + Y = N. Also, given N = n, X is a sum of n
13
independent Bernoulli( 43 ) random variables. Thus, given N = n, X has a
binomial distribution with parameters n and 43 , so
3
X|N = n ∼ Binomial(n, p = );
4
1
Y |N = n ∼ Binomial(n, q = 1 − p = ).
4
We have
3
X
PX (k) = P (X = k|N = n)PN (n) (law of total probability)
n=0
3
X n k n−k
= p q PN (n)
n=k
k
P
3 n 3 0 1 n
n=0 0 4 4
· PN (n) for k = 0
3 1
3 n 1 n−1
P
n=0 0 4 4
· PN (n) for k = 1
3 2 1 n−2
P3
n
PX (k) = n=0 0 4 4
· PN (n) for k = 2
3 3 1 n−3
P3 n
· PN (n) for k = 3
n=0 0 4 4
0 otherwise
23
for k = 0
128
33
128
for k = 1
45
PX (k) = 128
for k = 2
27
for k = 3
128
0 otherwise
1
Similarly, for the marginal PMF of Y , p = 4
and q = 43 .
73
for k = 0
128
43
128
for k = 1
11
PY (k) = 128
for k = 2
1
for k = 3
128
0 otherwise
14
(b) To find joint PMF of X and Y , we can also use the law of total probability:
∞
X
PXY (i, j) = P (X = i, Y = j|N = n)PN (n) (law of total probability).
n=0
15
1
25
for z = −4
2
for z = −3
25
3
for z = −2
25
4
for z = −1
25
5
25
for z = 0
PZ (z) = 4
25
for z = 1
3
for z = 2
25
2
for z = 3
25
1
25
for z = 4
0 otherwise
12. Consider two random variables X and Y with joint PMF given in Table 4.
Y =0 Y =1 Y =2
1 1 1
X=0 6 6 8
1 1 1
X=1 8 6 4
Solution:
16
(a) Using the table we find out
1 1 1 11
PX (0) = + + = ,
6 6 8 24
1 1 1 13
PX (1) = + + = ,
8 6 4 24
1 1 7
PY (0) = + = ,
6 8 24
1 1 1
PY (1) = + = ,
6 6 3
1 1 3
PY (2) = + = .
8 4 8
Note that X and Y are not independent.
(b) We have
PXY (0, 0)
PX|Y (0|0) =
PY (0)
1
6 4
= 7 = .
24
7
Thus,
4 3
PX|Y (1|0) = 1 − = .
7 7
We conclude
3
X|Y = 0 ∼ Bernoulli .
7
Similarly, we find
1
PX|Y (0|1) = ,
2
1
PX|Y (1|1) = .
2
(c) We note that the random variable Y can take three values: 0 , 1 and 2 . Thus,
the random variable Z = E[X|Y ] can take three values as it is a function of Y .
Specifically,
E[X|Y = 0] if Y = 0
Z = E[X|Y ] = E[X|Y = 1] if Y = 1
E[X|Y = 2] if Y = 2
17
Now, using the previous part, we have
3 1 2
E[X|Y = 0] = , E[X|Y = 1] = , E[X|Y = 2] =
7 2 3
7
and since P (y = 0) = P (y = 1) = 31 , and P (y = 2) = 38 we conclude that
24
,
3 7
7
with probability 24
1
Z = E[X|Y ] = 2
with probability 31
2
3
with probability 83
So we can write
7 3
24
if z = 7
1 1
if z =
3 2
PZ (z) =
3 2
if z =
8 3
0 otherwise
(d) Now that we have found the PMF of Z, we can find its mean and variance.
Specifically,
3 7 1 1 2 3 13
E[Z] = · + · + · = .
7 24 2 3 3 8 24
13
We also note that EX = 24
. Thus, here we have
18
13. Let X, Y , and Z = E[X|Y ] be as in problem 12. Define the random variable V as
V = Var(X|Y ).
Solution:
Therefore,
7
Var(X|Y = 0) with probability 24
1
V = Var(X|Y ) = Var(X|Y = 1) with probability 3
3
Var(X|Y = 2) with probability
8
3 4 12
Var(X|Y = 0) = · = ,
7 7 49
1 1 1
Var(X|Y = 1) = · = ,
2 2 4
2 1 2
Var(X|Y = 2) = · = ,
3 3 9
Thus,
12 7
49
with probability 24
1 1
V = Var(X|Y ) = 4
with probability 3
2 3
with probability
9 8
19
So we can write
7 12
24
if v = 49
1 1
if v =
3 4
PV (v) =
3 2
if v =
8 9
0 otherwise
13 11 143
Var(X) = · =
24 24 576
5
EV =
21
41
Var(Z) = .
4032
14. Let N be the number of phone calls made by the customers of a phone company in
a given hour. Suppose that N ∼ P oisson(β), where β > 0 is known. Let Xi be the
length of the i’th phone call, for i = 1, 2, ..., N. We assume Xi ’s are independent
of each other and also independent of N. We further assume that have the same
mean and variance
Xi ∼ Exponential(λ),
where λ > 0 is known. Let Y be the sum of the lengths of the phone calls, i.e.,
N
X
Y = Xi .
i=1
20
so, we use the law of iterated expectations:
N
X
Var(Y |N) = Var(Xi |N)
i=1
XN
= Var(Xi ) (since Xi ’s are independent of N)
i=1
= NVar(X).
Thus, we have
21
We obtain
15. Let X and Y be two jointly continuous random variables with joint PDF
1 −x cy
2 e + (1+x)2 0 ≤ x, 0 ≤ y ≤ 1
fXY (x, y) =
0 otherwise
Solution:
(a) We have:
RXY
22
Z ∞ Z ∞
fXY (x, y)dxdy = 1
∞ ∞
1 ∞
1 −x
Z Z
cy
= e + dxdy
y=0 x=0 2 (1 + x)2
Z 1 ∞
1 −x cy
= − e − dy
0 2 (1 + x) 0
Z 1
1
= ( + cy)dy
0 2
1
1 1 2
= y + cy
2 2 0
1 1
= + c
2 2
Thus, c = 1.
(b)
1
P (0 ≤ X ≤ 1, 0 ≤ Y ≤ )
2
1
1
1 −x
Z Z
2 y
= e + dxdy
y=0 x=0 2 (1 + x)2
Z 1 1
2 1 −x y
= − e − dy
0 2 1+x 0
Z 1
2 1 1 −1 y
= ( + y) − ( e + ) dy
0 2 2 2
5 1
= −
16 4e
1
2
1 x
23
(c)
1 1
1 −x
Z Z
y
P (0 ≤ X ≤ 1) = e + dxdy
y=0 x=0 2 (1 + x)2
3 1
= −
4 2e
16. Let X and Y be two jointly continuous random variables with joint PDF
−xy
e 1 ≤ x ≤ e, y > 0
fXY (x, y) =
0 otherwise
Solution:
(a) We have:
RXY
y
1 e x
24
for 0 < y
Z e
fY (y) = e−xy dx
1
1
= (e−y − e−ey )
y
Thus, 1 −y
y (e − e−ey ) y>0
fX (x) =
0 otherwise
(b)
√
√
Z e Z 1
P (0 ≤ Y ≤ 1, 1 ≤ X ≤ e) = e−xy dydx
x=1 y=0
√
e
1 1 −x
Z
= − e dx
2 1 x
17. Let X and Y be two jointly continuous random variables with joint PDF
1 2 1
4x + 6y −1 ≤ x ≤ 1, 0 ≤ y ≤ 2
fXY (x, y) =
0 otherwise
Solution:
(a) for −1 ≤ x ≤ 1
y
2
RXY
-1 1 x
25
2
1 1
Z
fX (x) = ( x2 + y)dy
0 4 6
1 1
= x2 +
2 3
Thus, 1 2 1
2x + 3
−1 ≤ x ≤ 1
fX (x) =
0 otherwise
for 0 ≤ y ≤ 2
1
1 2 1
Z
fY (y) = x + y dx
−1 4 6
1 1
= + y
6 3
1 1
6 + 3y 0≤y≤2
fX (x) =
0 otherwise
(b) We have:
y
2
-1 1 x
1 1
1 2 1
Z Z
P (X > 0, Y < 1) = x + y dydx
x=0 y=0 4 6
1
=
6
(c) We have:
26
y
2
-1 1 x
1
1 1
Z
P (Y < 1) = ( y + )dy
0 3 6
1
=
3
Therefore, P (X > 0|Y < 1) = 21 .
(e) We have:
y
2
A
B
-1 1 x
27
P (X + Y > 0) = P ((x, y) ∈ A)
= 1 − P (B)
Z 0 Z −x
1 2 1
=1− x + y dydx
x=−1 y=0 4 6
131
=
144
18. Let X and Y be two jointly continuous random variables with joint CDF
1 − e−x − e−2y + e−(x+2y) x, y > 0
FXY (x, y) =
0 otherwise
= (a function of x) · (a function of y)
= FX (x) · FY (y)
(a)
(b)
Z ∞ Z 2y
P (X < 2Y ) = 2e−(x+2y) dxdy
y=0 x=0
1
=
2
28
(c) Yes, as we saw above.
Solution:
X ∼ N(0, 1). Thus,
1 x2
fX (x) = √ e− 2
2π
Thus, q 2
2 − x2
π
e x>0
fX|A (x) =
0 else
FX (x) − FX (0)
FX|A (x) =
P (A)
1
= 2 FX (x) −
2
= 2FX (x) − 1
29
(b)
Z ∞
E[X|A] = xfX|A (x)dx
−∞
Z ∞r
2 − x2
= xe 2 dx
0 π
2
− x2
u=e
x2
du = −xe− 2 dx
r Z 0
2
E[X|A] = (−du)
π 1
r
2
=
π
(c)
Z ∞
2
E[X |A] = x2 fX|A (x)dx
0
Z ∞
=2 x2 fX (x)dx
0
Z ∞
= x2 fX (x)dx (since x2 fX (x) is an even function.)
−∞
= E[X 2 ]
=1
Thus
For 0 ≤ y ≤ 1, find
30
Solution:
+1
1 1 1 +1
Z
fY (y) = (x2 + y)dx = x3 + yx −1
−1 3 3 3
2 2
= y+ for 0 ≤ y ≤ 1
3 3
Thus, for 0 ≤ y ≤ 1, we obtain:
For 0 ≤ y ≤ 1:
3x2 +y
2y+2
−1 ≤ x ≤ 1
fX|Y (x|y) =
0 else
(b)
1 1
3x2 + y
Z Z
P (X > 0|Y = y) = fX|Y (x|y)dx = dx
0 0 2y + 2
Z 1
1
= (3x2 + y)dx
2y + 2 0
1 3 1 y+1 1
= (x + yx) 0 = =
2y + 2 2(y + 1) 2
21. Let X and Y be two jointly continuous random variables with joint PDF
1 2 2
2x + 3y −1 ≤ x ≤ 1, 0 ≤ y ≤ 1
fXY (x, y) =
0 otherwise
31
Solution:
Let us first find fY |X (y|x). To do so, we need fX (x):
1
1 2 1 1 1
Z
fX (x) = ( x2 + y)dy = x2 y + y 2 0
0 2 3 2 3
1 1
= x2 + for − 1 ≤ x ≤ +1
2 3
Thus:
1 2
fXY (x, y) x + 2y
fY |X (y|x) = = 21 2 31
fX (x) 2
x +3
Therefore:
2
3
y
fY |X (y|0) = 1 = 2y for 0≤y≤1
3
Thus:
1 1
2
Z Z
E[Y |X = 0] = yfY |X (y|0)dy = 2y 2dy =
0 0 3
1 1
1
Z Z
2 2
E[Y |X = 0] = y fY |X (y|0)dy = 2y 3 dy =
0 0 2
Therefore:
1 2 1 4 1
Var(Y |X = 0) = − ( )2 = − =
2 3 2 9 18
22. Consider the set
E = {(x, y)|x| + |y| ≤ 1}.
Suppose that we choose a point (X, Y ) uniformly at random in E. That is, the joint
PDF of X and Y is given by
c (x, y) ∈ E
fXY (x, y) =
0 otherwise
32
y
−1 1
x
−1
Solution:
(a)
We have:
Z Z √ √
1= cdxdy = c(area of E) = c 2 · 2 = 2c
E
1
→c=
2
(b)
For 0 ≤ x ≤ 1, we have:
1−x
1
Z
fX (x) = dy = 1 − x
x−1 2
33
y
−x + y = 1 x+y =1
−1 1
x
−x − y = 1 x−y = 1
−1
For −1 ≤ x ≤ 0, we have:
1+x
1
Z
fX (x) = dy = 1 + x
−x−1 2
1 − |x| −1 ≤x≤1
fX (x) =
0 else
Similarly, we find:
1 − |y| −1≤y ≤1
fY (y) =
0 else
(c)
1
fXY (xy) 2
fX|Y (x|y) = =
fY (y) (1 − |y|)
1
= for |x| ≤ 1 − |y|
2(1 − |y|)
34
Thus:
1
2(1−|y|)
for − 1 + |y| ≤ x ≤ 1 − |y|
fX|Y (x|y) =
0 else
23. Let X and Y be two independent Unf iorm(0, 2) random variables. Find P (XY <
1).
Solution:
We have:
Z 2
P (XY < 1) = P (XY < 1|Y = y)fY (y)dy Law of total prob
0
2
1
Z
= P (XY < 1|Y = y) dy Since Y ∼ Unif orm(0, 2)
0 2
Z 2
1 1
= P (X < )dy X and Y indep
2 0 y
Z 1 Z 2
1 1
= 1dy + dy X ∼ Unif orm(0, 2)
2 0 1 2y
1
= [1 + ln2] ≈ 0.847
2
or equivalently
(a) Find EY .
35
(b) Find Var(Y ).
Solution:
a+b (b−a)2
Remember that if Y ∼ Unif orm(a, b), then EY = 2
and Var(Y ) = 12
(a)
Using the law of total expectation:
Z ∞
E[Y ] = E[Y |X = x]fX (x)dx
0
Z ∞
= E[Y |X = x]e−x dx Since Y |X ∼ Unif orm(0, X)
Z0 ∞
1 ∞ −x
Z
x −x
= e dx = [ xe dx]
0 2 2 0
1 1
= ·1=
2 2
(b)
Z ∞ Z ∞
2 2
EY = E[Y |X = x]fX (x)dx = E[Y 2 |X = x]e−x dx Law of total expectation
0 0
Y |X ∼ Unif orm(0, X)
Therefore:
2 2 1 5
EY 2 = Var(Y ) = − =
3 3 4 12
25. Let X and Y be two independent Unif orm(0, 1) random variables. Find
(a) E[XY ].
36
(b) E[eX+Y ].
(c) E[X 2 + Y 2 + XY ].
(d) E[Y eXY ].
Solution: (a)
E[XY ] = E[X] · E[Y ] Since Xand Y are indep
1 1 1
= · =
2 2 4
(b)
Therefore:
E[eX+Y ] = (e − 1)(e − 1) = (e − 1)2
(c)
1
1
Z
2
EX = x2 · 1dx =
0 3
Therefore:
2 1 11
E[X 2 + Y 2 + XY ] = + =
3 4 12
(d)
Z 1 Z 1
XY
E[Y e ]= yexy dxdy LOTUS
0 0
Z 1 Z 1
xy 1
= [e ]0 dy = [ey − 1]dy = ey − 2
0 0
37
X
26. Let X and Y be two independent Unif orm(0, 1) random variables, and Z = Y
.
Find the CDF and PDF of Z.
Solution:
First note that since RX = RY = [0, 1], we conclude RZ = [0, ∞). We first find the
CDF of Z.
X
FZ (z) = P (Z ≤ z) = P ≤z
Y
= P (X ≤ zY ) (Since Y ≥ 0)
Z 1
= P (X ≤ zY |Y = y)fY (y)dy (Law of total prob)
0
Z 1
= P (X ≤ zY |Y = y)dy (Since X and Y are indep)
0
Note:
1 if y > z1
P (X ≤ zy) =
zy if y ≤ z1
1
1 1 1
Z
FZ (z) = (zy)dy = zy 2 0 = z
0 2 2
(b)
If z > 1, then
1
Z
z
Z 1
FZ (z) = zydy + 1dy
1
0 z
1 2 z1 1
= zy 0 + y 1
2 z
1 1 1
= +1− =1−
2z z 2z
38
1
2z 0≤z≤1
1
FZ (z) = 1− 2z
z≥1
0 z<0
1
0≤z≤1
d 2
1
fZ (z) = FZ (z) = 2z 2
z≥1
dz
0 else
1 (u−x)2
fU |X (u|x) = √ e− 2
2π
1 u2
fU (u) = √ e− 4
2 π
(c)
39
fXU (x, u)
fX|U (x|u) =
fU (u)
fU |X (u|x)fX (x)
=
fU (u)
(u−x) 2 2
x
√1 e− 2 · √1 e− 2
2π 2π
= u 2
1
√
2 π
e− 4
1 u 2
= √ e−(x− 2 )
π
Therefore, X|U = u ∼ N( u2 , 21 )
(d)
By part (c),
u
E X|U = |u =
2
1
Var X|U = |u =
2
28. Let X and Y be two independent standard normal random variables. Consider the
point (X, Y ) in the x − y plane. Let (R, Θ) be the corresponding polar coordinates
as shown in Figure 1. The inverse transformation is given by
X = R cos Θ
Y = R sin Θ
where, R ≥ 0 and −π < Θ ≤ π. Find the joint PDF of R and Θ. Show that R and
Θ are independent.
Y •(X, Y )
X = R cos Θ
R Y = R sin Θ
Θ
X
40
Solution: Here (X, Y ) are jointly continuous and are related to (R, Θ) by a one-
to-one relationship. We use the method of transformations (Theorem ??). The
function h(r, θ) is given by
x = h1 (r, θ) = r cos θ
y = h2 (r, θ) = r sin θ
Thus, we have
where
∂h1 ∂h1
∂r ∂θ
cos θ −r sin θ
J = det = det = r cos2 θ + r sin2 θ = r.
∂h2 ∂h2
∂r ∂θ
sin θ r cos θ
We conclude that
where
r2
re− 2 r ∈ [0, ∞]
fR (r) =
0 otherwise
1
2π
θ ∈ (−π, π]
fΘ (θ) =
0 otherwise
41
Solution: Note that here 0 ≤ X, Y ≤ 1, so we can write the range of (R, Θ) as
follows
0 ≤ r,
0 ≤ r cos θ ≤ 1,
0 ≤ r sin θ ≤ 1.
30. Consider two random variables X and Y with joint PMF given in Table 5.
Y =0 Y =1 Y =2
1 1 1
X=0 6 4 8
1 1 1
X=1 8 6 6
Solution:
First, we find PMFs of X and Y :
42
1
RX = {0, 1} PX (0) = 6
+ 14 + 1
4+6+3
824
= = 13
24
PX (1) = 1
8
+ 1
6
+ 1
6
= 11
24
1 1 7
RY = {0, 1, 2} PY (0) = 6 + 8 = 24
1 1 5
PY (1) = 4
+ 6
= 12
PY (2) = 18 + 16 = 247
13 11 11
EX = 0 · +1· =
24 24 24
7 5 7
EY = 0 · +1· +2· =1
24 12 24
X 1 1 1 1 1 1
EXY = ijPXY (i, j) = 0 + 1 · 0 · + 1 · 1 · + 1 · 2 · = + =
8 6 6 6 3 2
Therefore:
1 11 1
Cov(X, Y ) = EXY − EX · EY = − ·1=
2 24 24
Var(X) = EX 2 − (EX)2
11
EX 2 =
24
11 13
Var(X) = ·
24 24
√
11 × 13
→ σX = ≈ 0.498
24
7 5 7 19
EY 2 = 0 · +1· +4· =
24 12 24 12
19 7
Var(Y ) = −1=
12
r 12
7
→ σY = ≈ 0.76
12
Cov(X, Y )
→ ρ(X, Y ) =
σX σY
1
24
= √ q ≈ 0.11
11×13 7
24
· 12
Z = 11 − X + X 2 Y
W = 3 − Y.
Find Cov(Z, W ).
43
Solution:
Cov(Z, W ) = Cov(11 − X + X 2 Y, 3 − Y )
= Cov(−X + X 2 Y, −Y ) = Cov(X, Y ) − Cov(X 2 Y, Y )
= −Cov(X 2 Y, Y ) (Since X and Y are indep Cov(X, Y ) = 0)
= −E[X 2 Y · Y ] + E[X 2 Y ] · E[Y ] (EY = 0)
2 2 2 2
= −E[X Y ] = −EX · EY = −1
2
32. Let X and Y be two random variables. Suppose that σX = 4, and σY2 = 9. If we
know that the two random variables Z = 2X − Y and W = X + Y are independent,
find Cov(X, Y ) and ρ(X, Y ).
Solution:
Z and W are independent, thus Cov(Z, W ) = 0. Therefore:
0 = Cov(Z, W ) = Cov(2X − Y, X + Y )
= 2 · Var(X) + 2 · Cov(X, Y ) − Cov(Y, X) − Var(Y )
= 2 × 4 + Cov(X, Y ) − 9
Therefore:
Cov(X, Y ) = 1
Cov(X, Y )
ρ(X, Y ) =
σX σY
1 1
= =
2×3 6
Solution:
X ∼ Unif orm(1, 3) Therefore:
44
EX = 2
(3 − 1)2 1
Var(X) = =
12 3
Y |X ∼ Exponential(X)
Therefore:
E[XY ] = E E[XY |X] law of iterated expectations
1
= E XE[Y |X] = E[X · ] Since Y |X ∼ Exponential(X)
X
= E[1] = 1
Similarly:
1
E[Y ] = E E[Y |X] = E[ ]
X
Z 3
11 1
= dx = ln3
1 2x 2
1
Cov(X, Y ) = EXY − EX · EY = 1 − 2 · ln3 = 1 − ln3
2
Z =7+X +Y
W = 1 + Y.
Find ρ(Z, W ).
Solution:
Cov(Z, W ) = Cov(7 + X + Y, 1 + Y )
= Cov(X + Y, Y )
= Cov(X, Y ) + Var(Y ) Since Xand Y are indep, Cov(X, Y ) = 0
= Var(Y ) = 1
45
Var(Z) = Var(X + Y ) Since Xand Y are indep
= Var(X) + Var(Y ) = 2
Var(W ) = Var(Y ) = 1
Therefore:
Cov(Z, W )
ρ(X, Y ) =
σZ σW
1 1
=√ =√
1×2 2
35. Let X and Y be be jointly normal random variables with parameters µX = −1,
2
σX = 4, µY = 1, σY2 = 1, and ρ = − 21 .
−1
(b) Note Cov(X, Y ) = ρσX σY = 2
· 2 · 1 = −1
Therefore:
46
36. Let X and Y be be jointly normal random variables with parameters µX = 1,
2
σX = 4, µY = 1, σY2 = 1, and ρ = 0.
(b)
Solution: µX = 2; µY = 1; σX = 2; σY = 3 ; ρ = − 21 .
(a)
3 − µX
E[Y |X = 3] = µY + ρσY ·
σX
−1 3−2 3 1
=1+ · (3) · = 1− =
2 2 4 4
(b)
47
Var[Y |X = 2] = (1 − ρ2 )σY2
1 27
= (1 − ) · (9) =
4 4
(c)
P (X + 2Y ≤ 5|X + Y = 3) = P (Y ≤ 2|X + Y = 3)
3 − µW
E[Y |W = 3] = µY + ρ(W, Y ) · σY · =1
σW
2 5 45
Var(Y |W = 3) = (1 − ) · Var(Y ) = · 9 =
7 7 7
Therefore, Y |W = 3 ∼ N(1, 45
7
)
r
2−1 7
P (Y ≤ 2|W = 3) = φ( q ) = φ( )
45 45
7
48