100% found this document useful (1 vote)
2K views48 pages

Chapter 5 Sol

1) The document describes a joint probability mass function (PMF) between two random variables X and Y given in a table. It then finds: a) P(X ≤ 2, Y > 1), b) the marginal PMFs of X and Y, c) P(Y = 2|X = 1), and d) determines if X and Y are independent. 2) A new random variable Z = X - 2Y is defined. The document finds: a) the PMF of Z, and b) P(X = 2|Z = 0). 3) A box contains two coins that are chosen randomly and tossed once. The document defines random variables X and Y for
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
2K views48 pages

Chapter 5 Sol

1) The document describes a joint probability mass function (PMF) between two random variables X and Y given in a table. It then finds: a) P(X ≤ 2, Y > 1), b) the marginal PMFs of X and Y, c) P(Y = 2|X = 1), and d) determines if X and Y are independent. 2) A new random variable Z = X - 2Y is defined. The document finds: a) the PMF of Z, and b) P(X = 2|Z = 0). 3) A box contains two coins that are chosen randomly and tossed once. The document defines random variables X and Y for
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

1. Consider two random variables X and Y with joint PMF given in Table 1.

Table 1: Joint PMF of X and Y in Problem 1

Y =1 Y =2

1 1
X=1 3 12

1
X=2 6
0

1 1
X=4 12 3

(a) Find P (X ≤ 2, Y > 1).


(b) Find the marginal PMFs of X and Y .
(c) Find P (Y = 2|X = 1).
(d) Are X and Y independent?

Solution:
(a)

P (X ≤ 2, Y > 1) = P (X = 1, Y = 2) + P (X = 2, Y = 2)
1 1
= +0= .
12 12
(b)

X
PX (x) = P (X = x, Y = y).
y∈RY

1 1 5


 3
+ 12
= 12
for x = 1



1 1
PX (x) = 6
+0= 6
for x = 2



 1 1 5
+ = for x = 4

12 3 12

1
So:

5


 12
x=1



1
PX (x) = 6
x=2



 5
x=4

12

X
PY (y) = P (X = x, Y = y).
x∈RX

1
+ 16 + 1 7

 3 12
= 12
for y = 1
PY (y) =
1 1 5
+0+ = for y = 2

12 3 12

So:

7

 12
y=1
PY (y) =
5
y=2

12

(c)
1
P (Y = 2, X = 1) 12 1
P (Y = 2|X = 1) = = 5 = .
P (X = 1) 12
5

(d)
Using the results of the previous part, we observe that :
1 5
P (Y = 2|X = 1) = 5
6= P (Y = 2) = 12
.
So, we conclude, two variables are not independent.

2. Let X and Y be as defined in Problem 1. I define a new random variable Z = X−2Y .

(a) Find PMF of Z.


(b) Find P (X = 2|Z = 0).

Solution:
(a)

2
Table 2: Values of Z

Y =1 Y =2

X=1 −1 −3

X=2 0 −2

X=4 2 0

Z = X − 2Y

1


 P (X = 1, Y = 2) = 12
for z = −3




P (X = 2, Y = 2) = 0 for z = −2







1
PZ (z) = P (X = 1, Y = 1) = 3
for z = −1



 1 1 1
 P (X = 2, Y = 1) + P (X = 4, Y = 2) = + = for z = 0



 6 3 2


 1
P (X = 4, Y = 1) = for z = 2

12

So:
 1

 12
z = −3




0 z = −2







1
PZ (z) = 3
z = −1



 1
z=0




 2


 1
z=2

12

(b)
1
P (X = 2, Y = 1) 6 1
P (X = 2|Z = 0) = = 1 = .
P (Z = 0) 2
3

3
3. A box contains two coins: a regular coin and a biased coin with P (H) = 23 . I choose
a coin at random and toss it once. I define the random variable X as a Bernoulli
random variable associated with this coin toss, i.e., X = 1 if the result of the coin
toss is heads and X = 0 otherwise. Then I take the remaining coin in the box
and toss it once. I define the random variable Y as a Bernoulli random variable
associated with the second coin toss. Find the joint PMF of X and Y . Are X and
Y independent?

Solution:
We choose each coin with probability 0.5. We call the regular coin as coin1 and the
biased coin as coin2.
Let X be a Bernoulli random variable associated with the first chosen coin toss. We
can pick the first coin “coin1” or second coin “coin2” with equal probability 0.5.

P (X = 1) = P (coin1)P (H|coin 1) + P (coin2)P (H|coin 2)


1 1 1 2 7
= × + × = .
2 2 2 3 12
P (X = 0) = P (coin1)P (T |coin 1) + P (coin2)P (T |coin 2)
1 1 1 1 5
= × + × = .
2 2 2 3 12

Let Y be a Bernoulli random variable associated with the second chosen coin toss.
We can pick the first coin “coin1” or second coin “coin2” with equal probability 0.5.

P (Y = 1) = P (coin1)P (H|coin 1) + P (coin2)P (H|coin 2)


1 1 1 2 7
= × + × = .
2 2 2 3 12
P (Y = 0) = P (coin1)P (T |coin 1) + P (coin2)P (T |coin 2)
1 1 1 1 5
= × + × = .
2 2 2 3 12

4
P (X = 0, Y = 0) = P (first coin = coin1)P (T |coin 1)P (T |coin 2)
+ P (first coin = coin2)P (T |coin 1)P (T |coin 2)
= P (T |coin 1)P (T |coin 2)
1 1 1
= × = .
2 3 6
P (X = 0, Y = 1) = P (first coin = coin1)P (T |coin 1)P (H|coin 2)
+ P (first coin = coin2)P (T |coin 2)P (H|coin 1)
1 1 2 1 1 1 1
= × × + × × = .
2 2 3 2 3 2 4
P (X = 1, Y = 0) = P (first coin = coin1)P (H|coin 1)P (T |coin 2)
+ P (first coin = coin2)P (H|coin 2)P (T |coin 1)
1 1 1 1 2 1 1
= × × + × × = .
2 2 3 2 3 2 4
P (X = 1, Y = 1) = P (first coin = coin1)P (H|coin 1)P (H|coin 2)
+ P (first coin = coin2)P (H|coin 1)P (H|coin 2)
= P (H|coin 1)P (H|coin 2)
1 2 1
= × = .
2 3 3

Table 3: Joint PMF of X and Y

Y =0 Y =1

1 1
X=0 6 4

1 1
X=1 4 3

By comparing joint PMFs and marginal PMFs, we conclude two variables are not
independent. For example:

5
P (X = 0) =
12
7
P (Y = 1) =
12
1
P (X = 0, Y = 1) = 6= P (X = 0) × P (Y = 1).
4

5
4. Consider two random variables X and Y with joint PMF given by
1
PXY (k, l) = , for k, l = 1, 2, 3, ...
2k+l
(a) Show that X and Y are independent and find the marginal PMFs of X and
Y.
(b) Find P (X 2 + Y 2 ≤ 10).

Solution:
(a)

RX = {1, 2, 3, · · · }
RY = {1, 2, 3, · · · }

1
PXY (k, l) = .
2k+l


X X 1
PX (k) = P (X = k, Y = l) = k+l
l∈RY l=1
2

1 X 1 1
= k l
= k.
2 l=1 2 2


X X 1
PY (l) = P (X = k, Y = l) = k+l
k∈RX k=1
2

1 X 1 1
= = l.
2l k=1
2 l 2

By calculating the marginal PMFs we observe that PXY (k, l) = PX (k) · PY (l) for all
k ∈ RX and l ∈ RY (k, l = 1, 2, 3, · · · ). So, these two variables are independent.
(b)
There are different cases in which X 2 + Y 2 ≤ 10:

6
P (X 2 + Y 2 ≤ 10) = P (X = 1, Y = 1) + P (X = 1, Y = 2) + P (X = 2, Y = 1)
+ P (X = 2, Y = 2) + P (X = 1, Y = 3) + P (X = 3, Y = 1)
1 1 1 1 1 1 11
= 2+ 3+ 3+ 4+ 4+ 4 = .
2 2 2 2 2 2 16

5. Let X and Y be as defined in Problem 1. Also, suppose that we are given that
Y = 1.

(a) Find the conditional PMF of X given Y = 1. That is, find PX|Y (x|1).
(b) Find E[X|Y = 1].
(c) Find Var(X|Y = 1).

Solution:
(a)

P (X = x, Y = 1) P (X = x, Y = 1) 12
PX|Y (x|1) = = 7 = P (X = x, Y = 1).
P (Y = 1) 12
7

12 1 4


 7
× 3
= 7
x=1



12 1 2
PX|Y (x|1) = 7
× 6
= 7
x=2



 12 1 1
× = x=4

7 12 7

4


 7
x=1



2
PX|Y (x|1) = 7
x=2



 1
x=4

7

(b)

X 4 2 1 12
E[X|Y = 1] = xPX|Y (x|1) = 1 × +2× +4× = .
x
7 7 7 7

(c)

7
X 4 2 1 28
E[X 2 |Y = 1] = x2 PX|Y (x|1) = 1 × + 4 × + 16 × = .
x
7 7 7 7

Var(X|Y = 1) = E(X 2 |Y = 1) − (E[X|Y = 1])2


 2
28 12
= −
7 7
52
=
49

6. The number of customers visiting a store in one hour has a Poisson distribution
with mean λ = 10. Each customer is a female with probability p = 34 independent
of other customers. Let X be the total number of customers in a one-hour interval
and Y be the total number of female customers in the same interval. Find the joint
PMF of X and Y .

Solution: To find joint PMF of X and Y , first note that

PXY (i, j) = P (X = i, Y = j)
=0 if i < j.

To find PXY (i, j) for i ≥ j, we can use the law of total probability:

PXY (i, j) = P (X = i, Y = j)
X∞
= P (X = i, Y = j|X = k)PX (k) (law of total probability).
k=0

But note that P (X = i, Y = j|X = k) = 0 if i 6= k, thus

PXY (i, j) = P (X = i, Y = j|X = i)PX (i)


= P (Y = j|X = i)PX (i)
i 3 j 1 (i−j) −10 10i
 
= ( )( ) e
j 4 4 (i)!
−10 10 i j
e ( 4 ) (3)
=
(i − j)!j!
h i
X 2 +Y 2
7. Let X and Y be two independent Geometric(p) random variables. Find E XY
.

8
X 
Solution: We can use LOTUS to find E Y
. Let us first remember the following
useful identities:

X 1
kxk−1 = for |x| < 1,
(1 − x)2
k=1


X xk
− ln(1 − x) = for |x| < 1.
k=1
k

The first one is obtained by taking derivative of the geometric sum formula, and
the second one is a Taylor series. Now, let’s apply LOTUS.

X2 + Y 2
     
X Y
E =E +E
XY Y X

  X ∞ X ∞
X m
E = P (X = m, Y = n)
Y n=1 m=1
n
∞ X ∞
X m 2 m−1 n−1
= p q q
n=1 m=1
n
∞ ∞
X 1 2 n−1 X
= p q mq m−1
n=1
n m=1

X 1 2 n−1 1
= p q
n=1
n (1 − q)2

X 1 n−1
= q
n=1
n

1 X qn
=
q n=1 n
1 1
= ln .
1−p p
Y 
By symmetry, we obtain the same answer for E X . Therefore:
 2
X +Y2

2 1
E = ln .
XY 1−p p

9
8. Consider the set of points in the set C:

C = {(x, y)|x, y ∈ Z, x2 + |y| ≤ 2}.

Suppose that we pick a point (X, Y ) from this set completely at random. Thus,
1
each point has a probability of 11 of being chosen.

(a) Find the joint and marginal PMFs of X and Y .


(b) Find the conditional PMF of X given Y = 1.
(c) Are X and Y independent?

Solution:

(a) Note that here

RXY = G = {(x, y)|x, y ∈ Z, x2 + |y| ≤ 2}.

Thus, the joint PMF is given by


1

11
(x, y) ∈ G
PXY (x, y) =
0 otherwise

To find the marginal PMF of Y , PY (j), we use Equation ??. Thus,


1
PY (−2) = PXY (0, −2) = ,
11
3
PY (−1) = PXY (0, −1) + PXY (−1, −1) + PXY (1, −1) = ,
11
3
PY (0) = PXY (0, 0) + PXY (1, 0) + PXY (−1, 0) = ,
11
3
PY (1) = PXY (0, 1) + PXY (−1, 1) + PXY (1, 1) = ,
11
1
PY (2) = PXY (0, 2) = .
11
Similarly, we can find
3


 11
for i = −1, 1

5
PX (i) = 11
for i = 0

 0
 otherwise

10
(b) For i = −1, 0, 1, we can write

PXY (i, 1)
PX|Y (i|1) =
PY (1)
1
1
= 11
3 = , for i = −1, 0, 1.
11
3

Thus, we conclude
1

3
for i = −1, 0, 1
PX|Y (i|1) =
0 otherwise

By looking at the above conditional PMF, we conclude that given Y = 1, X is


uniformly distributed over the set {−1, 0, 1}.
(c) X and Y are not independent. We can see this as the conditional PMF of X
given Y = 1 (calculated above) is not the same as marginal PMF of X, PX (x).
(d) We have
X
E[XY 2 ] = ij 2 PXY (i, j)
i,j∈RXY
1 X
= ij 2
11 i,j∈R
XY

=0

9. Consider the set of points in the set C:

G = {(x, y)|x, y ∈ Z, x2 + |y| ≤ 2}.

Suppose that we pick a point (X, Y ) from this set completely at random. Thus,
1
each point has a probability of 11 of being chosen.

(a) Find E[X|Y = 1].


(b) Find Var(X|Y = 1).
(c) Find E[X||Y | ≤ 1].
(d) Find E[X 2 |Y | ≤ 1].

Solution:

11
(a) To find E[X|Y = 1], we have
X
E[X|Y = 1] = xi PX|Y (xi |1).
xi ∈RX

In the previous problem, we found that given Y = 1, X is uniformly distributed


over the set {−1, 0, 1}. Thus we conclude that
1
E[X|Y = 1] = (−1 + 0 + 1) = 0.
3
(b)
X
E[X 2 |Y = 1] = xi PX|Y =1 (x)
x2i ∈RX
1 1 1 2
= (−1)2 + 02 + 12 . = .
 3 3 3 3
Var(X|Y = 1) = E (X − µX|Y (1))2 |Y = 1
= E X 2 |Y = 1 − µX|Y (1)2
 

2
= − 02
3
2
=
3
(c) To find E[X| − 1 ≤ Y ≤ 1], let A be the event that −1 ≤ Y ≤ 1. To find
E[X|A], we need to find the conditional PMF, PX|A (k), for k = 1, 0, 1. First,
note that
3 3 3 9
P (A) = PY (0) + PY (1) + PY (−1) = + + = .
11 11 11 11
Thus, for k = −1, 0, 1, we have
11
PX|A (k) = P (X = k, A).
9
So, we can write
11
PX|A (−1) = P (X = −1, A)
9
11   3 1
= PXY (−1, 0) + PXY (−1, 1) + PXY (−1, −1) = = ,
9 9 3
11
PX|A ( 0 ) = P (X = 0, A)
9
11   3 1
= PXY (0, 0) + PXY (0, 1) + PXY (0, −1) = = ,
9 9 3
11
PX|A ( 1 ) = P (X = 1, A)
9
11   3 1
= PXY (1, 0) + PXY (1, 1) + PXY (1, −1) = = ,
9 9 3

12
Thus, we have
X
E[X|A] = xi PX|A (x)
xi ∈RX
1 1 1
= (−1) + 0 + 1. = 0.
3 3 3

(d) To find E X 2 − 1 ≤ Y ≤ 1 , we use the conditional PMF and LOTUS. We


have
X
E[X 2 A] = x2i PX|A (x)

xi ∈RX
1 1 1 2
= (−1)2 · + 02 · + 12 · = .
3 3 3 3
10. The number of cars being repaired at a small repair shop has the following PMF
 1
 8
for x = 0


 1



 8
for x = 1

 1
PX (x) = 4
for x = 2
 1
for x = 3




 2

 0 otherwise

Each car that is being repaired is a four-door car with probability 34 and a two-door
car with probability 41 independently from other cars and independently from the
number of cars being repaired. Let X be the number of four-door cars and Y be
the number of two-door cars currently being repaired.

(a) Find the marginal PMFs of X and Y .


(b) Find joint PMF of X and Y .
(c) Are X and Y independent?
(d) Find E[XY 2 ].

Solution:

(a) Suppose that the number of cars being repaired is N. Then note that RX =
RY = {0, 1, 2, 3} and X + Y = N. Also, given N = n, X is a sum of n

13
independent Bernoulli( 43 ) random variables. Thus, given N = n, X has a
binomial distribution with parameters n and 43 , so

3
X|N = n ∼ Binomial(n, p = );
4
1
Y |N = n ∼ Binomial(n, q = 1 − p = ).
4
We have
3
X
PX (k) = P (X = k|N = n)PN (n) (law of total probability)
n=0
3  
X n k n−k
= p q PN (n)
n=k
k
 P
3 n 3 0 1 n
  


 n=0 0 4 4
· PN (n) for k = 0

3 1

3 n 1 n−1
 P   



 n=0 0 4 4
· PN (n) for k = 1

3 2 1 n−2
 P3
n
  
PX (k) = n=0 0 4 4
· PN (n) for k = 2

3 3 1 n−3
 P3 n
  
· PN (n) for k = 3




 n=0 0 4 4

 0 otherwise


23
for k = 0


 128

33




 128
for k = 1

45

PX (k) = 128
for k = 2
 27
for k = 3




 128

 0 otherwise

1
Similarly, for the marginal PMF of Y , p = 4
and q = 43 .
73
for k = 0


 128

43




 128
for k = 1

11

PY (k) = 128
for k = 2
 1
for k = 3




 128

 0 otherwise

14
(b) To find joint PMF of X and Y , we can also use the law of total probability:

X
PXY (i, j) = P (X = i, Y = j|N = n)PN (n) (law of total probability).
n=0

But note that P (X = i, Y = j|N = n) = 0 if N 6= i + j, thus

PXY (i, j) = P (X = i, Y = j|N = i + j)PN (i + j)


= P (X = i|N = i + j)PN (i + j)
 
i+j 3 i 1 j
= ( ) ( ) PN (i + j)
i 4 4

1 3 i 1 j

 ( ) (4)
8 4
for i + j = 0


1 3 i 1 j
( ) (4) for i + j = 1


8 4




 1 2 3 i 1 j
PXY (i, j) = 4 i 4
( ) (4) for i + j = 2

 21 3i ( 43 )i ( 14 )j

for i + j = 3





 0 otherwise


(c) X and Y are not independent, since as we saw above

PXY (i, j) 6= PX (i)PY (j).

11. Let X and Y be two independent random variables with PMFs


 1
 5 for x = 1, 2, 3, 4, 5
PX (k) = PY (k) =
 0 otherwise

Define Z = X − Y . Find the PMF of Z.

Solution: To find PMF of Z, let i, j ∈ {1, 2, 3, 4, 5}. Then


X
PZ (Z = i − j) = P (X = i, Y = j)
i,j∈{1,2,3,4,5}
X
= P (X = i)P (Y = j) (since X and Y are independent)
i,j∈{1,2,3,4,5}

15
1


 25
for z = −4

2

for z = −3




 25

3
for z = −2


25




4
for z = −1


25




5



 25
for z = 0
PZ (z) = 4

 25
for z = 1


3
for z = 2


25




2
for z = 3


25




1




 25
for z = 4


 0 otherwise

12. Consider two random variables X and Y with joint PMF given in Table 4.

Table 4: Joint PMF of X and Y in Problem 12

Y =0 Y =1 Y =2

1 1 1
X=0 6 6 8

1 1 1
X=1 8 6 4

Define the random variable Z as Z = E[X|Y ].

(a) Find the Marginal PMFs of X and Y .


(b) Find the conditional PMF of X given Y = 0 and Y = 1, i.e., find PX|Y (x|0)
and PX|Y (x|1).
(c) Find the P MF of Z.
(d) Find EZ, and check that EZ = EX.
(e) Find Var(Z).

Solution:

16
(a) Using the table we find out
1 1 1 11
PX (0) = + + = ,
6 6 8 24
1 1 1 13
PX (1) = + + = ,
8 6 4 24
1 1 7
PY (0) = + = ,
6 8 24
1 1 1
PY (1) = + = ,
6 6 3
1 1 3
PY (2) = + = .
8 4 8
Note that X and Y are not independent.
(b) We have
PXY (0, 0)
PX|Y (0|0) =
PY (0)
1
6 4
= 7 = .
24
7
Thus,
4 3
PX|Y (1|0) = 1 − = .
7 7
We conclude
 
3
X|Y = 0 ∼ Bernoulli .
7
Similarly, we find
1
PX|Y (0|1) = ,
2
1
PX|Y (1|1) = .
2
(c) We note that the random variable Y can take three values: 0 , 1 and 2 . Thus,
the random variable Z = E[X|Y ] can take three values as it is a function of Y .
Specifically,


 E[X|Y = 0] if Y = 0



Z = E[X|Y ] = E[X|Y = 1] if Y = 1




E[X|Y = 2] if Y = 2

17
Now, using the previous part, we have
3 1 2
E[X|Y = 0] = , E[X|Y = 1] = , E[X|Y = 2] =
7 2 3
7
and since P (y = 0) = P (y = 1) = 31 , and P (y = 2) = 38 we conclude that
24
,
 3 7

 7
with probability 24



1
Z = E[X|Y ] = 2
with probability 31




 2
3
with probability 83
So we can write
7 3


 24
if z = 7



1 1

if z =


 3 2
PZ (z) =
3 2
if z =




 8 3



0 otherwise

(d) Now that we have found the PMF of Z, we can find its mean and variance.
Specifically,
3 7 1 1 2 3 13
E[Z] = · + · + · = .
7 24 2 3 3 8 24
13
We also note that EX = 24
. Thus, here we have

E[X] = E[Z] = E[E[X|Y ]].

(e) To find Var(Z), we write

Var(Z) = E[Z 2 ] − (EZ)2


13
= E[Z 2 ] − ( )2 ,
24
where
3 7 1 1 2 3 17
E[Z 2 ] = ( )2 · + ( )2 · + ( )2 · = .
7 24 2 3 3 8 56
Thus,
17 13
Var(Z) = − ( )2
56 24
41
= .
4032

18
13. Let X, Y , and Z = E[X|Y ] be as in problem 12. Define the random variable V as
V = Var(X|Y ).

(a) Find the PMF of V .


(b) Find EV .
(c) Check that Var(X) = EV + Var(Z).

Solution:

(a) To find the PMF of V , we note that V is a function of Y . Specifically,




 Var(X|Y = 0) if Y = 0



V = Var(X|Y ) = Var(X|Y = 1) if Y = 1




Var(X|Y = 2) if Y = 2

Therefore,
7


 Var(X|Y = 0) with probability 24



1
V = Var(X|Y ) = Var(X|Y = 1) with probability 3



 3
Var(X|Y = 2) with probability

8

Now, since X|Y = 0 ∼  Bernoulli 37 , X|Y 1


 
= 1 ∼ Bernoulli 2
, and
X|Y = 2 ∼ Bernoulli 23 we have

3 4 12
Var(X|Y = 0) = · = ,
7 7 49
1 1 1
Var(X|Y = 1) = · = ,
2 2 4
2 1 2
Var(X|Y = 2) = · = ,
3 3 9
Thus,
12 7


 49
with probability 24



1 1
V = Var(X|Y ) = 4
with probability 3



 2 3
with probability

9 8

19
So we can write
7 12


 24
if v = 49



1 1

if v =


 3 4
PV (v) =
3 2
if v =




 8 9



0 otherwise

(b) To find EV , we write


12 7 1 1 2 3 5
EV = · + · + · = .
49 24 4 3 9 8 21

(c) To check that Var(X) = E(V )+Var(Z), we just note that

13 11 143
Var(X) = · =
24 24 576
5
EV =
21
41
Var(Z) = .
4032
14. Let N be the number of phone calls made by the customers of a phone company in
a given hour. Suppose that N ∼ P oisson(β), where β > 0 is known. Let Xi be the
length of the i’th phone call, for i = 1, 2, ..., N. We assume Xi ’s are independent
of each other and also independent of N. We further assume that have the same
mean and variance

Xi ∼ Exponential(λ),

where λ > 0 is known. Let Y be the sum of the lengths of the phone calls, i.e.,
N
X
Y = Xi .
i=1

Find EY and Var(Y ).

Solution: To find EY , we cannot directly use the linearity of expectation because N


is random. But, conditioned on N = n, we can use linearity and find E[Y |N = n];

20
so, we use the law of iterated expectations:

EY = E[E[Y |N]] (law of iterated expectations)


"  N #
X
=E E Xi |N
i=1
" N #
X
=E E[Xi |N] (linearity of expectation)
i=1
" N #
X
=E E[Xi ] (Xi ’s and N are indpendent)
i=1
= E[NE[X]] (since EXi = EXs)
= E[X]E[N] (since EX is not random).
EY = E[X]E[N]
1
EY = · β
λ
β
EY =
λ

To find Var(Y ), we use the law of total variance:

Var(Y ) = E(Var(Y |N)) + Var(E[Y |N])


= E(Var(Y |N)) + Var(NEX) (as above)
= E(Var(Y |N)) + (EX)2 Var(N).

To find E(Var(Y |N)), note that, given N = n, Y is a sum n independent random


variables. As we discussed before, for n independent random variables, the variance
of the sum is equal to sum of the variances. We can write

N
X
Var(Y |N) = Var(Xi |N)
i=1
XN
= Var(Xi ) (since Xi ’s are independent of N)
i=1
= NVar(X).

Thus, we have

E(Var(Y |N)) = ENVar(X).

21
We obtain

Var(Y ) = ENV ar(X) + (EX)2 Var(N).


1 1
Var(Y ) = β( )2 + ( )2 β.
λ λ

=
λ2

15. Let X and Y be two jointly continuous random variables with joint PDF
 1 −x cy
 2 e + (1+x)2 0 ≤ x, 0 ≤ y ≤ 1
fXY (x, y) =
0 otherwise

(a) Find the constant c.


(b) Find P (0 ≤ X ≤ 1, 0 ≤ Y ≤ 21 ).
(c) Find P (0 ≤ X ≤ 1).

Solution:

(a) We have:

RXY

22
Z ∞ Z ∞
fXY (x, y)dxdy = 1
∞ ∞
1 ∞  
1 −x
Z Z
cy
= e + dxdy
y=0 x=0 2 (1 + x)2
Z 1 ∞
1 −x cy
= − e − dy
0 2 (1 + x) 0
Z 1
1
= ( + cy)dy
0 2
 1
1 1 2
= y + cy
2 2 0
1 1
= + c
2 2
Thus, c = 1.
(b)
1
P (0 ≤ X ≤ 1, 0 ≤ Y ≤ )
2
1
1
1 −x
Z Z
2 y
= e + dxdy
y=0 x=0 2 (1 + x)2
Z 1 1
2 1 −x y
= − e − dy
0 2 1+x 0
Z 1 
2 1 1 −1 y
= ( + y) − ( e + ) dy
0 2 2 2
5 1
= −
16 4e

1
2

1 x

23
(c)
1 1  
1 −x
Z Z
y
P (0 ≤ X ≤ 1) = e + dxdy
y=0 x=0 2 (1 + x)2
3 1
= −
4 2e
16. Let X and Y be two jointly continuous random variables with joint PDF
 −xy
 e 1 ≤ x ≤ e, y > 0
fXY (x, y) =
0 otherwise

(a) Find the marginal PDFs, fX (x) and fY (y).



(b) Write an integral to compute P (0 ≤ Y ≤ 1, 1 ≤ X ≤ e).

Solution:
(a) We have:

RXY
y

1 e x

for 1 < x < e:


Z ∞
fX (x) = e−xy dy
0

1 −xy
=− e
x 0
1
=
x
 1
 x 1≤x≤e
fX (x) =
0 otherwise

24
for 0 < y
Z e
fY (y) = e−xy dx
1
1
= (e−y − e−ey )
y
Thus,  1 −y
 y (e − e−ey ) y>0
fX (x) =
0 otherwise

(b)


Z e Z 1
P (0 ≤ Y ≤ 1, 1 ≤ X ≤ e) = e−xy dydx
x=1 y=0

e
1 1 −x
Z
= − e dx
2 1 x
17. Let X and Y be two jointly continuous random variables with joint PDF
 1 2 1
 4x + 6y −1 ≤ x ≤ 1, 0 ≤ y ≤ 2
fXY (x, y) =
0 otherwise

(a) Find the marginal PDFs, fX (x) and fY (y).


(b) Find P (X > 0, Y < 1).
(c) Find P (X > 0 or Y < 1).
(d) Find P (X > 0|Y < 1).
(e) Find P (X + Y > 0).

Solution:

(a) for −1 ≤ x ≤ 1
y
2

RXY

-1 1 x

25
2
1 1
Z
fX (x) = ( x2 + y)dy
0 4 6
1 1
= x2 +
2 3
Thus,  1 2 1
 2x + 3
−1 ≤ x ≤ 1
fX (x) =
0 otherwise

for 0 ≤ y ≤ 2
1  
1 2 1
Z
fY (y) = x + y dx
−1 4 6
1 1
= + y
6 3
 1 1
 6 + 3y 0≤y≤2
fX (x) =
0 otherwise

(b) We have:

y
2

-1 1 x

1 1  
1 2 1
Z Z
P (X > 0, Y < 1) = x + y dydx
x=0 y=0 4 6
1
=
6

(c) We have:

26
y
2

-1 1 x

P (X > 0 or Y < 1) = 1 − P (X < 0 and Y > 1)


Z 0 Z 2  
1 2 1
= 1− x + y dydx
x=−1 y=1 4 6
2
=
3
(d)
P (X > 0 and Y < 1)
P (X > 0|Y < 1) =
P (Y < 1)
1
=
6P (Y < 1)

1
1 1
Z
P (Y < 1) = ( y + )dy
0 3 6
1
=
3
Therefore, P (X > 0|Y < 1) = 21 .
(e) We have:

y
2

A
B
-1 1 x

27
P (X + Y > 0) = P ((x, y) ∈ A)
= 1 − P (B)
Z 0 Z −x  
1 2 1
=1− x + y dydx
x=−1 y=0 4 6
131
=
144
18. Let X and Y be two jointly continuous random variables with joint CDF

 1 − e−x − e−2y + e−(x+2y) x, y > 0
FXY (x, y) =
0 otherwise

(a) Find the joint PDF, fXY (x, y).


(b) Find P (X < 2Y ).
(c) Are X and Y independent?

Solution: Note that we can write FXY (x, y) as

FXY (x, y) = 1 − e−x u(x)(1 − e−2y )u(y)




= (a function of x) · (a function of y)
= FX (x) · FY (y)

i.e. X and Y are independent.

(a)

FX (x) = (1 − e(−x) )u(x)

Thus X ∼ Exponential(1) . So, we have fX (x) = e−x u(x). Similarly, fY (y) =


2e−2y u(y) which results in:

fXY (x, y) = 2e(−x+2y) u(x)u(y)

(b)
Z ∞ Z 2y
P (X < 2Y ) = 2e−(x+2y) dxdy
y=0 x=0
1
=
2

28
(c) Yes, as we saw above.

19. Let X ∼ N(0, 1).

(a) Find the conditional PDF and CDF of X given X > 0.


(b) Find E[X|X > 0].
(c) Find Var(X|X > 0).

Solution:
X ∼ N(0, 1). Thus,
1 x2
fX (x) = √ e− 2

(a) By symmetry, we have P (X > 0) = 12 . Let A be the event that X > 0:


fX (x)

 P (A)
x>0
fX|A (x) =
0 else

Thus,  q 2
2 − x2

 π
e x>0
fX|A (x) =

0 else

FX (x) − FX (0)
FX|A (x) =
P (A)
 
1
= 2 FX (x) −
2
= 2FX (x) − 1

Note FX (x) = Φ(x). Therefore, FX|A (x) = 2Φ(x) − 1

29
(b)
Z ∞
E[X|A] = xfX|A (x)dx
−∞
Z ∞r
2 − x2
= xe 2 dx
0 π
2
− x2
u=e
x2
du = −xe− 2 dx
r Z 0
2
E[X|A] = (−du)
π 1
r
2
=
π

(c)
Z ∞
2
E[X |A] = x2 fX|A (x)dx
0
Z ∞
=2 x2 fX (x)dx
0
Z ∞
= x2 fX (x)dx (since x2 fX (x) is an even function.)
−∞
= E[X 2 ]
=1

Thus

V ar[X|A] = E[X 2 |A] − (E[X|A]))2


2
=1−
π
20. Let X and Y be two jointly continuous random variables with joint PDF
 2 1
 x + 3y −1 ≤ x ≤ 1, 0 ≤ y ≤ 1
fXY (x, y) =
0 otherwise

For 0 ≤ y ≤ 1, find

(a) the conditional PDF of X given Y = y;


(b) P (X > 0|Y = y). Does this value depend on y?
(c) Are X and Y independent?

30
Solution:

(a) Let us first find fY (y):

+1
1 1 1 +1
Z
fY (y) = (x2 + y)dx = x3 + yx −1
−1 3 3 3
2 2
= y+ for 0 ≤ y ≤ 1
3 3
Thus, for 0 ≤ y ≤ 1, we obtain:

fXY (x, y) x2 + 31 y 3x2 + y


fX|Y (x|y) = = 2 = for −1≤x≤1
fY (y) 3
y + 23 2y + 2

For 0 ≤ y ≤ 1:

3x2 +y

 2y+2
−1 ≤ x ≤ 1
fX|Y (x|y) =
0 else

(b)
1 1
3x2 + y
Z Z
P (X > 0|Y = y) = fX|Y (x|y)dx = dx
0 0 2y + 2
Z 1
1
= (3x2 + y)dx
2y + 2 0
1  3 1 y+1 1
= (x + yx) 0 = =
2y + 2 2(y + 1) 2

Thus it does not depend on y.


(c) X and Y are not independent. Since fX|Y (x|y) depends on y.

21. Let X and Y be two jointly continuous random variables with joint PDF
 1 2 2
 2x + 3y −1 ≤ x ≤ 1, 0 ≤ y ≤ 1
fXY (x, y) =
0 otherwise

Find E[Y |X = 0] and Var(Y |X = 0).

31
Solution:
Let us first find fY |X (y|x). To do so, we need fX (x):

1
1 2 1 1 1
Z
fX (x) = ( x2 + y)dy = x2 y + y 2 0
0 2 3 2 3
1 1
= x2 + for − 1 ≤ x ≤ +1
2 3
Thus:

1 2
fXY (x, y) x + 2y
fY |X (y|x) = = 21 2 31
fX (x) 2
x +3

Therefore:
2
3
y
fY |X (y|0) = 1 = 2y for 0≤y≤1
3

Thus:

1 1
2
Z Z
E[Y |X = 0] = yfY |X (y|0)dy = 2y 2dy =
0 0 3

1 1
1
Z Z
2 2
E[Y |X = 0] = y fY |X (y|0)dy = 2y 3 dy =
0 0 2
Therefore:

1 2 1 4 1
Var(Y |X = 0) = − ( )2 = − =
2 3 2 9 18
22. Consider the set

E = {(x, y) |x| + |y| ≤ 1}.
Suppose that we choose a point (X, Y ) uniformly at random in E. That is, the joint
PDF of X and Y is given by

 c (x, y) ∈ E
fXY (x, y) =
0 otherwise

32
y

−1 1
x

−1

(a) Find the constant c.


(b) Find the marginal PDFs fX (x) and fY (y).
(c) Find the conditional PDF of X given Y = y, where −1 ≤ y ≤ 1.
(d) Are X and Y independent?

Solution:
(a)
We have:

Z Z √ √
1= cdxdy = c(area of E) = c 2 · 2 = 2c
E
1
→c=
2
(b)
For 0 ≤ x ≤ 1, we have:

1−x
1
Z
fX (x) = dy = 1 − x
x−1 2

33
y

−x + y = 1 x+y =1

−1 1
x

−x − y = 1 x−y = 1

−1

For −1 ≤ x ≤ 0, we have:

1+x
1
Z
fX (x) = dy = 1 + x
−x−1 2


 1 − |x| −1 ≤x≤1
fX (x) =
0 else

Similarly, we find:

 1 − |y| −1≤y ≤1
fY (y) =
0 else

(c)

1
fXY (xy) 2
fX|Y (x|y) = =
fY (y) (1 − |y|)
1
= for |x| ≤ 1 − |y|
2(1 − |y|)

34
Thus:

1

 2(1−|y|)
for − 1 + |y| ≤ x ≤ 1 − |y|
fX|Y (x|y) =
0 else

So, we conclude that given Y = y, X is uniformly distributed on [−1 + |y|, 1 − |y|],


i.e.:
X|Y = y ∼ Unif orm(−1 + |y|, 1 − |y|)
(d)
No, because fXY (x, y) 6= fX (x) · fY (y)

23. Let X and Y be two independent Unf iorm(0, 2) random variables. Find P (XY <
1).

Solution:
We have:

Z 2
P (XY < 1) = P (XY < 1|Y = y)fY (y)dy Law of total prob
0
2
1
Z
= P (XY < 1|Y = y) dy Since Y ∼ Unif orm(0, 2)
0 2
Z 2
1 1
= P (X < )dy X and Y indep
2 0 y
Z 1 Z 2
1 1
= 1dy + dy X ∼ Unif orm(0, 2)
2 0 1 2y
1
= [1 + ln2] ≈ 0.847
2

24. Suppose X ∼ Exponential(1) and given X = x, Y is a uniform random variable in


[0., x], i.e.,

Y |X = x ∼ Unif orm(0, x),

or equivalently

Y |X ∼ Unif orm(0, X).

(a) Find EY .

35
(b) Find Var(Y ).

Solution:
a+b (b−a)2
Remember that if Y ∼ Unif orm(a, b), then EY = 2
and Var(Y ) = 12
(a)
Using the law of total expectation:
Z ∞
E[Y ] = E[Y |X = x]fX (x)dx
0
Z ∞
= E[Y |X = x]e−x dx Since Y |X ∼ Unif orm(0, X)
Z0 ∞
1 ∞ −x
Z
x −x
= e dx = [ xe dx]
0 2 2 0
1 1
= ·1=
2 2

(b)

Z ∞ Z ∞
2 2
EY = E[Y |X = x]fX (x)dx = E[Y 2 |X = x]e−x dx Law of total expectation
0 0

Y |X ∼ Unif orm(0, X)

E[Y 2 |X = x] = Var(Y 2 |X = x) + (E[Y |X = x])2


x2 x2 x2
= + =
Z12∞ 24 3
1 ∞ 2 −x
Z
2 x −x
EY = e dx = x e dx
0 3 3 0
1 1 1 2
= EW 2 = [Var(w) + (Ew)2 ] = (1 + 1) = where w ∼ Exponential(1)
3 3 3 3

Therefore:

2 2 1 5
EY 2 = Var(Y ) = − =
3 3 4 12

25. Let X and Y be two independent Unif orm(0, 1) random variables. Find

(a) E[XY ].

36
(b) E[eX+Y ].
(c) E[X 2 + Y 2 + XY ].
(d) E[Y eXY ].

Solution: (a)
E[XY ] = E[X] · E[Y ] Since Xand Y are indep
1 1 1
= · =
2 2 4
(b)

E[eX+Y ] = E[eX · eY ] = E[eX ]E[eY ]


Z 1
X Y
E[e ] = E[e ] = ex · 1 = e − 1
0

Therefore:
E[eX+Y ] = (e − 1)(e − 1) = (e − 1)2

(c)

E[X 2 + Y 2 + XY ] = E[X 2 ] + E[Y 2 ] + E[XY ] linearity of expectation


= 2EX 2 + EXEY

1
1
Z
2
EX = x2 · 1dx =
0 3
Therefore:
2 1 11
E[X 2 + Y 2 + XY ] = + =
3 4 12
(d)

Z 1 Z 1
XY
E[Y e ]= yexy dxdy LOTUS
0 0
Z 1 Z 1
xy 1
= [e ]0 dy = [ey − 1]dy = ey − 2
0 0

37
X
26. Let X and Y be two independent Unif orm(0, 1) random variables, and Z = Y
.
Find the CDF and PDF of Z.

Solution:
First note that since RX = RY = [0, 1], we conclude RZ = [0, ∞). We first find the
CDF of Z.
X 
FZ (z) = P (Z ≤ z) = P ≤z
Y
= P (X ≤ zY ) (Since Y ≥ 0)
Z 1
= P (X ≤ zY |Y = y)fY (y)dy (Law of total prob)
0
Z 1
= P (X ≤ zY |Y = y)dy (Since X and Y are indep)
0

Note:

1 if y > z1

P (X ≤ zy) =
zy if y ≤ z1

Consider two cases:


(a)
If 0 ≤ z ≤ 1, then P (X ≤ zy) = zy for all 0 ≤ y ≤ 1
Thus:

1
1 1 1
Z
FZ (z) = (zy)dy = zy 2 0 = z
0 2 2

(b)
If z > 1, then

1
Z
z
Z 1
FZ (z) = zydy + 1dy
1
0 z
 1 2  z1  1
= zy 0 + y 1
2 z

1 1 1
= +1− =1−
2z z 2z

38
 1
 2z 0≤z≤1
1
FZ (z) = 1− 2z
z≥1
0 z<0

Note that FZ (z) is a continuous function.

1

0≤z≤1
d  2
1
fZ (z) = FZ (z) = 2z 2
z≥1
dz
0 else

27. Let X and Y be be two independent N(0, 1) random variables and U = X + Y .

(a) Find fU |X (u|x).


(b) Find fU (u).
(c) Find fX|U (x|u).
(d) Find E[X|U = u] and Var(X|U = u).

Solution: (a) U = X + Y , thus given X = x, U = x + Y .


Since X and Y are independent, we conclude:
U|X = x ∼ N(x, 1), therefore:

1 (u−x)2
fU |X (u|x) = √ e− 2

(b) U = X + Y ; X, Y ∼ N(0, 1).



Therefore, U ∼ N(0, 2) (µu = 0, σu = 2)

1 u2
fU (u) = √ e− 4
2 π

(c)

39
fXU (x, u)
fX|U (x|u) =
fU (u)
fU |X (u|x)fX (x)
=
fU (u)
(u−x) 2 2
x
√1 e− 2 · √1 e− 2
2π 2π
= u 2
1

2 π
e− 4
1 u 2
= √ e−(x− 2 )
π

Therefore, X|U = u ∼ N( u2 , 21 )
(d)
By part (c),

  u
E X|U = |u =
2
 1
Var X|U = |u =
2
28. Let X and Y be two independent standard normal random variables. Consider the
point (X, Y ) in the x − y plane. Let (R, Θ) be the corresponding polar coordinates
as shown in Figure 1. The inverse transformation is given by

X = R cos Θ
Y = R sin Θ
where, R ≥ 0 and −π < Θ ≤ π. Find the joint PDF of R and Θ. Show that R and
Θ are independent.

Y •(X, Y )

X = R cos Θ
R Y = R sin Θ

Θ
X

Figure 1: Polar coordinates

40
Solution: Here (X, Y ) are jointly continuous and are related to (R, Θ) by a one-
to-one relationship. We use the method of transformations (Theorem ??). The
function h(r, θ) is given by

x = h1 (r, θ) = r cos θ
y = h2 (r, θ) = r sin θ

Thus, we have

fRΘ (r, θ) = fXY (h1 (r, θ), h2 (r, θ))|J|


= fXY (r cos θ, r sin θ)|J|.

where
 ∂h1 ∂h1
  
∂r ∂θ
cos θ −r sin θ
J = det   = det   = r cos2 θ + r sin2 θ = r.
∂h2 ∂h2
∂r ∂θ
sin θ r cos θ

We conclude that

fRΘ (r, θ) = fXY (r cos θ, r sin θ)|J|


 2
r − r2
 2π e r ∈ [0, ∞], θ ∈ (−π, π]
=

0 otherwise

Note that from above we can write

fRΘ (r, θ) = fR (r)fΘ (θ),

where
r2

 re− 2 r ∈ [0, ∞]
fR (r) =

0 otherwise
1

 2π
θ ∈ (−π, π]
fΘ (θ) =
0 otherwise

Thus, we conclude that R and Θ are independent.


29. In problem 28, suppose that X and Y are independent Unif orm(0, 1) random
variables. Find the joint PDF of R and Θ. Are R and Θ independent?

41
Solution: Note that here 0 ≤ X, Y ≤ 1, so we can write the range of (R, Θ) as
follows

0 ≤ r,
0 ≤ r cos θ ≤ 1,
0 ≤ r sin θ ≤ 1.

Therefore, the range of (R, Θ) is the set


  
2 π 1 1
RR,θ = (r, θ) ∈ R | 0 ≤ θ ≤ , 0 ≤ r ≤ min , .
2 cos θ sin θ

Since fXY (r cos θ, r sin θ) = 1:

fRΘ (r, θ) = fXY (r cos θ, r sin θ)|J|


1
, 1

0 ≤ θ ≤ π2 , 0 ≤ r ≤ min

 r cos θ sin θ
=
0 otherwise

Note that the conditional range of R given Θ = θ depends on the value of θ, so we


cannot write
fRΘ (r, θ) = fR (r)fΘ (θ)
Thus, we conclude that R and Θ are not independent.

30. Consider two random variables X and Y with joint PMF given in Table 5.

Table 5: Joint PMF of X and Y in Problem prob:table-cov

Y =0 Y =1 Y =2

1 1 1
X=0 6 4 8

1 1 1
X=1 8 6 6

Find Cov(X, Y ) and ρ(X, Y ).

Solution:
First, we find PMFs of X and Y :

42
1
RX = {0, 1} PX (0) = 6
+ 14 + 1
4+6+3
824
= = 13
24
PX (1) = 1
8
+ 1
6
+ 1
6
= 11
24
1 1 7
RY = {0, 1, 2} PY (0) = 6 + 8 = 24
1 1 5
PY (1) = 4
+ 6
= 12
PY (2) = 18 + 16 = 247

13 11 11
EX = 0 · +1· =
24 24 24
7 5 7
EY = 0 · +1· +2· =1
24 12 24
X 1 1 1 1 1 1
EXY = ijPXY (i, j) = 0 + 1 · 0 · + 1 · 1 · + 1 · 2 · = + =
8 6 6 6 3 2

Therefore:

1 11 1
Cov(X, Y ) = EXY − EX · EY = − ·1=
2 24 24
Var(X) = EX 2 − (EX)2
11
EX 2 =
24
11 13
Var(X) = ·
24 24

11 × 13
→ σX = ≈ 0.498
24
7 5 7 19
EY 2 = 0 · +1· +4· =
24 12 24 12
19 7
Var(Y ) = −1=
12
r 12
7
→ σY = ≈ 0.76
12
Cov(X, Y )
→ ρ(X, Y ) =
σX σY
1
24
= √ q ≈ 0.11
11×13 7
24
· 12

31. Let X and Y be two independent N(0, 1) random variable and

Z = 11 − X + X 2 Y
W = 3 − Y.

Find Cov(Z, W ).

43
Solution:

Cov(Z, W ) = Cov(11 − X + X 2 Y, 3 − Y )
= Cov(−X + X 2 Y, −Y ) = Cov(X, Y ) − Cov(X 2 Y, Y )
= −Cov(X 2 Y, Y ) (Since X and Y are indep Cov(X, Y ) = 0)
= −E[X 2 Y · Y ] + E[X 2 Y ] · E[Y ] (EY = 0)
2 2 2 2
= −E[X Y ] = −EX · EY = −1

2
32. Let X and Y be two random variables. Suppose that σX = 4, and σY2 = 9. If we
know that the two random variables Z = 2X − Y and W = X + Y are independent,
find Cov(X, Y ) and ρ(X, Y ).

Solution:
Z and W are independent, thus Cov(Z, W ) = 0. Therefore:

0 = Cov(Z, W ) = Cov(2X − Y, X + Y )
= 2 · Var(X) + 2 · Cov(X, Y ) − Cov(Y, X) − Var(Y )
= 2 × 4 + Cov(X, Y ) − 9

Therefore:

Cov(X, Y ) = 1
Cov(X, Y )
ρ(X, Y ) =
σX σY
1 1
= =
2×3 6

33. Let X ∼ Unif orm(1, 3) and Y |X ∼ Exponential(X). Find Cov(X, Y ).

Solution:
X ∼ Unif orm(1, 3) Therefore:

44
EX = 2
(3 − 1)2 1
Var(X) = =
12 3

Y |X ∼ Exponential(X)
Therefore:

 
E[XY ] = E E[XY |X] law of iterated expectations
  1
= E XE[Y |X] = E[X · ] Since Y |X ∼ Exponential(X)
X
= E[1] = 1

Similarly:

  1
E[Y ] = E E[Y |X] = E[ ]
X
Z 3
11 1
= dx = ln3
1 2x 2

1
Cov(X, Y ) = EXY − EX · EY = 1 − 2 · ln3 = 1 − ln3
2

34. Let X and Y be two independent N(0, 1) random variable and

Z =7+X +Y
W = 1 + Y.

Find ρ(Z, W ).

Solution:

Cov(Z, W ) = Cov(7 + X + Y, 1 + Y )
= Cov(X + Y, Y )
= Cov(X, Y ) + Var(Y ) Since Xand Y are indep, Cov(X, Y ) = 0
= Var(Y ) = 1

45
Var(Z) = Var(X + Y ) Since Xand Y are indep
= Var(X) + Var(Y ) = 2
Var(W ) = Var(Y ) = 1

Therefore:

Cov(Z, W )
ρ(X, Y ) =
σZ σW
1 1
=√ =√
1×2 2

35. Let X and Y be be jointly normal random variables with parameters µX = −1,
2
σX = 4, µY = 1, σY2 = 1, and ρ = − 21 .

(a) Find P (X + 2Y ≤ 3).


(b) Find Cov(X − Y, X + 2Y ).

Solution: (a) Let Z = X + 2Y , then:

EZ = EX + 2EY = (−1) + 2 · (1) = 1


Var(Z) = Var(X) + 4Var(Y ) = 4 + 4 = 8

Therefore Z ∼ N(µZ = 1, σZ2 = 8)


3−1 2 1
P (Z ≤ 3) = φ( √ ) = φ( √ ) = φ( √ )
8 8 2

−1
(b) Note Cov(X, Y ) = ρσX σY = 2
· 2 · 1 = −1
Therefore:

Cov(X − Y, X + 2Y ) = Var(X) + 2Cov(X, Y ) − Cov(X, Y ) − 2Var(Y )


= Var(X) + Cov(X, Y ) − 2Var(Y )
= 4 + (−1) − 2 = 1

46
36. Let X and Y be be jointly normal random variables with parameters µX = 1,
2
σX = 4, µY = 1, σY2 = 1, and ρ = 0.

(a) Find P (X + 2Y > 4).


(b) Find E[X 2 Y 2 ].

Solution: X ∼ N(1, 4); Y ∼ N(1, 1) :


ρ(X, Y ) = 0 and X , Y are jointly normal. Therefore X and Y are independent.
(a)
W = X + 2Y Therefore:
√ √
W ∼ N(3, 4 + 4) = N(3, 8)
4−3 1
P (W > 4) = 1 − φ( √ ) = 1 − φ( √ )
8 8

(b)

E[X 2 Y 2 ] = EX 2 · EY 2 = (4 + 1) · (1 + 1) = 10 Since Xand Y are indep

37. Let X and Y be be jointly normal random variables with parameters µX = 2,


2
σX = 4, µY = 1, σY2 = 9, and ρ = − 12 .

(a) Find E[Y |X = 3].


(b) Find Var(Y |X = 2).
(c) Find P (X + 2Y ≤ 4|X + Y = 3).

Solution: µX = 2; µY = 1; σX = 2; σY = 3 ; ρ = − 21 .
(a)

3 − µX
E[Y |X = 3] = µY + ρσY ·
σX
−1 3−2 3 1
=1+ · (3) · = 1− =
2 2 4 4

(b)

47
Var[Y |X = 2] = (1 − ρ2 )σY2
1 27
= (1 − ) · (9) =
4 4
(c)

P (X + 2Y ≤ 5|X + Y = 3) = P (Y ≤ 2|X + Y = 3)

Note that Y and W = X + Y are jointly normal.


Y ∼ N(1, 9); EW = 2 + 1 = 3

Var(W ) = Var(X) + Var(Y ) + 2ρσX σY


−1
= 4+9+2× ×2×3 =7
2
Cov(W, Y ) = Cov(X + Y, Y ) = Cov(X, Y ) + Var(Y )
−1
= ·2·3+9 =6
2
6 2
ρ(W, Y ) = √ =√
7·3 7
Thus:


 µY = 1, σY = 3




µW = 3, σW = 7



ρ(W, Y ) = √27

3 − µW
E[Y |W = 3] = µY + ρ(W, Y ) · σY · =1
σW
2 5 45
Var(Y |W = 3) = (1 − ) · Var(Y ) = · 9 =
7 7 7
Therefore, Y |W = 3 ∼ N(1, 45
7
)

r
2−1 7
P (Y ≤ 2|W = 3) = φ( q ) = φ( )
45 45
7

48

You might also like