0% found this document useful (0 votes)
827 views8 pages

f19 HW03 Module02b Solns PDF

This document contains 12 multiple choice questions that assess concepts from a homework assignment covering probability and statistics topics such as conditional probability, conditional expectation, covariance, correlation, probability distributions, limit theorems, estimation, and maximum likelihood estimation. The questions provide solutions that explain the reasoning for the correct multiple choice answer.

Uploaded by

Thao Tran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
827 views8 pages

f19 HW03 Module02b Solns PDF

This document contains 12 multiple choice questions that assess concepts from a homework assignment covering probability and statistics topics such as conditional probability, conditional expectation, covariance, correlation, probability distributions, limit theorems, estimation, and maximum likelihood estimation. The questions provide solutions that explain the reasoning for the correct multiple choice answer.

Uploaded by

Thao Tran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

1

ISyE 6644 — HW3 — Fall 2019


(covers Module 2.10-2.16)

1. (Lesson 2.10: Conditional Expectation.) BONUS: Suppose that

f (x, y) = 6x for 0 ≤ x ≤ y ≤ 1.

Hint (you may already have seen this someplace): The marginal p.d.f. of X turns
out to be
fX (x) = 6x(1 − x) for 0 ≤ x ≤ 1.

Find the conditional p.d.f. of Y given that X = x.


1
(a) f (y|x) = 1−x
, 0≤x≤y≤1
1
(b) f (y|x) = 1−x
, 0≤x≤1
1
(c) f (y|x) = 1−y
, 0≤y≤1
1
(d) f (x|y) = 1−x
, 0≤x≤y≤1

Solution:
f (x, y) 6x 1
f (y|x) = = = , 0 ≤ x ≤ y ≤ 1.
fX (x) 6x(1 − x) 1−x

So the answer is (a). 

2. (Lesson 2.10: Conditional Expectation.) BONUS: Again suppose that

f (x, y) = 6x for 0 ≤ x ≤ y ≤ 1.

Hint (you may already have seen this someplace): The marginal p.d.f. of X turns
out to be
fX (x) = 6x(1 − x) for 0 ≤ x ≤ 1.

Find E[Y |X = x].

(a) E[Y |X = x] = 1/2, 0≤x≤1


1+x
(b) E[Y |X = x] = 2
, 0≤x≤1
1+y
(c) E[Y |X = x] = 2
, 0≤y≤1
2

1+y
(d) E[X|Y = y] = 2
, 0≤y≤1

Solution: By the definition of conditional expectation, we have


Z ∞ Z 1
y 1+x
E[Y |X = x] = yf (y|x) dy = dy = , 0 ≤ x ≤ 1.
−∞ x 1−x 2

So the answer is (b). 

3. (Lesson 2.10: Conditional Expectation.) BONUS: Yet again suppose that

f (x, y) = 6x for 0 ≤ x ≤ y ≤ 1.

Hint (you may already have seen this someplace): The marginal p.d.f. of X turns
out to be
fX (x) = 6x(1 − x) for 0 ≤ x ≤ 1.
h i
Find E E[Y |X] .

(a) 1/2
(b) 2/3
(c) 3/4
(d) 1

Solution: By the Law of the Unconscious Statistician and part (b),


h i Z 1
E E[Y |X] = E[Y |x]fX (x) dx
0
Z 1
1+x
= 6x(1 − x) dx
0 2
= 3/4.

So the answer is (c). 

Let’s check this answer. First of all, the marginal p.d.f. of Y is


Z y Z y
fY (y) = f (x, y) dx = 6x dx = 3y 2 , for 0 ≤ y ≤ 1.
0 0
3

R1 R1
Then E[Y ] = 0
yfY (y) dy = 0
3y 3 dy = 3/4.
h i
Finally, by double expectation, E E[Y |X] = E[Y ] = 3/4, so the check works! 

4. (Lesson 2.11: Covariance and Correlation.) Suppose that the correlation between
December snowfall and temperature in Siberacuse, NY is −0.5. Further suppose
that Var(S) = 100 in2 and Var(T ) = 25 (degrees F)2 . Find Cov(S, T ) (in units of
degree inches, whatever those are).
(a) −25
(b) −5
(c) 5
(d) 25

Solution: The answer is (a) since


p
Cov(S, T ) = Corr(S, T ) Var(S)Var(T ) = −0.5(10)(5) = −25. 

5. (Lesson 2.11: Covariance and Correlation.) If X and Y both have mean −7 and
variance 4, and Cov(X, Y ) = 1, find Var(3X − Y ).
(a) 34
(b) 36
(c) 40
(d) 41

Solution:
Var(3X −Y ) = 32 Var(X)+(−1)2 Var(Y )+2(3)(−1) Cov(X, Y ) = 36+4−6 = 34.
So the answer is (a). 

6. (Lesson 2.12: Probability Distributions.) You may recall that the p.m.f. of the
Geometric(p) distribution is f (x) = (1 − p)x−1 p, x = 1, 2, . . .. If the number of
orders at a production center this month is a Geom(0.7) random variable, find the
probability that we’ll have at most 3 orders.
4

(a) 0.027
(b) 0.140
(c) 0.860
(d) 0.973

Solution: Denote X ∼ Geom(0.7). Then


3
X 3
X
P(X ≤ 3) = P(X = x) = (0.3)x−1 (0.7) = 0.973.
x=1 x=1

So the answer is (d). 

7. (Lesson 2.12: Probability Distributions.) Suppose the SAT math score of a Univer-
sity of Georgia student can be approximated by a normal distribution with mean
400 and variance 225. Find the probability that the UGA Einstein will score at
least a 415.

(a) 0.5
(b) 0.1587
(c) 0.975
(d) 0.8413

Solution: This answer is (b). To see why, let X denote his score and let Z denote
a standard normal random variable. Then
 
X − 400 415 − 400
P(X ≥ 415) = P √ ≥ = P(Z ≥ 1) = 0.1587,
225 15

where you could’ve used the back of the book or the NORMSDIST function in
Excel to look up that last probability. (Actually, this is a famous one, so you may
have memorized it.) 

8. (Lesson 2.13: Limit Theorems.) What is the most-important theorem in the uni-
verse?

(a) Eastern Limit Theorem


5

(b) Central Limit Theorem


(c) Central Limit Serum
(d) Central Simit Theorem (simit is a tasty Turkish bagel)

Solution: (b). 

9. (Lesson 2.13: Limit Theorems.) If X1 , . . . , X400 are i.i.d. from some distribution
with mean 1 and variance 400, find the approximate probability that the sample
mean X̄ is between 0 and 2.
(a) 0.1587
(b) 0.3174
(c) 0.6826
(d) 0.8413

Solution: First of all, note that E[X̄] = E[Xi ] = 1 and Var(X̄) = Var(Xi )/n = 1.
Then by the CLT, we have X̄ ≈ Nor(1, 1). Thus,
P(0 ≤ X̄ ≤ 2) ≈ P(−1 ≤ Z ≤ 1) = 2Φ(1) − 1 = 2(0.8413) − 1 = 0.6826.
So the answer is (c). 

10. (Lesson 2.14: Estimation.) Suppose we collect the following observations: 7, −2,
1, 6. What is the sample variance?
(a) 13

(b) 13
(c) 18
(d) 28

Solution: First of all, the sample mean is X̄ = n1 ni=1 Xi = 3. Then the sample
P
variance is n
2 1 X
S = (Xi − X̄)2 = 18,
n − 1 i=1
so that the answer is (c). 
6

11. (Lesson 2.14: Estimation.) BONUS: Consider two estimators, T1 and T2 , for an
unknown parameter θ. Suppose that the Bias(T1 ) = 0, Bias(T2 ) = θ, Var(T1 ) = 4θ2 ,
and Var(T2 ) = θ2 . Which estimator might you decide to use and why?
(a) T1 — it is has lower expected value.
(b) T1 — it is has lower MSE.
(c) T2 — it is has lower variance.
(d) T2 — it is has lower MSE.

Solution: Although low bias and variance are just great individually, we are
usually interested in the estimator with the lower MSE, which balances bias and
variance.

Of course, MSE = Bias2 + Variance. Thus, MSE(T1 ) = 4θ2 and MSE(T2 ) = 2θ2 .
So the answer is (d).

Note that (a) is incorrect because lower expected value doesn’t necessarily tell us
about bias. (b) is just plain wrong, wrong, wrong. (c) is close, but variance isn’t
as important as MSE. 

12. (Lesson 2.15: Maximum Likelihood Estimation.) BONUS: Suppose X1 , X2 , . . . , Xn


are i.i.d. Pois(λ). Find λ̂, the MLE of λ. (Don’t panic — it’s not that difficult.)
(a) X̄
(b) 1/X̄
(c) n/ xi=1 Xi
P

(d) S 2

Solution: The likelihood function is


n n Pn
Y Y e−λ λxi e−nλ λ i=1 xi
L(λ) = f (xi ) = = Qn .
i=1 i=1
xi ! i=1 (xi !)

To simplify things (as per the suggestion in the lesson), take logs:
n
X
`n(L(λ)) = −nλ + xi `n(λ) + C,
i=1
7

Q 
n
where C = −`n (x
i=1 i !) is a constant with respect to λ.

The recipe says that you now set the derivative = 0, and solve for λ:
Pn
d xi
`n(L(λ)) = −n + i=1 = 0.
dλ λ

Solving, we get λ̂ = ni=1 Xi /n = X̄. (I won’t do a second derivative test because


P
I’m lazy.) Thus, the correct answer is (a). 

13. (Lesson 2.15: Maximum Likelihood Estimation.) BONUS: Suppose that we are
looking at i.i.d. Exp(λ) customer service times. We observe times of 2, 4, and 9
minutes. What’s the maximum likelihood estimator of λ2 ?

(a) 5
(b) 1/5
(c) 25
(d) 1/25

Solution: From the lesson, we know that the MLE of λ is λ̂ = 1/X̄ = 1/5.

Therefore, the Invariance Property states that the MLE of λ2 is λ̂2 = 1/25. This
is choice (d). 

14. (Lesson 2.16: Confidence Intervals.) BONUS: Suppose we collect the following
observations: 7, −2, 1, 6 (as in a previous question in this homework). Let’s
assume that these guys are i.i.d. from a normal distribution with unknown variance
σ 2 . Give me a two-sided 95% confidence interval for the mean µ.

(a) [−2, 7]
(b) [−3.75, 9.75]
(c) [−6.75, 6.75]
(d) [3.75, 9.75]
8

Solution: The confidence interval will be of the form


p
µ ∈ X̄ ± tα/2,n−1 S 2 /n,

where n = 4, the sample mean X̄ = 3, the sample variance S 2 = 18, and α = 0.05,
so that the t-distribution quantile (which you have to look up) is t0.025,3 = 3.182.
All of this stuff yields
p
µ ∈ 3 ± 3.182 18/4 = 3 ± 6.75 = [−3.75, 9.75].

This is choice (b). 

You might also like