0% found this document useful (0 votes)
45 views2 pages

Unit 5 Problem Set

This document is a problem set for a statistics class covering inference, estimation methods, and estimator properties. It includes 12 problems involving the method of moments, maximum likelihood estimation, unbiasedness, mean squared error, efficiency, and the Cramér-Rao lower bound. Students are to find parameter estimates using different distributions and data sets, and evaluate properties of the resulting estimators. Work must be shown thoroughly for problems to be considered complete.

Uploaded by

Karen Lu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views2 pages

Unit 5 Problem Set

This document is a problem set for a statistics class covering inference, estimation methods, and estimator properties. It includes 12 problems involving the method of moments, maximum likelihood estimation, unbiasedness, mean squared error, efficiency, and the Cramér-Rao lower bound. Students are to find parameter estimates using different distributions and data sets, and evaluate properties of the resulting estimators. Work must be shown thoroughly for problems to be considered complete.

Uploaded by

Karen Lu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

McGuffey  Fall 2021 Stat 310 Problem Set

U NIT 5 P ROBLEM S ET (PS5)


Due: 11 pm CT on Monday, Nov. 15, 2021.
Submission instructions are on Canvas.

All textbook problems refer to problems from An Introduction to Mathematical Statistics and Its
Applications, 6th Edition, by Larsen and Marx. All problems will be graded for effort and complete-
ness. Work must be shown thoroughly for a problem to be considered complete. See Canvas
for answers and complete solutions.

Lesson 5.1: Inference Framework & Method of Moments


1. Assume the following observations are drawn from a Geometric(p) distribution: 9, 5, 2, 12, 20, 1.
Use the method of moments to estimate p.

2. Suppose that y1 = 0.42, y2 = 0.10, y3 = 0.65, and y4 = 0.23 is a random sample of size 4
from the PDF
fY (y; θ) = θy θ−1 for 0 ≤ y ≤ 1.
Find the method of moments estimate for θ.

3. (Textbook) Problem 5.2.17 on page 293, but only the MOM parts. That is, find a formula for
the MOM estimate and calculate the MOM estimate for the given data.

4. The gamma distribution is often used to model the waiting time until a k th success occurs.
It has parameters k and λ. Its mean is µ = k /λ, and its variance is σ 2 = k/λ2 . Assume the
following data are drawn from a Gamma(k, λ) distribution:

2.4, 1.1, 1.9, 2.5, 4.5.

Use MOM to estimate k and λ.

Lesson 5.2: Maximum Likelihood Estimation


Note that Problems 5-8 correspond to Problems 1-4 above, in that they are based on the same
underlying distributions.

5. Assume the following observations are drawn from a Geometric(p) distribution: 9, 5, 2, 12, 20, 1.
Use maximum likelihood to estimate p.

6. (Textbook) Problem 5.2.7 on page 288. It doesn’t explicitly say, but you should use maximum
likelihood estimation.

7. (Textbook) Problem 5.2.12 on page 289. Hint: What happens when the support of the r.v.
depends on the parameter?

PS5  Page 1/2


McGuffey  Fall 2021 Stat 310 Problem Set

8. The PDF for the gamma distribution is

λk x k−1 e−λx
fX (x; k, λ) = for x > 0
Γ(k )

where Γ is an extension of the factorial function. Assume the following data are drawn from
a Gamma(k, λ) distribution:
2.4, 1.1, 1.9, 2.5, 4.5.
Write out a set of equations that, when solved, give the maximum likelihood estimates for
k and λ. Your answer should be two equations in terms of two unknowns; both equations
should have a right hand side “= 0”. The term “Γ0 (k )” [the derivative of Γ(k)] will be in one of
these equations; leave this term as is, without simplification.

9. Let X1 = k1 , X2 = k2 , ..., Xn = kn be a random sample of size n from the Poisson distribution,


−λ k
pX (k ; λ) = e k!λ for k = 0, 1, 2, ... where λ is unknown. Show that k , the observed sample
mean, is the maximum likelihood estimate for λ.

Lesson 5.3: Estimator Properties


10. In Problem 3 above (textbook problem 5.2.17) you found an estimator of θ. Is it unbiased? If
unbiased, show it. If biased, calculate the bias.

11. Let X1 , X2 , ..., Xn be a random sample of size n from the PDF

1 −x/θ
fX (x; θ) = e , x > 0.
θ

(a) Find the mean square error (MSE) of θb1 = X1 .


(b) Find the MSE of θb2 = X n .
(c) Which estimator (θb1 or θb2 ) is more efficient? Briefly justify your answer.

12. Consider Y ∼ Binom(n, p), where n is known and p is the unknown parameter of interest.
Y
(a) Define p
b= n. Show that p
b is unbiased for p.
(b) Recall that we may express a binomial r.v. as the sum of n i.i.d. Bernoulli r.v.s; that is,
iid
Y = X1 + X2 + · · · + Xn where Xi ∼ Bern(p). We may write the Bernoulli PMF in the form

pXi (k ; p) = pk (1 − p)1−k , k = 0, 1; 0 < p < 1.


Y
Find the Cramér-Rao lower bound for pXi (k; p). Still considering p
b= n, how does Var (p
b)
compare with this bound?

PS5  Page 2/2

You might also like