Unit 5 Problem Set
Unit 5 Problem Set
All textbook problems refer to problems from An Introduction to Mathematical Statistics and Its
Applications, 6th Edition, by Larsen and Marx. All problems will be graded for effort and complete-
ness. Work must be shown thoroughly for a problem to be considered complete. See Canvas
for answers and complete solutions.
2. Suppose that y1 = 0.42, y2 = 0.10, y3 = 0.65, and y4 = 0.23 is a random sample of size 4
from the PDF
fY (y; θ) = θy θ−1 for 0 ≤ y ≤ 1.
Find the method of moments estimate for θ.
3. (Textbook) Problem 5.2.17 on page 293, but only the MOM parts. That is, find a formula for
the MOM estimate and calculate the MOM estimate for the given data.
4. The gamma distribution is often used to model the waiting time until a k th success occurs.
It has parameters k and λ. Its mean is µ = k /λ, and its variance is σ 2 = k/λ2 . Assume the
following data are drawn from a Gamma(k, λ) distribution:
5. Assume the following observations are drawn from a Geometric(p) distribution: 9, 5, 2, 12, 20, 1.
Use maximum likelihood to estimate p.
6. (Textbook) Problem 5.2.7 on page 288. It doesn’t explicitly say, but you should use maximum
likelihood estimation.
7. (Textbook) Problem 5.2.12 on page 289. Hint: What happens when the support of the r.v.
depends on the parameter?
λk x k−1 e−λx
fX (x; k, λ) = for x > 0
Γ(k )
where Γ is an extension of the factorial function. Assume the following data are drawn from
a Gamma(k, λ) distribution:
2.4, 1.1, 1.9, 2.5, 4.5.
Write out a set of equations that, when solved, give the maximum likelihood estimates for
k and λ. Your answer should be two equations in terms of two unknowns; both equations
should have a right hand side “= 0”. The term “Γ0 (k )” [the derivative of Γ(k)] will be in one of
these equations; leave this term as is, without simplification.
1 −x/θ
fX (x; θ) = e , x > 0.
θ
12. Consider Y ∼ Binom(n, p), where n is known and p is the unknown parameter of interest.
Y
(a) Define p
b= n. Show that p
b is unbiased for p.
(b) Recall that we may express a binomial r.v. as the sum of n i.i.d. Bernoulli r.v.s; that is,
iid
Y = X1 + X2 + · · · + Xn where Xi ∼ Bern(p). We may write the Bernoulli PMF in the form