0% found this document useful (0 votes)
9 views1 page

Stat100b hw6 w25

The document is a homework assignment for Statistics 100B at UCLA, focusing on various statistical estimators and their properties. It includes questions about Poisson and normal distributions, unbiased estimators, bias reduction techniques, and efficiency of estimators. Each question requires theoretical proofs and derivations related to statistical estimation methods.

Uploaded by

kken0236
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views1 page

Stat100b hw6 w25

The document is a homework assignment for Statistics 100B at UCLA, focusing on various statistical estimators and their properties. It includes questions about Poisson and normal distributions, unbiased estimators, bias reduction techniques, and efficiency of estimators. Each question requires theoretical proofs and derivations related to statistical estimation methods.

Uploaded by

kken0236
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

University of California, Los Angeles

Department of Statistics
Statistics 100B Instructor: Nicolas Christou
Homework 6
Answer the following questions:
a. Let X1 , X2 , . . . , Xn be i.i.d.Poisson(λ) and let X̄ and S 2 be the sample mean and sample
variance respectively. Each one of these two estimators has expected value equal to λ
(why?). Which estimator is better? Show that X̄ is an efficient estimator of λ and
therefore X̄ is as at least as good as S 2 .
b. Let X1 , . . . , Xn be i.i.d. N√(θ, θ), θ > 0. For this model both X̄ and cS are unbiased
n−1Γ( n−1 )
estimators of θ, where c = √2Γ( n )2 . Show that for any α the estimator αX̄+(1−α)cS
2
is also unbiased estimator of θ. For what value of α this estimator has the minimum
variance?
c. Let X1 , . . . , Xn be i.i.d. random variables with Xi ∼ Γ(α, β) with α known. Find an
unbiased estimator of β1 . (Find E X̄1 and then adjust it to be unbiased of β1 .)
d. A general technique for reducing bias in an estimator is the following. Let X1 , X2 , . . . , Xn
be i.i.d. random variables, and let θ̂ be some estimator of a parameter θ. In order to
reduce the bias the method works as follows: We calculate θ̂(i) , i = 1, 2, . . . , n just as θ̂
is calculated but using the n − 1 observations with Xi removed from the sample. This
Pn
new estimator is given by θ̂∗ = nθ̂ − n−1n
(i)
i=1 θ̂ . To apply this concept we will use
the Bernoulli distribution. Let X1 , X2 , . . . , Xn be i.i.d. Bernoulli(p). It is given that
 Pn 2
Xi
the MLE of p2 is θ̂ = i=1
n
. Show that θ̂ is not unbiased for p2 .

e. Refer to question (d). Use the technique described above to reduce the bias in θ̂. Does
the method remove the bias entirely in this example?
f. Find the Rao-Cramér lower bound of an estimator of θ but do not assume that θ̂ is
unbiased estimator of θ. Please show the entire derivation.

g. Let X1 , . . . , Xn be i.i.d. random variables with Xi ∼ Γ(α, β) with α known. Is β̂ = α
efficient estimator of β?
h. Let X1 , X2 , . . . , Xn denote a random sample
Pn from a normal distribution with known
X 2
µ = 0 and unknown variance σ 2 . Let σˆ2 = in i be an estimator of σ 2 . Is it unbiased?
Find the variance of this estimate. Is it an efficient estimator of σ 2 ?
i. Let θ̂1 and θ̂2 are two independent unbiased estimators of a parameter θ. Suppose
var(θ̂1 ) = 2var(θ̂2 ). Find the constants c1 and c2 so that c1 θ̂1 + c2 θ̂2 is unbiased with
the smallest variance.
j. Let X1 , . . . , Xn be i.i.d. random variables with Xi ∼ N (µ, σ). Find c so that the MSE
E[cS 2 − σ 2 ]2 is minimized. How is this new estimator cS 2 compared to S 2 ?

You might also like