Lecture Note 15
Lecture Note 15
Lecture 15
Unbiased Estimation
Let {F0 , θ ∈ Θ}, where Θ ⊂ Rk , be a nonempty set of probability distributions. Let
X = (X1 , X2 , . . . , Xn ) be a multiple random vector with distribution function Fθ and
sample space X . Let ψ : Θ → R be a real-valued parametric function. A (Borel-
measurable) function T : X → Θ is said to be unbiased for ψ if
provided that Eθ |T (X)| < ∞ for all θ ∈ Θ. Any parametric function, ψ(·) is said to
be estimable if we can find an unbiased estimator T that satisfies (1). An estimator that
is not unbiased is called biased. The function b(T, ψ) defined by
(a) If T is unbiased for θ, g(T ) is not, in general, an unbiased estimator of g(θ) unless
g is a linear function.
(b) Unbiased estimators do not always exist. For example, suppose X sample from
b(1, p), and we wish to estimate ψ(p) = p2 . For an estimator T to be unbiased for
p2 , we must have
that is,
p2 = p{T (1) − T (0)} + T (0).
must hold for all p in the interval [0, 1], which is impossible.
1
(c) Sometimes an unbiased estimator may be absurd. Let X ∼ P ossion(λ) and define
ψ(λ) = e−3λ . We show that T (X) = (−2)X is an unbiased estimator for ψ(λ). We
have
Eλ [T (X)] = ψ(λ).
However, T (x) = (−2)x is positive if x is even and negative if x is odd, which is
absurd since ψ(λ) > 0.
Let θ0 ∈ Θ and let U(θ0 ) be the class of all unbiased estimators T of ψ(θ0 ) such that
Eθ0 [T 2 ] < ∞. Then, T0 ∈ U(θ0 ) is called a locally minimum variance unbiased
estimator (LMVUE) of ψ(θ0 ) if
Eθ0 [(T0 − ψ(θ0 ))2 ] ≤ Eθ0 [(T − ψ(θ0 ))2 ]
holds for all T ∈ U(θ0 ).
Let U be the set of all unbiased estimators T of ψ(θ) such that Eθ [T 2 ] < ∞ for all
θ ∈ Θ. An estimator T0 ∈ U is called a uniformly minimum variance unbiased
estimator (UMVUE) of ψ(θ) if
Eθ [(T0 − ψ(θ))2 ] ≤ Eθ [(T − ψ(θ))2 ]
for all θ ∈ Θ and every T ∈ U.
Let a1 , a2 , . . . , an be any set of real numbers such that ni=1 ai = 1. Let X1 , X2 , . . . , Xn
P
be independent random variables (RVs) with common mean µ and variances σi2 for i =
1, 2, . . . , n. Then, the estimator
n
X
T = ai Xi
i=1
is an unbiased estimator of µ with variance
Xn
a2i σi2 (Exercise!).
i=1
2
Corollary 1.1. Let X1 , X2 , . . . , Xn be iid random variables (RVs) with common mean µ
and variance σ 2 . Then the Best Linear Unbiased Estimator (BLUE) of µ is
n
1X
X̄ = Xi .
n i=1
Let X and Y be random variables (RVs) defined on a probability space (Ω, S, P ), and
let h be a Borel-measurable function. Then the conditional expectation of h(X), given
Y , written as E[h(X) | Y ], is an RV that takes the value E[h(X) | y], defined by
P
h(x)P (X = x | Y = y), if discrete type with P (Y = y) > 0,
E[h(X) | y] = Rx∞
−∞
h(x)fX|Y (x | y) dx, if continuous type with fY (y) > 0,
The following two theorems are the most useful method for finding UMVUEs.
Theorem 2 (Rao-Blackwell Theorem). Let {Fθ : θ ∈ Θ} be a family of probability
distribution functions, and let h be any statistic in U, where U is the (nonempty) class of
all unbiased estimators of ψ(θ) with Eθ h2 < ∞. Let T be a sufficient statistic for ψ(θ).
Then the conditional expectation ϕ(T ) = Eθ [h | T ] is independent of θ and is an unbiased
estimator of ψ(θ). Moreover,
Pθ {h = ϕ(T )} = 1, ∀θ.
3
Theorem 3. Let U be the nonempty class of unbiased estimators as defined in Theorem
2. Then there exists at most one UMVUE for ψ(θ).
Theorem 4. Let U be the class of all unbiased estimators T of ψ(θ) where θ ∈ Θ with
Eθ [T 2 ] < ∞ for all θ, and suppose that U is nonempty. Let U0 be the class of all unbiased
estimators v of 0, that is,
Then T0 ∈ U is the UMVUE of ψ(θ) if and only if T0 is uncorrelated with all unbiased
estimators of 0, for all θ, i.e.,
Eθ [E[h1 | T ] − E[h2 | T ]] = 0, ∀θ ∈ Θ.
E[h1 | T ] = E[h2 | T ].
4
Thus,
T
Eλ = Eλ (X) = λ.
n
Since X̄ = Tn is unbiased for θ and is a function of T , which is a complete sufficient
statistic. By Theorem (5), X̄ is the UMVUE of λ.
Example 2 Suppose X1 , X2 , . . . , Xn are iid U (0, θ), where θ > 0. Then find the
UMVUE of θ.
Solution T = T (X) = X(n) is complete sufficient complete. We know,
n
Eθ (T ) = Eθ (X(n) ) = θ, for all θ > 0.
n+1
implies,
(n + 1)
Eθ X(n) = θ.
n
Since (n+1)
n
X(n) is an unbiased estimator of θ and is a function of X(n) , which is a
complete sufficient statistic, therefore,
(n + 1)
X(n)
n
is the UMVUE of θ.
Example 3 Suppose X1 , X2 , . . . , Xn are iid Poisson(θ), where θ > 0. Find the
UMVUE for
ψ(θ) = Pθ (X = 0) = e−θ .
Pn
Solution T = T (X) = i=1 Xi is a complete sufficient statistic for θ, since the
Poisson is a member of the exponential family.
Any unbiased estimator W will ”work,” so let’s keep our choice simple, say
Thus,
5
Pθ (X1 = 0)Pθ ( ni=2 Xi = t)
P
P (X1 = 0 | T = t) = .
Pθ (T = t)
X1 ∼ Pois(θ), ni=2 Xi ∼ Pois((n − 1)θ), and T = ni=1 Xi ∼ Poisson(nθ), Therefore,
P P
t
n−1
ϕ(t) = .
n
Theorem 7. If UMVUEs Ti exist for real functions ψi , i = 1, 2 of θ, they also exist for
λψi (λ real), as well as for ψ1 + ψ2 , and are given by λTi and T1 + T2 , respectively.