0% found this document useful (0 votes)
2K views7 pages

Lehmann-Scheff e Theorem: Proof

The Lehmann-Scheffé theorem states that if T(X) is a complete sufficient statistic and W(X) is an unbiased estimator of τ(θ), then φ(T) = E(W|T) is the unique uniformly minimum variance unbiased estimator (UMVUE) of τ(θ). The theorem can be applied by first finding a complete sufficient statistic T, then finding an unbiased estimator W(X) and taking the conditional expectation E(W|T) to obtain the UMVUE φ(T). For an exponential family with k parameters, if the range of the coefficients contains an open k-rectangle, then the sufficient statistic is also complete.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2K views7 pages

Lehmann-Scheff e Theorem: Proof

The Lehmann-Scheffé theorem states that if T(X) is a complete sufficient statistic and W(X) is an unbiased estimator of τ(θ), then φ(T) = E(W|T) is the unique uniformly minimum variance unbiased estimator (UMVUE) of τ(θ). The theorem can be applied by first finding a complete sufficient statistic T, then finding an unbiased estimator W(X) and taking the conditional expectation E(W|T) to obtain the UMVUE φ(T). For an exponential family with k parameters, if the range of the coefficients contains an open k-rectangle, then the sufficient statistic is also complete.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Lehmann-Schee Theorem If T(X) is a com-

plete sucient statistic and W(X) is an unbi-


ased estimator of (), then (T) = E(W|T) is
an UMVUE of (). Furthermore, (T) is the
unique UMVUE in the sense that if T

is any
other UMVUE, then P

((T) = T

) = 1 for all
.
Proof Let W be any unbiased estimator of
(). Then by the Rao-Blackwell Theorem,
(T) = E(W|T) is such that
Var

((T)) Var

(W) .
Let

W be any other unbiased estimator and

(T) = E(

W|T). Then
E

[(T)

(T)] = 0 ,
and by completeness of T it follows that
P

((T) =

(T)) = 1
Hence, (T) is the unique UMVUE.
107
Note: When T is complete and sucient, the
Lehmann-Schee Theorem implies that there
is at most one function of T thats unbiased
for ().
Application of Lehmann-Schee
1. Find a complete sucient statistic T.
2. If we can nd an unbiased estimator V (T),
weve found the UMVUE.
3. Otherwise, nd any unbiased estimator
W(X) and then compute (T) = E(W|T).
108
Completeness of the Exponential Family
k-parameter exponential family:
f(x|) = exp
_
_
k

i=1
c
i
()T
i
(x) +d() +S(x)
_
_
I
A
(x),
where A doesnt depend on .
Theorem 4 Let {f(x|) : } be a k-par-
ameter exponential family. Suppose the range
of (c
1
(), . . . , c
k
()) contains an open k-rectang-
le. Then (T
1
(X), . . . , T
k
(X)) is complete as
well as sucient. (Bickel and Doksum, p. 123)
Open rectangle:
{(u
1
, . . . , u
k
) : a
i
< u
i
< b
i
, 1 i k}
109
Example 22 (cont.)
f(x|, ) = exp
_
_

1
2
2
n

i=1
x
2
i
+

2
n

i=1
x
i

n
2
2
2
nlog
n
2
log(2)
_
c
1
(, ) = (2
2
)
1
c
2
(, ) =

2
The range of (c
1
, c
2
) is
{(a, b) : a < 0, < b < },
which obviously contains an open rectangle.
So, (

n
i=1
X
2
i
,

n
i=1
X
i
) is a complete sucient
statistic.
For known constants a and b, recall that
a = a
_
_
1
n
n

i=1
X
2
i


X
2
_
_
1/2
and
b = b
1
n
n

i=1
|X
i


X|
are UEs of .
110
Since a is unbiased and a function of the
complete sucient statistic, it is the unique
UMVUE of .
Interestingly, note that it must be true that
E(b |

X,
n

i=1
X
2
i
) = a .
Example 26 Let X
1
, . . . , X
n
be i.i.d. Poisson(),
and dene
() = P(X
1
= 3) =
e

3
3!
.
Find an UMVUE for (). Let
U(X
1
) =
_
1, if X
1
= 3
0, otherwise.
We have
E

[U(X
1
)] = P(X
1
= 3) = (),
and so U(X
1
) is a UE of ().
111
We know that T =

n
i=1
X
i
is a complete, suf-
cient statistic. Why?
Well now use Lehmann-Schee to nd the
UMVUE. Need to compute E(U(X
1
)|T), and
so we need the conditional distribution of X
1
given T = t.
P(U(X
1
) = 1|T = t) =
P(U(X
1
) = 1, T = t)
P(T = t)
We know that the sum of k i.i.d. Poisson()
random variables has a Poisson(k) distribu-
tion.
P(U(X
1
) = 1, T = t) =
P(X
1
= 3, X
2
+ +X
n
= t 3) =
_
0, t 2
()
e
(n1)
[(n1)]
t3
(t3)!
, t 3.
112
This leads to
P(U(X
1
) = 1|T = t) =
_
0, t = 0, 1, 2
_
t
3
_
(1/n)
3
(1 1/n)
t3
, t = 3, 4, . . ..
Obviously, E(U(X
1
)|T = t) = P(U(X
1
) = 1|T =
t), and so weve found the UMVUE. Dene
(T) =
_
T
3
__
1
n
_
3
_
1
1
n
_
T3
,
where
_
T
3
_
= 0 for T < 3. Then (T) is a UE
of (), and by the Lehmann-Schee Theorem,
(T) is the unique UMVUE.
113

You might also like