We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 15
Properties of Maximum Likelihood Estimators. (™ Le E d
_ Se
2
(0 The frst and second order derivatives, vit. 2708 ang POEL exist
St and second order derivatives. vt
and are continuous functio i R (including the true value 8p of the
Parameter) for almost all x. For every 0 in R
< Fx)
|S.e2| log L exists such that
oe 2
demea| i)
saa - log L| < M(x)
_ Ea an pry
where E(M(x)] < K, a positive quantity. € (d=
iii) For every 0inR, |
Mon
ae Snr) J am {sen
is finite and non-zero.
Atv) The range of integration is independent of 9. But if the range of
integration depends on 6, then f(x, 6) vanishes at the extremes depending on 0,
This assumption is to make the differentiation under the integral sign valid,a (Cramer-Rao Theorem). “With probability approaching
unity as a> =, the likelihood equation & log L = 0, has a solution which
converges in probability to the true value @)”. In other words M.L.E.'s are
e ther words M.L.E.'s ar
consistent.
~— Remark. face's are always consistent estimators but need not be
unbiased. |For example in sampling from N (jt, 62) population, [c.f. Example
15-31],
MLE(u1) = x (sample mean), which is both unbiased and consistent
estimetor of pt.
MLE(o?) = s? (sample variance), which is consistent but not unbiased
estimator of 62,
p{\t eleehor mexr (Hazoor Bazar's Theorem). Any consistent solution of
the likelihood equation provides a maximum of the likelihood with probability
tending to unity as the sample size (n) tends to infinity.
mos p =vw (Asymptotic Normality of MLE’s).4
consistent solution of the likelihood équation is asymptotically normally
distributed about the Q. Thus, 6 is asymptotically N (0 ito)
n— oo,
Remark, Wann of ali is given by
v@) =Ta 1
TCE]
Zowe
7 Af MLE. exists, it is the most efficient in the class of
such estimators.
—_____STheorem i suficen extinaor axes is a fction of the
Maximum Likelihood Estimator.
—Proof. If = 1G, 2, ra) is a sufficient estimator of 9, then Likelihood
Function can be written as (cf. ‘Theorem 15-7)
L = g(t, 8) h(a, x2 X3, «--s Xal
where g(t, @) is the density function of t and h(x, x2,
function of the sample, given t, Den ate al
h
be jo" y he
ba weey7?
X11) is the density
log L = log a(t, 0) +10g ACs, %2,--¥4!0) inn] a
Differentiating wert. 0, MAL
log L !
DRL 3 tog g(r'®) = v(t, 0), Cay)
which is a function of t and @ only.
MLE. is given by
d log L
=0 = ywlt6)=0
20
(8 =n(@ = Some function of sufficient statistic.
LP = (@) = Some function of M.L.E.—_
ae If for a given population with p.df. fix, 6), an MVB
estimator T exists for 9, then the likelihood equation wifl have a solution eq
to the estimator T.
Proof. Since T is an MVB estimator of 0, we have (c.f. (15-40)],
a T-0
F) log L = x) = (T -@) A(@)
MLE for 6 is the solution of the likelihood equation
Diogh=0 = 6-7
Hplesl=0 => =
as required. fi
— (Invariance Property of MLE). If T is the MLE
of Band y(6) is one to one function of 6, then y{T) is the MLE of y().
Te
yn 2 PwGal
— Find the maximum likelihood estimate for the
Parameter 2 of @ Poisson distribution on the basis of a sample of size n.
A (yen Potrsmma
Jone etx
fern ar
Lsethant — fonaion |
Le mite)
fst
Prat ty
a
mcopnen
—poe 5 oun Sar) ooo 4 cua)
——®
> %& oN Ae ->d Xy
Lze A @ a e A
—<$— . ad ——_——
HN AI An
Xn
»
-»V" my Me
x Gl _. Kal=x" WitAga- # Am
——
Ay) %1 -... Kal
ty [er om)
Al Al... An |
we aan
- tae n+ Lun) om!)= -AN dye + % |
a e C
Sey a) A " :
ot an!)
My
—dX\n
L (Gras peck
Ay
) & x
mal)
| _
assy (a1 x
atac xe Cb b An)
aia fg d
»
= 4a Cay Alo: My)
- -m&) + (mbar day Lao
>
vy 2 ont eS
da 2
Like
Ja bay
<=
stn on goo+
x
+
a
J
+
x
x
3
i
°
-1
mia eb Em 7
OT
>»
yr Xa be # hm
n=
n