0% found this document useful (0 votes)
9 views6 pages

Unit 4 1lec 5

It is important topic of Probability Distribution

Uploaded by

Arun Kochar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
9 views6 pages

Unit 4 1lec 5

It is important topic of Probability Distribution

Uploaded by

Arun Kochar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 6
erties of Maximum Likelihood Estimators. % We make the jons, known as the Regularity Conditions dlogh . wy iz Band © 6 s functions of @ in a range R (including the tr first and second order derivatives ue value Bp c ae it all x. For every 6 in R, a log L |: F,(x) and 20 log L| ae => 2a-x)-a=0 > a=2r a a-x Hence MLE of ois given by : = 2x, ox? _ "2 £@)=EQX)=2[) xfex,o)dr= [jr(o-2ax =4 + 5b 73" a Since E (&) # 0, & = 2x is not an unbiased estimate of ‘Scanned with CamScanner me 6 xample 17-31. In random sampling fro1 i¢ ig oe xan eae slinabor it pling from normal population N(11, 0”), find the (i) pwhen oF is known, (ii) 0? when pis known, and (iii) the simultaneous estimation of and 0°. Solution. X ~ N (1, 62), then b= fiLagonldowl (ay onl bn] i=l logL = 1 Jog x) Blog ot 3, (j-)? ~~ Case (i). When ois known, the likelihood equation for estimating [1 is : = = -+ 22 (-1) Ajeet = o = a2 (x; - H)(-1) = 0 n in _ > % (%j-p) =0 => nay xyak Hence M.LE. for pis the sample mean x g Case (ii). When p is known, the likelihood. oo for estimating 67 is 1 1 log L= 0 > 5 XQ +76 2, Gi pye=0 so: nL (q-P=0 = abd (i-HP Case (iii). The 2 batinaba equations for simultaneous estimation of 1 and o? are: 5lBL=0 and 2 log L= 0, thus giving pez [From (*)] a " a2 Yo wy? =i, L (x)- 2)? =s%, the sample variance. i=l i=l ‘Scanned with CamScanner it Note. It may be pointed out here that though E() = E(Z) =p, E(62) = E(s2) #02 the maximum likelihood estimators (M.L.Es.) need not Necessarily by Another illustration is given in Example 17-32. ark, Since M.L.E. is the most efficient, we conclude that in sampling from a nomi ion, the sample mean x is the most efficient estimator of the population meant ‘Scanned with CamScanner

You might also like