0% found this document useful (0 votes)
73 views9 pages

Handout 6 (Chapter 6) : Point Estimation: Unbiased Estimator: A Point Estimator

The document discusses point estimation and methods for obtaining point estimates. It defines key concepts like unbiased estimators, standard error, estimated standard error, and minimum variance unbiased estimator. Examples are provided to illustrate how to find point estimates using the method of moments and maximum likelihood estimation.

Uploaded by

additionalpyloz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views9 pages

Handout 6 (Chapter 6) : Point Estimation: Unbiased Estimator: A Point Estimator

The document discusses point estimation and methods for obtaining point estimates. It defines key concepts like unbiased estimators, standard error, estimated standard error, and minimum variance unbiased estimator. Examples are provided to illustrate how to find point estimates using the method of moments and maximum likelihood estimation.

Uploaded by

additionalpyloz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 9

STAT 211

Handout 6 (Chapter 6): Point Estimation


A point estimate of a parameter is a single number that can be regarded as the most
plausible value of .
Unbiased Estimator: A point estimator,
^

= + error of estimation, is an unbiased estimator of if


E
^

!= for ever" possible value of . #ther$ise, it is biased and %ias = E


^

!& .
'ead the e(ample ).2 "our te(tboo*!.
E(ample 1+ ,hen - is a binomial r.v. $ith parameters, n and p, the sample proportion -.n is an unbiased
estimator of p.
To prove this, "ou need to sho$ E-.n!=p $here
^
p
=-.n.
E-.n! = E-!.n, /sing the rules of the e(pected value.
= np . n =p 0f -1%inomialn,p! then E-!=np 2hapter 3!
E(ample 2+ A sample of 14 students $ho had ta*en calculus class "ielded the follo$ing information on
brand of calculator o$ned+ T 5 2 T 5 5 2 T T 2 2 5 S S S T+ Te(as 0nstruments, 5+ 5e$lett
6ac*ard, 2=2asio, S=Sharp!.
a! Estimate the true proportion of all such students $ho o$n a Te(as 0nstruments calculator.
Ans$er=7.2))8
b! Three out of four calculators made b" onl" 5e$lett 6ac*ard utili9e reverse 6olish logic. Estimate
the true proportion of all such students $ho o$n a calculator that does not use reverse 6olish
logic.
Ans$er=7.:7
E(ample 3 E(ercise ).:! + 0n a random sample of :7 components of a certain t"pe, 12 are found to be
defective.
a! A point estimate of the proportion of all such components $hich are not defective.
Ans$er=7.:4
b! 'andoml" select 4 of these components and connect them in series for the s"stem. Estimate the
proportion of all such s"stems that $or* properl".
Ans$er=7.;;38
E(ample ; E(ercise ).12! +
-+ "ield of 1
st
t"pe of fertili9er.
1
!
1
1
2
<
2
1
1

=
n
x x
S
n
i
i
, E-!=
1
=ar-!=
2

>+ "ield of 2
nd
t"pe of fertili9er.
1
!
2
1
2
<
2
2
2

=
n
y y
S
n
i
i
, E>!=
2
=ar>!=
2

1
STAT 211
Sho$
2
! 1 ! 1
2 1
2
2 2
2
1 1
^
2
+
+
=
n n
S n S n
is an unbiased estimator for
2

0t means that "ou need to sho$


2
2 1
2
2 2
2
1 1
2
! 1 ! 1
=

+
+
n n
S n S n
E
( ) ( )
2
2
2 1
2 2
1
2 1
1
2 1
2
2 2
2
1 1
2
! 1
2
! 1
2
! 1 ! 1
S E
n n
n
S E
n n
n
n n
S n S n
E
+

+
+

+
+


2 2
2 1
2 2
2 1
1
2
! 1
2
! 1
=
+

+
+

=
n n
n
n n
n
E(ample 4 E(ercise ).13! + -1,-2,?.,-n be a random sample from the pdf f(!=7.41+(!, &1(1,
&11. Sho$ that
< ^
3 X =
is an unbiased estimator for .
0t means that "ou need sho$
=

^
E
.
! 3 3
4
<
^
X E X E E
chapter
=

= $here
3 3 2
1
3 2
1
4 . 7
3 2
4 . 7 ! 1 4 . 7 !
1
1
3 2 1
1

=

+ + =

+ = + =

x x
dx x x X E , 1 1
The standard error: The standard error of an estimator
^

is its standard deviation


^

.
The estimated standard error: The estimated standard error of an estimator is its estimated standard
deviation
^
^

=
^

s
.
The minimum variance unbiased estimator (MVUE): The best point estimate. Among all estimators
of that are unbiased choose the one that has minimum variance. The resulting
^

is @=/E.
2
STAT 211
E(ample )+ 0f $e go bac* to e(ample 1, the standard error of
^
p
is
n
p p
p Var
p
! 1
^
^

=
$here
! 1 !
! , 1
3
2
var
^
!
p np X Var
p n Binomial X
Chapter iance rulesof
n
X Var
p Var
=
= =

n
p p
n
p np ! 1 ! 1
2

=


E(ample 8+ 0f $e go bac* to e(ample 4, the standard error of
^

is
n n n
X Var
X Var Var
chapter
2 2
4
<
^
3
A
3
A
!
A A
^


= =

=
$here =ar-!=
A
3
A 3
1
!B C !
2 2
2 2

= = X E X E
E-
2
!=
3
1
; 3
1
; 3
1
4 . 7
; 3
4 . 7 ! 1 4 . 7
1
1
; 3 1
1
2
=

+ + =

+ = +



x x
dx x x
E(ample :+ Dor normal distribution,
< ^
x =
is the @=/E for . 6roof is as follo$s.
3
STAT 211
The follo$ing graphs are generated b" creating 477 samples $ith si9e 4 from E7,1! and calculating the
sample mean and the sample median for each sample.
E(ample A E(ercise ).3!+ Fiven normall" distributed data "ield the follo$ing summar" statistics.
Variable n Mean Median TrMean StDev SE Mean
thickness 16 1.3481 1.3950 1.3507 0.3385 0.0846
Variable Mini! Ma"i! #1 #3
thickness 0.8300 1.8300 1.05$5 1.64$5
a! A point estimate of the mean value of coating thic*ness.
b! A point estimate of the median value of coating thic*ness.
c! A point estimate of the value that separates the largest 17G of all values in the coating thic*ness
distribution from the remaining A7G.
Ans$er=1.8:13:
d! Estimate 6-H1.4! The proportion of all thic*ness values less than 1.4!
Ans$er=7.)83)
e! Estimated standard error of the estimator used in a!.
Ans$er=7.7:;)24
;
STAT 211
1.8 1.3 0.8
thickness
Boxplot of thickness
2.4 1.4 0.4
99
95
90
80
70
60
50
40
30
20
10
5
1
Data
P
e
r
c
e
n
t
1.074 AD
!oo"ness of #it
$or%al Pro&a&ilit' Plot for thickness
() *sti%ates + 95, -.
(ean
/tDe0
1.34812
0.327781
() *sti%ates
METHOD O! O"T#$%$%& PO$%T ET$M#TO'
1. The @ethod of @oments @@E!
Iet -1,-2,?.,-n be a random sample from a pmf or pdf. Dor *=1,2,?., the *
th
population moment
of the distribution is E-
*
!. The *
th
sample moment is

=
n
i
k
i
x
n
1
1
.
4
STAT 211
Steps to follo$ + 0f "ou have onl" one un*no$n parameter
i! calculate E-!.
ii! eJuate it to
<
1
1
1
x x
n
n
i
i
=

=
.
iii! Solve for un*no$n parameter such as 1!.
0f "ou have t$o un*no$n parameters, "ou also need to compute the follo$ing to
solve t$o un*no$n parameters $ith t$o eJuations.
iv! calculate E-
2
!.
v! eJuate it to

=
n
i
i
x
n
1
2
1
.
vi! Solve for the second un*no$n parameter such as 2!.
0f "ou have more than t$o un*no$n parameters, repeat the same steps for *=3,?.. until "ou can solve it.

E(ample 17+ Sho$ that @@E of the parameter in 6oisson distribution is
<
x
There is one un*no$n parameter.
The 1
th
population moment of the distribution is E-!= .
The 1
th
sample moment is
<
x
Then
<
x
is the @@E for
E(ample 11+ Dind the @@E for the parameters and in gamma distribution.
There are t$o un*no$n parameters.
The 1
th
population moment of the distribution is E-!= .
The 1
th
sample moment is
<
x
Then =
<
x
but this did not help to solve for an" un*no$n parameter. ,e need to
continue the steps.
The 2
nd
population moment of the distribution is E-
2
!=
2
1+!.
The 2
nd
sample moment is

=
n
i
i
x
n
1
2
1
Then
2
1+!=

=
n
i
i
x
n
1
2
1
Since $e have 2 un*no$n parameters and t$o eJuations, $e can solve for the un*no$n
parameters.
The @@E for and are

n
i
i
x x
x
1
2
<
2
<
!
and
<
1
2
<
x
x x
n
i
i
=

, respectivel"
E(ample 12+ Dind the @@E for the parameters and
2
in normal distribution.
There are t$o un*no$n parameters.
The 1
th
population moment of the distribution is E-!= .
The 1
th
sample moment is
<
x
Then =
<
x
but $e still need to solve for the second un*no$n parameters. ,e need to
continue the steps.
)
STAT 211
The 2
nd
population moment of the distribution is E-
2
!=
2
+
2
.
The 2
nd
sample moment is

=
n
i
i
x
n
1
2
1
Then
2
+
2
=

=
n
i
i
x
n
1
2
1
Then this can be solved for the second un*no$n parameter.
The @@E for and
2
are
<
x
and
n
x x
n
i
i
=

1
2
<
, respectivel"
2. The @ethod of @a(imum Ii*elihood @IE!
Ii*elihood function is the Koint pmf or pdf of - $hich is the function of un*no$n values $hen (Ls
are observed. The ma(imum li*elihood estimates are the values $hich ma(imi9e the li*elihood
function.
Steps to follo$+
i! Metermine the li*elihood function.
ii! Ta*e the natural logarithm of the li*elihood function.
iii! Ta*e a first derivative $ith respect to each un*no$n and eJuate it to 9ero if "ou have m
un*no$n parameters, "ou $ill have m eJuations as a result of derivatives!.
iv! Solve for un*no$n Ls.
v!2hec* if it reall" ma(imi9es "our function b" loo*ing at a second derivative.
E(ample 13+ Sho$ that @IE of the parameter in 6oisson distribution is
<
x
There is one un*no$n parameter.
I=Ii*elihood = p(1,(2,?.,(n! = p(1!p(2!?.p(n! b" independence
=
N
1
1
x
e
x


.
N
2
2
x
e
x


.
N
3
3
x
e
x


??..
N
n
x
x
e
n

=

n
i
i
x
n
x
e
i
1
N

lnI!=
( )

+ ! N ln ! ln
i i
x x n
7
! ln
= + =


i
x
n
d
L d
then
<
x =

7
! ln
2 2
2
< =


i
x
d
L d
then the @IE of is
< ^
x =
The 0nvariance 6rinciple+ Iet
1
^

.,
2
^

,...,
m
^

be the @IELs of the parameters


1
,
2
,...,
m

. Then the
@IE of an" function h
1
,
2
,...,
m

! of these parameters is the function h


1
^

.,
2
^

,...,
m
^

! of the
@IELs
E(ample 1;+
8
STAT 211
(() Iet -1,?,-n be a random sample of normall" distributed random variables $ith the mean and the
standard deviation .
The method of moment estimates of and
2
are
<
x
and
n
s n
n
x x
n
i
i 2
1
2
<
! 1
!

=
, respectivel"
The ma(imum li*elihood estimates of and
2
are
<
x
and
n
s n
n
x x
n
i
i 2
1
2
<
! 1
!

=
, respectivel"
()) Iet -1,?,-n be a random sample of e(ponentiall" distributed random variables $ith parameter .
The method of moment estimate and the ma(imum li*elihood estimate of are
<
. 1 x
.
(*) Iet -1,?,-n be a random sample of binomial distributed random variables $ith parameter p.
The method of moment estimate and the ma(imum li*elihood estimate of p are -.n.
(+) Iet -1,?,-n be a random sample of 6oisson distributed random variables $ith parameter .
The method of moment estimate and the ma(imum li*elihood estimate of are
<
x
.
All the estimates above are unbiasedO Some >es but others Eo. $ill be discussed in class!
E(ample 14 E(ercise ).27!+ random sample of n bi*e helmets are selected.
-+ number among the n that are fla$ed =7,1,2,?..,n
p=6fla$ed!
a! @a(imum li*elihood estimate @IE! of p if n=27 and (=3O
b! 0s the estimator in a! unbiasedO
c! @a(imum li*elihood of 1&p!
4
none of the ne(t five helmets e(amined is fla$ed!O
d! 0nstead of selecting 27 helmets to e(amine, e(amine the helmets in succession until 3 fla$ed ones
are found. ,hat $ould be different in - and pO
E(ample 1) E(ercise ).22!+
-+ the proportion of allotted time that a randoml" selected student spends $or*ing on a certain aptitude
The pdf of ( is f(P!=

x ! 1 + , 7(1, Q&1.
A random sample of 17 students "ield the data+ 7.A2, 7.8A, 7.A7, 7.)4, 7.:), 7.;8, 7.83, 7.A8, 7.A;, 7.88.
a! #btain the @@E of and compute the estimate using the data.
2
1
2
! 1 ! 1 !
1
7
2 1
7
+
+
=

+
+ = + =
+

x
dx x x X E
Set E-!=
<
x
and then solve for .
The given data "ield
<
x
= 7.:7 then the method of moment estimator for is
: . 7 1
1 ! : . 7 2
1
1 2
<
<
1

=
x
x

=3
b! #btain the @IE of and compute the estimate using the data.
:
STAT 211
I=Ii*elihood=

= = =
+ = + =
n
i
i
n
n
i
i
n
i
i
x x x f
1 1 1
! 1 ! 1 !


lnI!=

=
+ +
n
i
i
x n
1
! ln ! 1 ln

=
+
+
=
n
i
i
x
n
d
L d
1
! ln
1
! ln

=7 then solve for .
The given data "ield =

=
n
i
i
x
1
! ln &2.;2A4 then the ma(imum li*elihood estimator for is
!! ;2A4 . 2
! ;2A4 . 2 17
! ln
! ln
1
1
^

+
=

=
=
n
i
i
n
i
i
x
x n
=3.11)1

Proposition: /nder ver" general conditions on the Koint distribution of the sample $hen the sample si9e
is large, the @IE of an" parameter is appro(imatel" unbiased and has a variance that is nearl" as small
as can be achieved b" an estimator.
A

You might also like