0% found this document useful (0 votes)
9 views15 pages

Statistics - Lec08 - Point Estimation - Comparing

The document outlines key concepts in point estimation, including methods for deriving estimators such as the Method of Moments and Maximum Likelihood. It discusses properties of estimators like bias, variance, mean squared error, and consistency, providing examples to illustrate these concepts. The document serves as a lecture note for a Probability and Statistics II course, focusing on the evaluation of estimators' efficiency and their statistical properties.

Uploaded by

ezat49789
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views15 pages

Statistics - Lec08 - Point Estimation - Comparing

The document outlines key concepts in point estimation, including methods for deriving estimators such as the Method of Moments and Maximum Likelihood. It discusses properties of estimators like bias, variance, mean squared error, and consistency, providing examples to illustrate these concepts. The document serves as a lecture note for a Probability and Statistics II course, focusing on the evaluation of estimators' efficiency and their statistical properties.

Uploaded by

ezat49789
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 15

Probability and Statistics II

Dr. Ahmed Yahia


Engineering Mathematics and Physics Department, Faculty of Engineering, Alexandria
University
[email protected]
Point estimation- Introduction
General methods to derive
the estimators

The method The Method of


of moments Maximum Likelihood

Compare Is it a good estimation?

2
Outlines (Lec. 8)
CH3: Point estimation
Properties of estimators
⁻ Bias
⁻ Variance
⁻ Mean Squared Error
⁻ Consistency
Properties of estimators - Bias
True
probability value
Bias of the estimator 𝜽 density
function

𝐛 𝛉 =𝐄 𝛉 −𝛉 Bias
estimator B
estimator A

An estimator 𝜽 is said to be an unbiased estimator


of the parameter 𝜽 if 𝐛(𝛉)=0
𝐄 𝛉 =𝛉 𝐖𝐡𝐢𝐜𝐡 𝐢𝐬 𝐛𝐞𝐭𝐭𝐞𝐫?
4
Properties of estimators - Bias
Example 1
𝑋
If X has a binomial distribution with the parameters n and p, show that 𝑝 = is an
𝑛

unbiased estimator of p.
Note that: for binomial distribution E[X]=np
Properties of estimators - Variance
True
Minimum Variance Unbiased Estimator (MVUE) probability value
density
function
We would like to find the unbiased
estimator B
estimator that has the smallest variance. Variance
estimator A

Note:
q
• For an unknown parameter 𝜃 .
• It would be acceptable that the variance of the estimator does not exceed
a certain value 𝒌.
• If an unbiased estimator 𝜃 has 𝑽𝒂𝒓 𝜽 ≤ 𝒌  Then 𝜽 is MVUE .
12
Properties of estimators - MVUE
Example 3
If 𝑿𝟏 , 𝑿𝟐 , … , 𝑿𝒏 are a random sample from a population where
𝝁 = 𝟑 𝜷 and 𝝈𝟐 = 𝟑 𝜷𝟐
𝑿
Given 𝜷 = , Prove that 𝜷 is the minimum variance unbiased estimator
𝟑
for 𝜷 . If the reference upper bound on variance = 𝛽2

𝑿 1 1 1
𝐸 𝜷 =𝐸 = 𝐸 𝑿 = 𝜇 = 3𝛽 =𝛽 “Unbiased”
𝟑 3 3 3
𝑿 1 1 𝜎2 1 𝛽 2
𝑉𝑎𝑟 𝜷 = 𝑉𝑎𝑟 = 𝑉𝑎𝑟 𝑿 = 9 𝑛 = 9 𝑛 3 𝛽2 = 3 𝑛
𝟑 9
Assume another estimator 𝜷𝟐 = 𝑿𝟏 , which estimator you can consider as MVUE?
Properties of estimators - Mean Squared Error (MSE)
True True
probability value probability value
density density
function function

estimator B estimator B

estimator A estimator A

q q
Mean squared error:
𝟐 For unbiased  𝑴𝑺𝑬 𝜽 = 𝑽𝒂𝒓 𝜽
𝑴𝑺𝑬(𝜽) = 𝑬 𝜽 − 𝜽
𝟐 𝐖𝐡𝐢𝐜𝐡 𝐢𝐬 𝐛𝐞𝐭𝐭𝐞𝐫?
𝑴𝑺𝑬 𝜽 = 𝑽𝒂𝒓 𝜽 + 𝐛𝐢𝐚𝐬 Proof 15
Mean Squared Error (MSE)
Properties of estimators - Efficiency (Based on the MSE)
Two
estimators
𝜽𝟏 𝑴𝑺𝑬 𝜽𝟏
Unknown
parameter 𝜽 𝜽𝟐 𝑴𝑺𝑬 𝜽𝟐

The relative efficiency of 𝜽𝟐 to 𝜽𝟏 is

𝑴𝑺𝑬(𝜽𝟏 )
𝒆𝒇𝒇(𝜽𝟐 /𝜽𝟏 )=
𝑴𝑺𝑬(𝜽𝟐 )
If eff(𝜽𝟐 /𝜽𝟏 ) < 𝟏  𝜽𝟏 is more efficient than 𝜽𝟐
Properties of estimators - Efficiency (Based on the MSE)
Example 4

Suppose that 𝜃1 and 𝜃2 are estimators of the parameter 𝜃 . We know that


𝜃2 𝜃 𝜃2
𝐸 𝜃1 = 𝜃 , 𝑉𝑎𝑟 𝜃1 = and 𝐸 𝜃2 = , 𝑉𝑎𝑟 𝜃2 = . Which estimator is
3 3 9
more efficient? Find the relative efficiency.

𝜃2
MSE1=
3
2
𝜃2 𝜃 5
MSE2= + −𝜃 = 𝜃2
9 3 9

2 𝜃2
Note that: if 𝑉𝑎𝑟 𝜃1 =
3
Properties of estimators - Consistency
The estimator 𝜃 is called a consistent estimator of the parameter 𝜃 if and only if

True
probability value
density
function
𝑽𝒂𝒓 𝜽 → 𝟎 as 𝒏 → ∞

𝑬 𝜽 → 𝜽 as 𝒏 → ∞

q
Properties of estimators - Consistency
Example 6
𝑋
If X has a binomial distribution with the parameters n and p, show that 𝑝 = is a consistent
𝑛

estimator of p.
Note that: for binomial distribution E[X]=np, Var(X)= np(1-p)
Practice 1
Suppose that 𝜃1 and 𝜃2 are estimators of the parameter 𝜃 . We know that
15
𝐸 𝜃1 = 𝜃 , 𝑉𝑎𝑟 𝜃1 = and 𝐸 𝜃2 = 𝜃 , 𝑉𝑎𝑟 𝜃2 = 7 . Which estimator is
𝑛
more efficient? Check the consistency of both estimators.
Practice 2

If X has a normal distribution with the parameters 𝜇 and 𝜎 2 . Let 𝑋1 , 𝑋2 , … , 𝑋𝑛 be a random

𝑛 𝑥𝑖 −𝑥 2
sample from X. Show that the sample variance 𝑠2 = 𝑖=1 𝑛−1 is unbiased estimator of

2 𝑛 𝑥𝑖 −𝑥 2
𝜎 , while 𝑖=1 is a biased estimator for it.
𝑛

You might also like