0% found this document useful (0 votes)
26 views13 pages

Point Estimation

Uploaded by

Aditha Buwaneka
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views13 pages

Point Estimation

Uploaded by

Aditha Buwaneka
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

P OINT E STIMATION

Dr. Romesh Thanuja

(U O M) CM2111 1 / 13
O UTLINE

1 I NTRODUCTION

2 M ETHOD OF F INDING E STIMATORS


Maximum Likelihood Estimator
Method of Moments

3 M ETHODS OF E VALUATING E STIMATORS


Mean Squared Error
Best Unbiased Estimators
Sufficiency and Unbiasedness

(U O M) CM2111 2 / 13
I NTRODUCTION

P OINT E STIMATION -I NTRODUCTION

In one common estimation scenario, we take a random sample from the distribution to elicit some
information about the unknown parameter θ. That is, we repeat the experiment n independent
times, observe the sample, X1 , X2 , ..., Xn , and try to estimate the value of θ by using the
observations x1 , x2 , ..., xn . The function of X1 , X2 , ..., Xn used to estimate θ—say, the statistic
u(X1 , X2 , ..., Xn )—is called an estimator of θ. We want it to be such that the computed estimate
u(x1 , x2 , ..., xn ) is usually close to θ. Since we are estimating one member of θ ∈ Ω, such an
estimator is often called a point estimator.

(U O M) CM2111 3 / 13
M ETHOD OF F INDING E STIMATORS M AXIMUM L IKELIHOOD E STIMATOR

OUTLINE

1 I NTRODUCTION

2 M ETHOD OF F INDING E STIMATORS


Maximum Likelihood Estimator
Method of Moments

3 M ETHODS OF E VALUATING E STIMATORS


Mean Squared Error
Best Unbiased Estimators
Sufficiency and Unbiasedness

(U O M) CM2111 4 / 13
M ETHOD OF F INDING E STIMATORS M AXIMUM L IKELIHOOD E STIMATOR

M AXIMUM L IKELIHOOD E STIMATION (MLE)

Let X1 , X2 , . . . , Xn be a random sample from a distribution then their joint likelihood is denoted by
f (x, θ) or f (x1 , x2 , . . . , xn ; θ). If X1 , X2 , . . . , Xn are independent then the joint density function (pdf)
or pmf can be written as
f (x, θ) = f (x1 ; θ)f (x2 ; θ), . . . , f (xn ; θ)
Maximum likelihood estimation is a method to estimate parameters and these parameter values
are fixed such that they maximize the likelihood function that the process produced the data that
was actually observed.

The joint pdf above when considered as a function of the unknown parameter θ instead of the
sample x, is known as the likelihood function.

L(θ) = f (x, θ) = f (x1 ; θ)f (x2 ; θ), . . . , f (xn ; θ)

Using the methods of calculus, we now find the values of θ that maximize L(θ).

(U O M) CM2111 5 / 13
M ETHOD OF F INDING E STIMATORS M ETHOD OF M OMENTS

OUTLINE

1 I NTRODUCTION

2 M ETHOD OF F INDING E STIMATORS


Maximum Likelihood Estimator
Method of Moments

3 M ETHODS OF E VALUATING E STIMATORS


Mean Squared Error
Best Unbiased Estimators
Sufficiency and Unbiasedness

(U O M) CM2111 6 / 13
M ETHOD OF F INDING E STIMATORS M ETHOD OF M OMENTS

M ETHOD OF M OMENTS (MOM)


Let X1 , X2 , . . . , Xn be a sample from a population with pdf or pmf f (x|θ1 , θ2 , . . . , θk ). Method of
moments estimators are found by equating the first k sample moments to the corresponding k
population moments, and solving the resulting system of simultaneous equations. More precisely,
define
1X 1 ′
m1 = Xi , µ1 = E(X 1 ),
n
1X 2 ′
m2 = Xi , µ2 = E(X 2 ),
n
..
.
1X k ′
mk = Xi , µk = E(X k ).
n
′ ′
The population moment µj will typically be a function of θ1 , θ2 , . . . , θk say µj (θ1 , . . . , θk ). The
method of moments estimator (θ˜1 , . . . , θ˜k ) of (θ1 , . . . , θk ) is obtained by solving the following
system of equations for (θ1 , . . . , θk ) in terms of (m1 , . . . , mk ) :

m1 = µ1 (θ1 , . . . , θk ),
..
.

mk = µk (θ1 , . . . , θk ).
(U O M) CM2111 7 / 13
M ETHODS OF E VALUATING E STIMATORS M EAN S QUARED E RROR

OUTLINE

1 I NTRODUCTION

2 M ETHOD OF F INDING E STIMATORS


Maximum Likelihood Estimator
Method of Moments

3 M ETHODS OF E VALUATING E STIMATORS


Mean Squared Error
Best Unbiased Estimators
Sufficiency and Unbiasedness

(U O M) CM2111 8 / 13
M ETHODS OF E VALUATING E STIMATORS M EAN S QUARED E RROR

M ETHODS OF E VALUATING E STIMATORS

In this section, we will consider some basic criteria for evaluating estimators.

M EAN S QUARED E RROR


The mean squared error (MSE) of an estimator W of a parameter θ is the function of θ defined by
Eθ (W − θ)2 .

MSE can be expressed as follows:

Eθ (W − θ)2 = Varθ W + (Eθ W − θ)2 = Varθ W + (Biasθ W )2 .

B IAS OF AN ESTIMATOR
The bias of a point estimator W of a parameter θ is the difference between the expected value of
W and θ; that is, Biasθ W = Eθ W − θ. An estimator whose bias is identically equal to 0 is called
unbiased and satisfies Eθ W = θ for all θ.

(U O M) CM2111 9 / 13
M ETHODS OF E VALUATING E STIMATORS B EST U NBIASED E STIMATORS

OUTLINE

1 I NTRODUCTION

2 M ETHOD OF F INDING E STIMATORS


Maximum Likelihood Estimator
Method of Moments

3 M ETHODS OF E VALUATING E STIMATORS


Mean Squared Error
Best Unbiased Estimators
Sufficiency and Unbiasedness

(U O M) CM2111 10 / 13
M ETHODS OF E VALUATING E STIMATORS B EST U NBIASED E STIMATORS

B EST U NBIASED E STIMATORS

U NIFORM M INIMUM VARIANCE U NBIASED E STIMATOR (UMVUE)


An estimator W ∗ is a best unbiased estimator of τ (θ) if it satisfies Eθ W ∗ = τ (θ) for all θ and, for
any other estimator W with Eθ W = τ (θ), we have Varθ W ∗ ≤ Varθ W for all θ. W ∗ is also called a
uniform minimum variance unbiased estimator (UMVUE) of τ (θ).

C RAMER -R AO I NEQUALITY
Let X1 , X2 , . . . , Xn be a sample with pdf f (x|θ) and let W (X ) = W (X1 , . . . , Xn ) be any estimator
satisfying
d
Z

Eθ W (X ) = [W (x)f (x|θ)]dx
dθ χ ∂θ

and
Varθ W (X ) < ∞.
Then
( d Eθ W (X ))2
Varθ (W (X )) ≥  dθ 

Eθ ( ∂θ log f (X |θ))2

(U O M) CM2111 11 / 13
M ETHODS OF E VALUATING E STIMATORS S UFFICIENCY AND U NBIASEDNESS

OUTLINE

1 I NTRODUCTION

2 M ETHOD OF F INDING E STIMATORS


Maximum Likelihood Estimator
Method of Moments

3 M ETHODS OF E VALUATING E STIMATORS


Mean Squared Error
Best Unbiased Estimators
Sufficiency and Unbiasedness

(U O M) CM2111 12 / 13
M ETHODS OF E VALUATING E STIMATORS S UFFICIENCY AND U NBIASEDNESS

S UFFICIENCY AND U NBIASEDSNES

R ECALL
If X and Y are any two random variables, then provided the expectations exist, we have

E(X ) = E[E(X |Y )],


Var (X ) = Var [E(X |Y )] + E[Var (X |Y )]

R AO -B LACKWELL T HEOREM
Let W be any unbiased estimator of τ (θ), and let T be a sufficient statistics for θ. Define
ϕ(T ) = E(W |T ). Then Eθϕ (T ) = τ (θ) and Varθϕ (T ) ≤ Varθ W for all θ; that is, ϕ(T ) is a
uniformly better unbiased estimator of τ (θ)

(U O M) CM2111 13 / 13

You might also like