0% found this document useful (0 votes)
6 views23 pages

Lectures Series 8a - Point Estimators

This document introduces statistical inference, focusing on methods for estimating population parameters such as mean, variance, and proportion using point estimators like Method of Moments (MM) and Maximum Likelihood (ML) estimators. It explains the importance of sample statistics in making inferences about populations and outlines the process of deriving estimators based on sample moments. Additionally, it provides examples to illustrate the application of these methods in real-world scenarios.

Uploaded by

Ciniso
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views23 pages

Lectures Series 8a - Point Estimators

This document introduces statistical inference, focusing on methods for estimating population parameters such as mean, variance, and proportion using point estimators like Method of Moments (MM) and Maximum Likelihood (ML) estimators. It explains the importance of sample statistics in making inferences about populations and outlines the process of deriving estimators based on sample moments. Additionally, it provides examples to illustrate the application of these methods in real-world scenarios.

Uploaded by

Ciniso
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion

etc. Method

Outline

1 Introduction - Statistical Inference

2 Appropriate Statistics for Estimation of Population


Mean, Variance and Proportion etc.

3 Methods of Finding Point


Estimators Method of Moments
(MM) Estimators

1 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Some Point Estimators (Chapter 7 of Montgomery and


Runger’s book)

What this lecture series is all about

In this lecture series, you will be introduced to the


two most common point estimators of population
parameters: MM Estimators and ML Estimators
Unit Objectives

At the end of this lecture series, you should be able


to :
appreciate fully that sample statistics are used to
estimate population parameters
state suitable point estimators for such population
parameters as the mean, variance, proportion, etc.
2 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

derive method-of-moments estimators of population


parameters for a population whose probability
distribution is given
derive maximum likelihood estimators of population
parameters for a population whose probability
distribution is given

3 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Introduction - What is Statistical Inference?

The importance of obtaining a “good” sample is that


statistical inference about a population is usually
made on the basis of a sample drawn from the
population.
Making conclusions on the basis of a sample is
the subject area called Statistical Inference.
Statistical Inference comprises two branches:
Estimation and Hypothesis Testing.
Estimation has in turn, two subsections : Point
Estimation and Interval Estimation.

4 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

5 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Preamble to the Issues Surrounding Statistical Inference

Statistical Inference arises as follows:


Let X be a random variable representing some
characteristic of a population. Let the probability
density or mass function of X, fX (x ; θ), have an
unknown parametric vector θ. The form of the
density, fX (x ; θ) is usually assumed to be known.
Let x1, . . . , xn be realizations of the random sample
X1, . . . , Xn so that X1, . . . , Xn are independent and
identically distributed (iid) random variables.
It is on the basis of the sample realizations that we
make statistical inference (in particular, statistical
estimation) of the parametric vector θ = (θ1 , · · · ,
θp )T . 6 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Example
1
Suppose X1 , ..., Xn is a random sample of heights of
people to be drawn. Let X be a random variable
representing the height of a randomly selected
individual following a normal distribution with unknown
mean µ and unknown variance σ2 , i.e. − µ
1 —1 ( x )
fX (x ; µ, σ 2 ) = √2π e 2
2 σ .
σ
Then the form of the density is obviously known to be
a normal distribution. The population parameters for
which statistical inference has to be made are θ1 = µ
and θ2 = σ2 so that
θ = (θ1 , θ2 )T .
If the heights of n=10 randomly selected people gave the
following realisations (in cm):
one
160,can calculate
157,188, 201,estimates
188, 173,of170, the 187,
parameters on the
165, 169 8/
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Example
2
Let X1, X2, ..., Xn be a random sample representing the
number of telephone calls received by a secretary per
an eight hour working day. If X is believed to be
modeled by a Poisson distribution with unknown
λ, =
−λ x
fparameter
X (x ) = P(X i.e.x ) = xe λ , then, obviously, the form
distribution
of the of X is known.
! However, inference has to
be drawn about λ which is unknown.
If the numbers of telephone calls in a sample of 12 1
hour intervals gave the following realisations :
16, 15,18, 20, 8, 3, 25, 6, 8, 45, 7, 6
then one can calculate an estimate of the parameter λ on
the basis of the sample data.

8 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Appropriate Statistics for Estimation Population Mean,


Variance and Proportion etc.

Statistical inference about a population parameter is


made on the basis of appropriate statistics which have
certain desirable properties.
Statistical Inference on the Population Mean

Statistical inference about the mean is usually done on


the basis of the sample mean,
n

X= Xi
n
i (1)
=1
Remark The sample mean is an unbiased
estimator of the population mean.

9 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Statistical inference on the population variance

Statistical inference on the population variance, σ2 is


usually done on the basis of the unbiased sample
variance, n
2 1 Σ 2
t (X1 , ..., Xn ) = S (Xi − (2
n−
= i X) . )
1 =1

10 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Inference on Population Proportion

Statistical Inference on the population proportion of


elements with a certain characteristic in a
population θ is based on a sample proportion.
A sample of size n (usually large) is drawn
from the population.
The n elements are then inspected one after
another and a determination of the number (X )
of elements with the characteristic out of the n
sample elements is done. the sample proportion
is
X
θˆ = .
(3)
n 11 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

The sample X1 , X2 , · · · , Xn are Bernoulli rv’s


such that
 if ith element does not the
 0
Xi = characteristic
 1 if ith elemenet has the
characteristic
i.e. the random sample X1 , X2 , · · · , Xn are Bernoulli
rv’s whose sum is a binomially distributed rv.

12 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Methods of finding Point Estimators

For any population parameter θ, one can find


numerous point estimators for it.
There are various methodologies of coming up
with point estimators of population parameters.
In this course, we discuss two methods of finding point
estimators, namely:
Method of Moments; and
1

2 Maximum Likelihood Method

Other estimators include the Bayesian Estimators


and Least Squares Estimators (popularly used in
Regression Analysis).

13 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Estimator and Estimate - what is the difference?

When we have a random sample X1, ..., Xn, then the


function
t (X1 , ..., Xn ) is an estimator.
If x1 , ..., xn are the actual sample realizations of
a drawn sample, then t (x1 , ..., xn ) is an estimate
of τ (θ).
An estimator is itself a random variable whilst an
estimate is a fixed value.
Example 3
Let x1, . . . , x5 be the realized sample values of random
variables X1, . . . , X5 where
5 the random variables are

drawn from a population
X= Xiwith
is anmean µ. Then
estimator
5
i of µ
=1 Σ 5
while µˆ = x 15 i xi is an estimate of 15 /
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Method of Moments Estimators

The first method and perhaps the most straightforward


method that we discuss is the Method-of-Moments
(MM).
Definition 1 be the density of a random variable X.
Let fX (x ; θ)
The r-th moment of a random variable X, denoted µ
′ ,is given by:
r
(4
µ r = E [X ]
′ r )

and the k-th moment of random variable X about its


mean, µ, denoted µ k is given by (5
k )
µ k = E [(X − µ) ].
Remark The k-th moment of a random variable X with
density
fX (x ; θ) is a function 16 /
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Example 4
Consider a random sample X1 , · · · , Xn obtained
from an exponential distribution with unknown
X i ∼ Expo(θ).
parameter θ, i.e.Then the first moment (i.e. k = ′11), µθ =
1
the second moment (k =
and
2),
1 1
µ 2′2 = var (X ) + [E (X )] =
2
+ θ2 = θ2

θ2
Example 5
Consider a random variable X ∼ N(θ1, θ2). Then the first
moment (i.e. k = 1), µ1′ = θ1 and the second moment (k
= 2),
µ 2′ = var(X) + [E (X )]2 = θ2 + θ1 2 , etc.
16 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Definition 2
Sample Moments Let X1 , ..., Xn be a sample of size n
and n

M j′ = Xij . (6
n
i )
=1
Then Mj′ is called the j-th sample moment of the random
sample (about the origin).

17 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

X ¯ is the first sample moment about


1Σ n 2 is the second sample moment about the
the
n i = 1X
origin
Remarki origin, etc.
Method-of-moments estimators of a p-
dimensional parameter θ can be found using the first p
sample moments of X; using a random sample X1 , ...,
Xn , the method-of-moments estimators are given by:
n
′ 1Σ
µˆr = Mr Xir , r = 1, ..., (7
n

= i p )
=1
i.e. the first p sample moments are taken as the
estimators of the population moments.

18 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Example 6
Let X1 , ..., Xn be a random sample obtained from a
uniform distribution over the interval [θ − 2, θ +
3]. Derive a
method-of-moments estimator of the parameter θ. Hence,
estimate the parameter θ using the method of moments
estimator for the data:
25.66, 27.33, 25.89, 26.35, 26.82, 23.65, 25.05, 25.98,
24.27 that E [X ] = θ2
shown
Now, . using
Solution:
+ 1 thepopulation
The first samplehasmoment about θ. It can be
one parameter,
the origin
easily as the
estimator for the first population moment
about the origin, we 1
θˆ +have
=X¯ ⇒ ˆθ = X − .
¯ 1 2
2
Thus, the method-of-moments estimate
ˆ =θx¯
for
θ is −12 = 25.67 −2 =
1
25.17. 20 /
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Remark

In the last example, since E [θˆ] = θ, we say that the


Method-of-Moments estimator for θ in Example (6) is an
unbiased estimator of θ.
Example 7
Let X1 , ..., Xn be a random sample obtained from the
binomial distribution with parameters (m, θ), where m
is known and θ is unknown. Find a method-of-
moments (MM) estimator of θ.
Solution:
Here, p =1 and E[X] = mθ. 1 Σ
n Xi = X
Thus, mθˆ n
i
= =1n
1 Σ X
θ = mn Xi = .
m
ˆ i 21 /
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Example 8
Let X1, ..., Xn be a random sample from N(µ, σ 2 ).
Find method-of-moments estimators of µ and
σ2 .

21 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Solution to the last example ....


Here p = 2 and θ = (µ, σ 2 ). E[X] = µ and
var (X ) = σ 2 = E [X 2 ] − µ 2 ⇒ E [X 2 ] = σ 2
+ µ 2.

Thus, µˆ Xi .
n n
= i
=1
Since σ + µ = E [X
2 2

] 1Σ
2
2
n Xi
2 2 n
⇒ σˆ + µˆ = i
! !
= 1n
1 Σ 2 n− 1 1 Σn
⇒ σˆ2 2
Xi − nX (Xi − X )2
n n n−
= i = i
=1 1 =1
= n− 1S
n
2
.
22 / 24
Introduction - Statistical Inference Appropriate Statistics for Estimation of Population Mean, Variance and Proportion
etc. Method

Method of Moments (MM) Estimators

Example 9
Let X1, X2, ..., Xn be a random sample from a gamma
distribution with parameters λ and r (where r is known),
that is
r−1 −λx
(8
fX (x ) = r x I(0,∞ ) (x )
Γ(r
λ ) )
e
Find a method-of-moments estimator
of λ. Solution:
∫ ∞ ∫
λr ∞ r −λx
E [X ] = −∞
x.f X (x )dx = x e dx
∫ ∞ Γ(r )
0 1 Γ(r +
λ r v e
r −v dv
=
Γ(r )λ r 0 == 1)
r λ λ Γ(r )
= .
λ

Thus, the MM estimator of rλ : ¯ r


λ ˆ X¯ 24 /

You might also like