Chapter 1.1 Mle
Chapter 1.1 Mle
CHAPTER 1:Estimation
We may have several different choices for the point estimator of a parameter. For example,
if we wish to estimate the mean of a population, we might consider the sample mean, the
sample median or may be even the average of the smallest and the largest observations in
the sample as point estimator. Therefore, we need to examine their statistical properties
and develop some criteria for comparing estimators.
The methods of estimation to be discussed here are maximum likelihood estimation and
estimation by methods of moments.
Some might asked, why do we need to estimate parameters? A brief description to answer
the question is as follows:
The most widely accepted principle is the principle of Maximum Likelihood. The principle
involved here is choosing an estimate of the parameter, θ through the process of
n
maximizing the likelihood function L( x) f ( xi , ) which depends on the sample values
i 1
Note:
The method of maximum likelihood cannot be applied without the knowledge of
underlying distribution.
Joint pdf’s and likelihood functions look the same, but the two are interpreted
differently. A joint pdf defined for a set of n random variables is a multivariate
function of those random variables. In contrast, L is a function of ; it should be
considered a function of xi ' s .
There are a few situations where the equations
dL( ) d ln L( )
0 or 0 are not meaningful and do not yield solution for ˆ .
d d
In those cases, the MLE often turns out to be an order statistic, for reasons having to do
with range of the random variable.
Definition 1.1
Let X 1 , X 2 ,..., X n be a random sample from f ( x; ) , where θ is an unknown parameter.
The likelihood function , L( x) , is the product of the pdf f ( x; ) evaluated at n data
points. That is,
n
L( x) f ( xi , )
i 1
Example 1.1
A random sample of size n, X 1 , X 2 ,..., X n are taken from B(1,p) distribution with observed
Solution
Example 1.2
e x
f x; , x 0,1, 2,...
x!
Suppose that a random sample x1 , x2 ,..., xn is taken from the distribution. What is the
Solution
Example 1.3
It is known that a sample of 12, 11.2, 13.5, 12.3, 13.8, 11.9 comes from a population with
probability function
, x 1
f ( x; ) x 1
0
, otherwise
Solution
Example 1.4
Based on the random sample Y1 = 6.3, Y2 = 1.8, Y3 = 14.2 and Y4 = 7.6, use the method of
maximum likelihood to estimate the parameter in the uniform probability density function
1
f ( y, ) , 0 y
Solution
Example 1.5
1
f ( x, ) 2 x 2 , 0 y
Find an expression for , the maximum likelihood estimator for .
Solution
Note: Finding MLEs When More Than One Parameter is Unknown
MLEs for the i ' s requires the solution of a set of k simultaneous equations. If k = 2, for
example, we would need to solve the system
d ln L(1 , 2 )
0
d1
d ln L(1 , 2 )
0
d 2
Example 1.6
Solution
Theorem 1.1
Let ˆ ˆ( x) be the MLE of on the basis of observed values x1 , x2 ,..., xn of the random
ˆ * ( x) g (ˆ( x)) .
Hence,
From Example 1.1, since the MLE of p is x , therefore according to Theorem 1, the MLE of
p( 1 p ) is x( 1 x ) .