0% found this document useful (0 votes)
17 views

inference assignment 3

The document contains a series of statistical inference problems focused on finding lower bounds for the variance of unbiased estimators, deriving uniformly minimum variance unbiased estimators (UMVUE), and determining sufficient statistics for various probability distributions. It includes problems related to random samples from different populations such as exponential, gamma, beta, and Pareto distributions. Additionally, it discusses completeness and minimal sufficiency of statistics in relation to specific distributions.

Uploaded by

Aman Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

inference assignment 3

The document contains a series of statistical inference problems focused on finding lower bounds for the variance of unbiased estimators, deriving uniformly minimum variance unbiased estimators (UMVUE), and determining sufficient statistics for various probability distributions. It includes problems related to random samples from different populations such as exponential, gamma, beta, and Pareto distributions. Additionally, it discusses completeness and minimal sufficiency of statistics in relation to specific distributions.

Uploaded by

Aman Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Statistical Inference

Test Set 3

1. Let X 1 , X 2 ,..., X n be a random sample from a population with density function


x x2 
f ( x=
) exp −  , x > 0, θ > 0. Find FRC lower bound for the variance of an
θ  2θ 
unbiased estimator of θ . Hence derive a UMVUE for θ .

2. Let X 1 , X 2 ,..., X n be a random sample from a discrete population with mass function
1−θ 1 θ
P( X = −1) = , P( X = 0) = , P( X =1) = , 0 < θ < 1.
2 2 2
Find FRC lower bound for the variance of an unbiased estimator of θ . Show that the
1
variance of the unbiased estimator X + is more than or equal to this bound.
2
3. Let X 1 , X 2 ,..., X n be a random sample from a population with density function
f ( x) =θ (1 + x) − (1+θ ) , x > 0, θ > 0. Find FRC lower bound for the variance of an
unbiased estimator of 1/ θ . Hence derive a UMVUE for 1/ θ .

4. Let X 1 , X 2 ,..., X n be a random sample from a Pareto population with density


βα β
f X ( x=
) , x > α , α > 0, β > 2. Find a sufficient statistics when (i) α is known, (ii)
x β +1
when β is known and (iii) when both α , β are unknown.

5. Let X 1 , X 2 ,..., X n be a random sample from a Gamma ( p, λ ) population. Find a sufficient


statistics when (i) p is known, (ii) when λ is known and (iii) when both p, λ are
unknown.

6. Let X 1 , X 2 ,..., X n be a random sample from a Beta (λ , µ ) population. Find a sufficient


statistics when (i) µ is known, (ii) when λ is known and (iii) when both λ , µ are
unknown.

7. Let X 1 , X 2 ,..., X n be a random sample from a continuous population with density


function
θ
=
f ( x) , x > 0, θ > 0. Find a minimal sufficient statistic.
(1 + x)1+θ

8. Let X 1 , X 2 ,..., X n be a random sample from a double exponential population with the
1 −| x −θ |
density f= ( x) e , x ∈ , θ ∈ . Find a minimal sufficient statistic.
2
9. Let X 1 , X 2 ,..., X n be a random sample from a discrete uniform population with pmf
1
p=
( x) = , x 1,  , θ , where θ is a positive integer. Find a minimal sufficient statistic.
θ
10. Let X 1 , X 2 ,..., X n be a random sample from a geometric population with pmf
(1 − p ) x −1 p , x =
f ( x) = 1, 2,  , 0 < p < 1 . Find a minimal sufficient statistic.

11. Let X have a N (0, σ 2 ) distribution. Show that X is not complete, but X 2 is complete.

12. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
n
x) λ e − λ x , x > 0, λ > 0. Show that Y = ∑ X i is complete.
f (=
i =1

13. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
) e µ − x , x > µ , µ ∈ . Show that Y = X (1) is complete.
f ( x=

14. Let X 1 , X 2 ,..., X n be a random sample from a N (θ , θ 2 ) population. Show that ( X , S 2 )


is minimal sufficient but not complete.
Hints and Solutions
2
x
1. log f ( x | θ ) = log x − log θ − .

2
 ∂ lo f g  X2 1 E ( X 4 ) E ( X 2 ) 1 8θ 2 2θ 1
2
1
E  = E  2 −  = − + 2 = − 3+ 2 = 2.
 ∂θ   2θ θ  4θ θ θ 4θ θ θ θ
4 3 4

θ 2
So FRC lower bound for the variance of an unbiased estimator of θ is .
n
1 θ2
Further T =
2n
∑ X i2 is unbiased for θ and Var (T ) =
n
. Hence T is UMVUE for θ .

1 1 1 + 4θ − 4θ 2
2. E ( X )= θ − . So T= X + is unbiased for θ . Var (T ) = .
2 2 4n
(θ − 1) −1 , x = −1
∂  ∂ log f 
2
 1
=log f ( x | θ ) = 0, x 0 So we get E   =
∂θ θ −1 ,  ∂θ  2θ (1 − θ )
 x = 1
2θ (1 − θ )
The FRC lower bound for the variance of an unbiased estimator of θ is .
n
2θ (1 − θ ) (1 − 2θ ) 2
It can be seen that Var (T ) − = ≥ 0.
n 4n
∂ 1
3. We have log f ( x | θ ) =− log(1 + x) . Further, E{log(1 + X )} = θ −1 , and
∂θ θ
 ∂ log f 
2
1
E{log(1 + X )} = 2
2θ . So E 
−2
 = 2 , and FRC lower bound for the variance of
 ∂θ  θ
θ2 1
an unbiased estimator of θ −1 is
n
.= Now T
n
∑ log(1 + X i ) is unbiased for θ −1 and
θ2
Var (T ) = .
n
4. Using Factorization Theorem, we get sufficient statistics in each case as below
n
 n

(i) ∏ X i ,(ii) X (1) (iii)  X (1) , ∏ X i 
i =1  i =1 

5. Using Factorization Theorem, we get sufficient statistics in each case as below


n n
 n n

(i) ∑ X i ,(ii) ∏ X i (iii)  ∑ X i , ∏ X i 
i =1 i =1  i =1 i =1 

6. Using Factorization Theorem, we get sufficient statistics in each case as below


n n
 n n

(i) ∏ X i ,(ii) ∏ (1 − X i ) (iii)  ∏ X i , ∏ (1 − X i ) 
i =1 i =1 =  i 1 =i 1 
n
7. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is ∏ (1 + X ) .
i =1
i

8. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is the order statistics


( X (1) ,  , X ( n ) ) .
9. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is the largest order
statistics X ( n ) .
n
10. Using Lehmann-Scheffe Theorem, a minimal sufficient statistic is ∑X
i =1
i .

11. Since E ( X ) = 0 for all σ > 0 , but but P( X= 0)= 1 for all σ > 0 . Hence X is not
1
complete. Let W = X 2 . The pdf= e − w/2σ , w > 0 .
2
of W is fW ( w)
σ 2π w

Eσ g (W ) 0 for all σ > 0 ⇒ ∫ g ( w) w−1/2 e − w/2σ dw =
Now= 0 for all σ > 0 . Uniqueness
2

of the Laplace transform implies g ( w) = 0 a.e. Hence X 2 is complete.


n
12. Note that Y = ∑ X i has a Gamma (n, λ ) distribution. Now proceeding as in Problem 11,
i =1
it can be proved that Y is complete.

( y ) n e n ( µ − x ) , x > µ , µ ∈ .
13. Note that the density of Y = X (1) is given by fY=
Eg (Y ) 0 for all µ ∈ 
=

⇒ ∫ n g ( y )e n ( µ − y ) dy =
0 for all µ ∈ 
µ

⇒ ∫ g ( y )e − ny dy =0 for all µ ∈ 
µ

Using Lebesgue integration theory we conclude that g ( y ) = 0 a.e. Hence Y is complete.

14. Minimal sufficiency can be proved using Lehmann-Scheffe theorem. To see that
 n 
( X , S 2 ) is not complete, note that E S 2  0 for all θ > 0.
X 2 −=
 n +1 
 n 2
However, P  =
X 2 S= 0.
 n +1 

You might also like