inference assignment 3
inference assignment 3
Test Set 3
2. Let X 1 , X 2 ,..., X n be a random sample from a discrete population with mass function
1−θ 1 θ
P( X = −1) = , P( X = 0) = , P( X =1) = , 0 < θ < 1.
2 2 2
Find FRC lower bound for the variance of an unbiased estimator of θ . Show that the
1
variance of the unbiased estimator X + is more than or equal to this bound.
2
3. Let X 1 , X 2 ,..., X n be a random sample from a population with density function
f ( x) =θ (1 + x) − (1+θ ) , x > 0, θ > 0. Find FRC lower bound for the variance of an
unbiased estimator of 1/ θ . Hence derive a UMVUE for 1/ θ .
8. Let X 1 , X 2 ,..., X n be a random sample from a double exponential population with the
1 −| x −θ |
density f= ( x) e , x ∈ , θ ∈ . Find a minimal sufficient statistic.
2
9. Let X 1 , X 2 ,..., X n be a random sample from a discrete uniform population with pmf
1
p=
( x) = , x 1, , θ , where θ is a positive integer. Find a minimal sufficient statistic.
θ
10. Let X 1 , X 2 ,..., X n be a random sample from a geometric population with pmf
(1 − p ) x −1 p , x =
f ( x) = 1, 2, , 0 < p < 1 . Find a minimal sufficient statistic.
11. Let X have a N (0, σ 2 ) distribution. Show that X is not complete, but X 2 is complete.
12. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
n
x) λ e − λ x , x > 0, λ > 0. Show that Y = ∑ X i is complete.
f (=
i =1
13. Let X 1 , X 2 ,..., X n be a random sample from an exponential population with the density
) e µ − x , x > µ , µ ∈ . Show that Y = X (1) is complete.
f ( x=
θ 2
So FRC lower bound for the variance of an unbiased estimator of θ is .
n
1 θ2
Further T =
2n
∑ X i2 is unbiased for θ and Var (T ) =
n
. Hence T is UMVUE for θ .
1 1 1 + 4θ − 4θ 2
2. E ( X )= θ − . So T= X + is unbiased for θ . Var (T ) = .
2 2 4n
(θ − 1) −1 , x = −1
∂ ∂ log f
2
1
=log f ( x | θ ) = 0, x 0 So we get E =
∂θ θ −1 , ∂θ 2θ (1 − θ )
x = 1
2θ (1 − θ )
The FRC lower bound for the variance of an unbiased estimator of θ is .
n
2θ (1 − θ ) (1 − 2θ ) 2
It can be seen that Var (T ) − = ≥ 0.
n 4n
∂ 1
3. We have log f ( x | θ ) =− log(1 + x) . Further, E{log(1 + X )} = θ −1 , and
∂θ θ
∂ log f
2
1
E{log(1 + X )} = 2
2θ . So E
−2
= 2 , and FRC lower bound for the variance of
∂θ θ
θ2 1
an unbiased estimator of θ −1 is
n
.= Now T
n
∑ log(1 + X i ) is unbiased for θ −1 and
θ2
Var (T ) = .
n
4. Using Factorization Theorem, we get sufficient statistics in each case as below
n
n
(i) ∏ X i ,(ii) X (1) (iii) X (1) , ∏ X i
i =1 i =1
11. Since E ( X ) = 0 for all σ > 0 , but but P( X= 0)= 1 for all σ > 0 . Hence X is not
1
complete. Let W = X 2 . The pdf= e − w/2σ , w > 0 .
2
of W is fW ( w)
σ 2π w
∞
Eσ g (W ) 0 for all σ > 0 ⇒ ∫ g ( w) w−1/2 e − w/2σ dw =
Now= 0 for all σ > 0 . Uniqueness
2
( y ) n e n ( µ − x ) , x > µ , µ ∈ .
13. Note that the density of Y = X (1) is given by fY=
Eg (Y ) 0 for all µ ∈
=
∞
⇒ ∫ n g ( y )e n ( µ − y ) dy =
0 for all µ ∈
µ
∞
⇒ ∫ g ( y )e − ny dy =0 for all µ ∈
µ
14. Minimal sufficiency can be proved using Lehmann-Scheffe theorem. To see that
n
( X , S 2 ) is not complete, note that E S 2 0 for all θ > 0.
X 2 −=
n +1
n 2
However, P =
X 2 S= 0.
n +1