Ex 1
Ex 1
1. * Show that the following distributions belong to the 1-parameter exponential family and
determine the functions corresponding to A(θ), B(x), C(θ) and D(x), in the notation
of the lecture notes. Which of the families (a), (b), (c) and (d) is complete?
(a) X ∼ Poisson(θ). Here,
θx e−θ
P (X = x|θ) = , x = 0, 1, . . . , θ > 0.
x!
n!
P (X = x|θ) = θx (1 − θ)n−x , x = 0, 1, . . . , n, θ ∈ [0, 1].
x!(n − x)!
θ
fθ (x) = exp(−θ/x), x > 0, θ > 0.
x2
2. * For each of the four distributions (a), (b), (c), (d) stated in Question 1, calculate the
Fisher information for a single observation.
3. * Let X1 , . . . , Xn denote an IID sample. For each of the distributions (a)-(d) stated in
Question 1, use the Neyman Factorisation Theorem to determine a sufficient statistic.
In which of these cases (if any) is the sufficient statistic also minimal sufficient?
Xi ∼ N (ωi µ, 1), i = 1, . . . , n,
xθ−1 (1 − x)3
pX (x| θ) = 0 < x < 1 (θ > 0),
B(θ, 4)
where B(·, ·) is the Beta function. Write down the likelihood for the sample, and use
the Factorisation Theorem to find a sufficient statistic for θ.
7. Suppose that X1 , X2 , . . . , Xn are i.i.d. random variables each with probability density
function
3θ3
(
, x > 0;
pX (x| θ) = (θ+x)4
0, otherwise.
(a) Show that the above distribution is not a member of the exponential family.
(b) Find the Fisher Information about θ contained in a sample of size n.
8. Let X = (X1 , . . . , Xn ) denote a random sample from the discrete distribution pX (x|θ)
where x ranges over the integers. Prove that, under regularity conditions,
E[U (X)] = 0
and
∂ 2`
" #
Var[U (X)] = −E
∂θ2
where `(θ, X) is the log-likelihood based on the sample X1 , . . . , Xn and U (X) =
∂`(θ)/∂θ is the score statistic
Hint: a similar theorem was proved in the continuous case in the lectures. In the discrete
case, try replacing the integrals by suitable sums.
9. Consider an IID sample of size n from the distribution with probability mass function
Calculate the Fisher information in a sample of size n. Find a sufficient statistic for the
sample. Is this statistic minimal sufficient?
Hint: for the final part, you may appeal to a theorem stated in the lecture notes.
However, you should establish that the assumptions of the theorem are satisfied.
2
10. Let X1 , · · · , Xn be i.i.d. random variables with common probability mass function
1
pX (x|N ) = x = 1, 2, . . . , N.
N
Show that T = max{X1 , · · · , Xn } is sufficient for the family of joint probability mass
functions {pX (x|N ) : N ≥ 1} of (X1 , · · · , Xn ).
2 ni=1 Xi
P
.
n(n + 1)
12. * Let X1 , . . . , Xn denote a random sample from each of the distributions with the
following probability density functions,
In each case find (i) a suitable moment estimator of θ and (ii) the maximum likelihood
estimator of θ.