0% found this document useful (0 votes)
32 views3 pages

Ex 1

The document contains 12 problems related to exponential families, sufficient statistics, and maximum likelihood estimation. The problems cover topics like determining if a distribution belongs to an exponential family, finding Fisher information and sufficient statistics, and deriving maximum likelihood estimators. Several of the problems require showing that a statistic is sufficient by using the Neyman Factorization Theorem or conditional independence.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views3 pages

Ex 1

The document contains 12 problems related to exponential families, sufficient statistics, and maximum likelihood estimation. The problems cover topics like determining if a distribution belongs to an exponential family, finding Fisher information and sufficient statistics, and deriving maximum likelihood estimators. Several of the problems require showing that a statistic is sufficient by using the Neyman Factorization Theorem or conditional independence.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

G13INF Exercises 1 2014/15

Problems marked by an asterisk * should be handed in to the module lecturer immediately


after the 10.00-11.00 lecture on Friday 17 October 2014.

1. * Show that the following distributions belong to the 1-parameter exponential family and
determine the functions corresponding to A(θ), B(x), C(θ) and D(x), in the notation
of the lecture notes. Which of the families (a), (b), (c) and (d) is complete?
(a) X ∼ Poisson(θ). Here,

θx e−θ
P (X = x|θ) = , x = 0, 1, . . . , θ > 0.
x!

(b) X ∼ Binomial(n, θ). Here,

n!
P (X = x|θ) = θx (1 − θ)n−x , x = 0, 1, . . . , n, θ ∈ [0, 1].
x!(n − x)!

(c) X ∼ Gamma(α, β) where α is known. Here, the pdf is given by


1 α α−1 −βx
fα,β (x) = β x e , x > 0, α, β > 0,
Γ(α)

where α should be treated as a known constant.


(d) In this case, X is a random variable whose distribution has pdf

θ
fθ (x) = exp(−θ/x), x > 0, θ > 0.
x2

2. * For each of the four distributions (a), (b), (c), (d) stated in Question 1, calculate the
Fisher information for a single observation.

3. * Let X1 , . . . , Xn denote an IID sample. For each of the distributions (a)-(d) stated in
Question 1, use the Neyman Factorisation Theorem to determine a sufficient statistic.
In which of these cases (if any) is the sufficient statistic also minimal sufficient?

4. Suppose that X1 , . . . , Xn are independent random variables with

Xi ∼ N (ωi µ, 1), i = 1, . . . , n,

where ω1 , . . . , ωn are known constants which are not all zero.


(i) Using the Neymam Factorisation Theorem, find a sufficient statistic T (X̄), where
X̄ = (X1 , . . . , Xn ).
(ii) Is T minimal sufficient?
(iii) Show that T has a 1-parameter exponential family distribution and write down the
functions A, B, C and D.
5. A random sample of size n is taken from the distribution with density function

xθ−1 (1 − x)3
pX (x| θ) = 0 < x < 1 (θ > 0),
B(θ, 4)

where B(·, ·) is the Beta function. Write down the likelihood for the sample, and use
the Factorisation Theorem to find a sufficient statistic for θ.

6. Suppose that X1 , X2 , . . . , Xn are i.i.d. N (5, σ 2 ).


Show that T = ni=1 (Xi − 5)2 is sufficient for σ 2 ,
P

(a) by using the Factorisation Theorem;


(b) by showing that the distribution of X1 , X2 , . . . , Xn conditional on T = t is inde-
pendent of σ 2 .

Is the sufficient statistic T complete? Give justification.

7. Suppose that X1 , X2 , . . . , Xn are i.i.d. random variables each with probability density
function
3θ3
(
, x > 0;
pX (x| θ) = (θ+x)4
0, otherwise.

(a) Show that the above distribution is not a member of the exponential family.
(b) Find the Fisher Information about θ contained in a sample of size n.

8. Let X = (X1 , . . . , Xn ) denote a random sample from the discrete distribution pX (x|θ)
where x ranges over the integers. Prove that, under regularity conditions,

E[U (X)] = 0

and
∂ 2`
" #
Var[U (X)] = −E
∂θ2
where `(θ, X) is the log-likelihood based on the sample X1 , . . . , Xn and U (X) =
∂`(θ)/∂θ is the score statistic
Hint: a similar theorem was proved in the continuous case in the lectures. In the discrete
case, try replacing the integrals by suitable sums.

9. Consider an IID sample of size n from the distribution with probability mass function

P (X = x|θ) = (1 − θ)θx x = 0, 1, . . . 0 < θ < 1.

Calculate the Fisher information in a sample of size n. Find a sufficient statistic for the
sample. Is this statistic minimal sufficient?
Hint: for the final part, you may appeal to a theorem stated in the lecture notes.
However, you should establish that the assumptions of the theorem are satisfied.

2
10. Let X1 , · · · , Xn be i.i.d. random variables with common probability mass function
1
pX (x|N ) = x = 1, 2, . . . , N.
N
Show that T = max{X1 , · · · , Xn } is sufficient for the family of joint probability mass
functions {pX (x|N ) : N ≥ 1} of (X1 , · · · , Xn ).

11. Let X1 , X2 , . . . , Xn be independent with Xi ∼ Poisson(iθ), i = 1, 2, . . . , n.


(i) Write down the likelihood for θ given a sample X1 = x1 , . . . , Xn = xn .
Pn
(ii) Show that i=1 Xi is a sufficient statistic for θ.
(iii) Show that an unbiased estimator of θ is given by

2 ni=1 Xi
P
.
n(n + 1)

(iv) Calculate the Fisher Information about θ contained in the sample.

12. * Let X1 , . . . , Xn denote a random sample from each of the distributions with the
following probability density functions,

(a) pX (x| θ) = θxθ−1 , 0 < x < 1, θ > 0.


(b) pX (x| θ) = exp(θ − x) θ ≤ x < ∞ θ ≥ 0.

In each case find (i) a suitable moment estimator of θ and (ii) the maximum likelihood
estimator of θ.

You might also like