0% found this document useful (0 votes)
22 views5 pages

Practice Session 3 With Answers

The document discusses exercises related to hypothesis testing of distributions. Exercise 1 considers a composite test of an exponential distribution parameter. Exercise 2 considers composite tests of an exponential distribution parameter including one-sided, two-sided and equality tests. Exercise 3 considers finding a uniformly most accurate upper confidence bound for a Poisson process intensity parameter given a number of observed events.

Uploaded by

Mds Dms
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views5 pages

Practice Session 3 With Answers

The document discusses exercises related to hypothesis testing of distributions. Exercise 1 considers a composite test of an exponential distribution parameter. Exercise 2 considers composite tests of an exponential distribution parameter including one-sided, two-sided and equality tests. Exercise 3 considers finding a uniformly most accurate upper confidence bound for a Poisson process intensity parameter given a number of observed events.

Uploaded by

Mds Dms
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

LSTAT2050 - Analyse statistique II - Practice

session 3

Exercise 1 - composite test - problem 2


Let X1 , . . . , Xn ∼ pθ (x) so that pθ (x) = exp[θ − x]1[θ,∞] (x), where θ ∈ R. We want to test
iid

H0 : θ ≥ θ0 vs. H1 : θ ≤ θ0 . Does a UMP test exist ? If so, what is it ?

Solution
By Lehmann theorem (theorem 4.2), there exists a UMP test for one-sided problems provided
that pθ has the monotone likelihood ratio property (MLRP). Consider θ0 < θ00 , the likelihood
ratio is given by
Qn
pθ00 (x) exp[θ00 − Xi ]1[Xi ≥ θ00 ]
= Qi=1
n
i=1 exp[θ − Xi ]1[Xi ≥ θ ]
pθ0 (x) 0 0

exp [nθ00 − ni=1 Xi ] 1[X(1) ≥ θ00 ]


P
=
exp [nθ0 − ni=1 Xi ] 1[X(1) ≥ θ0 ]
P

1[X(1) ≥ θ 00 ]
= exp[n(θ00 − θ0 )]
1[X(1) ≥ θ 0 ]

Now, note that exp[n(θ00 − θ0 )] > 0. The following gure helps us conclude that pθ has the
MLRP (non-decreasing) with respect to X(1) .

pθ00 (x)
pθ0 (x)

0
exp[n(θ00 − θ0 )] 0

X(1)
θ0 θ00

Now, given the direction of our test (H1 : θ ≤ θ0 ), Lehmann theorem tell us that the UMP test
is of the form W = {X(1) < k0 }, where k0 is such that Pθ0 (X(1) < k0 ) = α. Using the properties
of the maximum, we have that Pθ0 (X(1) < k0 ) = 1 − exp[n(θ0 − k0 )]. Hence, k0 is obtained as
log(1 − α)
k0 = θ0 −
n

LSTAT2050 - Analyse statistique II - Practice session 3 1


Exercise 2 - composite test - problem 1,2,5
Let X1 , . . . , Xn ∼ Exp(λ) so that pλ (x) = λ1 exp − λx .
iid 

a) Find a UMP test at level α for testing H0 : λ ≤ λ0 vs. H1 : λ > λ0 .


b) Find a UMP test at level α for testing H0 : λ ≥ λ0 vs. H1 : λ < λ0 .
c) Find a UMP test at level α for testing H0 : λ = λ0 vs. H1 : λ 6= λ0 . (Hint : in this scenario,

you are not supposed to derive the precise critical values of the test. However, try to simplify

the most the conditions that the critical values must satisfy )

Solution

a) UMP for H0 : λ ≤ λ0 vs. H1 : λ > λ0

By Lehmann theorem (theorem 4.2), there exists a UMP test for one-sided problems provided
that pθ has the monotone likelihood ratio property (MLRP). In exponential family, where
pθ (x) = C(θ)h(x) exp(Q(θ)T (x)), this boils down to checking whether Q(θ) is strictly monotonic
in θ. If it happens, pθ has the MLRP with respect to T (X). Here, we have that Q(λ) = − λ1
strictly increasing and T (X) = i=1 Xi
Pn
Using Lehmann theorem, we have that W = {T (X) > c} with c : Pλ0 (T (X) > c) = α is an α-
level UMP test. Let us work out the probability we just dened. Noting that T (X) ∼0 Ga(n, λ0 ),
H

we take c = q1−α where q1−α is the 1 − α quantile of a Ga(n, λ0 ). In conclusion, our α-level
UMP test is given by
( n )
X
W = Xi > q1−α
i=1

b) UMP for H0 : λ ≥ λ0 vs. H1 : λ < λ0

We can undertake the same reasoning as in subquestion a), except that we now reject H0
provided that T (X) is suciently small. Formally, the rejection region is provided by W =
{T (X) < c}, with c : Pλ0 (T (X) < c) = α. Hence, c = qα , where qα is the α-quantile of a
Ga(n, λ0 ). In conclusion, our α-level UMP test is given by
( n )
X
W = Xi < qα
i=1

c) UMP for H0 : λ = λ0 vs. H1 : λ 6= λ0

Using theorem 4.5 and remark 4.2, our UMPU test is of the form W (X) = {T (X) < c1 or T (X) >
c2 } with Pλ0 (W (X)) = α and Eλ0 [W (X)T (X)] = αEλ0 [T (X)]. In our case, we need
n
! n
!
X X
P Xi < c1 +P Xi > c2 =α
i=1 i=1
" n " n
## " n
" n
##
X X X X
E Xi 1 Xi < c1 +E Xi 1 Xi > c2 = nλ0 α
i=1 i=1 i=1 i=1

LSTAT2050 - Analyse statistique II - Practice session 3 2


Our aim now is to rewrite these two expressions in simpler terms. We rst point out that a
Ga(n, λ0 ) distribution where n ∈ N is an Erlang(n, λ0 ) distribution, whose density is given by
h i
xn−1 exp − λx0
p(x) =
(n − 1)!λn0
We can use this to work out the second condition.
h i h i
Z c1 n
t exp − λt0 Z ∞ tn
t
exp − λ0
nλ0 α = dt + dt
0 (n − 1)!λn0 c2 (n − 1)!λn0
h i h i
Z c1 tn exp − t Z ∞ tn exp − t
λ0 λ0
= nλ0 n+1 dt + nλ0 n+1 dt
0 n!λ0 c1 n!λ0
= nλ0 P (T ∗ < c1 ) + nλ0 [1 − P (T ∗ < c2 )]

where T ∗ ∼ Erlang(n + 1, λ0 ). Consequently, we can rewrite both conditions as


P (T < c1 ) + 1 − P (T < c2 ) = α
P (T ∗ < c1 ) + 1 − P (T ∗ < c2 ) = α

where T ∼ Erlang(n, λ0 ), the CDF of which is given by


n−1    n
X 1 t t
P (T ≤ t) = 1 − exp −
k=0
k! λ0 λ0

Subtracting condition 2 from condition 1, we obtain


   n    n
1 c1 c1 1 c2 c2
− exp − + exp − =0
n! λ0 λ0 n! λ0 λ0
which, after some rounds of computation, can be expressed as
1
[c2 − c1 ] − n[log(c2 ) − log(c1 )] = 0
λ0

Exercise 3 - UMA upper condence bound


Consider a Poisson process {N (t), t ≥ 0}, meaning that N (0) = 0 with probability 1 and
N (t) ∼ P oi(λt) for any other value of t. Find a uniformly most accurate upper condence
bound at level α for the intensity rate λ when a certain amount m of events have occured, i.e.
N (t) = m.

Hint : The waiting time between two events of this Poisson process is Exp(λ)-distributed.
Hence, you can rephrase N (t) = m as a sample of waiting times.

Solution
We know that the waiting time between two events coming from a Poisson process behaves as
an Exp(λ) variable. Since we know that m events have occured, we are facing a sample of m
waiting times T1 , . . . , Tm ∼ Exp(λ).
iid

LSTAT2050 - Analyse statistique II - Practice session 3 3


In this context, we want to nd a UMA upper condence interval for λ, i.e. we need to nd b(T )
so that P (λ ∈ [0, b(T )]) = 1 − α. Given the equivalence between tests and condence regions,
this boils down to nd a UMP test for H0 : λ ≤ λ0 vs. H1 : λ > λ0 . Such UMP test has been
derived at exercise 4 of the previous practice set and its rejection region is given by
( m )
X
W (T ) = Ti > q1−α
i=1

where q1−α is the 1 − α quantile of a Ga(m, λ0 ).


The only challenge we are left with lies in transforming this test into a condence region. The
acceptance region is then given by
( m
)
X
A(λ) = λ:0< Ti < q1−α
i=1

In order to nd a pivot, we can exploit the fact that if X ∼ Ga(k, θ), then 2θX ∼ χ2k . As such,
we have
m
!
X
1−α=P 0< Ti < q1−α
i=1
m
!
X
=P 0 < 2λ Ti < 2λq1−α
i=1
m
!
X
=P 0 < 2λ Ti < χ2m;1−α
i=1
χ2m;1−α
 
=P 0 < λ < Pm
2 i=1 Ti
where χ2m;1−α is the 1 − α quantile of a χ2m . Hence, a 1 − α UMA condence region for λ is
given by
χ2m;1−α
 
0, Pm
2 i=1 Ti

Exercise 4 - simple test - problem 0 - supplementary exercise


This exercise draws a link between UMP tests in problem 0 and the search of an optimal binary
classier.

Let Y ∈ {0, 1} be a binary outcome (e.g. healthy or ill). Based on a set of continuous decision
variables X1 , . . . , Xp , we want to develop a rule for predicting whether Y = 0 or Y = 1.

Call d0 the decision to predict Y = 0 and d1 the decision to predict Y = 1, we dene the
probability of false positive as

F P P (d1 ) = P (Y = 0|d1 ).
Similarly, we dene the probability of true positive as
T P P (d1 ) = P (Y = 1|d1 ).
We dene an optimal decision rule as the output of the following procedure.

LSTAT2050 - Analyse statistique II - Practice session 3 4


1. We x F P P (d1 ) to α.
2. We maximize T P P (d1 ).
Assume that P (Y = 1|X1 , . . . , Xp ) = H(β1 X1 + . . . , +βp Xp ), where H is an increasing
function. Show that the optimal rule is of the form : d1 if β1 X1 + . . . + βp Xp > k .

Solution
We can visualize our problem as a test where H0 = {Y = 0} and H1 = {Y = 1}. By Neyman-
Pearson theorem (theorem 4.1), the optimal test is such that H0 is rejected (i.e. we predict
Y = 1) if the following is met

pH1 (x1 , . . . , xp )
>c
pH0 (x1 , . . . , xp )

Besides, we have
pH1 (x1 , . . . , xp ) p(x1 , . . . , xp |Y = 1)
=
pH0 (x1 , . . . , xp ) p(x1 , . . . , xp |Y = 0)
P (Y = 1|x1 , . . . , xp )P (Y = 0)
=
P (Y = 0|x1 , . . . , xp )P (Y = 1)
H(xT β)P (Y = 0)
=
[1 − H(xT β)]P (Y = 1)

Using the fact that H is increasing, we obviously have


pH1 (x1 , . . . , xp )
> c ⇔ β1 x1 + . . . + βp xp > k
pH0 (x1 , . . . , xp )

which leads us to the desired result.

LSTAT2050 - Analyse statistique II - Practice session 3 5

You might also like