0% found this document useful (0 votes)
206 views4 pages

Assignment 3 Solution Ver2

This document provides the solution to STAT2006 Assignment 3. It includes 5 questions covering topics such as probability density functions, method of moments estimation, maximum likelihood estimation, confidence intervals, and hypothesis testing. The solutions are presented in paragraph form with mathematical equations and explanations. Key results include the MLEs and confidence intervals for the parameters of several probability distributions as well as the minimum sample size needed for a desired confidence interval width.

Uploaded by

jam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
206 views4 pages

Assignment 3 Solution Ver2

This document provides the solution to STAT2006 Assignment 3. It includes 5 questions covering topics such as probability density functions, method of moments estimation, maximum likelihood estimation, confidence intervals, and hypothesis testing. The solutions are presented in paragraph form with mathematical equations and explanations. Key results include the MLEs and confidence intervals for the parameters of several probability distributions as well as the minimum sample size needed for a desired confidence interval width.

Uploaded by

jam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

STAT2006 Assignment 3 Solution

Due Date: 15:00, 24th March, 2015


1

1. (a) Note that the pdf of Y is fY (y) = 1 e y , y 0, > 0 and Y =




1
x 1
and the pdf of X, fX (x) =
exp
, x 1
2
2

y
1
X 1
. Therefore
=
2
x
2

(b) Note that E[Y ] = , V ar[Y ] = 2 . Therefore,


E[X] = E[1 + 2 Y ] = 1 + 2 E[Y ] = 1 + 2 , V ar[X] = V ar[1 + 2 Y ] = 22 V ar[Y ] = 2 22 .
n
n
1X
1X
2 and equate them with the theoretical moments:

Xi , V ,
(Xi X)
Denote X ,
n i=1
n i=1


1 + 2 = X

2 2
2 = V

V
1 =
X
2 = V /

i.e. The method-of-moments estimators for 1 , 2 are 1 = X


(c) As usual we can rewrite the likelihood function as
(
1
1
L(1 , 2 ; x1 , x2 , ..., xn ) = n n exp
2
2

n
X

V , 2 = V / respectively.

!)
xi n1

1{1 x(1) }

i=1

Therefore, when 1 >(x(1) , the indicator 1{


1 x(1) } = 0 L(1 , 2 ) = 0; when 1 x(1) ,
!)
n
X
n
1
L
= n+1 n+1 exp
xi n1
> 0 for any 2 > 0, i.e. it is strictly increasing
1
2 i=1
2
in 1 . Hence for any fixed 2 > 0, L is maximized when 1 = x(1) and thus the MLE of 1 , !
n
X
1

xi n1
1 = X(1) . On the other hand, note that ln L(1 , 2 ) = n ln n ln 2
2 i=1
for 1 x(1) . Differentiate the log-likelihood with respect to 2 and evaluate at 1 = x(1) , we
have
!
!

n
n
X
1
1X
ln L
n
n
= + 2
xi nx(1) = 2
xi x(1) 2
2 1 =x(1)
2 2 i=1
2 n i=1

x x(1) )/
> 0 if 0 < 2 < (
= 0 if
2 = (
x x(1) )/

< 0 if
2 > (
x x(1) )/
X(1) )/.
Therefore the MLE of 2 , 2 = (X
2. (a) The density of X is

f (x) =

x1

if 0 x
otherwise

The likelihood function is


L(, ; x1 , x2 , ..., xn ) = (

n
n Y 1
)
(
xi ) 1{X(n) }1{X(1) 0}
i=1

because the likelihood function is decreasing with respect to , X(n) is the MLE of .
1


n
X
ln L
n
= n ln x(n) +
ln xi = 0
=x(n)

i=1
=
=
Since

n
P
n ln x(n) ni=1 ln xi

2 ln L
n
=

<0
2
2

is the MLE of .
P
(b) x(n) = 25.0 , ni=1 ln xi = 43.95, from (a),
MLE = 12.59 , MLE = 25.0.
(c)

0.05 = P (X(n) / c) = P (allXi c) =

 0 n

= c 0 n

which implies that c = 0.05 0 n . Thus


0.95 = P (X(n) / > c) = P (X(n) /c > )
So { : < X(n) /(0.051/0 n )} is a 95% upper confidence limit for .
(d) From (b),
MLE = 12.59 and X(n) = 25.0, so the confidence interval is [25, 25/[0.051/(12.5914) ])
=[25, 25.43).
1
3. (a) MX (t) = E(etX ) = (1 t)P
, so
tW
2t/ n
Xi
i=1
MW (t) = E(e ) = E(e
) = E(e2t/X1 ) . . . E(e2t/Xn ) = (1 2t)n .
x
,
(b) [ 2 2n
(2n) 2
/2

2n
x
].
(2n)

1/2

(c) [46.049, 185.304].


4. (a) Note that 0 a < b as they are the quantiles of 2 (n 1). From the given constraint,




(n 1)S 2
1
2
1
1 = Pr a
b = Pr

2
b
(n 1)S 2
a
(r
)
r


2
2
2
(n 1)S 2
(n

1)S
(n

1)S
(n

1)S
= Pr
2
= Pr

b
a
b
a

2
2
(n

1)S
(n

1)S
So a confidence interval for 2 is in the form of
,
and a confidence
b #
a
"r
r
(n 1)S 2
(n 1)S 2
interval for is in the form of
,
.
b
a
r
r


(n 1)s2
(n 1)s2 p
1
1
2
The length of the latter interval is k =

= (n 1)s
a
b
a
b
n1
u
1
(b) Method 1: Note that the pdf of 2 (n 1) is g(u) = n1 n1 u 2 1 e 2 , u > 0
( 2 )2 2
Differentiate both sides of the constraint with respect to a, we have


ab
b
b
g(a)  a  n3
2
g(b)
g(a) = 0
=
=
e 2
a
a
g(b)
b

Using the results from part a),


k p
= (n 1)s2
a

1 b
3 +
3
2a 2
2b 2 a


=

p
b
(n 1)s2 e 2 
n

2b 2 a 2

n
2

a2

a e

n
2

2b

b e

Therefore for any local minimum, it must satisfy the condition for critical point:
a
b
n
n
k
= 0 a 2 e 2 b 2 e 2 = 0
a

Similarly, for the confidence interval of , we have k = (n 1)s

1 1

a b


and thus



b
n+1
b
1
k
(n 1)s2 e 2 n+1 a
1 b
2
= (n 1)s 2 + 2
=
(a 2 e 2 b 2 e 2 ) = 0
n+1
a
a
b a
b 2 a2
As a result, any local minimum (a, b) must satisfy
n+1
n+1
a
b
k
= 0 a 2 e 2 b 2 e 2 = 0
a

Method 2: The Lagrange function is




p
1
1
2
L = (n 1)s (G(a) G(b) (1 ))
a
b
. Differetiating the Lagrange function with respect to a, b and and set all of them to be zero:
p
L
3
1
(n 1)s2 a 2 g(a) = 0
a = p
2
3
L
= 12 (n 1)s2 b 2 + g(b) = 0
b
L
= G(b) G(a) (1 ) = 0

Therefore, by substituting g(a) and g(b) by the p.d.f of 2 (n 1) and substracting the first
equation by the second one above, we have a,b should satisfy
a

n+1
2

e 2 b

n+1
2

e 2 = 0.

For your interest:


Now we try to argue that there is a unique solution (a, b) to the system of

G(b) G(a) = 1
, and such solution minimize the length k subject to the
equations
a
n
b
n
a 2 e 2 b 2 e 2 = 0
constraint. First, ignore the degenerate cases of = 0, 1 which correspond to k 0 or
k +. Note that when a [0, 21 (n 1)], there exist a b satisfy the first equation. So we
first show that there exist some a [0, 21 (n 1)] satisfy the second equation. When a 0,
p
b
(n 1)s2 e 2
k
n
a
n
b
2
2
2
2
2
b (n 1) is finite, a e b e < 0,
+ and thus
, i.e. k is
n
3
a
2b 2 a 2
decreasing when a is close to the left end point. When b +, a 21 (n 1) > 0 is finite,
p
b
(n 1)s2 e 2
k
n
b
n
a
n
b

+, and a 2 e 2 b 2 e 2 > 0, and thus


+, i.e. k is
b 2 e 2 0, so
n
3
a
2b 2 a 2
increasing when a is close to the right end point. Therefore the global minimum is not located
on the two end points, but in the interior
of the interval,
i.e. the global minimum is a local


k
k
minimum. In other words, since
<0<
, by intermediate value theorem,
a a=0
a a=2 (n1)
1

k
= 0 in (0, 21 (n 1)). For the uniqueness of the
a
n
x
solution, consider the function f (x) = x 2 e 2 , n = 2, 3, 4, .... Then

> 0 if 0 < x < n


n n 1 x 1 n x
1 n 1 x
0
2
2
2
2
2
2
x=n
f (x) = x
e x e = x
e (n x) = 0 if

2
2
2
< 0 if
x>n

there exists a solution to the equation

i.e. f (a) < f (b) when a < b n and f (a) > f (b) when n a < b. From the above result,
there exists a0 < b0 such that f (a0 ) = f (b0 ). Then it must satisfy a0 < n < b0 . Suppose there
b
> 0, we also have b > b0 . If a (a0 , n], then
is another solution (a, b) with a > a0 . Since
a
f (a) > f (a0 ) = f (b0 ) > f (b); if a (n, +), then a < b f (a) > f (b). So it is impossible to
have such a solution. Similarly it is also impossible to have another solution with a < a0 . As a
result, the solution (a0 , b0 ) is unique.
h
i
5. Note that x z0.05 sn , x + z0.05 sn is an approximate 90% confidence interval for . Therefore,
[
x , x + ] is an approximate 90% confidence for if and only if  = z0.05 sn . Since z0.05 1.645,
s = 36,  = 6, we have the required sample size n 97.417. As n is an integer, the minimal
required sample size is 98.
6. (a) 
A two-sided 95% confidence interval for p1 p2 is
q
q
p1 )
p2 (1
p2 )
p1 (1
p1 )
+
,
p

+
z
+
p1 p2 z0.025 p1 (1
1
2
0.025
n1
n2
n1

p2 (1
p2 )
n2

[0.14069, 0.22637].
(b) 
Since n = 2000, y = y1 + y2 = 1100, p = y/n = 0.55, a two-sided confidence interval for p is
q
q
p(1
p)
p)
p z0.025
, p + z0.025 p(1
[0.5282, 0.5718]
n
n

You might also like