Stanford Stats 200

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

STATS 200 (Stanford University, Summer 2015)

Midterm Exam Sample Questions


This document contains some questions that are fairly representative of the content, style,
and difficulty of the questions that will appear on the midterm exam. Most of these questions
come from actual exams that I gave in previous editions of the course. Please keep the
following things in mind:
This document is much longer than the actual midterm exam will be.
All material covered in the lecture notes (up through page 1 of Lecture 6) is eligible
for inclusion on the midterm exam, regardless of whether it is covered by any of the
sample questions below.
Unlike this document, the actual exam paper will be printed with blank space between
the questions so that you can write your solution directly on the question paper itself.
1. Let X1 , . . . , Xn be iid discrete random variables, each with pmf

(x + 1) x

x+2
f (x) = (1 + )

if x {0, 1, 2, . . .},
if x {0, 1, 2, . . .},

where 0 is unknown, and where we take 00 = 1 so that f0 (0) = 1.


Note: It can be shown that E (X1 ) = 2 and Var (X1 ) = 2(1 + ). You may use these
facts without proof.
(a) Find the maximum likelihood estimator of .
(b)
Let n denote the maximum likelihood estimator of from part (a). Show that
n(n ) converges in distribution as n , and find the limiting distribution.
(c) Let 2 = Var (X1 ) = 2(1 + ). Find the maximum likelihood estimator of 2 .
Note: Assume the parameter space for 2 is [0, ), i.e., the parameter space for 2
is exactly what it logically should be.
(d)
Let
n2 denote the maximum likelihood estimator of 2 from part (b). Show that
n(
n2 2 ) converges in distribution as n , and find the limiting distribution.
2. Let X1 , . . . , Xn iid Poisson(), where > 0 is unknown. Let = 2 , and let
n

1
= ( Xi )
n i=1

be an estimator of . Find the bias of as an estimator of .


Note 1: You may express your answer either in terms of or in terms of .
Note 2: The mean and variance of the Poisson() distribution are both equal to . Also,
n
i=1 Xi Poisson(n). You may use these facts without proof. You do not need to know
the formula for the Poisson() or Poisson(n) pmf to solve the problem.

Midterm Exam Sample Questions

3. Let X1 , . . . , Xn iid Pareto(k, ) conditional on , where > 0 is unknown but k > 0 is


known. Let the prior on be Gamma(a, b), where a > 0 and b > 0 are known.
Note: The Pareto(k, ) distribution has pdf

f (x) = x+1

if x k,
if x < k,

and its mean is k/( 1) if > 1 (and if 1). The Gamma(a, b) distribution has pdf

ba a1

x exp(bx)

f (x) = (a)

if x > 0,
if x 0,

and its mean is a/b.


(a) Find the posterior distribution of .
Hint: It may help to remember that tc = exp(c log t) for any t > 0 and any c R.
(b) Find (or simply state) the posterior mean of .
4. Let X1 , . . . , Xn iid N (, 1), where R is unknown. Let a1 , . . . , an be constants such
that ni=1 ai = 0, and let Y = ni=1 ai Xi . Find a number b > 0 such that bY 2 has a chi-squared
distribution, and also state the degrees of freedom of this chi-squared distribution.
5. A sequence of random variables Yn is said to converge in probability to , written as
Yn P , if P (Yn M ) 0 for every M R. Let {Xn n 1} be a sequence of iid
continuous random variables with pdf f (x), where f (x) > 0 for all x R. Show that
max Xi P .
1in

6. Let X be a discrete random variable with pmf f (x), where R is unknown. Let
X = {x R f (x) > 0} denote the support of the pmf f (x), and suppose that X does
not depend on . Now suppose that we have a prior () such that the prior mean exists
and is finite, i.e.,
< () d < .

Show that the posterior mean E( X = x) exists and is finite for all data values x X .
Hints: For a sum to be finite, it is necessary (though not sufficient) for every term in the
sum to be finite. Also, since X does not depend on , the marginal distribution of X is
strictly positive for all x X , i.e., m(x) > 0 for all x X .
7. Construct an example of a sequence of random variables {Xn n 1}, a limiting random
variable X, and a set A R such that Xn D X, but P (X A) = 1 while P (Xn A) = 0
for all n 1.
Hint: If your example takes more than one or two lines to explain, then it is more
complicated than it needs to be.

Midterm Exam Sample Questions

8. Let X1 , . . . , Xn be iid random variables such that E,2 (X1 ) = and Var,2 (X1 ) = 2 are
both finite. However, suppose that X1 , . . . , Xn are not normally distributed. Define
Xn =

1 n
Xi ,
n i=1

Sn2 =

n
1 n
1
2
2
[ Xi2 n( X n ) ].
(Xi X n ) =
n 1 i=1
n 1 i=1

(a) Do we know for certain that X n and Sn2 are independent?


(b) Do we know for certain that (n 1)Sn2 / 2 has a 2n1 distribution?
(c) Do we know for certain that Sn2 is an unbiased estimator of 2 ?
(d) Do we know for certain that Sn2 is the maximum likelihood estimator of 2 ?
9. Let X1 , . . . , Xn iid Beta(1, ), where > 0 is unknown.
Note: The Beta(1, ) distribution has pdf

(1 x)1
f (x) =

if 0 < x < 1,
otherwise.

Also,
E (X1 ) =

1
,
1+

Var (X1 ) =

.
(1 + )2 (2 + )

You may use these facts without proof.


(a) Find the maximum likelihood estimator nMLE of .
Note: Recall that any logarithm of a number between 0 and 1 is negative.
(b) Do we know for certain that nMLE is an unbiased estimator of ?
(c) Let = 1/(1 + ) = E (X1 ), and define
n

X n = n1 Xi .
i=1

Do we know for certain that X n is an unbiased estimator of ?


(d) Define the estimator
1
n =
1.
Xn
Do we know for certain that n is an unbiased estimator of ?
Note: A simple of answer of Yes or No is good enough.
(e) Find the asymptotic distribution of n .
Note: Your answer should be a formal probabilistic result involving convergence in
distribution.

Midterm Exam Sample Questions

10. Let X1 , X2 , . . . be a sequence of Unif(0, 1) random variables. For each n 1, let Yn have
a Bin(m, xn ) distribution conditional on Xn = xn , where m 1 is an integer.
(a) Find E(Y1 ) and Var(Y1 ) (not conditional on X1 ).
Note: The Unif(0, 1) distribution has mean 1/2 and variance 1/12, and the Bin(m, )
distribution has mean m and variance m(1 ). You may use any of these facts
without proof.
(b) For each n 1, let Zn = ni=1 Yi . Find sequences of constants bn and cn such that
bn (Zn cn ) D N (0, 1).
11. Let X1 , X2 , . . . be a sequence of random variables, where each Xn has pdf
f

(Xn )

n exp(nx)
(x) =

if x 0,
if x < 0.

Prove that Xn P 0.
12. Let X1 , X2 , . . . be iid N (, 2 ) random variables, and let X n and Sn2 be the usual sample
mean and sample variance (respectively) of the first n observations, i.e.,
Xn =

1 n
Xi ,
n i=1

Sn2 =

1 n 2
1 n
n
2
2
(X ) .
(Xi X ) =
Xi
n 1 i=1
n 1 i=1
n1

(a) Show that Sn2 P 2 as n .


(b) Now suppose that X1 , X2 , . . . are iid with mean and variance 2 (both finite), but
their distribution is not normal. What additional conditions (if any) are needed on
this distribution for the result of part (a) to hold?
13. Let X1 , . . . , Xn be iid continuous random variables with pdf

2x exp(x2 )
f (x) =

if x 0,
if x < 0,

where > 0 is unknown. Suppose we assign a Gamma(a, b) prior to , where a > 0 and
b > 0 are known.
Note: The Gamma(a, b) distribution has pdf

ba

a1

(a) x exp(bx)
f (x) =

if x > 0,
if x 0,

and its mean is a/b. You may use these facts without proof.
(a) Find the posterior distribution of .
(b) Find (or simply state) the posterior mean of .

Midterm Exam Sample Questions

14. Let X and Y be discrete random variables with the following joint pmf:
f (X,Y ) (0, 0) = 0.1,

f (X,Y ) (0, 1) = 0.4,

f (X,Y ) (1, 0) = 0.3,

f (X,Y ) (1, 1) = 0.2,

with f (X,Y ) (x, y) = 0 for all other values of x and y. Find E(Y X = 0).
15. Let X1 , . . . , Xn be iid random variables with pdf

(x 1)2

exp[
]

2 x3
2x
f (x) =

if x > 0,
if x 0,

where > 0 is unknown. Find the maximum likelihood estimator of .


16. Let X1 , . . . , Xn be iid random variables with pdf

1
f (x) =

if < x < + 1,
otherwise,

where R is unknown. Show that a maximum likelihood estimator of exists but is


not unique.
17. Let X1 , . . . , Xn iid Poisson() conditional on , and let the prior on be Gamma(a, b).
Note: The Poisson() distribution has pmf
f (x) =

x exp()
x!

for x {0, 1, 2, . . .}

(zero for all other x),

with mean and variance . The Gamma(a, b) distribution has pdf


f (x) =

ba
xa1 exp(bx)
(a)

for x > 0

(zero for x 0),

with mean a/b and variance a/b2 .


(a) Find the posterior distribution of .
(b) Find (or simply state) the posterior mean of .
18. Let X1 , . . . , Xn be iid random variables with pdf

2x exp(x2 )
f (x) =

0
where > 0 is unknown.
(a) Find the maximum likelihood estimator of .

if x 0,
if x < 0,

Midterm Exam Sample Questions

(b) Now suppose that instead of > 0, we take the parameter space to be {1, 2}, i.e.,
it is known with certainty that either = 1 or = 2. Find the maximum likelihood
of estimator of under this new restriction.
19. An incorrect result and its incorrect proof are shown below.
(Incorrect) Result: Students t distribution with one degree of freedom is a
discrete distribution that takes values +1 and 1 with probability 1/2 each.
(Incorrect) Proof: Let Z N (0, 1). ThenZ 2 has a chi-squared distribution with
one degree of freedom, and hence T = Z/ Z 2 has a Students t distribution with
one degree of freedom. However, T = Z/ Z 2 = Z/Z, which is either +1 or 1
according to whether Z > 0 or Z < 0, each of which occurs with probability 1/2.
State (in one or two sentences) why this proof of this result is incorrect.
20. Let X be a single discrete random variable with pmf
f (0) = (1 )/2,

f (1) = 1/2,

f (2) = /2,

f (x) = 0 for all x {0, 1, 2},

where is unknown and 0 1. A maximum likelihood estimator of is

0 if X = 0,

(X) =

1 if X > 0.
(You do not need to show this.)
(a) Find the bias of (as an estimator of ).
(b) Let = [0, 1] = { R 0 1} denote the parameter space. Show that for every

unbiased estimator of , there exists x {0, 1, 2} such that (x)


. (The value
of x need not be the same for all unbiased estimators.)
21. Let X1 , . . . , Xn iid N (0, 2 ), where 2 > 0 is unknown. Suppose our prior pdf for 2 is

1 a+1
b
ba

(a) ( 2 ) exp( 2 )
2
( ) =

if 2 > 0,
if 2 0,

where a > 0 and b > 0.


Note: This is called the InverseGamma(a, b) distribution. Its mean is b/(a 1) if a > 1
(and if a 1). Its mode is b/(a + 1) (regardless of the value of a). You may use these
facts without proof.
(a) Find the posterior distribution of 2 .
(b) Find the posterior mean of 2 . (Be sure that your answer is correct for all possible
values of a > 0, b > 0, and n 1.)
(c) Find the posterior mode of 2 .

You might also like