0% found this document useful (0 votes)
49 views10 pages

2013 EEE539 Final

Uploaded by

selinturhan22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views10 pages

2013 EEE539 Final

Uploaded by

selinturhan22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

1

Bilkent University

EEE-539 DETECTION AND ESTIMATION THEORY

Fall 2013 - Final Exam

January 6, 2013

Duration: 3 hours

Surname
Name
ID #
Signature

Question-1 (25 pts)


Question-2 (25 pts)
Question-3 (25 pts)
Question-4 (25 pts)
TOTAL (100 pts)
2

1) Let Y = Θ + N , where N is a uniform random variable between −0.5 and 0.5, and Θ is an unknown
amplitude. Given Y = y, consider the following hypothesis-testing problem:

H0 : Θ = 0 (1)
H1 : Θ ∈ (0, 1] (2)

where the aim is to design an α-level test.


Does a uniformly most powerful (UMP) test exist? If so, express its threshold as a function of α, and
express and plot its power as a function of Θ.
3
4

2) Consider n independent and identically distributed (i.i.d.) observations Y1 , . . . , Yn , each with uniform
distribution over the interval (0, θ). In addition, the prior distribution of the unknown parameter Θ is
specified by the following probability density function:
β ( α )β+1
w(θ) =
α θ
for θ > α, where α and β are known positive parameters. Obtain the MMSE, MMAE, and MAP estimators
for θ based on Y1 , . . . , Yn .
5
6

3) This question has two independent parts.


a) Consider n independent and identically distributed (i.i.d.) observations Y1 , . . . , Yn , each with the
following probability density function:
( )θ+1
θ µ
pθ (y) =
µ y
for y > µ, where µ is a known positive parameter, and θ is a positive unknown parameter.
i) Obtain the maximum likelihood estimator (MLE) for θ based on Y1 , . . . , Yn .
ii) Obtain the Cramer-Rao lower bound for estimating θ based on Y1 , . . . , Yn .
iii) Find a minimal sufficient statistic for estimating θ. (Please provide arguments to justify that it is in
fact a minimal sufficient statistic.)
7

b) Consider n independent and identically distributed (i.i.d.) observations Y1 , . . . , Yn , each with the
following probability density function:
{
λ e−λ(y−θ) , y > θ ≥ 0
pθ (y) =
0, otherwise
where λ > 0 is a known parameter.
i) Find a minimal sufficient statistic for estimating θ based on Y1 , . . . , Yn . (Please provide arguments
to justify that it is in fact a minimal sufficient statistic.)
ii) For this part, assume that n = 2. Find a function of the minimal sufficient statistic in Part (i) that
is an unbiased estimator for θ.
8
9

4) Consider a scalar observation that is modeled as Y = 2 Θ + N , where Θ is uniformly distributed


over the closed interval [0, 1], and N is a discrete random variable with the following probability mass
function:


0.25 , n = 1
pN (n) = 0.5 , n = −1


0.25 , n = −3
Assume that Θ and N are independent.
The aim is to obtain the optimal quadratic MMSE estimator for estimating Θ based on Y . In other
words, among all the estimators in the form of θ̂(y) = ay 2 + by + c, obtain the one that achieves the
minimum MSE for the model specified above.
10

You might also like