0% found this document useful (0 votes)
14 views

Problem Set4

The document discusses 5 problems related to detection theory. Problem 1 involves finding the Bayes rule and minimum risk for two hypotheses with given conditional densities. Problem 2 involves determining if a uniformly most powerful test exists for two hypotheses with given densities. Problem 3 involves finding the ROC of a likelihood ratio test with Gaussian observations. Problem 4 involves finding the minimum error probability decision rule for an M-ary detection problem with orthogonal signals. Problem 5 involves finding the optimal decision rule and minimum error probability for deciding among three hypotheses with additive Gaussian noise.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

Problem Set4

The document discusses 5 problems related to detection theory. Problem 1 involves finding the Bayes rule and minimum risk for two hypotheses with given conditional densities. Problem 2 involves determining if a uniformly most powerful test exists for two hypotheses with given densities. Problem 3 involves finding the ROC of a likelihood ratio test with Gaussian observations. Problem 4 involves finding the minimum error probability decision rule for an M-ary detection problem with orthogonal signals. Problem 5 involves finding the optimal decision rule and minimum error probability for deciding among three hypotheses with additive Gaussian noise.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE5112: Detection Theory

Problem Set 4

1. (Poor II.F.13) Consider the following Bayes decision problem. The conditional density of
the real observation Y given the real parameter Θ = θ is given by
(
θe−θy , y ≥ 0
pθ (y) =
0, y < 0.

Θ is a random variable with density


(
αe−αθ , θ≥0
w(θ) =
0, θ<0

where α > 0. Find the Bayes rule and minimum Bayes risk for the hypotheses

H0 : Θ ∈ (0, β) , Λ0
versus
H1 : Θ ∈ (β, ∞) , Λ1

where β > 0 is fixed. Assume the cost structure


(
1, θ ∈
/ Λi
C[i, θ] =
0, θ ∈ Λi .

2. (Poor II.F.15) Consider the hypothesis testing problem:


1
H0 : Y has density p0 (y) = e−|y| , y∈R
2
versus
1
H1 : Y has density pθ (y) = e−|y−θ| , y ∈ R, θ > 0.
2
(a) Describe the locally most powerful α-level test and derive its power function.
(b) Does a uniformly most powerful test exist? If so, find it and derive its power function.
If not, find the generalized likelihood ratio test for H0 versus H1 .

3. (Poor III.F.2) Suppose the random observation vector Y is given by

Yk = Nk + θsk , k = 1, 2, . . . , n,

where N is a zero-mean Gaussian random vector with E(Nk Nl ) = σ 2 ρ|k−l| for all 0 ≤
k, l ≤ n, |ρ| < 1 and s is a known signal vector.
(a) Show that the test
(
1, if Σnk=1 bk zk ≥ τ 0
δ(y) =
0, if Σnk=1 bk zk < τ 0

is equivalent to the likelihood ratio test for θ = 0 versus θ = 1, where


b1 = s1 /σ
p
bk = (sk − ρsk−1 )/σ 1 − ρ2 , k = 2, 3, . . . , n
z1 = y1 /σ
p
zk = (yk − ρyk−1 )/σ 1 − ρ2 , k = 2, 3, . . . , n.
(b) Find the ROCs of the detector from (a) as a function of θ/σ, ρ, n, and the false alarm
probability α.
4. (Poor III.F.3) Consider the M-ary decision problem (Γ = Rn )
H0 : Y = N + s0
H1 : Y = N + s1
.
.
.
HM −1 : Y = N + sM −1 ,
where s0 , s1 , . . . , sM −1 are known signals with equal energies, ks0 k2 = ks1 k2 = . . . =
2
sM −1 .
(a) Assuming N = N (0, σ 2 I), find the decision rule achieving minimum error probability
when all hypotheses are equally likely.
(b) Assuming further that the signals are orthogonal, show that the minimum error
probability is given by
Z ∞
1 (x−d)2
Pe = 1 − √ [Φ(x)]M −1 e− 2 dx
2π −∞
where d2 = ks0 k2 /σ 2 .
5. (Poor III.F.4) Consider the following three hypotheses about a sequence Y1 , Y2 , ...., Yn , of
real observations.
H0 : Yk = Nk − sk , k = 1, 2, . . . , n,
H1 : Yk = Nk , k = 1, 2, . . . , n,
and
H2 : Yk = Nk + sk , k = 1, 2, . . . , n;
where s1 , s2 , . . . , sN is a known signal sequence, and N1 , N2 , . . . , Nn is a sequence of i.i.d.
N (0, 1) random variables.
(a) Assuming that these three hypotheses are equally likely, find the decision rule mini-
mizing the average probability of error in deciding among the three hypotheses.
(b) Again assuming equally likely hypotheses, calculate the minimum average error prob-
ability for deciding among these hypotheses.

You might also like