0% found this document useful (0 votes)
57 views4 pages

HW 4

This document appears to be a homework assignment for a detection and estimation theory course taught by Dr. Ali Olfat at the University of Tehran in Spring 2020. The homework contains 5 problems related to hypothesis testing, likelihood ratios, and minimum error probability decision rules. Problem 1 involves detecting whether a parameter θ is equal to 0 or some known positive value A based on observations of a signal sequence and noise. Problem 2 considers detecting whether an additional term θ is present based on observations with random +1/-1 values. Problems 3-5 involve further applications and extensions of hypothesis testing and deriving minimum error decision rules. The homework was due on February 1, 2020.

Uploaded by

mohammad hoseyni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views4 pages

HW 4

This document appears to be a homework assignment for a detection and estimation theory course taught by Dr. Ali Olfat at the University of Tehran in Spring 2020. The homework contains 5 problems related to hypothesis testing, likelihood ratios, and minimum error probability decision rules. Problem 1 involves detecting whether a parameter θ is equal to 0 or some known positive value A based on observations of a signal sequence and noise. Problem 2 considers detecting whether an additional term θ is present based on observations with random +1/-1 values. Problems 3-5 involve further applications and extensions of hypothesis testing and deriving minimum error decision rules. The homework was due on February 1, 2020.

Uploaded by

mohammad hoseyni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Detection and Estimation Theory

University of Tehran
Instructor: Dr. Ali Olfat Spring 2020
Homework 4
Due : 99/2/1

Problem 1
Consider the model
1
Yk = θ 2 sk Rk + Nk , k = 1, 2, ..., n
where s1 , s2 , . . . , sn is a known signal sequence, θ ≥ 0 is a constant, and R1 , R2 , . . . , Rn , N1 , N2 , . . . , Nn
are i.i.d. N (0, 1) random variables.

(a) Consider the hypothesis pair

H0 :θ = 0
H1 :θ = A

where A is a known positive constant. Describe the structure of the Neyman-


Pearson detector.

(b) Consider now the hypothesis pair

H0 :θ = 0
H1 :θ > 0

Under what conditions on s1 , s2 , . . . , sn does a UMP test exist?

(c) For the hypothesis pair of part 2 with s1 , s2 , . . . , sn general, find the locally most
powerful detector.
2

Problem 2
Suppose we have observations Yk = Nk + θSk , k = 1, 2, ..., n, where N ∼ N (0, I) and
where S1 , S2 , ..., Sn are i.i.d. random variables, independent of N and each taking on
the values of +1 and −1 with equal probabilities of 21 .

(a) Find the likelihood ratio for testing H0 : θ = 0 versus H1 : θ = A , where A is


a known constant.

(b) For the case n = 1, find the Neyman-Pearson rule and corresponding detection
probability for false alarm probability α ∈ (0, 1), for hypotheses of part a.

(c) Is there a UMP test of H0 : θ = 0 versus H0 : θ 6= 0 in this model? If so, why


and what is it? If not, why not? Consider the cases n = 1 and n > 1 separately.

Problem 3
The distribution of ri on the two hypotheses is (ri are independent under both hy-
potheses)
ri |Hk ∼ N (mk , σk2 ), i = 1, 2, ..., N andk = 0, 1

(a) Find the LRT. Express the test in terms of the following quantities :
N
X
Iα = ri
i=1
XN
Iβ = ri2
i=1

(b) Draw the decision regions in the Iα , Iβ -plane for the case in which

2m0 =m1 > 0


2σ1 =σ0

(c) For the special m0 = 0 and σ1 = σ0 , compute the ROC.

Detection and Estimation Theory HW #4


3

Problem 4
Consider the following ternary hypothesis testing problem with two- dimensional
observation Y = (Y1 , Y2 )T :

H0 : Y = N, H1 : Y = s + N, H2 : Y = −s + N

where s = √1 (1, 1)T and the noise vector N is Gaussian N (0, Σ) with covariance
2
matrix
1 14
 
1
4
1

(a) Assuming that all hypotheses are equally probable, show that the minimum
error probability rule can be written as

 0 sT Σ−1 y ≥ η
δ ∗ (y) = 1 −η ≤ sT Σ−1 y ≤ η
2 sT Σ−1 y ≤ −η

(b) Specify the value of η that minimizes the error probability and find the minimum
error probability.

(c) Assuming now that we are free to choose the signal s subject to the constraint
||s||2 ≤ 1, comment on whether the preceding error probability can be improved
upon.

Problem 5
Consider the M -ary decision problem: (Γ = Rn )

H0 : Y = N + s0
H1 : Y = N + s1
.
.
.
HM −1 : Y = N + sM −1

where s0 , s1 , ..., sM −1 are known signals with equal energies, ||s0 ||2 = ||s1 ||2 = ... =
||sM −1 ||2 .

(a) Assuming N ∼ N (0, σ 2 I), find the decidion rule achieving minimum error prob-
ability when all hypothesis are equally likely.

Detection and Estimation Theory HW #4


4

(b) Assuming further that the signals are orthogonal, show that the minimum error
probability is given by:
Z ∞
1 2
pe = 1 − √ [Φ(x)]M −1 e−(x−d) /2 dx
2π −∞

where d2 = ||s0 ||/σ 2 .

Detection and Estimation Theory HW #4

You might also like