0% found this document useful (0 votes)
25 views53 pages

Lecture 3 2

This document discusses signal detection in additive white Gaussian noise channels. It introduces key concepts such as: - Representing signals using an orthonormal basis in signal space - Projecting the received signal and noise onto the basis to obtain random variables - The noise components in signal space and external to signal space are independent random variables - Each interval of the received signal can be analyzed independently without intersymbol interference - The goal at the receiver is to recover the transmitted signal from the received signal, which is corrupted by noise

Uploaded by

Trường Hoàng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views53 pages

Lecture 3 2

This document discusses signal detection in additive white Gaussian noise channels. It introduces key concepts such as: - Representing signals using an orthonormal basis in signal space - Projecting the received signal and noise onto the basis to obtain random variables - The noise components in signal space and external to signal space are independent random variables - Each interval of the received signal can be analyzed independently without intersymbol interference - The goal at the receiver is to recover the transmitted signal from the received signal, which is corrupted by noise

Uploaded by

Trường Hoàng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

LECTURE 3-2: DECISION

THEORY
DR. TRINH VAN CHIEN
CONTENT
▪ Signal space representation
▪ AWGN channel
▪ Receiver roles
▪ Orthonormal basis formulation
▪ Gram-Schmidt algorithm
▪ Signal space based on orthonormal basis (vector presentation)
▪ ML and MAP criterions
▪ Received signals and noise at the receiver side
▪ Decision based on vector formulation
▪ Detection with MAP criterion
▪ Detection with ML criterion
▪ Voronoi region
CHANNEL TRANSMISSION

white Gaussian noise n(t)


• ergodic random process
• each random variable is a Gaussian random variable
with zero average

• constant spectral density Gn(f)=N0/2


Gn ( f )

N0 / 2

3
AWGN (1)
Gn ( f )

N0 / 2

N0
Gn ( f ) = N 0 / 2 Rn ( ) =  ( )
2

4
AWGN (2)
Gn ( f )

N0 / 2

N0 N0
Rn ( ) =  ( ) E  n(t1 )n(t1 +  )  =  ( )
2 2
n(t) is an ergodic process
(temporal properties = statistical properties)
5
AWGN (3)

Fixed two different instant times t1 and t2


the random variables
t1 ⎯⎯
→ n(t1 )
t2 ⎯⎯
→ n(t2 )
are Gaussian random variables with
N0
E[n(t1 )n(t2 )] =  (t1 − t2 )
2
Statistically independent

6
PROBLEM AT THE RECEIVER SIDE (1)

u T ⎯⎯
→ s(t ) ⎯⎯
→ r (t ) = s (t ) + n(t )

PROBLEM: given r(t) → recover s(t)

Divide r(t) in segments of duration T

r (t ) = (r[0](t ) | r[1](t ) | ... | r[ n](t ) | ...


T T T
7
PROBLEM AT THE RECEIVER SIDE (2)

Is it possible to independently analyze any single interval?

r (t ) = (r[0](t ) | r[1](t ) | ... | r[ n](t ) | ... we have

T T T r (t ) = s(t ) + n(t )

s(t ) = ( s[0](t ) | s[1](t ) | ... | s[n](t ) | ...

n(t ) = (n[0](t ) | n[1](t ) | ... | n[ n](t ) | ...

8
PROBLEM AT THE RECEIVER SIDE (3)

Consider the n-th interval nT  t  (n + 1)T

r[n](t ) = s[n](t ) + n[n](t )

Each r[n](t) certainly depends on

• the corresponding transmitting signal s[n](t)


• the noise random variables extracted at nT  t  (n + 1)T

9
PROBLEM AT THE RECEIVER SIDE (4)

s(t ) = ( s[0](t ) | s[1](t ) | ... | s[m](t ) | ...... | s[n](t ) | ...

T T T T
Each transmitted signal s[n](t)
• has finite durationT

• is statistically independent with respect to any other


transmitted signal s[m](t), mn

→ r[n](t) is independent from s[m](t), mn


10
PROBLEM AT THE RECEIVER SIDE (5)

n(t ) = (n[0](t ) | n[1](t ) | ... | n[ m](t ) | ...... | n[ n](t ) | ...

T T T T

Each random variable n(ti) is statistically independent

→r[n](t) is independent from n[m](t), mn

11
PROBLEM AT THE RECEIVER SIDE (5)

Each r[n](t) only depends on

• the corresponding transmitting signal s[n](t)


• the noise random variables extracted at

Each interval can be independently analyzed

NO INTERSYMBOL INTERFERENCE (ISI)

r (t ) = (r[0](t ) | r[1](t ) | ... | r[ n](t ) | ...


T T T
12
PROBLEM AT THE RECEIVER SIDE (6)

Each interval can be independently analyzed

Let us focus on the first one, for 0t T

r (t ) = (r[0](t ) | r[1](t ) | ... | r[ n](t ) | ...


T

13
PROBLEM AT THE RECEIVER SIDE (7)

Let us consider the first interval 0t T

s[0](t ) ⎯⎯
→ r[0](t ) = s[0](t ) + n[0](t )
For simplicity, omit the index [0]

s (t ) ⎯⎯
→ r (t ) = s (t ) + n(t )

PROBLEM: given r(t) → recover s(t)

14
PROBLEM AT THE RECEIVER SIDE (8)

The transmitted signal s(t) certainly belongs to the signal space S

Does the received signal r(t) belong to S ?


r (t ) = s(t ) + n(t )
This depends on n(t).

In general, n(t) will be a generic signal not belonging to S: n(t )  S

In general
r (t )  S
15
RANDOM VARIABLES (1)

We know that n(t )  S


Let us try to project the noise on the basis signals.

B = ( b j (t ) )
d

j =1

The j-th projection is:


T
n j =  n(t )b j (t )dt
0

16
RANDOM VARIABLES (2)
T
n j =  n(t )b j (t )dt
0

It is easy to show that these components nj are

Gaussian random variables

• average E[nj]=0
• variance σ2=N0/2
• Statistically independent

17
RANDOM VARIABLES (3)
T
n j =  n(t )b j (t )dt
0

• Gaussian random variables:

Obtained from a linear transformation of a Gaussian process

18
RANDOM VARIABLES (4)
T
n j =  n(t )b j (t )dt
0

• Average value E[nj]=0

T  T
E  n j  = E   n(t )b j (t )dt  =  E  n(t )  b j (t )dt = 0
0  0

19
RANDOM VARIABLES (5)
T
n j =  n(t )b j (t )dt
0
• variance σ2=N0/2
• statistically independent

T T
 T T 
E  n j ni  = E   n(t )b j (t )dt  n( x)bi ( x)dx  = E    n(t )n( x )b j (t ) bi ( x )dtdx  =
0 0  0 0 
T T T T
N0
=   E  n(t )n( x)  b j (t ) bi ( x)dtdx =    (t − x)b j (t ) bi ( x)dtdx =
0 0 0 0
2
N0
T
 N 0 / 2 if j = i
=
2 0 b j (t )bi (t )dt =  0 if j  i
20
RANDOM NOISE IN THE SIGNAL SPACE

Given n(t) we have computed the projections on the basis


signals: T
n j =  n(t )b j (t )dt
0

Let us introduce nS (t ) =  n j b j (t )
j

Clearly, n(t)  S : it is the portion of n(t) belonging to S

In general n(t )  nS (t )

21
RANDOM NOISE EXTERNAL TO THE
SIGNAL SPACE

We have n(t ) = nS (t ) + e(t )

e(t) = portion of n(t) external to S

Fixed the time instant t = t*

nS(t*) and e(t*)


statistically independent
(Proof for exercise)
22
RANDOM NOISE EXTERNAL TO THE
SIGNAL SPACE
Proof
E  nS (t*)e(t*) = 0 = E  nS (t*)  E e(t*) 

nS(t*) and e(t*)


statistically independent

The noise added outside the signal space is


statistically independent

23
RECEIVED SIGNAL IN THE SIGNAL SPACE

We know that r (t )  S

Let us project r(t) on the basis signals.

B = ( b j (t ) )
d

j =1

The j-th projection is:


T
rj =  r (t )b j (t )dt
0

24
RECEIVED SIGNAL IN THE SIGNAL SPACE

Define rS (t ) =  rj b j (t ) Obviously rS (t )  S
j

In general r (t )  rS (t )
But r (t ) = s (t ) + n(t ) = s (t ) + nS (t ) + e(t )
S S

Then r (t ) = rS (t ) + e(t ) with rS (t ) = s (t ) + nS (t )

25
DECISION PROBLEM IN THE SIGNAL SPACE

ORIGINAL PROBLEM:
P1 given r(t)=s(t)+n(t) → recover s(t)

EQUIVALENT PROBLEM:
P2
given rS(t) =s(t)+nS(t) → recover s(t)

The only difference is e(t):


noise (external to S) which is
statistically independent with respect to
both s(t) and nS(t)

26
DECISION PROBLEM IN THE SIGNAL SPACE

➢ rS(t) is a sufficient statistics for solving the problem

➢ It is sufficient to work in the signal space S

➢ All the other (infinite) dimensions do not carry useful


information, but only noise

27
DECISION PROBLEM: VECTORIAL
FORMULATION

PROBLEM:
P2 given rS(t) =s(t)+nS(t) → recover s(t)

The three signals belong to S

Vector representation

28
DECISION PROBLEM: VECTORIAL
FORMULATION

rS (t ) = s (t ) + nS (t ) r = sT + n

T
r = (r1 ,..., rj ,..., rd ) rj =  r (t )b j (t )dt
0

T
sT = ( s1 ,..., s j ,..., s d ) s j =  s(t )b j (t )dt
0
T

n = (n1 ,..., n j ,..., nd ) n j =  n(t )b j (t )dt


0
29
RECEIVED VECTOR

The received vector r (in the signal space) is given by


r = sT + n

Where sT = ( s1 ,..., s j ,..., sd )  M is the transmitted signal

and n = (n1 ,..., n j ,..., nd ) is the noise vector (added in


the signal space)

For each component we have rj = s j + n j

30
RECEIVED VECTOR

rj = s j + n j

The components rj are

Gaussian random variables with

• Mean E[rj]=sj
• Variance σ2[rj]=N0/2
• Statistically independent

( E[r r ] =
i j s i s j = E[ri ]E[rj ] )
31
DECISION PROBLEM: VECTORIAL
FORMULATION
PROBLEM:
P2 given rS(t) =s(t)+nS(t) → recover s(t)

PROBLEM:
P3
given r=sT+n→ recover sT

IMPORTANT:
given r(t), the vector r is easy to compute
(the basis signals are known)

32
DECISION CRITERION

PROBLEM:
P3
given r=sT+n→ recover sT

At the receiver side, given r

we want to choose a received signal sR  M

Goal: make the right choice: s R = sT


Unfortunately this in not always possible, due to noise.

33
DECISION CRITERION

Given r we have to establish a decision criterion which


determines the choice of sR

Minimization of symbol (signal) error probability


PS (e) = P( s R  sT )

34
DECISION CRITERION
PROBLEM:
P3
given r=sT+n→ recover sT

Let us suppose to receive a given r=ρ Rd

→ we choose sR  M such that PS(e) is minimum

decision criterion

C1 s R = arg min  P( s R  sT | r =  ) 
s i M

3
5
DETECTION

Problem of deciding which one, among a set of mutually


exclusive alternatives, is correct.
➢ A random variable X with m possible sample values with
known a priori probability P(X=x)

➢ We observe another random variable Y which is connected to


X by known probabilities P(Y=y|X=x) called likelihoods
When an experiment is performed, two samples
xX and yY are extracted.
The decision maker observes y but not x.
Let us suppose to receive a given r=ρ Rd
3
→ we choose sR  M such that PS(e) is minimum 6
DETECTION

Given y , the decision maker makes a decision d(y)=x’


The decision is correct if x’=x

Decision criterion adopted for choosing d(y):

Maximization of correct decision P(x’ = x)


=
Minimization of wrong decision P(x’  x)

3
7
MAP CRITERION

It is easy to show that the decision criterion must be


a MAXIMUM A POSTERIORI (MAP) criterion

d ( y ) = arg max  P( X = x | Y = y ) 
x

38
MAP CRITERION

Proof
P( X '  X ) =  P( X '  X , X = x, Y = y ) =
x y

=  P( X '  X | X = x, Y = y )P ( X = x, Y = y ) =
x y

=  P( X '( y )  x | X = x, Y = y )P( X = x | Y = y ) P (Y = y ) =
x y

 
=    (1 −  X '( y ), x ) P( X = x | Y = y ) P(Y = y )
y  x 

X '( y ) = arg min z  (1 −  z , x ) P( X = x | Y = y ) = arg max x P( X = x | Y = y)


x

3
9
ML CRITERION
P(Y = y | X = x) P( X = x)
Bayes theorem P( X = x | Y = y ) =
P(Y = y)
d ( y ) = arg max  P( X = x | Y = y ) 
x

d ( y ) = arg max  P(Y = y | X = x) P( X = x) 


x

1
For equiprobable hypothesis P( X = x) =
m
d ( y ) = arg max  P(Y = y | X = x) 
x

40
ML CRITERION

MAXIMUM LIKELIHOOD (ML) criterion

d ( y ) = arg max  P(Y = y | X = x) 


x

41
DETECTION PROBLEM AT THE RECEIVER
SIDE

Random variable X  transmitted signal sT  M

Observed variable Y  received signal r = sT + n  S

42
DETECTION PROBLEM AT THE RECEIVER
SIDE
r = sT + n

Connection between r and sT f r (  | sT = s i )

This is a Gaussian density function centered around s i with


variance N0/2 in each dimension

43
GAUSSIAN DENSITY FUNCTION

Example: sinlgle Gaussian random variable r

• Mean µ
• Variance σ2
• density function:
1 (  −  )2
fr ( ) = exp(− )
2 2 2

44
GAUSSIAN DENSITY FUNCTION

Example: pair of Gaussian random variables r1 r2

• Mean µ
• Variance σ2
• statistically independent
• density function:
1 ( 1 −  )2 1 ( 2 −  )2
f r1r2 ( 1 2 ) = exp(− ) exp(− )
2 2 2
2 2 2

1 ( 1 −  )2 + ( 2 −  ) 2
f r1r2 ( 1 2 ) = exp(− )
( 2 ) 2
2 2

45
GAUSSIAN DENSITY FUNCTION

f r (  | sT = s i )

r = Array of d Gaussian random variables

• Mean µ=sij
• Variance σ2=N0/2
• Statistically independent
• density function d

1
j =1
(  j − sij ) 2
f r (  | sT = s i ) = exp(− )
(  N0 ) d
N0

46
ML CRITERION

d ( y ) = arg max  P(Y = y | X = x) 


x

For our problem becomes:

given r =  choose s R = d (  ) = arg max  f r (  | sT = s i ) 


si M

C2

47
ML CRITERION

Using the expression of f r (  | sT = s i )

  d 2  
   (  j − sij )  
 1  − j =1  
sR = arg max exp
s i M  (  N ) d  N0  
   
0

   

d
sR = arg min  (  j − sij ) 2
si M
j =1

48
MINIMUM DISTANCE CRITERION

d
sR = arg min  (  j − sij )2
si M j =1

By introducing the Euclidean distance between vectors in Rd:


d
d E2 (  − si ) =  (  j − sij ) 2
j =1

We have: sR = arg min d E2 (  − s i )


s i M

49
MINIMUM DISTANCE CRITERION

The ML criterion is equivalent to a


minimum distance criterion

C3 given r =  choose s R = arg min d E (  − si )


2
s M i

50
VORONOI REGION

given r =  choose sR = arg min d E2 (  − s i )


si M

This decision criterion associates to any vector   R


d

a received signal sR  M

We can introduce the


Voronoi (decision) region V ( s i )
=set of all received vectors which determine
the choice sR = si


V (si ) =   R d : s R = si 
51
VORONOI REGION

Set of all received vectors


which determine the choice sR = si
When do we have sR = si ?

When   R d is nearest to s than to all other constellation


signals
V ( si ) = {  R d : d E2 (  , s i )  d E2 (  , s) s  M }

52
VORONOI REGION CRITERION
NOTE:

If we receive  V (si ) We certainly choose sR = s i

The minimum distance criterion


given r =  choose sR = arg min d E2 (  − s i )
si M

can be expressed as a Voronoi region criterion

C 4 given r =  if   V ( s) select sR = s
53

You might also like