Lecture 3 2
Lecture 3 2
THEORY
DR. TRINH VAN CHIEN
CONTENT
▪ Signal space representation
▪ AWGN channel
▪ Receiver roles
▪ Orthonormal basis formulation
▪ Gram-Schmidt algorithm
▪ Signal space based on orthonormal basis (vector presentation)
▪ ML and MAP criterions
▪ Received signals and noise at the receiver side
▪ Decision based on vector formulation
▪ Detection with MAP criterion
▪ Detection with ML criterion
▪ Voronoi region
CHANNEL TRANSMISSION
N0 / 2
3
AWGN (1)
Gn ( f )
N0 / 2
N0
Gn ( f ) = N 0 / 2 Rn ( ) = ( )
2
4
AWGN (2)
Gn ( f )
N0 / 2
N0 N0
Rn ( ) = ( ) E n(t1 )n(t1 + ) = ( )
2 2
n(t) is an ergodic process
(temporal properties = statistical properties)
5
AWGN (3)
6
PROBLEM AT THE RECEIVER SIDE (1)
u T ⎯⎯
→ s(t ) ⎯⎯
→ r (t ) = s (t ) + n(t )
T T T r (t ) = s(t ) + n(t )
8
PROBLEM AT THE RECEIVER SIDE (3)
9
PROBLEM AT THE RECEIVER SIDE (4)
T T T T
Each transmitted signal s[n](t)
• has finite durationT
T T T T
11
PROBLEM AT THE RECEIVER SIDE (5)
13
PROBLEM AT THE RECEIVER SIDE (7)
s[0](t ) ⎯⎯
→ r[0](t ) = s[0](t ) + n[0](t )
For simplicity, omit the index [0]
s (t ) ⎯⎯
→ r (t ) = s (t ) + n(t )
14
PROBLEM AT THE RECEIVER SIDE (8)
In general
r (t ) S
15
RANDOM VARIABLES (1)
B = ( b j (t ) )
d
j =1
16
RANDOM VARIABLES (2)
T
n j = n(t )b j (t )dt
0
• average E[nj]=0
• variance σ2=N0/2
• Statistically independent
17
RANDOM VARIABLES (3)
T
n j = n(t )b j (t )dt
0
18
RANDOM VARIABLES (4)
T
n j = n(t )b j (t )dt
0
T T
E n j = E n(t )b j (t )dt = E n(t ) b j (t )dt = 0
0 0
19
RANDOM VARIABLES (5)
T
n j = n(t )b j (t )dt
0
• variance σ2=N0/2
• statistically independent
T T
T T
E n j ni = E n(t )b j (t )dt n( x)bi ( x)dx = E n(t )n( x )b j (t ) bi ( x )dtdx =
0 0 0 0
T T T T
N0
= E n(t )n( x) b j (t ) bi ( x)dtdx = (t − x)b j (t ) bi ( x)dtdx =
0 0 0 0
2
N0
T
N 0 / 2 if j = i
=
2 0 b j (t )bi (t )dt = 0 if j i
20
RANDOM NOISE IN THE SIGNAL SPACE
Let us introduce nS (t ) = n j b j (t )
j
In general n(t ) nS (t )
21
RANDOM NOISE EXTERNAL TO THE
SIGNAL SPACE
23
RECEIVED SIGNAL IN THE SIGNAL SPACE
We know that r (t ) S
B = ( b j (t ) )
d
j =1
24
RECEIVED SIGNAL IN THE SIGNAL SPACE
Define rS (t ) = rj b j (t ) Obviously rS (t ) S
j
In general r (t ) rS (t )
But r (t ) = s (t ) + n(t ) = s (t ) + nS (t ) + e(t )
S S
25
DECISION PROBLEM IN THE SIGNAL SPACE
ORIGINAL PROBLEM:
P1 given r(t)=s(t)+n(t) → recover s(t)
EQUIVALENT PROBLEM:
P2
given rS(t) =s(t)+nS(t) → recover s(t)
26
DECISION PROBLEM IN THE SIGNAL SPACE
27
DECISION PROBLEM: VECTORIAL
FORMULATION
PROBLEM:
P2 given rS(t) =s(t)+nS(t) → recover s(t)
Vector representation
28
DECISION PROBLEM: VECTORIAL
FORMULATION
rS (t ) = s (t ) + nS (t ) r = sT + n
T
r = (r1 ,..., rj ,..., rd ) rj = r (t )b j (t )dt
0
T
sT = ( s1 ,..., s j ,..., s d ) s j = s(t )b j (t )dt
0
T
30
RECEIVED VECTOR
rj = s j + n j
• Mean E[rj]=sj
• Variance σ2[rj]=N0/2
• Statistically independent
( E[r r ] =
i j s i s j = E[ri ]E[rj ] )
31
DECISION PROBLEM: VECTORIAL
FORMULATION
PROBLEM:
P2 given rS(t) =s(t)+nS(t) → recover s(t)
PROBLEM:
P3
given r=sT+n→ recover sT
IMPORTANT:
given r(t), the vector r is easy to compute
(the basis signals are known)
32
DECISION CRITERION
PROBLEM:
P3
given r=sT+n→ recover sT
33
DECISION CRITERION
34
DECISION CRITERION
PROBLEM:
P3
given r=sT+n→ recover sT
decision criterion
C1 s R = arg min P( s R sT | r = )
s i M
3
5
DETECTION
3
7
MAP CRITERION
d ( y ) = arg max P( X = x | Y = y )
x
38
MAP CRITERION
Proof
P( X ' X ) = P( X ' X , X = x, Y = y ) =
x y
= P( X ' X | X = x, Y = y )P ( X = x, Y = y ) =
x y
= P( X '( y ) x | X = x, Y = y )P( X = x | Y = y ) P (Y = y ) =
x y
= (1 − X '( y ), x ) P( X = x | Y = y ) P(Y = y )
y x
3
9
ML CRITERION
P(Y = y | X = x) P( X = x)
Bayes theorem P( X = x | Y = y ) =
P(Y = y)
d ( y ) = arg max P( X = x | Y = y )
x
1
For equiprobable hypothesis P( X = x) =
m
d ( y ) = arg max P(Y = y | X = x)
x
40
ML CRITERION
41
DETECTION PROBLEM AT THE RECEIVER
SIDE
42
DETECTION PROBLEM AT THE RECEIVER
SIDE
r = sT + n
43
GAUSSIAN DENSITY FUNCTION
• Mean µ
• Variance σ2
• density function:
1 ( − )2
fr ( ) = exp(− )
2 2 2
44
GAUSSIAN DENSITY FUNCTION
• Mean µ
• Variance σ2
• statistically independent
• density function:
1 ( 1 − )2 1 ( 2 − )2
f r1r2 ( 1 2 ) = exp(− ) exp(− )
2 2 2
2 2 2
1 ( 1 − )2 + ( 2 − ) 2
f r1r2 ( 1 2 ) = exp(− )
( 2 ) 2
2 2
45
GAUSSIAN DENSITY FUNCTION
f r ( | sT = s i )
• Mean µ=sij
• Variance σ2=N0/2
• Statistically independent
• density function d
1
j =1
( j − sij ) 2
f r ( | sT = s i ) = exp(− )
( N0 ) d
N0
46
ML CRITERION
C2
47
ML CRITERION
d 2
( j − sij )
1 − j =1
sR = arg max exp
s i M ( N ) d N0
0
d
sR = arg min ( j − sij ) 2
si M
j =1
48
MINIMUM DISTANCE CRITERION
d
sR = arg min ( j − sij )2
si M j =1
49
MINIMUM DISTANCE CRITERION
50
VORONOI REGION
a received signal sR M
V (si ) = R d : s R = si
51
VORONOI REGION
52
VORONOI REGION CRITERION
NOTE:
C 4 given r = if V ( s) select sR = s
53