Lecture 2
Lecture 2
Lecture 2
Hard Decision
Receiver detects data before decoding
Soft Decision
Receiver quantizes received data and decoder uses
likelihood information to decode.
Magnitude of decision variable usually indicates
the likelihood that the detection is correct.
Maximum Likelihood Decoding (MLD).
Hard Decision vs Soft Decision
0’
0 0 0
0
1 1 1
1
1’
P( E ) = ∑ P (c' ≠ c | r ) P (r )
r
P (r | c ) = ∏ P ( ri | c i ) = p d ( r , c ) (1 − p ) n − d ( r , c ) (1)
i
log P ( r | c ) = d ( r , c ) log p + ( n − d ( r , c )) log( 1 − p ) (2)
fri(ri|-1) fri(ri|1)
-1 0 +1 ri
P(ci=1)/P(ci=-1)=fri(ri|ci=1)/fri(ri|ci=-1)
Example: HD vs SD
Consider Hamming (7,4) code used with the
following channels
0’
0 P(0’|0)=P(1’|1)=0.6
0 0
0 P(0|0)=P(1|1)=0.3
1 P(1|0)=P(0|1)=0.099
1 1
1 P(1’|0)=P(0’|1)=0.001
P(0|0)=P(1|1)=0.9
1’
P(1|0)=P(0|1)=0.1
(a) HD (b) SD
Example: HD vs SD
Suppose we receive r = 1’ 0 0’ 0 0’ 0’ 0’
For HD, there is no quantization, so r = 1000000 and
will be decoded as 0000000 using minimum
Hamming distance rule.
In the SD case, using (1) with c = 0000000, we get
P(r|c) = 0.000117
However, for c=1101000, we get P(r|c) = 0.000762
This means that for the given r, it is almost 7 times
more probable that 1101000 was transmitted than
0000000.
Errors and Channel Models
Memoryless channels: Noise affects each transmitted
symbol independently.
Tx symbol has probability p of being received incorrectly
and probability 1-p of being received correctly.
Transmission errors occur randomly in the received
sequence.
Memoryless channels are often referred to as random-error
channels.
Errors and Channel Models (2)
Examples
AWGN: ri = si+ni, E[ni]=0, E[ni2]=σn2 and
E[ninj]=0 for i≠j.
DMC: P[0|0]
0 0
P[1|0]
P[0|1]
1 P[1|1]
1
Errors and Channel Models (3)
Channels with memory
Errors do not occur randomly.
Either noise is not independent from
transmission to transmission (coloured noise)
Or slow time varying signal to noise ratio
causes time dependent error rates (fading
channels).
Errors and Channel Models (4)
Gilbert and Fritchman model
q1
Good Bad
1-q1 state State 1-q2
q2
Errors and Channel Models (5)
Channels with memory lead to error
bursts.
Burst-error correcting codes
Random error correcting codes with
interleaving-deinterleaving to randomize
errors.
Performance Measures
Probability of decoding error P(E).
Probability that codeword at output of decoder is not the
transmitted one.
Also referred to as word error rate (WER) or Block error
rate (BLER).
Bit error rate (BER) Pb
Probability that message bit at output of decoder is
incorrect.
Coding Gain (measured in dB)
Savings in transmitted power to achieve a specific BER
using coding compared to uncoded case
Performance Measures 2
Asymptotic coding gain
Coding gain when Eb/No → ∞
Performance Measures 3
1.00E-02
1.00E-03
1.00E-04
1.00E-05
BER
1.00E-06
1.00E-07 uncoded
coded
1.00E-08
1.00E-09
1.00E-10
6 7 8 9 10 11 12 13
Eb/No (in dB)
Coded Modulation
Use of ECC creates bandwidth expansion due to
redundant symbols.
Combining ECC and modulation allows the
redundancy to be contained in the modulation
1011
s1(t)+s0(t-T)+s1(t-2T)+s1(t-3T)
s1(t)+s2(t-T)+s1(t-2T)+s3(t-3T)
Memory is created without adding redundant bits
by using a higher order modulation scheme and
using a bit in two successive symbols.
Trellis Coded Modulation
State machine adds redundant bits and
creates memory
State change is encoded by selecting a
symbol from a larger than needed
constellation, thus no bandwidth
expansion occurs and significant coding
gains are achieved.