Maximum Likelihood Decoding Techniques Notes
Maximum Likelihood Decoding Techniques Notes
Unit 4
P(0 | 0) 1 p
- hard decision channel - U(m) is chosen closest in Hamming distance to Z - From U(m) , U(m) is chosen for which distance to Z is minimum
z ji u (jim )
The basis
Any two paths merge to a single state, one path is eliminated in search of an optimum path
Decoding Principle
At each time ti, 2K-1 states are present in trellis and can be entered by means of two paths Decoding computes the metric for two paths entering each state and eliminating one of them Computation is done for each of the 2K-1 states at time ti and decoder moves to time ti+1 and the process is repeated At any time the winning path metric for each state is termed as state metric for that state at that time
MDCT Unit 4: ML Decoding
Survivors at t2
MDCT Unit 4: ML Decoding
Survivors at t3
Metric comparisons at t4
Survivors at t4 Only one surviving path between time t1 and t2 and it is termed as common stem Transition occurred between 0010 and since it is due to input bit 1, decoder output is 1
MDCT Unit 4: ML Decoding
Metric comparisons at t5
MDCT Unit 4: ML Decoding
Survivors at t5
Metric comparisons at t6
Survivors at t6
Sequential Decoding
Proposed by Wozencraft and modified by Fano Generates hypothesis about transmitted codeword sequence and computes metric between these hypothesis and received signal Goes forward still metric indicates its choices are likely; else goes backward changing hypothesis still finding a likely one Can work with both hard and soft decisions, however soft decisions are normally avoided since large storage elements are used and also complexity