EC401 M2-Information Theory & Coding-Ktustudents - in
EC401 M2-Information Theory & Coding-Ktustudents - in
in
Module 2
KTUStudents.in
Shannon-Fano Algorithm
Huffman Coding
Channel Capacity
Redundancy and Efficiency of a channel
Binary Symmetric Channel (BSC), Binary Erasure
Channel (BEC)
Capacity of Band Limited Gaussian Channels
For more study materials: WWW.KTUSTUDENTS.IN
Construction of Basic Source
Codes
Huffman Coding
Shannon-Fano Algorithm
KTUStudents.in
KTUStudents.in
combined into a new source symbol with probability
equal to the sum of the two original probabilities. (The
list of source symbols, and therefore source statistics, is
thereby reduced in size by one.) The probability of the
new symbol is placed in the list in accordance with its
value.
3. The procedure is repeated until we are left with a final
list of source statistics (symbols) of only two for which a
0 and a 1 are assigned.
KTUStudents.in
Using Huffman algorithm derive the code words and
hence calculate the average length and entropy of the
discrete memoryless source.
KTUStudents.in
KTUStudents.in
Entropy:
KTUStudents.in
KTUStudents.in
Huffman code, there is arbitrariness in the way a 0
and a 1 are assigned to the last two source symbols.
Second, ambiguity arises when the probability of a
combined symbol (obtained by adding the last two
probabilities pertinent to a particular step) is found to
equal another probability in the list.
KTUStudents.in
It is usually found that when a combined symbol is
moved as high as possible, the resulting Huffman code
has a significantly smaller variance than when it is
moved as low as possible. On this basis, it is
reasonable to choose the former Huffman code over
the latter.
KTUStudents.in
Partition this ensemble into almost two equi probable
groups. Assign ‘0’ to one group and a ‘1’ to the other
group. These form the starting code symbols of the
codes.
Repeat step 2 on each of the subgroups contain only
one source symbol, to determine the succeeding code
symbols of the code words.
Read code directly.
For more study materials: WWW.KTUSTUDENTS.IN
Shannon Fano Algorithm
Consider the message ensemble
S={S1,S2,S3,S4,S5,S6,S7,S8} with
P={1/4,1/4,1/8,1/8,1/16,1/16,1/16,1/16}
KTUStudents.in
KTUStudents.in
KTUStudents.in
Let S be a DMS with S={s1,s2,….sq} and P={p1,p2,…pq}.It
is possible to construct a binary code that satisfies the
prefix property and has an average length L that
satisfies the inequality: H(S)≤L ≤H(S)+1
KTUStudents.in
KTUStudents.in
X selected from an alphabet and, in response, it emit
an output symbol Y from an alphabet .
The channel is discrete and memoryless.
Output alphabet
KTUStudents.in
Set of transition probabilities
KTUStudents.in
KTUStudents.in
KTUStudents.in
It is represented by ή
ή = I (X,Y) / C
I (X,Y) is mutual information ; C is Channel Capacity
Redundancy, E = 1- ή
KTUStudents.in
Define channel capacity. Write the expression
State and Prove source encoding theorem.
Find the capacity of Binary Symmetric channel.
Derive the capacity of BEC