INFORMATION THEORY AND
CODING
5th SEM E&C
JAYANTHDWIJESH H P M.tech (DECS)
Assistant Professor Dept of E&C
B.G.S INSTITUTE OF TECHNOLOGY (B.G.S.I.T)
B.G Nagara, Nagamangala Tq, Mandya District- 571448
INFORMATION THEORY AND CODING
B.E., V Semester, Electronics & Communication Engineering /
Telecommunication Engineering
[As per Choice Based Credit System (CBCS) scheme]
Subject Code 15EC54 IA Marks 20
Number of Lecture 04 Exam Marks 80
Hours/Week
Total Number of 50 (10 Hours / Module) Exam Hours 03
Lecture Hours
CREDITS 04
Modules
Module-1
Information Theory: Introduction, Measure of information, Information content of message,
Average Information content of symbols in Long Independent sequences, Average
Information content of symbols in Long dependent sequences, Markov Statistical Model of
Information Sources, Entropy and Information rate of Markoff Sources (Section 4.1, 4.2 of
Text 2).
Module-2
Source Coding: Source coding theorem, Prefix Codes, Kraft McMillan Inequality Property
KMI (Section 2.2 of Text 3).
Encoding of the Source Output, Shannons Encoding Algorithm (Sections 4.3, 4.3.1 of Text
2).
Shannon Fano Encoding Algorithm, Huffman codes, Extended Huffman coding, Arithmetic
Coding, Lempel Ziv Algorithm (Sections 3.6,3.7,3.8,3.10 of Text 1).
Module-3
Information Channels: Communication Channels (Section 4.4 of Text 2).
Channel Models, Channel Matrix, Joint probability Matrix, Binary Symmetric Channel,
System Entropies, Mutual Information, Channel Capacity, Channel Capacity of: Binary
Symmetric Channel, Binary Erasure Channel, Muroga, s Theorem, Continuous Channels
(Sections 4.2, 4.3, 4.4, 4.6, 4.7 of Text 1).
Module-4
Error Control Coding:
Introduction, Examples of Error control coding, methods of Controlling Errors, Types of
Errors, types of Codes, Linear Block Codes: matrix description of Linear Block Codes, Error
Detection and Error Correction Capabilities of Linear Block Codes, Single Error Correcting
hamming Codes, Table lookup Decoding using Standard Array.
Binary Cyclic Codes: Algebraic Structure of Cyclic Codes, Encoding using an (n-k) Bit
Shift register, Syndrome Calculation, Error Detection and Correction (Sections 9.1, 9.2, 9.3,
9.3.1, 9.3.2, 9.3.3 of Text 2).
Module-5
Some Important Cyclic Codes: Golay Codes, BCH Codes (Section 8.4 Article 5 of Text
3).
Convolution Codes: Convolution Encoder, Time domain approach, Transform domain
approach, Code Tree, Trellis and State Diagram, The Viterbi Algorithm) (Section 8.5
Articles 1,2 and 3, 8.6- Article 1 of Text 3).
Question paper pattern:
The question paper will have ten questions
Each full question consists of 16marks.
There will be 2 full questions (with a maximum of four sub questions) from each
module.
Each full question will have sub questions covering all the topics under a Module.
The students will have to answer 5 full questions, selecting one full question From
each module.
Text Book:
1. Information Theory and Coding, Muralidhar Kulkarni , K.S. Shivaprakasha, Wiley India
Pvt. Ltd, 2015, ISBN:978-81-265-5305-1
2. Digital and analog communication systems, K. Sam Shanmugam, John Wiley India Pvt.
Ltd, 1996.
3. Digital communication, Simon Haykin, John Wiley India Pvt. Ltd, 2008.
Reference Books:
1. ITC and Cryptography, Ranjan Bose, TMH, II edition, 2007
2. Principles of digital communication, J. Das, S. K. Mullick, P. K. Chatterjee, Wiley, 1986 -
Technology & Engineering
3. Digital Communications Fundamentals and Applications, Bernard Sklar, Second Edition,
Pearson Education, 2016, ISBN: 9780134724058.
4. Information Theory and Coding, K.N.Hari bhat, D.Ganesh Rao, Cengage
Learning, 2017.
FORMULAS FOR REFERENCE
MODULE 1(INFORMATION THEORY)
Amount of information or Self information.
= log ( ) or = ( ) or I ( ) = log ( )
Entropy of source or Average information content of the source.
H= = ( ) bits/symbol or H = = ( ) bits/symbol or
H(S) = = ( ) bits/symbol or H(S) = = ( ) bits/symbol or
H(S) = = ( ) bits/symbol
Information rate or average information rate.
= H(S) bits/sec or R= H bits/sec or R=r H bits/sec
Bits.
= ( ) bits
Hartleys or Decits.
= ( ) Hartleys or Decits
Nats or Neper.
= ( ) Nats or Neper.
Extremal or Upper bound or Maximum entropy
() = bits/message-symbol or () = bits/message-symbol.
Source efficiency
() ()
= or = X %
() ()
Source redundancy
()
= 1- = (1 - ()
) X %
The average information content of the symbols emitted from the i th state.
=
= ( ) bits/symbol or
=
= ( ) bits/symbol
The average information content of the symbols emitted from the k th state.
=
= ( ) bits/symbol
The average information content per symbol in a message of length N.
= ( )log or = ( )log P ( ) = H ( )
( )
The entropy of the second order symbols.
= H ( ) where N=2.
The entropy of the third order symbols.
= H ( ) where N=3.
Log properties
1. =
. =
. = ln (10)
FORMULAS FOR REFERENCE
MODULE 2 (source coding)
Entropy of source or Average information content of the source.
H(S) = = ( ) bits/symbol or H(S) = = ( ) bits/symbol
Average length
L= = bits/symbol or L = = bits/symbol
Source or code efficiency
() ()
= X % or = X %
Source or code redundancy
() ()
= 1- = (1 -
) X % or = 1- = (1 -
) X %
Compute the number of stages required for the encoding operation, which is
given by
=
or =
The probability of 0s and 1s and 2 s in the code are found using the
formulas
P (0) = =[ "0" s in the code for ] [ ] or
P (0) = =[ "0" s in the code for ] [ ] .
P (1) = =[ "1" s in the code for ] [ ] or
P (1) =
=[ "1" s in the code for ] [ ] .
P (2) =
=[ "2" s in the code for ] [ ] or
P (2) = =[ "2" s in the code for ] [ ] .
The variance of the word length is calculated from
Var ( ) = E [( ) =
= ( )
The Smallest integer value of if found using
or
The average length of the 2nd extension is given by
= = bits/symbol or = = bits/symbol
The average length of the 3rd extension is given by
= = bits/symbol or = = bits/symbol
The entropy of the 2nd extended source is calculated as
H ( ) = 2 H(S)
The entropy of the 3rd extended source is calculated as
H ( ) = 3H(S)
Source or code efficiency of the 2nd extended source is
( ) ( )
() = X % or () = X %
Source or code redundancy of the 2nd extended source is
( ) ( )
() = 1- () = (1 -
) X % or () = () = (1 -
)
%
Source or code efficiency of the 3rd extended source is
( ) ( )
() = X % or () = X %
Source or code redundancy of the 3rd extended source is
( ) ( )
() = 1- () = (1 -
) X % or () = () = (1 -
)
%
The average length of the Huffman ternary code is given by
() =
= trinits /Msg- symbol or
()
=
= trinits / Msg- symbol
The average length of the Huffman quaternary code is given by
() =
= quaternary digits /Msg- symbol or
() =
= quaternary digits / Msg- symbol
The entropy in ternary units/ message symbol is found by using equation
()
(S) = ternary units/ message symbol or
(S) = = ( ) ternary units/ message symbol or
(S) = = ( ) ternary units/ message symbol
The entropy in quaternary units/ message symbol is found by using equation
()
(S) = quaternary units/ message symbol or
(S) = = ( ) quaternary units/ message symbol or
(S) = = ( ) quaternary units/ message symbol
Source or code efficiency of the ternary is given by
() ()
() = X % or () = X % or
() ()
() ()
= X % or = X %
Source or code efficiency of the quaternary is given by
() ()
() = X % or () = X % or
() ()
() ()
= X % or = X %
Source or code redundancy of the ternary is given by
()
= 1- () = (1 - ) X % or
() ()
()
() = () = (1 - ) % or
()
() ()
= 1- = (1 -
) X % or = 1- = (1 -
) X %
Source or code redundancy of the quaternary is given by
()
= 1- () = (1 - ) X % or
() ()
()
() = () = (1 - ) % or
()
() ()
= 1- = (1 -
) X % or = 1- = (1 -
) X %
FORMULAS FOR REFERENCE
MODULE 3 ( INFORMATION CHANNELS)
The entropy of input symbol is given by
H (A) = = ( )( ( ) ) bits/ message-symbol
The entropy of output symbol is given by
H (B) = = ( )( ( )
) bits/ message-symbol
The joint entropy is given by
H (A, B) = = = ( , )( ) bits/ message-symbol
( , )
The equivocation H(A/B) is given by
H (A / B) = = = ( , )( ) bits/ message-symbol
( / )
( , )
Since P( / ) =
( )
And another formula
H (A/B) = H (A, B) H (B) bits/ message-symbol
The equivocation H(B/A) is given by
H (B / A) = = = ( , )( ) bits/ message-symbol
( / )
( , )
Since P( / ) =
( )
And another formula
H (B/A) = H (A, B) H (A) bits/ message-symbol
The mutual information is given by
I (A, B) = H (A) H (A/B) bits/ message-symbol
The channel capacity is given by
C = (log s h) bits/ sec
Since h= = = ( ) bits/ message-symbol
The capacity of the channel is given by
C=B [1 + S/N]
Source efficiency is given by
()
= x 100%
Source efficiency is given by
=
Channel efficiency is given by
(,)
= x 100%
Channel redundancy is given by
Channel redundancy = (1- ) x 100%
Estimation of channel capacity by MUROGAS method
FORMULAS FOR REFERENCE
MODULE 4 (error control coding and binary cyclic codes)
Chapter -1 error control coding
The code vector is given by
[C]= [D] [G]
The Message vector is given by
The Parity check matrix is given by
[H]= [ : ] or [ : ]
The generator matrix is given by
[G]= [ : P] or [ ]
The syndrome is given by
S=R or r
The corrected code vector is given by
C=R+E
The single error correcting (n, k) Hamming code has the following parameters
Code length: n
Number of message bits : k n- (n+1)
Number of parity check bits: (n-k)
Error correcting capability: t =
Error correcting capability:
is given by
()
= [ ] or [ ]
HAMMING BOUND (perfect code or not)
=
Chapter-2 binary cyclic codes
The Non systematic cyclic code vector is given by
V(X) =D(X) g(X)
The systematic cyclic :
The remainder polynomial R(X) is given by
R(X) =
()
The code vector is given by
[V] = [R][D]
The parity check polynomial is given by
h (x) = +1 / g(x)
The code vector is given by
[V] = [R][D]
FORMULAS FOR REFERENCE
MODULE 5 (Some Important Cyclic Codes and Convolution
Codes)
Chapter -1 Some Important Cyclic Codes
Reed Solomon (RS) codes
Chapter-2 Convolution Codes
Time domain approach
The generator matrix G is given by
Number of rows in [G] =L (one input)
Number of rows in [G] =2L (two inputs)
Number of columns in [G]= n(L+M)
The output of the encoder is given by
[C] = [d][G]
Transform domain approach
The output polynomials for the j(j=1,2,3) adders are given by
(x)= d(x) (x) for j=1, 2, 3.
The output of the encoder is given by
C(X) = () ( ) +X () () + () () ++ () ()
ITC
PREVIOUS
YEARS
QUESTION
PAPERS
2006-2017
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE
SHIRDI SAI ENGG COLLEGE