0% found this document useful (1 vote)
487 views110 pages

VTU E&C CBCS Scheme 5th Sem Information Theory and Coding Module-2 Notes

INFORMATION THEORY AND CODING B.E., V Semester, Electronics & Communication Engineering / Telecommunication Engineering [As per Choice Based Credit System (CBCS) scheme] Subject Code 15EC54 Module-2 Source Coding: Source coding theorem, Prefix Codes, Kraft McMillan Inequality Property – KMI (Section 2.2 of Text 3). Encoding of the Source Output, Shannon’s Encoding Algorithm (Sections 4.3, 4.3.1 of Text 2). Shannon Fano Encoding Algorithm, Huffman codes, Extended Huffman coding, Arithmetic Coding, Lempel – Ziv Algorithm (Sections 3.6,3.7,3.8,3.10 of Text 1). Question paper pattern:  The question paper will have ten questions  Each full question consists of 16marks.  There will be 2 full questions (with a maximum of four sub questions) from each module.  Each full question will have sub questions covering all the topics under a Module.  The students will have to answer 5 full questions, selecting one full question From each module. Text Book: 1. Information Theory and Coding, Muralidhar Kulkarni , K.S. Shivaprakasha, Wiley India Pvt. Ltd, 2015, ISBN:978-81-265-5305-1 2. Digital and analog communication systems, K. Sam Shanmugam, John Wiley India Pvt. Ltd, 1996. 3. Digital communication, Simon Haykin, John Wiley India Pvt. Ltd, 2008. Reference Books: 1. ITC and Cryptography, Ranjan Bose, TMH, II edition, 2007 2. Principles of digital communication, J. Das, S. K. Mullick, P. K. Chatterjee, Wiley, 1986 - Technology & Engineering 3. Digital Communications – Fundamentals and Applications, Bernard Sklar, Second Edition, Pearson Education, 2016, ISBN: 9780134724058. 4. Information Theory and Coding, K.N.Hari bhat, D.Ganesh Rao, Cengage Learning, 2017.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
487 views110 pages

VTU E&C CBCS Scheme 5th Sem Information Theory and Coding Module-2 Notes

INFORMATION THEORY AND CODING B.E., V Semester, Electronics & Communication Engineering / Telecommunication Engineering [As per Choice Based Credit System (CBCS) scheme] Subject Code 15EC54 Module-2 Source Coding: Source coding theorem, Prefix Codes, Kraft McMillan Inequality Property – KMI (Section 2.2 of Text 3). Encoding of the Source Output, Shannon’s Encoding Algorithm (Sections 4.3, 4.3.1 of Text 2). Shannon Fano Encoding Algorithm, Huffman codes, Extended Huffman coding, Arithmetic Coding, Lempel – Ziv Algorithm (Sections 3.6,3.7,3.8,3.10 of Text 1). Question paper pattern:  The question paper will have ten questions  Each full question consists of 16marks.  There will be 2 full questions (with a maximum of four sub questions) from each module.  Each full question will have sub questions covering all the topics under a Module.  The students will have to answer 5 full questions, selecting one full question From each module. Text Book: 1. Information Theory and Coding, Muralidhar Kulkarni , K.S. Shivaprakasha, Wiley India Pvt. Ltd, 2015, ISBN:978-81-265-5305-1 2. Digital and analog communication systems, K. Sam Shanmugam, John Wiley India Pvt. Ltd, 1996. 3. Digital communication, Simon Haykin, John Wiley India Pvt. Ltd, 2008. Reference Books: 1. ITC and Cryptography, Ranjan Bose, TMH, II edition, 2007 2. Principles of digital communication, J. Das, S. K. Mullick, P. K. Chatterjee, Wiley, 1986 - Technology & Engineering 3. Digital Communications – Fundamentals and Applications, Bernard Sklar, Second Edition, Pearson Education, 2016, ISBN: 9780134724058. 4. Information Theory and Coding, K.N.Hari bhat, D.Ganesh Rao, Cengage Learning, 2017.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 110

INFORMATION THEORY AND

CODING
5th SEM E&C

JAYANTHDWIJESH H P M.tech (DECS)


Assistant Professor Dept of E&C

B.G.S INSTITUTE OF TECHNOLOGY (B.G.S.I.T)


B.G Nagara, Nagamangala Tq, Mandya District- 571448
FORMULAS FOR REFERENCE

MODULE 2 (source coding)

Entropy of source or Average information content of the source.



H(S) = = ( ) bits/symbol or H(S) = = ( ) bits/symbol

Average length

L= = bits/symbol or L = = bits/symbol
Source or code efficiency
() ()
= X % or = X %

Source or code redundancy
() ()
= 1- = (1 -
) X % or = 1- = (1 -
) X %

Compute the number of stages required for the encoding operation, which is
given by

=

or =
The probability of 0s and 1s and 2 s in the code are found using the
formulas

P (0) = =[ "0" s in the code for ] [ ] or

P (0) = =[ "0" s in the code for ] [ ] .

P (1) = =[ "1" s in the code for ] [ ] or

P (1) =
=[ "1" s in the code for ] [ ] .

P (2) =
=[ "2" s in the code for ] [ ] or

P (2) = =[ "2" s in the code for ] [ ] .

The variance of the word length is calculated from


Var ( ) = E [( ) =
= ( )
The Smallest integer value of if found using

or

The average length of the 2nd extension is given by



= = bits/symbol or = = bits/symbol
The average length of the 3rd extension is given by

= = bits/symbol or = = bits/symbol
The entropy of the 2nd extended source is calculated as
H ( ) = 2 H(S)
The entropy of the 3rd extended source is calculated as
H ( ) = 3H(S)
Source or code efficiency of the 2nd extended source is
( ) ( )
() = X % or () = X %

Source or code redundancy of the 2nd extended source is


( ) ( )
() = 1- () = (1 -
) X % or () = () = (1 -

)
%
Source or code efficiency of the 3rd extended source is
( ) ( )
() = X % or () = X %

Source or code redundancy of the 3rd extended source is


( ) ( )
() = 1- () = (1 -
) X % or () = () = (1 -

)
%

The average length of the Huffman ternary code is given by
() =
= trinits /Msg- symbol or
()
=
= trinits / Msg- symbol

The average length of the Huffman quaternary code is given by
() =
= quaternary digits /Msg- symbol or

() =
= quaternary digits / Msg- symbol
The entropy in ternary units/ message symbol is found by using equation
()
(S) = ternary units/ message symbol or


(S) = = ( ) ternary units/ message symbol or


(S) = = ( ) ternary units/ message symbol

The entropy in quaternary units/ message symbol is found by using equation


()
(S) = quaternary units/ message symbol or


(S) = = ( ) quaternary units/ message symbol or


(S) = = ( ) quaternary units/ message symbol

Source or code efficiency of the ternary is given by


() ()
() = X % or () = X % or
() ()
() ()
= X % or = X %

Source or code efficiency of the quaternary is given by
() ()
() = X % or () = X % or
() ()
() ()
= X % or = X %

Source or code redundancy of the ternary is given by
()
= 1- () = (1 - ) X % or
() ()
()
() = () = (1 - ) % or
()
() ()
= 1- = (1 -
) X % or = 1- = (1 -
) X %

Source or code redundancy of the quaternary is given by


()
= 1- () = (1 - ) X % or
() ()
()
() = () = (1 - ) % or
()
() ()
= 1- = (1 -
) X % or = 1- = (1 -
) X %

You might also like