Digital Communication Chapter 3
Digital Communication Chapter 3
Chapter 3
Information Theory and
Coding
Information Source
• Information refers to any new knowledge about something.
• The outcome of the events is picked from a sample space (set of possible
outcomes) of the experiment
• Each element of the source alphabet is called as symbol having some probability
of occurrence
Discreate Memoryless Source (DMS)
• A source is discreate if the number of symbols present in the alphabet is
discreate/ finite.
• A source is memoryless if the occurrence of one event is independent of
occurrence of previous events.
Ex 1: Binary symmetric source:
• It is described by the alphabet 𝓜 = {0,1}. In this source, there are two
symbols,
𝑋1 = 0 and 𝑋2 = 1
1
𝐼 𝑠𝑖 = log 2 = − log 2 𝑃(𝑠𝑖 ) 𝑏𝑖𝑡𝑠
𝑃(𝑠𝑖 )
Properties of Information
• If 𝑃(𝑠𝑖 ) < 𝑃(𝑠𝑗 ) then 𝐼(𝑠𝑖 ) > 𝐼(𝑠𝑗 ) (Higher the probability lowers the
information)
Ex: A DMS has source alphabet 𝓜 = {𝑠1 , 𝑠2 , 𝑠3 , 𝑠4 } with 𝑃(𝑠1 ) = 0.5,
𝑃(𝑠2 ) = 0.25 and 𝑃(𝑠3 ) = 𝑃(𝑠4 ) = 0.125. Find information contained in
each symbol
Ans:
• 𝑃(𝑠1 ) = 0.5 So 𝐼(𝑠1 ) = − log 2 𝑃(𝑠1 ) = − log 2 0.5 = 1𝑏𝑖𝑡
Then average information transmitted from the source is more interesting than
information contained in individual symbols.
𝐻 𝑆 = 𝐸 𝐼 𝑠𝑖 = 𝑃(𝑠𝑖 )𝐼(𝑠𝑖 )
𝑖=1
𝐻 𝑆 = − σ𝑚
𝑖=1 𝑃(𝑠𝑖 ) log 2 𝑃(𝑠𝑖 ) bits/symbol
Ex: Find the entropy of the DMS of the previous problem.
Ans:
𝐻 𝑆 = − σ𝑚
𝑖=1 𝑃(𝑠𝑖 ) log 2 𝑃(𝑠𝑖 )
= 1.75 bits/symbols
• The average information contained in one symbol transmitted by the source is 1.75bits
For 𝑀 = 4 , 𝐻 𝑆 ≤ 2𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
Information Rate (R)
• If the source S is emitting r symbols per second, then the information rate of the symbol is
given by
𝑅 = 𝑟𝐻 𝑆
Ex: Find the information rate of the previous problem if the source emits 1000symbols/sec.
Ans:
𝑃(𝑠2 ) = 1 − 𝑝
• 𝐻 𝑆 = −𝑝 log 2 𝑝 − (1 − 𝑝) log 2 (1 − 𝑝)
𝑑𝐻(𝑆)
=0
𝑑𝑝
1−𝑝
log 2 =0
𝑝
1−𝑝
=1 𝑝 = 1/2
𝑝
Discrete Memoryless Channel (DMC)
• The channel is a path through which the information flows from
transmitter to receiver.
• In each signaling interval, the channel accepts an input signal from source
alphabet (X)
• In response, it generates a symbols from the destination alphabet (Y).
• If the channel output depends only on present input of the channel, then
channel is memoryless.
Channel Transition Probability
• Each input-output path is represented by its
channel transition probability.
• Transition probability between jth source
symbol 𝑥𝑗 and ith destination symbol 𝑦𝑖 is the
conditional probability 𝑃(𝑦𝑖 Τ𝑥𝑗 ).
Ex: If 𝑃 𝑦1 Τ𝑥1 = 0.9 and 𝑃 𝑦2 Τ𝑥2 = 0.8, find the channel transition matrix.
𝑃 𝑋 = 𝑃 𝑥1 𝑃 𝑥2 = [0.6 0.4]
0.9 0.1
𝑃 𝑌 = 𝑃 𝑋 𝑃 𝑌Τ𝑋 = 0.6 0.4
0.2 0.8
0.6 × 0.9 + 0.4 × 0.2 0.62
= =
0.6 × 0.1 + 0.4 × 0.8 0.38
So, 𝑃 𝑦1 = 0.62 and 𝑃 𝑦2 = 0.38
Conditional Entropy
• Let the symbol 𝑦𝑗 is observed at destination point.
• The symbol is caused due to transmission of any symbol 𝑥1 , 𝑥2 , 𝑥3 , … , 𝑥𝑚 from the input source alphabet.
• Conditional entropy 𝐻 𝑋Τ𝑦𝑗 is the uncertainty in transmitting the information source 𝑋 after observing
the output 𝑦𝑗 .
𝑚 𝑚
• 𝐻 𝑋Τ𝑌 is the uncertainty about the channel input after the channel output is observed.
𝑛 𝑛 𝑚
• 𝐻 𝑋 : Uncertainty about the channel input before the channel output is observed
• 𝐻 𝑌Τ𝑋 : Uncertainty about the channel input after the channel output is observed
𝐼 𝑋; 𝑌 = 𝐻 𝑋 − 𝐻 𝑌Τ𝑋
𝐶𝑠 = max 𝐼 𝑋; 𝑌 bits/symbol
𝑝(𝑋)
𝐶 = 𝑟𝐶𝑠 bits/sec
• Channel-coding theorem
The maximum rate (𝑅) at which communication is established over a discreate memoryless
channel with arbitrary small probability of error is
𝑅≤𝐶
Shannon’s formula for channel capacity in AWGN channel
• Ex: Consider an AWGN channel with 4 kHz BW and noise power spectral
𝜂
density = 10−2 𝑊𝑎𝑡𝑡/𝐻𝑧. It the signal power at the receiver is 0.1mW, find
2
the maximum rate at which information can be transmitted with arbitrary
small probability of error.
• Ans:
Signal Power: 𝑆 = 0.1 × 10−3 = 10−4 𝑊
Channel Bandwidth: 𝐵 = 4000𝐻𝑧
Noise Power: 𝑁 = 𝜂𝐵 = 2 × 10−12 × 4000 = 8 × 10−9 𝑊
𝑆 10−4
SNR: = = 1.25 × 104
𝑁 8×10−9
𝑆
Channel Capacity: 𝐶 = 𝐵 log 2 1 +
𝑁
= 4000 × log 2 1 + 1.25 × 104 = 54.44𝑘𝑏𝑝𝑠
Bandwidth- SNR Tradeoff
𝑆 𝑆
𝐶 = 𝐵 log 2 1+ = 𝐵 log 2 1 +
𝑁 𝜂𝐵
𝜂𝐵
𝑆𝜂 𝑆 𝑆 𝑆 𝑆
= 𝐵 log 2 1+ = log 2 1 +
𝜂𝑆 𝜂𝐵 𝜂 𝜂𝐵
𝜂𝐵
𝑆 𝑆 𝑆
𝐶∞ = lim 𝐶 = lim log 2 1 +
𝐵→∞ 𝜂 𝐵→∞ 𝜂𝐵
Forward Error Correction (FEC) Code
• Some redundant data is added with original information at transmitting end.
• It is used at receiver to detect and correct the errors in the received information.
• Receiver need not to ask any additional information from the transmitter.
• Code words: It is a unit of bits that can be decoded independently.
• The number of bits in a codeword is code length.
• If k-bits of data digit s are transmitted by code word of n- bit digits, the number of
redundant (check) bits:
𝑚 =𝑛−𝑘
𝑛
• Code rate: 𝑅 =
𝑘
𝐝 = 𝑑1 , 𝑑2 , 𝑑3 , … 𝑑𝑘
• And, code vector ( 𝑘 dimensional):
𝐜 = 𝑐1 , 𝑐2 , 𝑐3 , … 𝑐𝑛
Types of FEC codes
1. Block Codes 2. Convolutional Codes
.
.
𝑐𝑘 = 𝑑𝑘
𝑐𝑘+1 = ℎ11 𝑑1 + ℎ12 𝑑2 + ℎ13 𝑑3 + ⋯ +ℎ1𝑘 𝑑𝑘
𝑐𝑘+2 = ℎ21 𝑑1 + ℎ22 𝑑2 + ℎ23 𝑑3 + ⋯ +ℎ2𝑘 𝑑𝑘
.
.
𝑐𝑛 = 𝑐𝑘+𝑚 = ℎ𝑚1 𝑑1 + ℎ𝑚2 𝑑2 + ℎ𝑚3 𝑑3 + ⋯ +ℎ𝑚𝑘 𝑑𝑘
• So, 𝐜 = 𝐝𝐆
• 𝐆: Generator Matrix (𝑘 × 𝑛)
𝐜 = 𝐝𝐆
= 𝐝 𝐈𝑘 𝐏
= [𝐝 𝐝𝐏]
Example: (6,3) block code
In calculation use
modulo-2 adder:
1+1=0
1+0=1
0+1=1
0+0=0
Find all codewords for all possible data words
How many bits of error it can detect?
𝒄 𝑥 =𝒅 𝑥 𝒈 𝑥
• Example: Find generator polynomial 𝒈 𝑥 for a (7,4) cyclic code. Find codewords
for following data words: 1010, 1111, 0001, 1000.
• Ans:
𝑛=7
𝑘=4 𝑛−𝑘 =3
Code polynomial: 𝒄 𝑥 = 𝒅 𝑥 𝒈 𝑥 = 𝑥 3 + 𝑥 𝑥 3 + 𝑥 2 + 1 = 𝑥 6 + 𝑥 5 +
𝑥4 + 𝑥