Tutorial information theory and error coding with answers
Tutorial information theory and error coding with answers
2) 1) a) What does Odd and Even Parity mean (w.r.t. error detection)? Use a diagram to help your explanation.
b) Assume Even Parity. Do the following received codewords look like they have no errors?
i) 00000000, ii) 10000011
c) Assume Odd Parity. Do the following received codewords look like they have no errors?
i) 00000000, ii) 10000011
4) Using a diagram to help you, explain how a Parity check bit added to 7 data bits can produce a situation in
which we have errors that are undetected.
5) Compare the Shannon-Hartley formula in its original form with the simple approximation we derived.
Assume an analogue bandwidth of 1kHz, and an SNR of 10dB. What do you conclude?
6) A telephone network has an effective bandwidth of about 4000Hz, and transmitted signals have an average
SNR around 20dB. Evaluate the theoretical maximum channel capacity of the assumed telephone network
using the Shannon-Hartley formula.
7) Assume the channel bandwidth, B, is very large (B=1MHz). The signal is affected by AWGN, with variance =
10 volts2. The signal is a random signal for which logic 0 is represented by –A, and logic 1 represented by +A.
Assuming the signal is (worst case) a squarewave of amplitude A, find the upper limit of the capacity, C.
8) A repetition code uses 2 replications (i.e. each transmitted bit is replicated twice). What is the probability
that 4 bits transmitted in this way will be free from errors after decoding? Assume BER=0.01.
ANSWERS
1) Consider the Binary Symmetric Channel model:
a) p=0.01
𝑁
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = ( ) 𝑝𝑘 (1 − 𝑝)N−k
𝑘
𝑁!
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = . (0.01)𝑘 . (1 − 0.01)𝑁−𝑘
(𝑁 − 𝑘)! 𝑘!
100!
𝑝𝑟(0 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 100) = . (0.01)0 . (1 − 0.01)100−0
(100 − 0)! 0!
= 0.3660
P=0.1
𝑁
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = ( ) 𝑝𝑘 (1 − 𝑝)N−k
𝑘
𝑁!
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = . (0.1)𝑘 . (1 − 0.1)𝑁−𝑘
(𝑁 − 𝑘)! 𝑘!
100!
𝑝𝑟(0 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 100) = . (0.1)0 . (1 − 0.1)100−0
(100 − 0)! 0!
= 0.000026
b) p=0.01
𝑁
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = ( ) 𝑝𝑘 (1 − 𝑝)N−k
𝑘
𝑁!
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = . (0.01)𝑘 . (1 − 0.01)𝑁−𝑘
(𝑁 − 𝑘)! 𝑘!
100!
𝑝𝑟(2 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 100) = . (0.01)2 . (1 − 0.01)100−2
(100 − 2)! 2!
= 0.1848
P=0.1
𝑁
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = ( ) 𝑝𝑘 (1 − 𝑝)N−k
𝑘
𝑁!
𝑝𝑟(𝑘 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 𝑁) = . (0.1)𝑘 . (1 − 0.1)𝑁−𝑘
(𝑁 − 𝑘)! 𝑘!
100!
𝑝𝑟(2 𝑏𝑖𝑡𝑠 𝑖𝑛 𝑒𝑟𝑟𝑜𝑟 𝑜𝑢𝑡 𝑜𝑓 100) = . (0.1)2 . (1 − 0.1)100−2
(100 − 2)! 2!
= 0.0016
2) a) A Parity bit is appended to some data to ensure that the number of 1’s in the whole codeword (inc. the
Parity bit) is EVEN (for EVEN Parity), or ODD (for ODD Parity). See diagram below:
Even parity -- Even parity means the number of 1's in the given word including the parity bit should be even
(0,2,4,6,....).
Odd parity -- Odd parity means the number of 1's in the given word including the parity bit should be odd
(1,3,5,....).
b) Assuming Even Parity, do the following received codewords look like they have no errors:
i) 00000000
Yes, there are an even number of 1’s.
ii) 10000011
No, there are an odd number of 1’s.
c) Assume Odd Parity. Do the following received codewords look like they have no errors?
i) 00000000
ii) 10000011
3) The Hamming Distance is 2, meaning that single bit errors only can be detected.
4) Using a diagram to help you, explain how a Parity check bit added to 7 data bits can produce a situation in
which we have errors that are undetected. Example shown:
0 1 0 1 0 1 1 becomes
0 0 0 0 0 1 1 – it looks ok but isn’t
as there have been 2 errors.
5) Compare the Shannon-Hartley formula in its original form with the simple approximation we derived.
Assume an analogue bandwidth of 1kHz, and an SNR of 10dB. What do you conclude?
They are still pretty close even at this not huge SNR.
6) A telephone network has an effective bandwidth of about 4000Hz, and transmitted signals have an average
SNR around 20dB. Evaluate the theoretical maximum channel capacity of the assumed telephone network.
𝑆 = 𝐵. log 2 (1 + 𝑆/𝑁)
𝑆 = 4000. log 2 (1 + 100) = 26632.8 bps
7) Assume the channel bandwidth, B, is very large (B=1MHz). The signal is affected by AWGN, with variance =
10 volts2. The signal is a random signal for which logic 0 is represented by –A, and logic 1 represented by +A.
Assuming the signal is (worst case) a squarewave of amplitude A, find the upper limit of the capacity, C.
𝑝𝑜𝑤𝑒𝑟𝑠𝑖𝑔𝑛𝑎𝑙 (𝑠𝑖𝑔𝑛𝑎𝑙𝑟𝑚𝑠 )2
𝑆𝑁𝑅 = =
𝑃𝑜𝑤𝑒𝑟𝑛𝑜𝑖𝑠𝑒 𝑃𝑜𝑤𝑒𝑟𝑛𝑜𝑖𝑠𝑒
𝐴2 𝐴2 𝐴2
𝐶 = 𝐵. log 2 (1 + ( )) 𝑏𝑝𝑠 = 1,000,000. log 2 (1 + ( )) 𝑏𝑝𝑠 = log 2 (1 + ) 𝑀𝑏𝑝𝑠
10 10 10
10+𝐴2
𝐶 = log 2 ( 10
) 𝑀𝑏𝑝𝑠 = log 2 (10 + 𝐴2 ) − log 2 10 ≈ (log 2 (10 + 𝐴2 ) − 3.3219) Mbps
8) A repetition code uses 2 replications (i.e. each transmitted bit is replicated twice). What is the probability
that 4 bits transmitted in this way will be free from errors after decoding? Assume BER=0.01