0% found this document useful (0 votes)
4 views

Itc

Uploaded by

priyank Vaishnav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

Itc

Uploaded by

priyank Vaishnav
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Enroll. No.

_________

MARWADI UNIVERSITY

Faculty of Engineering
[INFORMATION & COMMUNICATION TECHNOLOGY][B.Tech]
SEM: 7th MU FINAL
EXAMDECEMBER:2022___________________________________________________________
_______________
Subject: - (Information Theory and Coding) (01CT0702) Date:- 20/12/2022Total Marks:-100
Time: -10:30 am to 1:30 pm

Instructions:
1. All Questions are Compulsory.
2. Make suitable assumptions wherever necessary.
3. Figures to the right indicate full marks.

Question: 1.
(a) Choose the correct or the best alternative in the following [10]
1. For a fast communication which of the following requirements have to be met
A. Large Bandwidth
B. High SNR
C. High Channel Capacity
D. None of the above
2. In Huffman coding, data in a tree always occur at
A. Roots
B. Leaves
C. Left sub tree
D. Right sub tree
3.Calculate the self-information if it is given that𝑃(𝑥) = 1/4
A. 16 𝑏𝑖𝑡𝑠
B. 8 𝑏𝑖𝑡𝑠
C. 4 𝑏𝑖𝑡𝑠
D. 2 𝑏𝑖𝑡𝑠
4.Channel Capacity of BSC with error probability𝑃 = 0.7
A. 𝐶 = 1.88[𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙]
B. 𝐶 = 2.89[𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙]
C. 𝐶 = 1.44[𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙]
D. 𝐶 = 2.88[𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙]
5.For noiseless channel is
A. H(x) = H(y)
B. H(y/x) = 0
C. Both A and B
D. None of the above
6.A source delivers symbols𝑚 , 𝑚 with probabilities 1⁄2 , 1⁄2.The entropy of the system is
A. 2
B. 4
C. 1
D. 0
7.A communication channel with AWGN has a bandwidth of 2000 and an SNR is 15. Its
channel capacity is
A. 16 kbps
B. 8 kbps
C. 1.6 kbps
D. 160 kbps
MARWADI UNIVERSITY 1|
Enroll. No._________

8.For a (10,4) block code where 𝑑 = 3, how mnay errors can be corrected by this code?
A. 1
B. 2
C. 3
D. 0
9.The Hamming distance between equal codewords is
A. 1
B. 0
C. 2
D. 3
10.What is the correct Run-length encoding for the data: 𝐴𝐴𝐵𝐵𝐵𝐶𝐶𝐷𝐷
A. 2A3B2C2D
B. 3B2C2D
C. A2B2C2D
D. None of the above
(b) Answer Following [10]
1. Write an equation to relate mutual information and conditional entropy.
2. A code scheme has a hamming distance 𝑑 = 7. What is the error detection and
correction capability of this scheme?
3. Design an instantaneous binary code with lengths of 2, 2, 2, 3
4. Define channel capacity of BEC.
5. What is the source coding?
6. Define prefix code.
7. Define code rate for channel codes
8. What is Hamming distance between (10100) and (11110)?
9. An event has six possible outcomes with the entropy 𝐻(𝑝) = 17/8 bits/message.
Find the rate of information if there are 8 outcomes per second?
10. What is MPEG compression?
Question: 2.
(a) DesignHuffmancodefor followingprobability distributionofsymbols. [8]

(b) Construct a systematic (7,4) cyclic code using the generator polynomial 𝑔(𝑥) = 𝑥 + [8]
𝑥 + 1.
1. What are the error correcting capabilities of this code?
2. Construct the decoding table
3. If the received word is 1101101, determine the transmitted data word.
OR
(b) State and prove Kraft-McMillan inequality theorem [8]
Question: 3.
(a) Consider a linear block code (6,2), generated by the matrix [8]

MARWADI UNIVERSITY 2|
Enroll. No._________

1 0 1 1 1 0
0 1 1 0 1 1
1.Construct the code table for this code and determine the minimum distance between
code words
2.Prepare a suitable decoding table
(b) An ideal power limited communication channel with additive white Gaussian noise [4]
with 2 𝑀𝐻𝑧 bandwidth and Signal to Noise of 63 is transmitting the information at the
theoretical maximum rate. If the Signal to Noise ratio is reduced to 7, how much
bandwidth is required to maintain the same rate:
(c) Write the advantages and disadvantages of Data compression [4]
OR
(a) For the following code word length [8]
Code Lengths
A 2,4,3,3
B 2,1,3,3
Can an instantaneous ternary code be formed? If so, give an example of such a code.
(b) In the communication system, if for a given rate of information transmission requires [4]
channel bandwidth, 5MHz and 3. If the channel bandwidth is doubled for same rate of
information then a new SNR will be
(c) Consider a source 𝑠 = {𝑠 , 𝑠 , 𝑠 , 𝑠 , 𝑠 , 𝑠 , 𝑠 } with probabilities 𝑝 = {1/2,1/4,1/8,1/ [4]
16,1/32,1/64,1/64}
Find the entropy of the source.
Question: 4.
(a) Compare BCH and RS codes [8]
(b) Arithmetic encoding for the transmission of a message “BACA” comprising a string of [8]
characters with probability 𝑃(𝐴) = 0.5, 𝑃(𝐵) = 0.25, 𝑃(𝐶) = 0.25
OR
(a) Arithmetic decoding a message 0.572 comprising a string of characters with [8]
probability 𝑃(𝐶) = 0.4, 𝑃(𝐸) = 0.5, 𝑃(!) = 0.1
(b) Design parity check code C(5,4) [8]
Question: 5.
(a) What is mutual information? Mention its properties and prove that 𝐼(𝑋; 𝑌) = 𝐻(𝑋) − [6]
𝐻(𝑋/𝑌) and 𝐼(𝑋; 𝑌) = 𝐻(𝑌) − 𝐻(𝑌/𝑋)
(b) A discrete source transmit message 𝑥 , 𝑥 , and 𝑥 with the probabilities 0.3, 0.4 and [6]
0.3. The source is connected to the channel given in figure. Calculate all entropy.

(c) Compare BCH and RS codes [4]

MARWADI UNIVERSITY 3|
Enroll. No._________

(a) Calculate Average code word length and code efficiency of given code A and Code B [6]

(b) A source generates information with probabilities 𝑝 = {0.1,0.2,0.3,0.4}. Find the [6]
source entropy of the system. What percentage of maximum possible is being
generated by this source?
(C) Design parity check code C(3,1) [4]

Question: 6.

(a) Discuss the binary symmetric channel (BSC) and also derive channel capacity [8]
equation for BSC.

(b) Explain convolutional codes with suitable example [4]

(C) Derive I(X;Y) = H(X) – H(X/Y) and I(X;Y) = H(Y) – H(Y/X) [4]

OR

(a) Explain Viterbi decoding algorithm with suitable example [8]

(b) Encode given data sequence by LZW coding [4]


000101110010100101
(C) Derive H(X,Y) = H(X) + H(Y/X) and H(X,Y) = H(Y) +H(X/Y) [4]

---Best of Luck---

MARWADI UNIVERSITY 4|
Enroll. No._________

– Bloom’s Taxonomy Report –

Sub: Information Theory and Coding


Sem.: 7th
Branch: Department of Information and Communication Technology

Que. Paper weightage as per Bloom’s Taxonomy


LEVEL % of weightage Question No. Marks of
Que.
Remember/Knowledge 13% Q.1 (a) (b) 20

Understand 27% Q.2 (b) ,Q.3 (b) (c), Q.4 (a) 24


Apply 22%2 Q.2 (a), Q.4 (b) 16
Analyze 23% Q.3 (a) 8
Evaluate 10% Q.5 (a) Q.5 (b) 116
Higher order 5% Q.6 (a) (b) 16
Thinking/ Creative

Chart/Graph of Bloom’s Taxonomy

Marks of Que.

Remember/Knowledge 13% Q.1 (a)


(b)

16% Understand 27% Q.2 (b) ,Q.3 (b) (c),


20% Q.4 (a)
Apply 22% Q.2 (a), Q.4 (b)
16%
Analyze 23% Q.3 (a)
24%
8% Evaluate 10% Q.5 (a) Q.5 (b)
16%
Higher order Thinking/ Creative 5%
Q.6 (a) (b)

MARWADI UNIVERSITY 5|

You might also like