0% found this document useful (0 votes)
0 views3 pages

Information Theory & Coding

Uploaded by

surajeeth.edurev
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views3 pages

Information Theory & Coding

Uploaded by

surajeeth.edurev
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Information Theory & Coding Formula Sheet for

GATE (Communication Systems)

Information Theory Basics


• Information Content: For an event with probability pi .
( )
1
I(xi ) = log2 = − log2 pi (in bits)
pi

• Entropy: Average information content of a source with n symbols.



n
H(X) = − p(xi ) log2 p(xi ) (bits/symbol)
i=1

• Joint Entropy: For two random variables X and Y .



H(X, Y ) = − p(x, y) log2 p(x, y)
x,y

• Conditional Entropy:

H(Y |X) = − p(x, y) log2 p(y|x)
x,y

• Mutual Information:

I(X; Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)

Channel Capacity
• Discrete Memoryless Channel Capacity:

C = max I(X; Y ) (bits/channel use)


p(x)

• Binary Symmetric Channel (BSC) Capacity:

C = 1 − H(p) where H(p) = −p log2 p − (1 − p) log2 (1 − p)

and p is the crossover probability.


• Gaussian Channel Capacity (Shannon-Hartley Theorem):
( )
S
C = B log2 1 + (bits/second)
N

where B is the bandwidth (Hz), S is signal power, N = N0 B is noise power, N0 is


noise PSD (W Hz−1 ).

1
Source Coding
• Shannon’s Source Coding Theorem: Average codeword length L satisfies:
H(X) ≤ L < H(X) + 1

• Huffman Coding: Assigns shorter codes to more probable symbols.



n
L= p(xi )li
i=1

where li is the length of the codeword for symbol xi .


• Code Efficiency:
H(X)
η=
L

Error-Correcting Codes
• Hamming Distance: Minimum number of bit flips between codewords.
dmin = min{d(ci , cj ) | ci ̸= cj }

• Error Detection and Correction:


Error detection up to dmin − 1 errors
⌊ ⌋
dmin − 1
Error correction up to errors
2

• Block Code Parameters: For an (n, k) code.


k
Code rate =
n
where n is codeword length, k is message length.
• Shannon’s Channel Coding Theorem: Reliable communication possible if:
R<C
where R is the information rate (bit s−1 ).

Noise and Error Probability


• Bit Error Rate (BER) for BSC:
Pe = p
where p is the crossover probability.
• BER for Gaussian Channel (BPSK):
(√ )
2Eb
Pe = Q
N0
∫∞
e−t
2 /2
where Eb is energy per bit, N0 is noise PSD, Q(x) = √1 dt.
2π x

2
Key Notes
• Entropy H(X) measures source uncertainty; maximize for uniform distribution.
• Channel capacity is the maximum rate for reliable communication.
• Huffman coding is optimal for prefix-free codes.
• Use Q-function tables for BER calculations in Gaussian channels.
• Units: bandwidth in Hz, power in W, information in bit.

You might also like