Information Theory & Coding
Information Theory & Coding
• Conditional Entropy:
∑
H(Y |X) = − p(x, y) log2 p(y|x)
x,y
• Mutual Information:
Channel Capacity
• Discrete Memoryless Channel Capacity:
1
Source Coding
• Shannon’s Source Coding Theorem: Average codeword length L satisfies:
H(X) ≤ L < H(X) + 1
Error-Correcting Codes
• Hamming Distance: Minimum number of bit flips between codewords.
dmin = min{d(ci , cj ) | ci ̸= cj }
2
Key Notes
• Entropy H(X) measures source uncertainty; maximize for uniform distribution.
• Channel capacity is the maximum rate for reliable communication.
• Huffman coding is optimal for prefix-free codes.
• Use Q-function tables for BER calculations in Gaussian channels.
• Units: bandwidth in Hz, power in W, information in bit.