Channel Coding: X X y y
Channel Coding: X X y y
Channel Coding
Channel Coding is done to ensure that the signal transmitted is
recovered with very low probability of error at the destination.
x y0
0
x y
1 1
Source, X Destination, Y
X Channel Y
x y
J−1 K−1
such that
K−1
X
p(yk |xj ) = 1, ∀j.
k=0
p(xj , yk ) = P (X = xj , Y = yk )
= P (Y = yk |X = xj )P (X = xj )
= p(yk |xj )p(xj )
& %
' $
& %
' $
1−p
x =0 y =0
0 0
x =1 y =1
1 1
1−p
• Mutual Information
K−1
X
H(X |Y) = H(X |Y = yk )p(yk ) (1)
k=0
K−1
" #
X J−1
X 1
= p(xj |yk )p(yk )log2 (2)
p(xj |yk )
k=0 j=0
K−1
" #
X J−1
X 1
= p(xj , yk )log2 (3)
p(xj |yk )
k=0 j=0
& %
termed as Mutual Information of the channel and is denoted by
' $
I(X ; Y).
Similarly,
– Property 1:
The mutual information of a channel is symmetric, that is
J−1
" #
X 1
H(X ) = p(xj ) log2 (7)
j=0
p(xj )
J−1
" # K−1
X 1 X
= p(xj ) log2 p(yk |xj ) (8)
j=0
p(xj )
k=0
J−1 K−1
" #
XX 1
= p(yk |xj )p(xj ) log2 (9)
j=0 k=0
p(xj )
J−1 K−1
" #
XX 1
= p(xj , yk ) log2 (10)
j=0
p(xj )
k=0
& %
we obtain
' $
J−1
" #
X K−1
X p(xj |yk )
I(X ; Y) = p(xj , yk ) log2 (11)
j=0 k=0
p(xj )
J−1
" #
X K−1
X p(yk |xj )
I(X ; Y) = p(xj , yk ) log2 = I(Y; X ) (13)
j=0 k=0
p(yk )
– Property 2:
& %
The mutual is always non-negative, that is
' $
I(X ; Y) ≥ 0
Proof:
We know,
p(xj , yk )
p(xj |yk ) = (14)
p(yk )
Substituting Eq. 14 in Eq. 13, we get
J−1
" #
X K−1
X p(xj , yk )
I(X ; Y) = p(xj , yk ) log2 = I(Y; X )
j=0 k=0
p(xj )p(yk )
(15)
Using the following fundamental inequality which we
& %
derived discussing the properties of Entropy,
' $
K−1
!
X qk
pk log2 ≤0
pk
k=0
I(X ; Y) ≥ 0
– Property 3:
& %
I(X ; Y) = H(X ) + H(Y) − H(X , Y)
' $
where, the joint entropy H(X , Y) is defined as
J−1
!
X K−1
X 1
H(X , Y) = p(xj , yk ) log2
j=0 k=0
p(xj , yk )
Proof:
J−1
!
X K−1
X p(xj )p(yk )
H(X , Y) = p(xj , yk ) log2 + (16)
j=0 k=0
p(xj , yk )
J−1
!
X K−1
X 1
p(xj , yk ) log2 (17)
j=0 k=0
p(xj )p(yk )
J−1
!
X K−1
X 1
= I(X ; Y) + p(xj , yk ) log2 (18)
j=0 k=0
p(xj )p(yk )
& %
But,
' $
J−1
!
X K−1
X 1
p(xj , yk ) log2 (19)
j=0 k=0
p(xj )p(yk )
J−1
! K−1
X 1 X
= log2 p(xj , yk )+ (20)
j=0
p(xj )
k=0
K−1
! J−1
X 1 X
log2 p(xj , yk ) (21)
p(yk ) j=0
k=0
J−1
! K−1 !
X 1 X 1
= p(xj ) log2 + p(yk ) log2 (22)
j=0
p(xj ) p(yk )
k=0
= H(X ) + H(Y) (23)
Problems
The following problems may be given as exercises.
1. Show that the mutual information is zero for a deterministic
channel.
2. Prove that I(X ; Y) = min(H(Y), H(X ))
3. Prove that I(X ; Y) = min(log(|Y|), log(|X |))
& %