An Introduction To Information Theory: Adrish Banerjee
An Introduction To Information Theory: Adrish Banerjee
Adrish Banerjee
Department of Electrical Engineering
Indian Institute of Technology Kanpur
Kanpur, Uttar Pradesh
India
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Conditional Entropy
Problem # 1: Give examples of joint random variable X and Y
such that
i) H(Y |X = x) < H(Y )
ii) H(Y |X = x) > H(Y )
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 2: Give examples of joint random variable X, Y and Z
such that
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 2: Give examples of joint random variable X, Y and Z
such that
i) I (X ; Y |Z ) < I (X ; Y )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 2: Give examples of joint random variable X, Y and Z
such that
i) I (X ; Y |Z ) < I (X ; Y )
ii) I (X ; Y |Z ) > I (X ; Y )
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 2: Give examples of joint random variable X, Y and Z
such that
i) I (X ; Y |Z ) < I (X ; Y )
ii) I (X ; Y |Z ) > I (X ; Y )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 2: Give examples of joint random variable X, Y and Z
such that
i) I (X ; Y |Z ) < I (X ; Y )
ii) I (X ; Y |Z ) > I (X ; Y )
Adrish Banerjee
(1)
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 2: Give examples of joint random variable X, Y and Z
such that
i) I (X ; Y |Z ) < I (X ; Y )
ii) I (X ; Y |Z ) > I (X ; Y )
(1)
ii) Let X and Y be independent fair binary random variables, and let
Z = X + Y.
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 2: Give examples of joint random variable X, Y and Z
such that
i) I (X ; Y |Z ) < I (X ; Y )
ii) I (X ; Y |Z ) > I (X ; Y )
(1)
ii) Let X and Y be independent fair binary random variables, and let
Z = X + Y.
Then I (X ; Y ) = 0, but I (X ; Y |Z ) = H(X |Z ) H(X |Y , Z ) =
H(X |Z ) = P(Z = 1)H(X |Z = 1) = 12 bit.
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Divergence
Problem # 3: Let PX (X = 0) = PX (X = 1) = 0.5,
QX (X = 0) = 0.25, QX (X = 1) = 0.75 and
RX (X = 0) = 0.2, RX (X = 1) = 0.8. Show that triangle inequality
does hold for divergence, i.e.
D(PX ||RX ) > D(PX ||QX ) + D(QX ||RX )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Divergence
Problem # 3: Let PX (X = 0) = PX (X = 1) = 0.5,
QX (X = 0) = 0.25, QX (X = 1) = 0.75 and
RX (X = 0) = 0.2, RX (X = 1) = 0.8. Show that triangle inequality
does hold for divergence, i.e.
D(PX ||RX ) > D(PX ||QX ) + D(QX ||RX )
Solution:
D(PX ||QX ) =
D(QX ||RX ) =
D(PX ||RX ) =
Adrish Banerjee
0.5
0.5
+ 0.5 log
= 0.208
0.25
0.75
0.75
0.25
+ 0.75 log
= 0.011
0.25 log
0.2
0.8
0.5
0.5
+ 0.5 log
= 0.322
0.5 log
0.2
0.8
0.5 log
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Divergence
Problem # 3: Let PX (X = 0) = PX (X = 1) = 0.5,
QX (X = 0) = 0.25, QX (X = 1) = 0.75 and
RX (X = 0) = 0.2, RX (X = 1) = 0.8. Show that triangle inequality
does hold for divergence, i.e.
D(PX ||RX ) > D(PX ||QX ) + D(QX ||RX )
Solution:
D(PX ||QX ) =
D(QX ||RX ) =
D(PX ||RX ) =
0.5
0.5
+ 0.5 log
= 0.208
0.25
0.75
0.75
0.25
+ 0.75 log
= 0.011
0.25 log
0.2
0.8
0.5
0.5
+ 0.5 log
= 0.322
0.5 log
0.2
0.8
0.5 log
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 4: Consider a discrete memoryless channel with inputs
X and outputs Y . The input X takes values from a ternary set with
equal probability and it is known that the probability of error for the
system is p. Using Fanos lemma, nd a lower bound to the mutual
information I (X ; Y ) as a function of p.
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 4: Consider a discrete memoryless channel with inputs
X and outputs Y . The input X takes values from a ternary set with
equal probability and it is known that the probability of error for the
system is p. Using Fanos lemma, nd a lower bound to the mutual
information I (X ; Y ) as a function of p.
Solutions: Mutual information can be written as
I (X ; Y ) = H(X ) H(X |Y )
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 4: Consider a discrete memoryless channel with inputs
X and outputs Y . The input X takes values from a ternary set with
equal probability and it is known that the probability of error for the
system is p. Using Fanos lemma, nd a lower bound to the mutual
information I (X ; Y ) as a function of p.
Solutions: Mutual information can be written as
I (X ; Y ) = H(X ) H(X |Y )
By Fanos inequality, we get
H(X |Y ) H(Pe ) + Pe log(3 1) = H(p) + p
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Mutual Information
Problem # 4: Consider a discrete memoryless channel with inputs
X and outputs Y . The input X takes values from a ternary set with
equal probability and it is known that the probability of error for the
system is p. Using Fanos lemma, nd a lower bound to the mutual
information I (X ; Y ) as a function of p.
Solutions: Mutual information can be written as
I (X ; Y ) = H(X ) H(X |Y )
By Fanos inequality, we get
H(X |Y ) H(Pe ) + Pe log(3 1) = H(p) + p
Thus
I (X ; Y ) H(X ) H(p) p = log 3 H(p) p
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Concave Function
Problem # 5: Let (X , Y ) p(x, y ) = p(x)p(y |x). the mutual
information I (X ; Y ) is a concave function of p(x) for xed p(y |x)
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Concave Function
Problem # 5: Let (X , Y ) p(x, y ) = p(x)p(y |x). the mutual
information I (X ; Y ) is a concave function of p(x) for xed p(y |x)
Solutions: To prove, we expand the mutual information
p(x)H(Y |X = x)
I (X ; Y ) = H(Y ) H(Y |X ) = H(Y )
x
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Concave Function
Problem # 5: Let (X , Y ) p(x, y ) = p(x)p(y |x). the mutual
information I (X ; Y ) is a concave function of p(x) for xed p(y |x)
Solutions: To prove, we expand the mutual information
p(x)H(Y |X = x)
I (X ; Y ) = H(Y ) H(Y |X ) = H(Y )
x
Adrish Banerjee
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Concave Function
Problem # 5: Let (X , Y ) p(x, y ) = p(x)p(y |x). the mutual
information I (X ; Y ) is a concave function of p(x) for xed p(y |x)
Solutions: To prove, we expand the mutual information
p(x)H(Y |X = x)
I (X ; Y ) = H(Y ) H(Y |X ) = H(Y )
x
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India
Concave Function
Problem # 5: Let (X , Y ) p(x, y ) = p(x)p(y |x). the mutual
information I (X ; Y ) is a concave function of p(x) for xed p(y |x)
Solutions: To prove, we expand the mutual information
p(x)H(Y |X = x)
I (X ; Y ) = H(Y ) H(Y |X ) = H(Y )
x
Adrish Banerjee
An introduction to Information Theory
Department of Electrical Engineering Indian Institute of Technology Kanpur Kanpur, Uttar Pradesh India