ITC Module2 1
ITC Module2 1
Module 2
H(X,Y) Joint entropy (combined randomness of X and Y)
P(X,Y) Joint probability (combined probability of occurrence of X and Y)
H(X|Y) Conditional entropy (The uncertainty in X, Y is known)
H(Y|X) Conditional entropy (The uncertainty in Y, X is known)
P(X|Y) Conditional probability (The probability of occurrence of X, Y is known)
P(Y|X) Conditional probability (The probability of occurrence of Y, X is known)
I(X;Y) Mutual information (amount of information shared between X and Y)
H(X;Y) Mutual entropy (total information contained in X and Y.)
Joint entropy
The joint entropy of two random variables 𝑋X and 𝑌Y, denoted as 𝐻(𝑋,𝑌)H(X,Y), measures the total uncertainty or
information contained in the pair (𝑋,𝑌)(X,Y). P(x,y) is the joint probability.
It generalizes the concept of entropy for a single random variable to the case where two variables are involved.:
𝑯 ( 𝑿 ,𝒀 ) =− ∑ ∑ 𝒑 ( 𝒙 , 𝒚 ) 𝐥𝐨𝐠 𝒑( 𝒙 , 𝒚)
𝒙 ∈ 𝑿 𝒚 ∈𝒀
Properties:
1) The joint entropy of a set of random variables is a nonnegative number. i.e: H(X,Y) ≥ 0.
2) The joint entropy of a set of variables is greater than or equal to the maximum of all of the individual entropies
of the variables in the set. i.e: H(X,Y) ≥ max [H(X), H(Y)]
3) The joint entropy of a set of variables is less than or equal to the sum of the individual entropies of the variables
in the set. i.e: H(X,Y) ≤ H(X) + H(Y)
Marginal Probability
P(X=S)=0.1+0.4=0.5
P(X=R)=0.4+0.1=0.5
P(Y=Y)=0.1+0.4=0.5
P(Y=N)=0.4+0.1=0.5
𝑯 (𝒀 ∨ 𝑿 ) =− ∑ ∑ 𝒑 ( 𝒙 , 𝒚 ) 𝐥𝐨𝐠 𝒑(𝒚
𝑯∨𝒙)
(𝒀 ∨ 𝑿 ) =− ∑ ∑ 𝒑 ( 𝒙 , 𝒚 ) 𝐥𝐨𝐠
𝒑(𝒙 , 𝒚 )
𝒙 ∈ 𝒚 ∈𝒀 𝒙 ∈ 𝒚 ∈𝒀 𝒑 ( 𝒙)
𝒑 ( 𝒙 , 𝒚 ) =𝒑 ( 𝒙| 𝒚 ) . 𝒑 ( 𝒚 ) =𝒑 ( 𝒚 |𝒙 ) . 𝒑 ( 𝒙 )
Properties:
1) I(X;Y) = I(Y;X)
2) I(X;Y) ≥ 0
3) I(X;Y) = H(X) + H(Y) – H(X,Y)
Pij = P(bj/ai)
P=
Noisy: Say the channel is noisy and introduces a bit inversion 1% of the time, then the channel
matrix is given by
Binary symmetric channel (BSC)
• In BSC the input to the channel will be the binary digits {0,
1}.