Uncertainity, Information, and Entropy
Uncertainity, Information, and Entropy
& %
We can define the amount of information contained in each
' $
symbols.
I(sk ) = log( p1k )
Here, generally use log2 since in digital communications we will be
talking about bits. The above expression also tells us that when
there is more uncertainity(less probability) of the symbol being
occured then it conveys more information. Some properties of
information are summarized here:
1. for certain event i.e, pk = 1 the information it conveys is zero,
I(sk ) = 0.
2. for the events 0 ≤ pk ≤ 1 the information is always I(sk ) ≥ 0.
3. If for two events pk > pi , the information content is always
I(sk ) < I(si ).
4. I(sk si ) = I(sk ) + I(si ) if sk and si are statistically independent.
The amount of information I(sk ) produced by the source during an
& %
' $
arbitrary signalling interval depends on the symbol sk emitted by
the source at that time. Indeed, I(sk ) is a discrete random variable
that takes on the values I(s0 ), I(s1 ), · · · , I(sK−1 ) with probabilites
p0 , p1 , · · · , pK−1 respectively. The mean of I(sk ) over teh source
alphabet S is given by
H(S) = E[I(sk )]
X
K1
= pk I(sk )
k=0
X
K1
1
= pk log2 ( )
pk
k=0
& %
information content per source symbol.
' $
& %
' $
X
K−1
L̄ = p k Ik
k=0
H(S)
η= L̄
& %