Siruganur, Tiruchirappalli - 621 105
Siruganur, Tiruchirappalli - 621 105
Communication system
Discrete message
During one time one message is transmitted. During the next time
interval the next from the set is transmitted.
Memory source
A source with memory for which each symbol depends on the previous
symbols.
Memoryless source
with probabilities
Source alphabet
Symbols or letters
Uncertainty
Binit 2
Natural unit(nat) e
3. If for two events pk > pi, the information content is always I(sk) < I(si).
The less probable the event is, the more information we gain when it occurs.
4. I(sksi) = I(sk)+I(si) if sk and si are
statistically independent. Proof:
p(sj, sk) = p(sj) p(sk) if sk and si are
statistically independent. I (sj, sk) = log (1 /
p(sj, sk))
= log (1 / p(sj) p(sk))
= log (1 / p(sj)) + = log (1 / p(sk))
= I(sj) I(sk)
Average information or entropy
The important quantity H(φ) is called the entropy of a discrete memoryless source
with source alphabet φ. It is a measure of the average information content per
source symbol. Note the entropy H(φ) depends only on the probabilities of the
symbols in the alphabet φ of the source.