0% found this document useful (0 votes)
29 views

Information Theory: M Ication System

Information theory is a branch of probability theory that studies communication systems from a statistical perspective. It was developed by communication scientists to study the statistical structure of electrical communication equipment. Information theory defines the measure of information as relating to the probability of an event, with less probable events carrying more information. It provides a way to quantify information and optimize the transmission of information in communication systems.

Uploaded by

amit kumar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Information Theory: M Ication System

Information theory is a branch of probability theory that studies communication systems from a statistical perspective. It was developed by communication scientists to study the statistical structure of electrical communication equipment. Information theory defines the measure of information as relating to the probability of an event, with less probable events carrying more information. It provides a way to quantify information and optimize the transmission of information in communication systems.

Uploaded by

amit kumar
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

In fo rm at io n Theory

INT RO DUC TIO N


whic h can be appl ied to the study of
Info rmat ion theory is a branch of probability theory,
mation is statistical in nature and the
communication systems. In general, communication of infor
ideal statistical com mun ication models.
main aim of information theory is to study the simple
scientists whil e they were studying the
Info rmat ion theory was invented by com mun icati on
nts.
statistical structure of electrical communication equipme
of infor mati on in som e network. The
Communication systems deal with the flow of some sort
, pictu res, mus ic, etc. There are three
inform.ation may be electrical signals, words, digital data
basic blocks of a com mun icati on system:
(i) Tran smit ter or source.
the communique from trans mitte r to
(ii) Chan nel or trans miss ion netw ork whic h conv eys
receiver.
(iii) Rece iver or destination.
on syste m. In prac tice, gene rally, there
Figu re I 0.1 show s the simp lest form of a com mun icati
plex netw ork. In such case s, it is desir able
are a num ber of trans mitte rs and rece ivers with a com
efore som e sort of trans miss ion efficiency
to stud y the distribution ofin form ation in the system. Ther
miss ion.
is to be defined whic h will lead to the mos t effic ient trans

Transmitter •I Channel I •I Receiver

Fig. 1 0 .1 A Com m unication System


tric curr ent, the stud y of the
When the com mun ique is read ily measura ble, such as an elec , the study
n the com mun ique is information
com mu nica tion syste m is relat ively easy . But, whe
for an amo unt of info rma tion? And havi ng
beco mes rath er diffi c ult. How to defi ne the mea sure
the com mun ication of info rmation?
defined a suitable mea sure , how can it be appl ied to impr ove
Info rmat ion theo ry answ ers thes e ques tions .
Gu:rP<~ lO·· •- 1 i'D-r-
-
;atia '1 Theory - - - - - -~ · 453
· L ~ OF °'
~ --~ -"""'"."~~
. _.... ~"'-~ ...~c: i S}
~O lUL \UO N
~
. .
~~ can ne,.- be "Stem cons idere d h
ribe d in d _ ~ ~ of stans ncal nature; i.e. the performance of the
11:.u.s the er . desc 8 11 sens e. It is ahvays described in statistical terms.
r-""__:_. ___ -~~s t SJgi ufica nt f.-.
- - 4,...._ ... ,2 -...i,t\ _
elem msn c
"G-ture of the. com mum·cat1o •
n syste m show n in Fig.
· ·
l 0. I 1s Its
.. · __ ~ ... u.,_ or lm ~~ ~. The peci fied
~ _... The prob abi ht,. · 1- ~ n t e r transmits at random any one of the pre-s
\Vhe n the
~ur uca non SVst Prn mod- ~ ,tat tra~ smit ting each indiv idua l mess age is know n.
e 1 IS ·· or (I\'erage
1
~llcal.y defin ed. we are able to describe its 01:er a/l
• ' UJ
~~J . ~ ~--c-e. Thus.. our qu
'
f; -
~ & z c ~ associated v.ith aest ~~ 3:111o
l
unt of infon natio n is virtu ally a searc h for a stati stica
ive mea sure
0 1 ~:e:-..::., ... ~ releYant to ~no •h~ scheme. The prarame1er should indicate a relat
Th.e p;in ciple of imp ro~1-~~ ~c e 15 _of each mess age in the mess age ense mble .
~a
dog l-i~ a man. it 5 no ~ (_v.hi cb one of the basic princ iples of the med ia wor ld}- rd. · e
in
mc:71 bi~es a dog. it s a neu:s ·· -he lps us littl this rega
; ':t'":' .l.:':_ :~ 0 ~ a dOQ b. ~ew s. but if_ a ·
· not a ne-ws. 1.e. very f
- ~m nm~ a man is qwte high • so thi s 1s e amo unt o
ti~r mat :io:i is com . - r hand . the
- ~-;.µ1 -.,.·;- . . of..a IIlaD bi~c ared by. die mess age ··a dog bites a man ". On the othe
,:-- -''-' --~L ...u, t:J.no ad , i.e. quite an amo unt
of rn.fo~ . .::, . og 15 exn-e me ly smal l. so this beco mes a news
15
co~ urucate d by the mess age "a man bi1es a
dog ".Th us, we see that there
s.ho cl. be on of
• _
11
. som e so~ of m, :~ re latio nsru p
betw een the prob abili ty of an even t and the amo unt
even t. the less is the amo unt of
~o.~ ~~ ate d \\7th it. The more the prob abili ty of an
,
~"c nn2 non assocrated i'ith it and Yice versa. Thus

-[ l 7
l (xj) = J - . . - . I
p (.,j ) ..,
{I O.I.I )

is
W,ili;Te .x is an e\--e nt \\ith a prob abili ty and the amo unt of information asso ciate d ,vith it
p(x.)
- . }

l(.r,) .
and YK are independent Hence. the prob abili ty
-~ow . let there be anot her e,·en t YK such that x j
ciated infor mati on cont em
of the J.oiru even t is p(x . r .,) = f](.x .) p (yK) with asso J - ,h - J

(10.1.2)
J(x;<•K) = 1[-p(-x_-i,Y-i.:-i] = f l -p-(x-i )-~-0-,K-)]
sum of individual infor mati ons l (x) and / (rK)•
The a>tal information /(xi' YKJ mus t be equa l to the

.
r;. !le!e /(rt: ) - ( . ) · ' _fr~
] Thu s it can be seen that the func tion on RHS
1 of Eq. l 0. 1.2 mus t be a
~
riilim is one such
_ . . - p Jv ertS
K the onPr ation of mul tiplic ation into addi rion. Loga
nu::-ct!o:r: .\,-hi ch con y -· ·

function. Tnus.
J(x, . Yx) = log [ p(x) p(vx )]

I , I
= Jog__. -( T log p( ,. J
p X1 ) • K

= / (X} ..,. J(_-~ K)


) is
. defini~g rhe amo unt of information (or self- infor mati on
' I .. .,- f>
rt-:"'
., J'-,
~~,1c ~q1.12t1on
V - >
I/, -, ,., - •
454 ,' --_ _ _ _ _ __ Communication Systems: Analog and Digital

I (10. 1.3)
l (x .) = lo g - - = - log p(x 1)
I p(XJ) .
Different units of information can be defined for different bases of logant~s. :W~en the b_a~e
is '2' the unit is a bit, when the base is 'e', the unit is 'nat ', and when the base 1s 10, the umt 1s
decit or Hartley. The base 2, or binary system, is of particular _import~nce . Hence, when no base
is mentioned, it is to be assumed as 2 and the unit of information as bit.

Table 10.1. 1 Conve rsion of Information Units


Unit Bits (base 2) N ats (base e) Decits (base 10)
-
Bits
l
-l-
--
(base 2)

- -- --
- l bit

-
=
log 2 e
= 0.6932 nat
l bit -=
=
log 2 10
0.30 l 0 decit
1
Nats I nat = -1- - l nat = --
ln 2 ln 10
(base e) = 1.4426 bits = 0.4342 decit
- -
Decits
I. 1
(base 10) I decit = 1 decit = -
(
fog. 10 ...2 log 10 e
= 3.3219 bits = 2.3026 nats
-
log22 = 1 log2e = 1.4426 log 2 10 = 3.3219
1n2 = 0.6932 In e = I In 10 = 2.3026
logrn2 = 0.3010 logrne = 0.4342 · log10 l0 = l

-..2 ENTROPY
A communication system is not only meant to deal with a single message, but with all possible
messages. Hence, although the instantaneous informations flows corresponding to individual
messages from the source may be erratic.(:ve may describe the source in terms of average
information per individual 01e..s._s~e. known as entropy of the source:)
It is necessary at this stage to understancf the difference between 'arithmetic average' and
'statistical average' .
The concept of 'arithmetic average' is applicable when the quantities to be dealt with are
detenninistic in nature and hence are to be considered just once. An example is to find the average
height of students in a class. Let there be M students in the class and their heights be
h °(j= 1, 2, 3, ..., Jvf)
1
The average height will then be (arithmetic average)
M

Lh,
'-1
\{
5
~~~ nfo rm ati on Th eor y - - - - - - -- - 45
L .. r i o : I . . .
et us_ ap ply the sam e d on to a pro b) . . abi lis tic , sta tis tic al)
o e~ n, ti ini stic (pr ob b M
qu an titi es, lik e inf ass oc · d cm inv olv ing no n-d ete rm e
tra ns mi tte r m ess ag esnn ations iate wi.th ev ery tra ns mi.tte r me ssa ge (sy mb o l). Le t the re
in fi l , 2, 3_, .. .,
j th me ssa ge be I xi U =
d let the
M) . Th en ac co ct · an
~: ;on a~s oc iat ed wi th d
me ssa ge wi ll : e mg to the
de fi nit ion ~~ 1 inform ati on pe r tra nsm itte
ma ti c av era ge , the av era ge

M
I, xi
/ xi .. _1 _
="" -J_
. M . d
t th1 s de fin iti on is n t . . ssa ge s are no t tra ns m1 tte
Bu
( - o co rre ct Th e renso n is sunple . The tra nsmitted me giv en
. t · a
J US on ce . A s a co
mp ar ·
t of ev ery stu de nt is co ns ide red j ust on ce .) ·Ov er ·tt d
so n, t~le he igh
len gth of tim e, the y are tri an sm, ttc d ma t·1 s, but the n:-imb er of times a m~ssa ge_1s trans rm_ ~
. m~ 1h ty
d ep ~n d s on tts pr ob ab ili ty of oc c .. ny FOJ ex am ple , 1f a messa ge oc cu rn ng with pro· bab ' th
t· . Ull en ce. • erv al, the n an oth er me ssa ge oc cu rri ng w1
0. 1 1s tra ns m itt ed 1OO nn es m a g ·ive n t'un e int
. ..ty 0 . 15 wi ll b - . ar tha t the
pr o b ab ili d 15 0 tim es in the sam e tim e interval. Thus , it is cle · 11 b
co n tri bu tio n of the s e trda ns mi tte · ·
· fi na tlo n : m the avera ge inf orm ati on pe r me ssa ~e , ~~ e
- ec on fi. me ssa ge 111 ~n 1v1 du al
tio n of md
50 % mo re tha n tha t of the 11st me ssa ge mf orm at1on . In other wo rds, the· co ntr ibu ·gh mg· f:ac t or
ssa oe inf or m t · . · · we igh ted ; the we 1
me 0 a 1011 111 the av era ge m · tiormat1on pe r me ssa ge , 1s
be ing the pr ob b 1T ' Of oc cu rre nc e of res pe cti ve me ssa ge s .. a
c. a Ity fi d "' . ·
fi nd the tot al of all qu an t1t 1es ov er • e
Tb ere1 ore the pr oc ed u re t O m the av era ge w ill be to .. . . g tha t tim
ffi
. 1 ' o pe rio d 0 f f
y lon d' •d ed by the nu mb er of qu an titi es oc cu rrmg du rm
su c1e nt 0 tm e 1v1
.
m ter va l. , w e ha ve to
to be no ted tha t sin ce in co mm un ica tio n the ory
st '. It is d, it
Th i~ is ' ~ti ~ti ca l av era ge mi nis tic qu an tit ies , wh en ev er ' av era ge ' is ref e rre
tit ies an d no t de ter
de a l wi th sta tis tic al qu an me tic av era ge ' .
tis tic al av era ge ' an d no t 'ar ith ing m an ne r.
alw ~y s me an s ' sta
ivi du al me ssa ge ca n no w be ca lcu lat ed in the fol low
n pe r ind nc es
( lh e a ve rag e inf or ma tio m M • wi th the ir res pe cti
ve pro ba bil iti es of oc cu rre
me ssa ge s m , m .... t L be
Le t the re be M dif fer en t 1 2
tim e int erv al, L me ssa ge s ha ve be en ge ne rat ed . Le
me tha t in a lon g
P 1, p 2 , .. . P Af · Le t u s as su ge s m 1 = p 1 L.
lar ge s o tha t L >> M ; the n, the nu mb er of me ssa
v er y
1 of inf or matio n in all
in me ssa ge m = lo g- - . Th us , the tot al am ou nt
tio n
T h e am o un t o f inf or ma
1
P1
1
m es sa ge s = p L log -.
m1 1
P1 the n be
am ou nt o f inf or ma tio n in all L me ssa ge s will
Th e to tal 1
l L log -
L log - + ... + PM
I = p 1L log - + p 2 PM
P1 P2
'
the
· .c nn a ti on pe r me
ss ag e , o r en tro py, wil l th en
Th e average rn io I l g
l
-
/ _- + ··· + Pu lo
fl = - ' = Pi) log - + p· 2 log P2 . PM
L · Pt
M l Al

=. X P1c log -, = - LP klog ~


~
/
(10.2. 1)

._7' 'f:: Pk k= I

an d Pk= p 1 = 1, the n
. . possi ble message, i.e . M = 1
ff the re 1s only a s mg1e
45 6 ~ - - - - -- Com mu nication System s: Analo
g and Digital

H = p 1 log - 1 = 1 log -1 = 0
Th us, it can be see n tha t in the Pt l
cas e of a sin gle pos sib le me ssa
con vey s no inf orm atio n. ge, the rec eption of tha t me ssa
ge
On the oth er han d, let the re be
on ly one me ssa ge out of the M
and all oth ers 0. In tha t cas e me ssa ge hav ing a pro bab ilit y
· 1
M l
H = L PK log -
k.,; 1 PK

= p 1 Jog - I + lim [ p log -l + p Iog -1 + .. ·]


Pt p-l- 0 P P
= 1 log -1 + 0
1
=0
Thus, if all the probabilities are
zero except for one , which oug
zero. In all the oth er cases, the ht to be unity, the entropy is
entropy is greater than O as can
Fo r a binary system (M = 2), the be see n from Eq . l 0.2 . l.
entropy is
1
H= Pi lo g- + p lo•1
2 g-
L et P1 Pv
p 1 = p, the np = 1 -p = 1 -
He nce , 2 1 p= q
. . --- '
.· .. 1 I ~
H= pl og -+ (1 :- jJ )l og1
P. . -- (10 .2.2)
(1.- p)
1 1
= p lo g- + q lo g- = H( p) = H( q)
p ' q
A plot of H , as a function of p,
as in Eq. 10.2.2 is shown in Fig
The condit ion for maximum ent . 10.2.1.
ropy and its value can be found
Di fferen tia ting Eq. 10.2.2 w.r.t. as follows:
p and equating it to zero yields.
dH
- = 0 = - ln 2 - log p + ln
dp 2 + log ( 1 - p)
1.e. log p = log (1 - p)
H
1.e.
p = 1.:- f
i.e. p = o.s'
This concludes that the re is either
a maxim a or a minima at
p = 0.5. If the second derivative of
Hi s positive, then the re is a
minima and if it is negative, the n
it is a maxi ma. No w,
d 2H 1 l
dp 2 = - p - 1 - p ~ O
O 0.5
Hence, H has a maximum at 0.5
Fi g. 10 . 2 .1
T he ma . · Chapter io: Informati on Theory
x1mum valu f
e o H can be found from Eq . \0 .2.2 by puttingp == 0 .5 in it. Thus.
H
we h ave seen that fi max=H\
P-O ~ = 0.5 log 2 + 0 .5 log 2 = I bit/messa ge
both th
e messages areorethe u
bina
r_y case (M - 2), the entropy is maximum whenp -- 0 ·5 • 1. •e • when
the
. q ally hkely. Similarly, it can bo shown that for an M- ary case,
entropy is maximum whe 11 - _l lo
this case the . n 8 th0 messages arc equally likely. Tims, p 1 = p 1 = ··· PM - M ·
. mnx,mum cn~opy is

or
H m1.,'I( "' f
t,,,1
1
Pk log - - - M ( _I log M \
P t< \ M )

. . H mM = log Mbits / message ( lO .i . 3 )


[It 1s mteresting to c .d . .
onS\ er a s1tuation when all messages are equ1prob able. ln th,s case.
Hence, P t -- P2 = ... = P k= ... = PM = M1

M L,fXk
Average information = L Pk /xk = _k_=_I_
k = I M
!his is same as the arithmetic average. The reason is simple. The relative weight of all the quantitie s
LS same, ?ecause they are equiprobable, and hence it is redundant and
the definition of arithmet ic
average 1s applicable.
Th~, it can be concluded that the statistical average is equal to the arithmet ic average when al\
quantities are equiprobable.]
:t:J::: The important properties of entropy can now be summarised as follows :
(i) log M '?:. H '?:. 0.
(ii) H = 0 if all probabilities are zero, except for one, which must be unity.

(iii) H= log Mifall the probabilities are equal so thatp(x) ;= P;= ~ for all fs.
Now, let us ex.amine H under different cases for M = 2:
Case I: p1 = 0.-Ql , = 0.,~~.. ,,H = _<1Q8~
p2
Case II: p1 = 0.1# p 2 ~ . .D =:o, .-, H = 0.97
Case III: p 1 = · 0.5, p 2 = O'. ~, H = 1.00 .../
-
In case I, it is very easy to guess whether the message m 1 with a probability p (= 0.01) will
1
occur, or the message m with a probability p 2 (= 0.99) will occur. (Most of the time message m.,
2
will occur.) Thus, in thi s case, the uncertainty is less. ln case 11, it is somewh at difficult to guess
whether m will occur or m2 will occur as their probabilities are nearly equal. Thus. in this case,
the uncerta\ nty is more. In case III , it _is e~treme ly difficult t~ gue_ss wh~ther m or m~ wlll occur,
1
as their probabilities are equal. Thus,_m this cas~, _the uncertamty 1s m~ 1mum. We have seen that
the entropy is less when uncertai nty 1s le~s and 1t 1s more when nncertam ty is more . T hus. we can
say that entropy is a measure of uncertam ty.
. t . Sy · stem s ·· A nalog and Dig ital
458 ,"--_ _ __ _ Communzca 10 n

-- =3 RATE OF INFORMATION
t" f . essa • ges per seco nd, th e rate of informa t1·on
ip l£
Jfo message source generat es messag
es at t,1c ra c o 1 111•
1 • •
number of bt ls of information per second .
Now'
· d CfiJllC d OS tJ~' c overage.
· n rote) R IS
· "- rmn tJo H
(or llhO per mes sag e. ence,
of info rma tion
H is the ovcrogc number of bits (1 0.3. l )
R ... r!f bits lsec
Ler us consider two sou rces of equa l entr
r,
nd
opy H. gene rati ng a r 2 mes sage s
per second,
the second
vely. The fi rst source wi ll tran smi t the info ~mation at_a rate R 1 = r1 H and
resp ecri 1 th en R 1 > R 2. Thus in a
info nna ti on ar a rate R = r 2 !-I. Now, f r 1 > r 2,
source will transmi t the 2
sour~e, placing
period. mor e info m,a ti on is tra nsm itted fro m th e firs t source t~an the seco_nd
gi \'en its entropy
and s on the com mun icat ion cha nne l. Hence, the sou rce ts not describ ed by
greater dem
.
alone but also by its rate of info m1a1ion opy , and H is referred to as bits /message entr
opy.
Sometim es. R is refe rred to as bits /sec entr

probabilities p 1 = l/2, p 2 = l/4, p3 =


Ex am ple 10. 3/ An event has six poss ible outcomes with the the rate of infonnation
the entropy of the system. Al so find
I ~- r" =- I I tl. p ~ 1/32 rrnd p 6 = 1/32. Find
1f there are- J 6 outcomes per seco nd.

Solutio n The entropy His


I
Y L Pk log,Pi- .
6

k=r·

1 1 1 1
I
J.
= - log:2 :+- - 102: 4 + - log 8 + -16 log 16 + -32 lo ° 32 +-
0
32
log 32
2 ~ 4 oi.._ 8 :

. I
= -31 bits message
16
Now, r = J6 outcomes/sec
Hence. the ra te of info nnation R is
R = rH

'f. = 16 x ~
16
= 3 I bits /sec

ls
e:Y!- :ed to 5 kHz. The signal is quantized in 8 leve
Ex ample 10. continu ous signal is ban dlim it
, 0.05 and 0.05 . Calculate the entropy
the prob abi lities 0.25 , 0. 2, 0.2, 0. 1, 0. I, 0.05
of a PCM system with
and the ra te of infonnation.
Each
al sho uld be sam pled at a frequen cy 5 x 2 = IO kHz (Sam phn g theorem).
Soluti~ n The si~ , we get
then qua ntized to one of the eigh t levels. Looking at each qua ntized leve l as a message
sample is
~ =- - (0 .25 log 0.25 + 0.2 log 0.2 +
0.2 log 0.2 + 0.1 log O. l + 0.1 log 0.1
)
0.05 log 0.05 + 0. 05 log 0.05 + 0.05 log 0.05
1

= 2.74 bits /message of


ency is IO kHz , the mes sage rate = l 0,00 0 messages/s ec. Hence, the rate
As the sampling frequ
information is~
f . ;,., : f or~ =·rH = 10.000 x 2.74 = 27,4 00 bits /sec .
R

You might also like