0% found this document useful (0 votes)
67 views6 pages

Siruganur, Tiruchirappalli - 621 105

This document discusses key concepts in digital communication including: - Discrete and memoryless sources where messages are selected from a finite set and each symbol is independent of previous symbols. - The source alphabet is the set of possible source symbols. - The uncertainty and information content of each symbol is related to its probability, with less probable symbols conveying more information. - The average information or entropy of a source is the expected information content per symbol and depends only on the symbol probabilities. Entropy is a measure of the uncertainty in the source output.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views6 pages

Siruganur, Tiruchirappalli - 621 105

This document discusses key concepts in digital communication including: - Discrete and memoryless sources where messages are selected from a finite set and each symbol is independent of previous symbols. - The source alphabet is the set of possible source symbols. - The uncertainty and information content of each symbol is related to its probability, with less probable symbols conveying more information. - The average information or entropy of a source is the expected information content per symbol and depends only on the symbol probabilities. Entropy is a measure of the uncertainty in the source output.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

M.A.M.

COLLEGE OF ENGINEERING AND


TECHNOLOGY
Siruganur, Tiruchirappalli – 621 105.

Course Code : EC8501


Course Name : Digital Communication
Year / Semester : III / V
Department : ECE
Faculty Name : S.Kavitha
Regulations : AU-R2017
Academic year : 2020-21

S.Kavitha , Associate Professor ,Department of ECE,M.A.M college of Engineering and Technology


M.A.M. COLLEGE OF ENGINEERING AND
TECHNOLOGY
Siruganur, Tiruchirappalli – 621 105.

Communication system

The function of any communication system is to convey the


information from source to destination.

S.Kavitha , Associate Professor ,Department of ECE,M.A.M college of Engineering and Technology


M.A.M. COLLEGE OF ENGINEERING AND
TECHNOLOGY
Siruganur, Tiruchirappalli – 621 105.

Discrete message

Message which is selected from a finite number of predetermined messages.

During one time one message is transmitted. During the next time
interval the next from the set is transmitted.

Memory source

A source with memory for which each symbol depends on the previous
symbols.

Memoryless source

Memoryless in the sense that the symbol emitted at any time is


independent of previous choices.

Probabilistic experiment involves the observation of the output


emitted by a discrete source during every unit of time. The source output is
modeled as a discrete random variable, S,which takes on symbols form a
fixed finite alphabet.

S = s0, s1, s2, · · · , sk-1

with probabilities

P(S = sk) = pk, k = 0, 1, · · · ,K – 1

We assume that the symbols emitted by the source during successive


signaling intervals are statistically independent. A source having the
properties is described is called discrete memoryless source, memoryless in
the sense that the symbol emitted at any time is independent of previous
choices.

Source alphabet

The set of source symbols is called the source alphabet.

S.Kavitha , Associate Professor ,Department of ECE,M.A.M college of Engineering and Technology


M.A.M. COLLEGE OF ENGINEERING AND
TECHNOLOGY
Siruganur, Tiruchirappalli – 621 105.

Symbols or letters

The element of the set is called symbol.

Uncertainty

The amount of information contained in each symbols is closely


related to its uncertainty or surprise.

If an event is certain (that is no surprise, of probability 1) it conveys zero


information.

We can define the amount of information contained in each symbols.

I(sk) = logb(1/ pk)

Here, generally use log2 since in digital communications we will be


talking about bits. The above expression also tells us that when there is
more uncertainty(less probability) of the symbol being occurred then it
conveys more information.

Unit of the information

The unit of the information depends on the base of the

logarithmic function. UNIT b VALUE

Binit 2

Decit (OR) Hartley 10

Natural unit(nat) e

When pk = ½ ,we have I(sk) = 1 bit.

Some properties of information are summarized here:

1. For certain event i.e, pk = 1 the information it conveys is


zero, I(sk) = 0. Absolutely certain of the outcome of an
event.
2. For the events 0 ≤ pk ≤ 1 the information is always I(sk) ≥0.Either
provides some or no information, but never brings about a loss of
information.

S.Kavitha , Associate Professor ,Department of ECE,M.A.M college of Engineering and Technology


M.A.M. COLLEGE OF ENGINEERING AND
TECHNOLOGY
Siruganur, Tiruchirappalli – 621 105.

3. If for two events pk > pi, the information content is always I(sk) < I(si).
The less probable the event is, the more information we gain when it occurs.
4. I(sksi) = I(sk)+I(si) if sk and si are
statistically independent. Proof:
p(sj, sk) = p(sj) p(sk) if sk and si are
statistically independent. I (sj, sk) = log (1 /
p(sj, sk))
= log (1 / p(sj) p(sk))
= log (1 / p(sj)) + = log (1 / p(sk))
= I(sj) I(sk)
Average information or entropy

The amount of information I(sk) produced by the source during an arbitrary


signalling interval depends on the symbol sk emitted by the source at that time.
Indeed, I(sk) is a discrete random variable that takes on the values I(s0), I(s1), · · ·
, I(sK-1) with probabilities p0, p1, · · · , pK-1 respectively.

The mean of I(sk) over the source alphabet S is given by

The important quantity H(φ) is called the entropy of a discrete memoryless source
with source alphabet φ. It is a measure of the average information content per
source symbol. Note the entropy H(φ) depends only on the probabilities of the
symbols in the alphabet φ of the source.

S.Kavitha , Associate Professor ,Department of ECE,M.A.M college of Engineering and Technology


M.A.M. COLLEGE OF ENGINEERING AND
TECHNOLOGY
Siruganur, Tiruchirappalli – 621 105.

S.Kavitha , Associate Professor ,Department of ECE,M.A.M college of Engineering and Technology

You might also like