0% found this document useful (0 votes)
4 views17 pages

ITC Lecture1

The document outlines a course on Information Theory and Coding, focusing on principles such as probability, entropy, and communication channel capacity. It includes a syllabus, resources, and grading criteria, as well as an introduction to the historical context and applications of information theory. Key topics include the relationship between information and uncertainty, coding techniques, and the performance of communication systems.

Uploaded by

ahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views17 pages

ITC Lecture1

The document outlines a course on Information Theory and Coding, focusing on principles such as probability, entropy, and communication channel capacity. It includes a syllabus, resources, and grading criteria, as well as an introduction to the historical context and applications of information theory. Key topics include the relationship between information and uncertainty, coding techniques, and the performance of communication systems.

Uploaded by

ahmed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Information Theory and

Coding

Lecture 1

By Dr Wassan Saad

4th Electrical Engineering

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


1
Syllabus

The aims of this course are to introduce the principles and applications of
information theory. The course will study how information is measured in terms of
probability and entropy, and the relationships among conditional and joint entropies;
how these are used to calculate the capacity of a communication channel, with and
without noise; coding schemes, including error correcting codes; how discrete
channels and measures of information generalize to their continuous forms.

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


2
Resources

1. A Student’s Guide to Coding and Information Theory STEFAN M. MOSER


PO-NING CHEN, National Chiao Tung University (NCTU), Hsinchu,
Taiwan
2. Fundamentals in Information Theory and Coding by Monica Borda
3. Information Theory, Inference, and Learning Algorithms
David J.C. MacKay Version 7.2 (fourth printing) March 28, 2005
4. Elements of Information Theory , Thomas M. Cover, Joy A. Thomas
5. INFORMATION THEORY AND CODING BY EXAMPLE ,MARK
KELBERT Swansea University, and Universidade de Sa ̃o Paulo ,YURI
SUHOV ,University of Cambridge, and Universidade de Sa ̃o Paulo

Scores

Attendance 15%

HomeWorks 15%

Quizzes 15%

Other 5%

Mid 1 25%

Mid 2 25%

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


3
Outlines

• Introduction about Information Theory


o Information theory versus coding theory
o Model and basic operations of information processing systems
o Physical Interpretation of amount of information
o Definition of Information (measure of Information)
o Properties of information
• Entropy
o Properties of Entropy
• Information Rate

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


4
Introduction

Modern digital communication depends on Information Theory, which was


invented in the 1940's by Claude E. Shannon. Shannon first published A
Mathematical Theory of Communication in 1947-1948, and provides a
mathematical model for communication.

Systems dedicated to the communication or storage of information are commonplace


in everyday life. Generally speaking, a communication system is a system which
sends information from one place to another. Examples include telephone networks,
computer networks, audio/video broadcasting, etc. Storage systems, e.g. magnetic
and optical disk drives, are systems for storage and later retrieval of information. In
a sense, such systems may be regarded as communication systems which transmit
information from now (the present) to then (the future). Whenever or wherever
problems of information processing arise, there is a need to know how to compress
the textual material and how to protect it against possible corruption.

1.1 Information theory versus coding theory

Information theory is a branch of probability theory with extensive applications


to communication systems. Like several other branches of mathematics, information
theory has a physical origin. It was initiated by communication scientists who were
studying the statistical structure of electrical communication equipment and was
principally founded by Claude E. Shannon through the landmark contribution
[Sha48] on the mathematical theory of communications.

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


5
Coding theory is mainly concerned with explicit methods for efficient and reliable
data transmission or storage, which can be roughly divided into data compression
and error-control techniques. Of the two, the former attempts to compress the data
from a source in order to transmit or store them more eff ciently. This practice is
found every day on the Internet where data are usually transformed into the ZIP
format to make files smaller and reduce the network load.

The latter adds extra data bits to make the transmission of data more robust to
channel disturbances. Although people may not be aware of its existence in many
applications, its impact has been crucial to the development of the Inter-net, the
popularity of compact discs (CD), the feasibility of mobile phones, the success of
the deep space missions, etc.

Logically speaking, coding theory leads to information theory, and information


theory provides the performance limits on what can be done by suitable encoding of
the information. Thus the two theories are intimately related, although in the past
they have been developed to a great extent quite separately.

Information theory answers two fundamental questions in communication theory:


what is the ultimate data compression (answer: the entropy H), and what is the
ultimate transmission rate of communication (answer: the channel capacity C).

1.2 Model and basic operations of information processing


systems

Communication and storage systems can be regarded as examples of


information processing systems and may be represented abstractly by the block

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


6
diagram in Figure 1.1. In all cases, there is a source from which the information
originates. The information source may be many things; for example, a book, music,
or video are all information sources in daily life.

The source output is processed by an encoder to facilitate the transmission (or


storage) of the information. In communication systems, this function is often called
a transmitter, while in storage systems we usually speak of a recorder. In general,
three basic operations can be executed in the encoder: source coding, channel
coding, and modulation.

The output of the encoder is then transmitted through some physical communication
channel (in the case of a communication system) or stored in some physical storage
medium (in the case of a storage system). As examples of the former we mention
wireless radio transmission based on electromagnetic waves, telephone
communication through copper cables, and wired high-speed transmission through
fiber optic cables. As examples of the latter we indicate magnetic storage media,
such as those used by a magnetic tape, a hard-drive, or a floppy disk drive, and
optical storage disks, such as a CD-ROM1 or a DVD.2 Each of these examples is
subject to various types of noise disturbances. On a telephone line, the disturbance
may come from thermal noise, switching noise, or crosstalk from other lines. On
magnetic disks, surface defects and dust particles are regarded as noise disturbances.
Regardless of the explicit form of the medium, we shall refer to it as the channel.

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


7
Information conveyed through (or stored in) the channel must be recovered at the
destination and processed to restore its original form. This is the task of the decoder.
In the case of a communication system, this device is often referred to as the receiver.
In the case of a storage system, this block is often called the playback system. The
output of the decoder is then presented to the final user, which we call the
information sink.

1.3 Physical Interpretation of information

The performance of the communication system is measured in terms of its error


probability. An errorless transmission is possible when probability of error at the
receiver approaches zero.

The performance of the system depends upon available signal power, channel noise
and bandwidth. Based on these parameters it is possible to establish the condition
for errorless transmission. These conditions are referred as Shamom.s theorem.

Consider the source which emits the discrete symbols randomly from the set of fixed
alphabet i.e.

X = {x! , x" , x# , x$ , . . . x%-" , ) …………(1.1)

The various symbols in ‘X’ have probabilities of p! , p" , p$ , . . . . . . . . etc which can be
written as,

P(X = x' ) = p' k = 0, 1 , 2, 3, . . . . . . . K-1 ……(1.2)

The set of probabilities satisfy the following condition,

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


8
(-"

6 p' = 1
)*!

Such information source is called discrete information source. The concept of


‘Information” produced by the source is discussed in the next section. The Idea of
Information is related of “Uncertainty “ or “Surprise. Consider the emission of
symbol X= x' from the source. If the probability of x! is p' = 0, then such a symbol
is impossible. Similarly when probability of p' = 1, then such a symbol of sure. In
both the cases there is no surprise and hence there is no information is produced
when symbol x' is emitted. As the probability p' is low, there is more surprise or
uncertainty.

Lets consider the content of information the following messages

1. Tomorrow the sun will rise from the east


2. It will snow in Baghdad this winter
3. The prime minister will call you in the next one hour.

The first statement does not carry any information since it is sure that the sun always
rises from the east. The probability of occurrence of first event is high or sure. Hence
it carries less or negligible information.

In the winter season snow fall in Baghdad is rare. Hence the probability of
occurrence of this even is low to medium. Therefore it carries a moderate
information.

The third statement predicts about the phone ring from the prime minister which is
very rare event and it is almost impossible to be done in the next one hour. Hence
this message carries large amount of information.

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


9
1.4 Definition of Information (measure of Information)
Let us consider the communication system which transmit messages m1, m2,
m3…. With probabilities of occurrence p1, p2, p3 ….. The amount of information
transmitted through the message mk with probability pk is given by
"
Amount of Information : I' = log # ,
!

Unit of information

Normally the unit of information is ‘bit’. We are using the term bit
as an abbreviation for the binary digit.

1.5 Properties of Information

1. If there is more uncertainty about the message, information carried is also


more
2. If receiver know the message being transmitted the amount of information
carried is zero
3. If I" is the information carried by message m" , and I# is the information
carried by m# , then the amount of information carried combinedly due to m1
and m2 is + I#
4. If there are M = 2- I" equally likely ,Massages, then the amount of
information carried by each message will be N bits

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


10
Example 1 : Calculate the amount of information if p' = ¼;

Example 2: Calculate the amount of information if binary digit (binit) occur with
equal likelihood in binary PCM.
Solution:

Example 3 : In binary PCM if “0” occure with probability ¼ and ‘1” occur with
probability 3/4 , then calculate the amount of information conveyed by each binit.

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


11
Example 4: If there are M equally likely and independent messages, then prove that
the amount of information carried by each message will =N bits where M = 2- and
N is an integer.

Example 5: If I" is the information carried by message m" and I# is the information
carried by message m# , then prove that the amount of information carried
compositely due to I" and I#" is I",# = I" + I"
Solution:

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


12
Homework 1 : Prove that if the receiver knows the message being
transmitted , the amount of information carried is zero?

2.1 Entropy
It is Average Information Content of Symbols in Long Independent Sequences

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


13
Information theory and Coding Dr Wassan Saad 4th Electrical Engineering
14
Information theory and Coding Dr Wassan Saad 4th Electrical Engineering
15
2.2 Properties of Entropy

Homework 2 : Calculate the entropy when p' = 0 and when p' = 1

3.1 Information Rate

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


16
3.2 Information Rate second law

Example 1: If some transmission scheme contains 4 messages, calculate the


information rate if all messages are equally likely Knowing that the r= 1.875
messages/sec.

= 1.87 * 2 = 3.74 bits/seconds

Homework3: A Source produces dots “.” And dashes “ __” with p(dot) = 0.65 if the
time duration of a dot is 200 ms and that for a dash is 800 ms. Find the average source
entropy rate.

Information theory and Coding Dr Wassan Saad 4th Electrical Engineering


17

You might also like