ECE503: Communications Engineering 4: Dr. Fatma Khallaf
ECE503: Communications Engineering 4: Dr. Fatma Khallaf
Lecture 1
Email: [email protected]
Spring 2025
Course Code
cecf57n
ECE503: Communications Engineering 4 (2025) | General | Microsoft Teams
2
General Information
3
Course Materials
Reference:
4
1 Communications Engineering 1
All Types of AM ( DSB-LC, DSB-SC, SSB, VSB, QAM) – AM modulators, and demodulators,
advantages and disadvantages- Synchronization circuits - AM applications: Telephone channel
multiplexing and superheterodyne receiver -Angle Modulation - Narrow band angle modulated
signals - Spectrum of sinusoidal signal (N.B and W.B) - Generation of wide band FM ( Indirect
and Direct methods)-Demodulation (slope detector, PLL) - De-emphasis and pre-emphasis
filtering -compatible stereo - Intersystem comparison – Sampling process – PAM –
Quantization (uniform and non-uniform) – PCM – Time division multiplexing – Delta, and
adaptive delta modulation – Differential PCM – random process – Stationary and ergodic
processes – Mean, correlation, and covariance functions – Power spectral density – Narrow
band noise.
2 Communications Engineering 2
5
3 Communications Engineering 3
DFT and its properties – Fading (fast, slow, and flat) – Frequency selective and non-
selective – Dual Multi-Tone (DMT) – OFDM – Multi- path propagation – Delay
spread values – Guard time and cyclic extension – OFDM parameters – OFDM
versus single carrier modulation - Spread Spectrum – PN sequence generators –
Direct sequence Spread Spectrum – Probability of error – Frequency Hopping
Spread Spectrum – CDMA – DS-CDMA.
4 Communications Engineering 4
6
Contents
• Introduction
• Overview: What is Information Theory?
• Historical Background
• Probability Rules
• Entropy
• Mutual Information
7
Introduction
➢ Information: is the reduction of uncertainty. Imagine your
friend invites you to dinner for the first time. When you
arrive at the building where he lives you find that you have
misplaced his apartment number. He lives in a building with
4 floors and 8 apartments on each floor. If a neighbor passing
by tells you that your friend lives on the top floor, your
uncertainty about where he lives reduces from 32 choices to 8.
By reducing your uncertainty, the neighbor has conveyed
information to you. How can we quantify the amount of
information?
➢ Information theory is the branch of mathematics that describes
how uncertainty should be quantified, manipulated and
represented.
8
Overview
What is Information Theory?
9
Overview
10
Historical Background
13
Historical Background
14
Probability Rules
What are random variables? What is probability?
• Random variables are variables that take on values determined by
probability distributions.
1.
16
Probability Rules
• Probability theory rests upon two rules:
2.
17
Entropy
I = log2 p (1)
19
Entropy
Why the Logarithm?
• The logarithmic measure is justified by the desire for
information to be additive. We want the algebra of our
measures to reflect the Probability Rules.
20
Entropy
21
Information Measure
24
Entropy of Ensembles
• Solution:
(5)
This measures the average uncertainty that remains about X, when Y is known.27
Chain Rule for Entropy
28
Entropy
• Solution:
29
Entropy
(10)
31
Mutual Information
32
Mutual Information
33
Venn Diagram
• In the Venn diagram, the portion of H(X) that does not lie
within I(X;Y) is just H(X|Y). The portion of H(Y) that
does not lie within I(X;Y) is just H(Y|X).
34
Venn Diagram
Fig. Mutual information’s relationship with joint entropy and conditional entropy.
35
Let’s Conclude
The relationships between entropy, joint entropy, conditional entropy and mutual
information The width of each bar represents its size (in bits).
36
Let’s Conclude
38
THANKS
☺
39