0% found this document useful (0 votes)
16 views2 pages

Digital Communication Intro2

Digital communication systems transmit digital data over an analog channel. They involve modulating an analog carrier with a digital signal. Three key aspects are: 1. Information capacity is a measure of how much information can be transmitted through the system as a function of bandwidth and noise. It represents the maximum number of independent symbols that can be sent per unit of time. 2. Shannon's source coding and channel coding theorems establish the theoretical limits of lossless data compression and reliable communication over noisy channels. 3. Digital systems use a binary interface between the source and channel to simplify implementation and allow separation of source and channel coding according to Shannon's separation theorem.

Uploaded by

Georji kairu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views2 pages

Digital Communication Intro2

Digital communication systems transmit digital data over an analog channel. They involve modulating an analog carrier with a digital signal. Three key aspects are: 1. Information capacity is a measure of how much information can be transmitted through the system as a function of bandwidth and noise. It represents the maximum number of independent symbols that can be sent per unit of time. 2. Shannon's source coding and channel coding theorems establish the theoretical limits of lossless data compression and reliable communication over noisy channels. 3. Digital systems use a binary interface between the source and channel to simplify implementation and allow separation of source and channel coding according to Shannon's separation theorem.

Uploaded by

Georji kairu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

- It should provide answers to two fundamental

Digital Communication questions:


- Include system where relatively high frequency 1. What is the irreducible complexity below which
analog carriers are modulated by a relatively low a signal cannot be compressed?
frequency digital information (digital radio) and the 2. What is the ultimate rate for reliable
system involving the transmission of digital pulses communication over a noisy channel?
(digital transmission).
- are communication systems that use such a digital Information Capacity
sequence as an interface between the source and the - Is a measure of how much information can be
channel input (and similarly between the channel propagated through a communication system and is a
output and final destination) function of the bandwidth and transmission line
- Represents the number of independent symbols that
can be carrier through a system in a given unit of
Simplified Block Diagram of a Digital Radio System time
- Expressed in bit rate

Bit rate
- The number of bits transmitted during one second
expressed in bits per second
-

The Hartley’s Law: states that information capacity of a


noiseless channel or an ideal channel

 source encoder converts the source output to a C = 2BW log2 n


binary sequence Where: BW = channel bandwidth
 channel encoder (often called a modulator) N = number of coding levels ( n=2 for
processes the binary sequence for transmission binary, n=8 for octal, n=10 for
over the channel. decimal
 channel decoder (demodulator) recreates the
incoming binary sequence (hopefully reliably)
 source decoder recreates the source output. Claude E. Shannon – Father of Information Theory

Why are communication systems become digital? Shannon Noisy- Channel Coding Theorem
- states that reliable communication is possible over a
• Digital hardware has become so cheap, reliable, and noisy channel provided that the rate of
miniaturized, that digital interfaces are eminently practical. communication is below a certain threshold called
channel capacity
• A standardized binary interface between source and channel
simplifies implementation and understanding. Channel Capacity
- achieved with appropriate encoding and decoding
 A standardized binary interface between source and channel technique
simplifies networking, which now reduces to sending binary
sequences through the network Shannon Source Coding Theorem
- establishes that on average the number of bits needed
to represent the result of an uncertain event is given
 One of the most important of Shannon’s information by entropy
theoretic results is that if a source can be transmitted over a
channel in any way at all, it can be transmitted using a Entropy – defined in terms of probabilistic behavior of a
binary interface between source and channel. This is known
source of information
as the source/channel separation theorem.
- if the entropy of a source is less than the capacity
of the channel, then an error-free communication
over a channel can be achieved
Information
- The knowledge or information that is communicated
Shannon Limit for Information Capacity:
between two or more points
C = BW log2 (1 + S/N)
Information Theory Where: S/N = signal-to-noise ratio (absolute value
- A highly theoretical study of the efficient use of not in dB)
bandwidth to propagate information through
electronic communication system
- Use to determine the information capacity of a digital
communication system
Source Coding Theorem - Must satisfy the condition:
- the process by which representation of a data is
𝐾−1
accomplished that is performed by a device known as
the source encoder 𝑝𝑘 = 1 (4)
For the encoder to be efficient: 𝑘=0
1. Variable-length code is used where:
a. short code words are assigned to frequent source
symbols - if the probability pk=1 and pi =0, for all i≠k, then
b. long code words are assigned to rare source there is no surprise and therefore no information
symbols when symbol skis emitted
2. Must satisfy two functional requirements: - if the source symbol occur with different probabilities
a. The code words produced by the encoder are in and the probabilities pk is low, then there is more
binary form surprise on the symbol skwhen emitted by the source.
b. The source code is uniquely decodable, so that - Before the event S = skoccurs, there is an uncertainty,
the original source sequence can be when the event S =sk occurs, there is an amount of
reconstructed perfectly from the encoded binary surprise and there is a gain of information
sequence
- Thus the amount of information is related to the
Source Encoding Block Diagram: inverse of the probability of occurrence.

sk bk I (sk) = log (1/pk) ----- (5)


Discrete Binary
memory less Sequence Where: I (sk) = amount of information gained after
sk sk
source observing the event S=sk
sk
pk= probability

Source Coding Theorem: Properties of Equation 5:

𝑲=𝟏 1. If the outcome of an event is certain, even before it


𝑳= 𝒑𝒌 𝒍𝒌 (𝟏) occurs, there is no information gained
𝒌=𝟎
2. The occurrence of an event S=sk either provides
some or no information but never brings out the loss
Given a discrete memoryless source of entropy H(L)),the of information
average codeword length L for any distortionless source
encoding is bounded as: 3. The less probable the event is, the more information
is gained
L >H(L))---(2)
4. I (sksl) = I(sk) + I (sl) if skand slare statistically
Where : L = average codeword length independent
Pk = probability of the symbol
Sk= the symbol at kth
K = different symbol of the source The Entropy (H(L))
H(L)) = entropy - Measures the average information content per source
symbol
- L)- the source alphabet or the label for the source
Discrete Memory Less Source:
- A source that emits statistically independent symbols during 𝐊−𝟏 𝟏
(H(L)) = 𝐤=𝟎 𝐩𝐤 𝐥𝐨𝐠𝟐 𝐩 (6)
successive signaling intervals 𝐤

- The symbol emitted at any time is independent of the


previous choices
Some Properties Of Entropy
1. The entropy of a discrete memory less source is bounded
Probabilistic Experiment: as follows:
- involves the observation of the output emitted by a
discrete source during every unit time or signaling 0 <H(L))< Log2 K
interval 2. H(L) )= 0, if and only if the probability pk=1 for some k,
- given the probability: and the remaining probabilities in the set are all zero, this
lower bound of entropy corresponds to no uncertainty
P(S= sk) = pk, k=0, 1,..K-1---(3) 3. H(L) )= log2K, if and only if pk=1/K for all k (all the
symbol in the alphabet L)) ; this upper bound on entropy
corresponds to the maximum uncertainty

You might also like