Digital Communication Intro2
Digital Communication Intro2
Bit rate
- The number of bits transmitted during one second
expressed in bits per second
-
Why are communication systems become digital? Shannon Noisy- Channel Coding Theorem
- states that reliable communication is possible over a
• Digital hardware has become so cheap, reliable, and noisy channel provided that the rate of
miniaturized, that digital interfaces are eminently practical. communication is below a certain threshold called
channel capacity
• A standardized binary interface between source and channel
simplifies implementation and understanding. Channel Capacity
- achieved with appropriate encoding and decoding
A standardized binary interface between source and channel technique
simplifies networking, which now reduces to sending binary
sequences through the network Shannon Source Coding Theorem
- establishes that on average the number of bits needed
to represent the result of an uncertain event is given
One of the most important of Shannon’s information by entropy
theoretic results is that if a source can be transmitted over a
channel in any way at all, it can be transmitted using a Entropy – defined in terms of probabilistic behavior of a
binary interface between source and channel. This is known
source of information
as the source/channel separation theorem.
- if the entropy of a source is less than the capacity
of the channel, then an error-free communication
over a channel can be achieved
Information
- The knowledge or information that is communicated
Shannon Limit for Information Capacity:
between two or more points
C = BW log2 (1 + S/N)
Information Theory Where: S/N = signal-to-noise ratio (absolute value
- A highly theoretical study of the efficient use of not in dB)
bandwidth to propagate information through
electronic communication system
- Use to determine the information capacity of a digital
communication system
Source Coding Theorem - Must satisfy the condition:
- the process by which representation of a data is
𝐾−1
accomplished that is performed by a device known as
the source encoder 𝑝𝑘 = 1 (4)
For the encoder to be efficient: 𝑘=0
1. Variable-length code is used where:
a. short code words are assigned to frequent source
symbols - if the probability pk=1 and pi =0, for all i≠k, then
b. long code words are assigned to rare source there is no surprise and therefore no information
symbols when symbol skis emitted
2. Must satisfy two functional requirements: - if the source symbol occur with different probabilities
a. The code words produced by the encoder are in and the probabilities pk is low, then there is more
binary form surprise on the symbol skwhen emitted by the source.
b. The source code is uniquely decodable, so that - Before the event S = skoccurs, there is an uncertainty,
the original source sequence can be when the event S =sk occurs, there is an amount of
reconstructed perfectly from the encoded binary surprise and there is a gain of information
sequence
- Thus the amount of information is related to the
Source Encoding Block Diagram: inverse of the probability of occurrence.