Notes
Notes
- Transmission, reception, and processing of information with the use of electronic circuits.
Digital Modulation
- Digitally modulated analog signals.
- Also known as Digital Radio.
Digital Transmission
- Physical cable or free space
- Any physical facility
Output Transducer
- The signal desired format is analog or digital at the output.
Channel
- Physical medium; wired channels: telephony
Source Encoder
- Convert the signal source into a digital signal.
Data Compression
- Efficiently converting the output of either analog or digital source into a sequence
of binary digits.
Source Decoder
- Tries to decode the sequence
Channel Encoder
- Introduce some redundancy in the binary information; overcome the effects of noise and
interference.
Channel Decoder
- Attempts to reconstruct the original information sequence
Digital Modulator
- Maps the binary sequence
Digital Demodulator
- Reduces the waveform to the sequence
Digital Modulation
Amplitude Shift Keying - amplitude of the carrier is varied.
Frequency Shift Keying - frequency of the carrier is varied.
Phase Shift Keying - phase of the carrier is varied.
Quadrature Shift Keying - amplitude and phase.
Advantages of digital
● Ease of regeneration
● Noise Immunity
● Ease of Transmission
● Use of modern technology
● Ease of encryption
Disadvantages of digital
● Requires: Greater bandwidth
● Reliable system synchronization
● A/D conversions at a high rate
● Nongraceful degradation
● Performance criteria
● Probability of error or Bit Error Rate
Information Theory
- Theoretical study of the efficient use of bandwidth to propagate information through
electronic communications systems.
- To determine the information capacity of data communication systems.
Information Capacity
- Measure how much information can be propagated
Hartley’s Law
- Developed by R. Hartley of Bell Telephone Laboratories in 1928.
M-ary Encoding
- Derived from the word binary.
- Advantageous to encode at a level higher than binary.
Baud
- Rate of change of a signal on the transmission medium after encoding and modulation
have occurred.
Nyquist Bandwidth
- Minimum theoretical bandwidth necessary to propagate a signal
ENTROPY
- Measures disorder or randomness in a system
Clausius Entropy
- Extensive quantity that can’t be measured directly.
- Measure of disorder and randomness in a physical system.
Boltzmann-Gibbs Entropy
- Gibbs enhanced the concept of Boltzmann entropy in cases where microstates are not
even likely.
Shannon Entropy
- Probability distribution related to the maximum possible data compression
- Measures uncertainty or probability of distribution
Properties of Shannon-Entropy
1. Non-Negativity
2. Maximum Entropy for Uniform Distribution
3. Additivty
4. Concavity
5. Subaddivity
6. Continuity
7. Extensibility
8. Monotonicity - should less than 1 but greater than 0.
Source Coding
- Method of representing information
Huffman Coding
- variable - length coding algorithm
Steps:
1. List all the items
2. List all the probability of each items respectively
3. Get the 2 least then add
4. Repeat step 3 until you reach the probability equal to 1.
5. Read the binary bits from right to left, not including the 1.
Block Code
- Efficiency in error detection but not in error correction.
Codewords - group or block of symbol
Hamming Code
- Hamming Distance - The number of bit position at which the codes two differ.