0% found this document useful (0 votes)
10 views

Notes

Uploaded by

qmalsucgang
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Notes

Uploaded by

qmalsucgang
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Electronic communications

- Transmission, reception, and processing of information with the use of electronic circuits.

Digital Modulation
- Digitally modulated analog signals.
- Also known as Digital Radio.

Digital Transmission
- Physical cable or free space
- Any physical facility

Elements of Digital Communication

Information Source and Input Transducer


- It can be analog or digital.
- The signal produced is converted into a digital signal consisting of 1’s and 0’s.

Output Transducer
- The signal desired format is analog or digital at the output.
Channel
- Physical medium; wired channels: telephony

Source Encoder
- Convert the signal source into a digital signal.

Data Compression
- Efficiently converting the output of either analog or digital source into a sequence
of binary digits.

Source Decoder
- Tries to decode the sequence

Channel Encoder
- Introduce some redundancy in the binary information; overcome the effects of noise and
interference.

Channel Decoder
- Attempts to reconstruct the original information sequence

Digital Modulator
- Maps the binary sequence

Digital Demodulator
- Reduces the waveform to the sequence

Digital Modulation
Amplitude Shift Keying - amplitude of the carrier is varied.
Frequency Shift Keying - frequency of the carrier is varied.
Phase Shift Keying - phase of the carrier is varied.
Quadrature Shift Keying - amplitude and phase.

Advantages of digital
● Ease of regeneration
● Noise Immunity
● Ease of Transmission
● Use of modern technology
● Ease of encryption

Disadvantages of digital
● Requires: Greater bandwidth
● Reliable system synchronization
● A/D conversions at a high rate
● Nongraceful degradation
● Performance criteria
● Probability of error or Bit Error Rate

Information Theory
- Theoretical study of the efficient use of bandwidth to propagate information through
electronic communications systems.
- To determine the information capacity of data communication systems.

Information Capacity
- Measure how much information can be propagated

Bit - to present information; also called Binary digit.


Bit Rate - number of bits transmitter during once second; expressed in bps.

Hartley’s Law
- Developed by R. Hartley of Bell Telephone Laboratories in 1928.

Shannon’s Limit for Information Capacity


- Develop by Claude E. Shannon of Bell Telephone Laboratories in 1948.

Signal-to-Noise ratio required for an ideal channel

M-ary Encoding
- Derived from the word binary.
- Advantageous to encode at a level higher than binary.

Baud
- Rate of change of a signal on the transmission medium after encoding and modulation
have occurred.

Nyquist Bandwidth
- Minimum theoretical bandwidth necessary to propagate a signal
ENTROPY
- Measures disorder or randomness in a system

Clausius Entropy
- Extensive quantity that can’t be measured directly.
- Measure of disorder and randomness in a physical system.

Boltzmann-Gibbs Entropy
- Gibbs enhanced the concept of Boltzmann entropy in cases where microstates are not
even likely.

Shannon Entropy
- Probability distribution related to the maximum possible data compression
- Measures uncertainty or probability of distribution

Properties of Shannon-Entropy
1. Non-Negativity
2. Maximum Entropy for Uniform Distribution
3. Additivty
4. Concavity
5. Subaddivity
6. Continuity
7. Extensibility
8. Monotonicity - should less than 1 but greater than 0.

Source Coding
- Method of representing information

Source Coding Theorem


Assumptions
● Data is noiseless
● Has an infinite number of elements
● Statistics of the source is known
Goal
● Representation is optimal
● A source at the minimum
Intuition
● Assign shorter (longer) codewords to frequent (rate) source symbols.

Huffman Coding
- variable - length coding algorithm

Steps:
1. List all the items
2. List all the probability of each items respectively
3. Get the 2 least then add
4. Repeat step 3 until you reach the probability equal to 1.
5. Read the binary bits from right to left, not including the 1.

Block Code
- Efficiency in error detection but not in error correction.
Codewords - group or block of symbol

Systematic Block Code


- Arranged systematically

Non-Systematic Block Code


- Not arranged systematically

Block code (n,k) parameters


● Total codewords required as per n block codes = 2n
● Total codewords required as per k information = 2k
● Total redundant codewords required as per r parity bits = 2n-2k

Block Code for Parity Check


- A simple form of error control
- Using Exclusive or

Hamming Code
- Hamming Distance - The number of bit position at which the codes two differ.

Minimum Hamming distance - dmin


- It is the smallest hamming distance between all possible codes in the encoding scheme.

You might also like