Introduction To Digital Communications
Introduction To Digital Communications
Digital Communications may be defined as the process of transmitting digital information to a sink. A digital waveform is a function used to represent digital information and has a discrete set of values. However, this doesnt mean that digital communications deals only with digital signals or information. In some cases, analog signals are also used as carriers modulated by a digital information signal. Information signal may also be in analog waveform. In this case, analog to digital conversion is needed prior to transmission then a digital to analog conversion is done at the recipient side. Digital communication systems offer a number of advantages over analog communication systems. 1. Digital signals are more robust to noise and may be regenerated for long distance communications without accumulation of noise. 2. It can provide better data security by encrypting the digital information prior to transmission. 3. It can provide a mean of error detection and control, thus preserving the integrity of the data 4. Digital signals coming from different sources can be combined and be transmitted over a common channel (multiplexing). 5. It is more economical, as digital circuits are easier to implement and maintained. The disadvantages are 1. Higher bandwidth are generally required 2. Synchronization is also required.
The information source may be from an analog source such as audio and video, or from a digital source such as teletype or computer data. The function of a source encoder is to convert the information into a sequence of binary digits. Source coder is sometimes referred to as data compressor. The channel encoder adds redundancy to the binary sequence to overcome the effects of noise and interference it will encounter when transmitted over the channel or medium. In effect the integrity of the data and fidelity of the received signal is improved. The digital modulator serves as the interface to the communication channel or medium. It maps the binary information into signal waveforms.
The communication channel is a physical medium used to send signal from the transmitter to the receiver. It can be a wired or guided medium such as wires or cables and fiber optic cables, or wireless or guided in which the signal is propagated through air. The signal is always subjected to noise as it travels down the medium. The primary goal is to preserve the integrity of the signal as it passes the medium, in other words, lessen its effect. At the receive end, the digital demodulator maps the received signal back to sequence of binary numbers. If the signal greatly suffered from noise, then it is possible that errors may be present on the received signal. The counter part of the channel encoder, the channel decoder extracts the information contained in the demodulated signal, it checks for error and in some cases tries to correct it. The source decoder transforms the sequence of binary numbers to a original form or the form needed at the output. Sources of information There are four main sources of information, speech, music, image, and video and computer data. These sources are characterized mostly by the signal that bears the information. A signal may be onedimensional such as speech and music; two dimensional such as in images; three dimensional like in video; and four dimensional such as in volume of data in time. This means that a signal may be defined as a single-valued function of time that plays the role of independent variable (dimensions). Information can also be in form of blocks or stream information. Speech Speech is considered as a primary way of communications. The sound is produced at the vocal tract starting from the glottis (the opening of the vocal cords) to the lips. As the sound propagates the vocal tract, it is shaped by the frequency selectivity of the vocal tract, this process may be considered as a form of sound filtering. It is important to note that the power spectrum of speech is zero at zero frequency and maximum at higher frequencies about hundred hertz, that is human hearing is frequency sensitive. In telecommunications, 300Hz to 3100Hz may be assigned adequate bandwidth for human speech or voice. Music In music, sounds are produced by musical instruments like guitar and piano. A musical note may last a short time or may be sustained for some time. Music has melodic structure which consists of time sequence of sounds, and has harmonic structure which is composed of a set of simultaneous sounds. Musical signals are bipolar, like that of human voice; however, it occupies a wider band of frequencies in the spectrum (about 15 kHz). Images and video Images or video are information based on human visual perception. It is produced from a camera, a facsimile, or a television receiver. Images/videos are represented by pixels. A camera contains a number of photosensitive elements. Light falling on it is recorded as a electrical charge in proportion to the
incoming lights intensity. The camera and the lens have means of controlling the amount of light in order to record images within its dynamic range. Dynamic range is the range of brightness a camera can record from the black to white values. Recorded in a camera may be analog or digital depending on the light sensor installed in the camera. Computer data Computer data are information that are processed in computing devices. These are digital in nature and follows some encoding techniques such as ASCII and EBCIDIC. It is characterized as wide-band signals since it occupies wide bandwidth, which is a requirement in digital representation of signals (complex waves). Information may also be transmitted in bursts. Compression techniques have been developed to provide solution to the burstiness characteristic of this information source.
impedance, velocity factor, 10MHz Attenuation, physical dimension and capacitance. For example, RG -59 has 75 ohm characteristic impedance, while RG-8 and RG-58 has 50 ohm. See table 12-3 Coaxial Cable Characteristics of Electronic Communications Systems by Wayne Tomasi for the lists of RG reference number and its characteristics. The figure below illustrates the range of frequencies for each guided channels.
Twisted pair 1kHz coaxial cable 1MHz waveguides 1GHz 100GHz fiber optic cables 1014 Hz 1015Hz
Waveguides are hollow tubes that are used in microwave transmission since coaxial cable and twisted pair can no longer be used effectively on these frequencies due to skin effect. The wave is bounced back and forth along the wall until it reach the other end of the waveguide instead of propagating it on the mediums surface. It can be a rectangular waveguide, circular waveguide or elliptical waveguide, based on the cross-sectional dimensions of the tube. 2. Fiber optic channels Fiber optic channels are made up of either a plastic or a glass core, where signals are propagated in the form of light. A cladding is just at the surface of the core is responsible for making the light to bounce back and forth the FOC as it transverse through its length. This is based on Snells Law. The cladding is protected by a special lacquer, silicone or acrylate coating. The buffer jacket provides additional support for the protective coating. A strength member increases the tensile strength of the overall cable. Finally, a polyurethane outer jacket is used to cover the entire cable at the outside. FOC offers advantages over metallic cables such as wider bandwidth, higher information capacity, immunity to crosstalk, immunity to static interference, immunity to environmental factors such as operating temperature and corrosion, safety and convenience, lower transmission loss, security, durability and reliability and economics. However, there are some disadvantages such as, interfacing costs, lower tensile strength, more fragile than copper wire, remote electrical power for the remote interfacing and regenerating equipment, more susceptible to losses introduced by bending the cable, and the need for specialized tools, equipment and training. It can have a bandwidth of about 10,000 GHz to 40,000 GHz. Wireless or Unguided Media In wireless propagation, the electromagnetic energy is coupled by using antenna that serves as a radiator. The operating frequency of the antenna depends mainly on its physical dimensions. The electromagnetic spectrum is divided into various frequency bands and each are allocated on specific application. This is to prevent signals from interfering to each other. There are three propagation modes in the atmosphere, namely, ground-wave propagation, sky-wave propagation and line of sight (LOS) propagation. Very Low Frequency (VLF) and Audio Frequency bands the ionosphere can act as a waveguide for the electromagnetic wave propagation. Thus in these range
of frequencies, communication signals can practically propagate around the globe. These frequencies are primarily used for maritime navigational aids. The information that is transmitted through these channels are relatively slow speed and generally confined to digital transmission. Thunderstorms are general sources of noise and interference from other users of the frequency band. Ground-wave propagation is used in Medium Frequency (MF) band. The wave is propagated on the ground surface in a vertical polarization. AM radio and maritime radio broadcasting are typical applications of this type of propagation. Atmospheric noise, man made noise and thermal noise are general sources of noise for ground-waves in MF band.
Sky-wave propagation results from transmitted signals being reflected from the ionosphere. The ionosphere is about 50 to 400 km above the earths surface and is consists of several layers of charged particles. During daytime, the lower layers such as D layer forms at altitudes lower than 120 km. The D layer absorbs signals lower than 2 MHz. At night-time, the electron density in the lower layers drops, reducing the frequency absorption during the daytime. Consequently, powerful transmitters can propagate over larger distances via the F layer of the ionosphere ranging from 140 km to 400 km above the surface of the earth. High Frequency bands (HF) bands propagating on sky-wave can experience signal multipath resulting in signal fading and intersymbol interference in a digital communication system. Atmospheric noise and thermal noise are dominant sources of noise in sky-wave propagation.
Line of sight (LOS) propagation is possible at frequencies above 30 MHz. This means that the transmitter and the receiver must be in direct line of sight with relatively low or no obstruction. VHF and UHF bands use this propagation. In general, the curvature of the earth limits the coverage of propagation. The major sources of noise in LOS are thermal noise generated at the receiver front end and cosmic noise picked up by the antenna. Atmospheric conditions can disrupt LOS propagation in SHF band, light to heavy rains can cause attenuation ranging from 0.003/km dB to 0.3dB/km, respectively in 10 GHz.
Storage channels Information storage and retrieval systems play significant roles in data handling activities. The process of storing data in a drive is similar to transmitting data in a wired communication channel while data read-outs from the drive is similar to receiving data. Signal processing to retrieve stored data is also similar to demodulating or decoding a received signal. Mostly, noise is generated by the electronic components and the interference from adjacent tracks. Characteristics of a Communications Channel/ Medium A communications channel may be characterized by bandwidth, distortion, distraction, interference and attenuation. Bandwidth The frequency of a digital signal can be referred to as the number of bits a transmission channel can carry per unit time. It is directly related to the lower and upper limits of the usable frequency range. Unlike periodic analog signal where frequencies are definite for a given signal, the frequency of a digital signal depends on the time interval between the states. Take for example the figure below: In the first waveform shows slow transition between the two states of a binary waveform. The second waveform shows a fast transition between the two states. The frequency of these two example waveforms can easily be computed
by taking the reciprocal of the time period covered from the trailing edge of one state to the trailing edge of the next pulse. However, in the third example, the occurrence of the pulses is non-uniform. In this case, the frequency is computed when the time interval between transitions are the shortest. It is then labelled as highest bit rate in the data stream. A communication channel can be limited by the amount of bit rate it can handle. A Passband differs from the bandwidth in a sense that a Passband defines the particular slot in the electromagnetic spectrum while bandwidth is the range of the frequencies covered by a signal. Take for example coaxial cable used in a TV channel has a 6 MHz bandwidth. At this point the specific slot in the electromagnetic spectrum is not defined. But if we say that a TV station operates on 54 MHz to 60 MHz, then this gives the Passband (not the bandwidth). In order to facilitate optimum transmission of data, it is necessary to limit the channel capacity in a reasonable amount of bandwidth. Cut-off frequencies may be assigned to trim down the Passband of a channel. This can minimize interference between users transmitting simultaneously. Also, transmission media poses inherent cut-off frequencies. In order to make sure that a signal will propagate effectively over the channel, the signal should be within the cut-off frequencies and, thus, the Passband of the given medium. Distortion Distortions are results of unnecessary delays in the communication channel. Ideally, signals propagating through a communication channel should have same speed regardless of its frequencies. In this manner, the linear relationship between the frequencies and the phase angle of any signal with respect to time is maintained. Then the channel is said to be distortionless. However, it is not the case for a non-ideal channel. Distortion degrades signal quality and in some cases, it can cause errors in the received signals. There are three types of distortion, namely, phase-delay distortion, attenuation distortion and jitter. The figure below illustrates the phase-delay distortion, envelope delay distortion or phase distortion.
The figure to the left shows a distortionless channel while the figure at the right shows phase-delay distortion. The input signal or the fundamental signal and its harmonics should arrive at the receiving end at the same time. Phase-delay distortion happens when some harmonics of the complex signal
propagating through the transmission medium arrived later than expected at the receiving end because of the nonlinearity of the medium. This can cause phase shifting of the original signal, and the resulting received signal is badly shaped. Mathematically, it is the first derivative of the phase delay, which means that the shape of the envelop delay curve reflects the degree of change of the slope of the phase vs. frequency curve. In attenuation distortion, higher frequencies of transmitted signals are attenuated at a higher rate than those at the lower frequencies. Since some components of the original signal is lost during transmission, the received signal can be distorted. The figure below illustrates this type of distortion
The green waveform is the transmitted or original signal while the one in red shows an attenuation distorted signal. Phase jitter is basically a low-index modulation of the transmitted signal with low frequency harmonics of the power line frequency (50 to 60 Hz). It can pose a serious problem in digital transmission of high rates. At the received end, it can be tracked and compensated for. Distraction Distractions are defined as unwanted audible signals being received during a telephone conversation. Crosstalk and echo are typical causes of distraction. Crosstalk is caused by induction from a line to its adjacent lines, as long as the distance would allow so. The induced signal is an audible signal, thus the other line can hear it. It is dependent on the frequency, high current ratings, and distance between parallel lines or circuits. Echo, in the other hand, is the return of a transmitted signal back to its source, thus one can hear himself/herself during the telephone conversation. Echoes may be caused by electrical imbalance in the circuit especially manifested on long distance calls. Echo suppressors are circuits that closes the receiver when the caller is speaking, then it opens the receiver when he pause or stops talking. Interference Interference can cause errors on transmitted data coming from noise, induction and multi-frequency tones
Induction is caused by interference from both the electric and magnetic fields by parallel lines (adjacent lines) similar to a crosstalk. Noise is commonly defined as any unwanted signal that interferes with the communication, measurement, perception or processing of an information-bearing signal. It can bear information regarding the sources of noise and the environment in which the signal is propagated. Noise and distortion are the main factors that limit the capacity of data in a communications channel and the accuracy of results in signal measurement systems. Noise processing depends on the ability to characterise and model the noise process, and to use the noise characteristics to differentiate the signal from the noise. Noise may be classified depending on its source, physics, frequency spectrum, and/or time characteristics. Multi-frequency tones are frequencies that may be heard on a telephone conversation usually came from the touch-tone dialers on the telephones. It can be misinterpreted by some machines as a valid data or control signals. Attenuation Attenuation is referred to as the loss of power in the transmitted signal, as it propagates through the communication medium. This loss is caused by absorption or dissipation of power in the transmission line. It usually follows the basic formula for wire resistance, thus longer wires causes more attenuation and thicker wires have less attenuation. Mathematical Models for Communication Channels Mathematical models are representations of the most important characteristics of a system, such as a transmission medium. A mathematical model of a communication channel can be used in the design of the channel encoder/decoder and modulator demodulator, and channel encoder/decoder. There are three frequently used model to characterize a physical channel. Additive Noise Channel It is the simplest and predominant communications channel model because r(t) = s(t)+ n(t) it can be applied to a broad class of + communications model and because of its mathematical manipulability. In this model, noise n(t) corrupts the n(t) transmission signal s(t). This noise may come from electronic components and amplifiers at the receiver or interference encountered during transmission. This model is also known as Additive Gaussian Noise Process because thermal noise is statistically a Gaussian Process. If channel attenuation is considered, an attenuation factor alpha is added, r(t) = s(t) + n(t). Channel
n(t)s(t)
This model is used when filters are used to ensure that the transmitted signals do not exceed the specified bandwidth in order to avoid interference from adjacent channels. This type of channel is characterized mathematically as linear filter with additive noise such that ( ) Linear Time-Variant Filter Channel Channel s(t) Linear Filter c( ;t) r(t)= s(t)*c( ;t) +n(t) ( ) ( ) ( )
+
n(t)
This model is used to model physical channels characterized by its time-variant multipath propagation such as underwater acoustic channels and ionospheric radio channels. Mathematically, it can be characterized as a time-variant channel impulse response c(;t), where represents elapsed time or age. ( ) ( ) ( ) ( )
This means that the received signal came from L numbers of multipath; whereas, each component is attenuated ( ) and delayed by .