0% found this document useful (0 votes)
118 views12 pages

Remote Sensing and Gis Unit - Iii Rs-Ii

The document discusses remote sensing techniques and the electromagnetic spectrum. It provides details on: 1) The electromagnetic spectrum ranges from gamma rays to radio waves, with different regions defined by wavelength. Visible light makes up a small portion, while other regions like infrared and microwave are important for remote sensing. 2) Electromagnetic radiation has characteristics of wavelength, frequency, and velocity that are related. Shorter wavelengths have higher frequencies and energies. 3) The sun is the primary energy source for remote sensing. Radiation interacts with and is altered by the atmosphere through scattering and absorption processes before being detected by sensors.

Uploaded by

vijjikewlguy7116
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
118 views12 pages

Remote Sensing and Gis Unit - Iii Rs-Ii

The document discusses remote sensing techniques and the electromagnetic spectrum. It provides details on: 1) The electromagnetic spectrum ranges from gamma rays to radio waves, with different regions defined by wavelength. Visible light makes up a small portion, while other regions like infrared and microwave are important for remote sensing. 2) Electromagnetic radiation has characteristics of wavelength, frequency, and velocity that are related. Shorter wavelengths have higher frequencies and energies. 3) The sun is the primary energy source for remote sensing. Radiation interacts with and is altered by the atmosphere through scattering and absorption processes before being detected by sensors.

Uploaded by

vijjikewlguy7116
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

REMOTE SENSING AND GIS

UNIT - III
RS-II
1. Introduction
Remote sensing techniques, electromagnetic radiations emitted / reflected by the targets are recorded at remotely
located sensors and these signals are analysed to interpret the target characteristics. Characteristics of the signals
recorded at the sensor depend on the characteristics of the source of radiation / energy, characteristics of the target
and the atmospheric interactions.
2. Electromagnetic energy
Electromagnetic (EM) energy includes all energy moving in a harmonic sinusoidal wave pattern with a velocity
equal to that of light. Harmonic pattern means waves occurring at frequent intervals of time.
Electromagnetic energy has both electric and magnetic components which oscillate perpendicular to each other
and also perpendicular to the direction of energy propagation as shown in Fig. 1.
It can be detected only through its interaction with matter.

Fig.1. Electromagnetic wave


Examples of different forms of electromagnetic energy: Light, heat etc.
EM energy can be described in terms of its velocity, wavelength and frequency.
All EM waves travel at the speed of light, c, which is approximately equal to 3108 m/s.
Wavelength of EM wave is the distance from any point on one wave to the same position on the next wave (e.g.,
distance between two successive peaks). The wavelengths commonly used in remote sensing are very small. It is
normally expressed in micrometers (m). 1 m is equal to 110 -6 m.
Frequency f is the number of waves passing a fixed point per unit time. It is expressed in Hertz (Hz).
The three attributes are related by
c=f

(1)

which implies that wavelength and frequency are inversely related since c is a constant. Longer wavelengths have
smaller frequency compared to shorter wavelengths.
Engineers use frequency attribute to indicate radio and radar regions. However, in remote sensing EM waves are
categorized in terms of their wavelength location in the EMR spectrum.
Another important theory about the electromagnetic radiation is the particle theory, which suggests that
electromagnetic radiation is composed of discrete units called photons or quanta.
3. Electro-Magnetic Radiation (EMR) spectrum
Distribution of the continuum of radiant energy can be plotted as a function of wavelength (or frequency) and
is known as the electromagnetic radiation (EMR) spectrum. EMR spectrum is divided into regions or intervals
of different wavelengths and such regions are denoted by different names. However, there is no strict dividing
line between one spectral region and its adjacent one. Different regions in EMR spectrum are indicated in Fig.
2.

10-6

10-5

10-4

Violet

10-3

Blue

10-2 10-1

Green

10

Radio waves

Microwave

Thermal
infrared

Visible
lightNearInfrared

Ultraviolet rays

X rays

Gamma rays
Wave
length (m)

102

Yellow Orange

103 104105

106 107108 109

Red
Near

Ultraviolet
Wavelength
(m)

Fig. 2. Electromagnetic radiation spectrum


0.48
0.54
0.58
0.60
0.65

0.40

Infrared

The EM spectrum ranges from gamma rays with very short wavelengths to radio waves with very long
wavelengths. The EM spectrum is shown in a logarithmic scale in order to portray shorter wavelengths.
The visible region (human eye is sensitive to this region) occupies a very small region in the range between 0.4
and 0.7 m. The approximate range of color blue is 0.4 0.5 m,
green is 0.5-0.6 m and red is 0.6-0.7 m. Ultraviolet (UV) region adjoins the blue end of the visible region
and infrared (IR) region adjoins the red end.
The infrared (IR) region, spanning between 0.7 and 100 m, has four subintervals of special interest for remote
sensing:
(1) Reflected IR (0.7 - 3.0 m)
A. Film responsive subset, the photographic IR (0.7 - 0.9 m)
B. and (4) Thermal bands at (3 - 5 m) and (8 - 14 m).
Longer wavelength intervals beyond this region are referred in units ranging from 0.1 to 100 cm. The microwave
region spreads across 0.1 to 100 cm, which includes all the intervals used by radar systems. The radar systems
generate their own active radiation and direct it towards the targets of interest. The details of various regions and
the corresponding wavelengths are given in Table 1.

Region

Table 1. Spectrum of electromagnetic radiation


Wavelength (m)
Remarks

Gamma rays

< 310-5

Not available for remote sensing. Incoming radiation


is absorbed by the atmosphere

X-ray

310-5 - 310-3

Not available for remote sensing since it is absorbed


by atmosphere

Ultraviolet
(UV) rays

0.03 - 0.4

Wavelengths less than 0.3 are absorbed by the ozone


layer in the upper atmosphere. Wavelengths between
0.3- 0.4 m are transmitted and termed as
Photographic UV band.

Visible

0.4 - 0.7

Detectable with film and photodetectors.

Infrared (IR)

0.7 - 100

Atmospheric windows exist which allows maximum


transmission. Portion between 0.7 and 0.9 m is
called photographic IR band, since it is detectable
with film. Two principal atmospheric windows exist
in the thermal IR region (3 - 5 m and 8 - 14 m).

Microwave

103 - 106

Can penetrate rain, fog and clouds. Both active and


passive remote sensing is possible. Radar uses
wavelength in this range.

Radio

> 106

Have the longest wavelength. Used for remote


sensing by some radars.
2

Energy in the gamma rays, X-rays and most of the UV rays are absorbed by the Earths atmosphere and hence
not used in remote sensing. Most of the remote sensing systems operate in visible, infrared (IR) and microwave
regions of the spectrum. Some systems use the long wave portion of the UV spectrum also.
4. Energy sources and radiation principles
4.1 Solar radiation
Primary source of energy that illuminates different features on the earth surface is the Sun. Solar radiation (also
called insolation) arrives at the Earth at wavelengths determined by the photosphere temperature of the sun
(peaking near 5,600 C). Although the Sun produces electromagnetic radiation in a wide range of wavelengths,
the amount of energy it produces is not uniform across all wavelengths.
Fig.3. shows the Solar irradiance (power of electromagnetic radiation per unit area incident on a surface)
distribution of the Sun. Almost 99% of the solar energy is within the wavelength range of 0.28-4.96 m. Within
this range, 43% is radiated in the visible wavelength region between 0.4-0.7 m. The maximum energy (E) is
available at 0.48 m wave length, which is in the visible green region.

Fig.3. Irradiance distribution of the Sun and Earth

Using the particle theory, the energy of a quantum (Q) is considered to be proportional to the frequency. The
relationship can be represented as shown below.
Q=hf

(2)

where h is the Planks constant (6.626 x 10-34 J Sec) and f is the frequency.
Using the relationship between c, and f (Eq.1), the above equation can be written as follows
Q=hc/

(3)

The energy per unit quantum is thus inversely proportional to the wavelength. Shorter wavelengths are associated
with higher energy compared to the longer wavelengths. For example, longer wavelength electromagnetic
radiations like microwave radiations are associated with lower energy compared to the IR regions and are difficult
to sense in remote sensing. For operating with long wavelength radiations, the coverage area should be large
enough to obtain a detectable signal.
5. ENERGY INTERACTIONS IN THE ATMOSPHERE
In remote sensing, all radiations traverse through the atmosphere for some distance to reach the sensor. As the
radiation passes through the atmosphere, the gases and the particles in the atmosphere interact with them causing
changes in the magnitude, wavelength, velocity, direction, and polarization.
5.1 Energy Interactions
The radiation from the energy source passes through some distance of atmosphere before being detected by the
remote sensor as shown in Fig. 4.

Fig. 4. Interactions in the atmosphere


3

The effect of atmosphere on the radiation depends on the properties of the radiation such as magnitude and
wavelength, atmospheric conditions and also the path length. Intensity and spectral composition of the incident
radiation are altered by the atmospheric effects. The interaction of the electromagnetic radiation with the
atmospheric particles may be a surface phenomenon (e.g., scattering) or volume phenomenon (e.g., absorption).
Scattering and absorption are the main processes that alter the properties of the electromagnetic radiation in the
atmosphere.
6 SCATTERING
Atmospheric scattering is the process by which small particles in the atmosphere diffuse a portion of the incident
radiation in all directions. There is no energy transformation while scattering. But the spatial distribution of the
energy is altered during scattering.

Fig. 5. Scattering of the electromagnetic radiation in the atmosphere


There are three different types of scattering:
1.
2.
3.

Rayleigh scattering
Mie scattering
Non-selective scattering

6.1 Rayleigh scattering


Rayleigh scattering mainly consists of scattering caused by atmospheric molecules and other tiny particles. This
occurs when the particles causing the scattering are much smaller in diameter (less than one tenth) than the
wavelengths of radiation interacting with them. Smaller particles present in the atmosphere scatter the shorter
wavelengths more compared to the longer wavelengths.
The scattering effect or the intensity of the scattered light is inversely proportional to the fourth power
of wavelength for Rayleigh scattering. Hence, the shorter wavelengths are scattered more than longer
wavelengths.

Fig. 6. Rayleigh scattering


Rayleigh scattering is also known as selective scattering or molecular scattering.
Molecules of Oxygen and Nitrogen (which are dominant in the atmosphere) cause this type of scattering of the
visible part of the electromagnetic radiation. Within the visible range, smaller wavelength blue light is scattered
more compared to the green or red. A "blue" sky is thus a manifestation of Rayleigh scatter. The blue light is
scattered around 4 times and UV light is scattered about 16 times as much as red light. This consequently results
in a blue sky. However, at sunrise and sunset, the sun's rays have to travel a longer path, causing complete
scattering (and absorption) of shorter wavelength radiations. As a result, only the longer wavelength portions
(orange and red) which are less scattered will be visible.
The haze in imagery and the bluish-grey cast in a color image when taken from high altitude are mainly due to
Rayleigh scatter.
4

6.2 Mie Scattering


Another type of scattering is Mie scattering, which occurs when the wavelengths of the energy is almost equal to
the diameter of the atmospheric particles. In this type of scattering longer wavelengths also get scattered compared
to Rayleigh scatter (Fig. 7).

Fig.7 Rayleigh and Mie scattering


In Mie scattering, intensity of the scattered light varies approximately as the inverse of the wavelength.
Mie scattering is usually caused by the aerosol particles such as dust, smoke and pollen. Gas molecules in the
atmosphere are too small to cause Mie scattering of the radiation commonly used for remote sensing.
6.3 Non-selective scattering
A third type of scattering is nonselective scatter, which occurs when the diameters of the atmospheric particles
are much larger (approximately 10 times) than the wavelengths being sensed. Particles such as pollen, cloud
droplets, ice crystals and raindrops can cause non-selective scattering of the visible light.
For visible light (of wavelength 0.4-0.7m), non-selective scattering is generally caused by water droplets which
is having diameter commonly in the range of 5 to 100 m. This scattering is nonselective with respect to
wavelength since all visible and IR wavelengths get scattered equally giving white or even grey color to the
clouds.
7 Absorption
Absorption is the process in which incident energy is retained by particles in the atmosphere at a given
wavelength. Unlike scattering, atmospheric absorption causes an effective loss of energy to atmospheric
constituents. The absorbing medium will not only absorb a portion of the total energy, but will also reflect, refract
or scatter the energy. The absorbed energy may also be transmitted back to the atmosphere.
The most efficient absorbers of solar radiation are water vapour, carbon dioxide, and ozone. Gaseous components
of the atmosphere are selective absorbers of the electromagnetic radiation, i.e., these gases absorb electromagnetic
energy in specific wavelength bands. Arrangement of the gaseous molecules and their energy levels determine
the wavelengths that are absorbed.
Since the atmosphere contains many different gases and particles, it absorbs and transmits many different
wavelengths of electromagnetic radiation. Even though all the wavelengths from the Sun reach the top of the
atmosphere, due to the atmospheric absorption, only limited wavelengths can pass through the atmosphere. The
ranges of wavelength that are partially or wholly transmitted through the atmosphere are known as "atmospheric
windows."

Sensor selection for remote sensing


While selecting a sensor the following factors should be considered:

The spectral sensitivity of the available sensors


The available atmospheric windows in the spectral range(s) considered. The spectral range of the sensor is
selected by considering the energy interactions with the features under investigation.
The source, magnitude, and spectral composition of the energy available in the particular range.
Multi Spectral Sensors sense simultaneously through multiple, narrow wavelength ranges that can be
located at various points in visible through the thermal spectral regions

8. ENERGY INTERACTIONS WITH EARTH SURFACE FEATURES


Energy incident on the Earths surface is absorbed, transmitted or reflected depending on the wavelength and
characteristics of the surface features (such as barren soil, vegetation, water body). Interaction of the
electromagnetic radiation with the surface features is dependent on the characteristics of the incident radiation
and the feature characteristics. After interaction with the surface features, energy that is reflected or re-emitted
from the features is recorded at the sensors and are analysed to identify the target features, interpret the distance
of the object, and /or its characteristics.
8.1. Energy Interactions
The incident electromagnetic energy may interact with the earth surface features in three possible ways:
Reflection, Absorption and Transmission. These three interactions are illustrated in Fig. 8.
Incident radiation
Reflection
Earth

Transmission
Absorption
Fig. 8. Energy interactions with earth surface features

9. SPECTRAL REFLECTANCE SIGNATURE CURVES


Spectral reflectance, [()], is the ratio of reflected energy to incident energy as a function of wavelength. Various
materials of the earths surface have different spectral reflectance characteristics. Spectral reflectance is
responsible for the color or tone in a photographic image of an object. Trees appear green because they reflect
more of the green wavelength. The values of the spectral reflectance of objects averaged over different, welldefined wavelength intervals comprise the spectral signature of the objects or features by which they can be
distinguished. To obtain the necessary ground truth for the interpretation of multispectral imagery, the spectral
characteristics of various natural objects have been extensively measured and recorded.

Fig.9
The plot between () and is called a spectral reflectance curve. This varies with the variation in the chemical
composition and physical conditions of the feature, which results in a range of values. The spectral response
patterns are averaged to get a generalized form, which is called generalized spectral response pattern for the object
concerned. Spectral signature is a term used for unique spectral response pattern, which is characteristic of a
terrain feature. Figure 9 shows a typical reflectance curves for three basic types of earth surface features, healthy
vegetation, dry bare soil (grey-brown and loamy) and clear lake water.
The spectral characteristics of the three main earth surface features are discussed below :

Vegetation: The spectral characteristics of vegetation vary with wavelength. Plant pigment in leaves called
chlorophyll strongly absorbs radiation in the red and blue wavelengths but reflects green wavelength. The internal
structure of healthy leaves acts as diffuse reflector of near infrared wavelengths. Measuring and monitoring the
near infrared reflectance is one way that scientists determine how healthy particular vegetation may be.
Water: Majority of the radiation incident upon water is not reflected but is either absorbed or transmitted. Longer
visible wavelengths and near infrared radiation is absorbed more by water than by the visible wavelengths. Thus
water looks blue or blue green due to stronger reflectance at these shorter wavelengths and darker if viewed at
red or near infrared wavelengths. The factors that affect the variability in reflectance of a water body are depth of
water, materials within water and surface roughness of water.
Soil: The majority of radiation incident on a soil surface is either reflected or absorbed and little is transmitted.
The characteristics of soil that determine its reflectance properties are its moisture content, organic matter content,
texture, structure and iron oxide content. The soil curve shows less peak and valley variations. The presence of
moisture in soil decreases its reflectance. By measuring the energy that is reflected by targets on earths surface
over a variety of different wavelengths, we can build up a spectral signature for that object. And by comparing
the response pattern of different features we may be able to distinguish between them, which we may not be able
to do if we only compare them at one wavelength. For example, Water and Vegetation reflect somewhat similarly
in the visible wavelength but not in the infrared.

The following graph shows the typical reflectance spectra of five materials: clear water, turbid water, bare soil
and two types of vegetation.

Fig.10 Reflectance Spectrum of Five Types of Landcover


The reflectance of clear water is generally low. However, the reflectance is maximum at the blue end of the
spectrum and decreases as wavelength increases. Hence, clear water appears dark-bluish. Turbid water has some
sediment suspension which increases the reflectance in the red end of the spectrum, accounting for its brownish
appearance. The reflectance of bare soil generally depends on its composition. In the example shown, the
reflectance increases monotonically with increasing wavelength. Hence, it should appear yellowish-red to the eye.
Vegetation has a unique spectral signature which enables it to be distinguished readily from other types of land
cover in an optical/near-infrared image. The reflectance is low in both the blue and red regions of the spectrum,
due to absorption by chlorophyll for photosynthesis. It has a peak at the green region which gives rise to the green
colour of vegetation. In the near infrared (NIR) region, the reflectance is much higher than that in the visible band
due to the cellular structure in the leaves. Hence, vegetation can be identified by the high NIR but generally low
visible reflectance. This property has been used in early reconnaissance missions during war times for
"camouflage detection".
The shape of the reflectance spectrum can be used for identification of vegetation type. For example, the
reflectance spectra of vegetation 1 and 2 in the above figures can be distinguished although they exhibit the
generally characteristics of high NIR but low visible reflectances. Vegetation 1 has higher reflectance in the
visible region but lower reflectance in the NIR region. For the same vegetation type, the reflectance spectrum also
depends on other factors such as the leaf moisture content and health of the plants.

10. IMAGE PROCESSING


A digital remotely sensed image is typically composed of picture elements (pixels) located at the intersection of
each row i and column j in each K bands of imagery. Associated with each pixel is a number known as Digital
Number (DN) or Brightness Value (BV), that depicts the average radiance of a relatively small area within a
scene. A smaller number indicates low average radiance from the area and the high number is an indicator of high

radiant properties of the area. The size of this area effects the reproduction of details within the scene. As pixel
size is reduced more scene detail is presented in digital representation.

Pixels
A digital image comprises of a two dimensional array of individual picture elements called pixels arranged in
columns and rows. Each pixel represents an area on the Earth's surface. A pixel has an intensity value and a
location address in the two dimensional image.
The intensity value represents the measured physical quantity such as the solar radiance in a given wavelength
band reflected from the ground, emitted infrared radiation or backscattered radar intensity. This value is normally
the average value for the whole ground area covered by the pixel.

10.1 Multilayer Image


Several types of measurement may be made from the ground area covered by a single pixel. Each type of
measurement forms images which carry some specific information about the area. By "stacking" these images
from the same area together, a multilayer image is formed. Each component image is a layer in the multilayer
image. Multilayer images can also be formed by combining images obtained from different sensors, and other
subsidiary data. For example, a multilayer image may consist of three layers from a SPOT multispectral image,
a layer of synthetic aperture radar SAR image, and perhaps a layer consisting of the digital elevation map of the
area being studied.

10.2 Multispectral Images


A multispectral image consists of several bands of data. For visual display, each band of the image may be
displayed one band at a time as a grey scale image, or in combination of three bands at a time as a colour
composite image. Interpretation of a multispectral colour composite image will require the knowledge of the
spectral reflectance signature of the targets in the scene. In this case, the spectral information content of the
image is utilized in the interpretation. The following three images show the three bands of a multispectral
image extracted from a SPOT multispectral scene at a ground resolution of 20 m. The area covered is the same
as that shown in the above panchromatic image. Note that both the XS1 (green) and XS2 (red) bands look
almost identical to the panchromatic image shown above. In contrast, the vegetated areas now appear bright in
the XS3 (NIR) band due to high reflectance of leaves in the near infrared wavelength region. Several shades
of grey can be identified for the vegetated areas, corresponding to different types of vegetation. Water mass
(both the river and the sea) appear dark in the XS3 (near IR) band.

Green

Red

NIR

10.3 Superspectral Image


The more recent satellite sensors are capable of acquiring images at many more wavelength bands. For example,
several satellites consist of 36 spectral bands, covering the wavelength regions ranging from the visible, near
infrared, short-wave infrared to the thermal infrared. The bands have narrower bandwidths, enabling the finer
spectral characteristics of the targets to be captured by the sensor. The term "superspectral" has been coined to
describe such sensors.
10.4 Hyperspectral Image
A hyperspectral image consists of about a hundred or more contiguous spectral bands forming a threedimensional (two spatial dimensions and one spectral dimension) image cube.. The characteristic spectrum of the
target pixel is acquired in a hyperspectral image. The precise spectral information contained in a hyperspectral
image enables better characterisation and identification of targets. Hyperspectral images have potential
applications in such fields as precision agriculture (e.g. monitoring the types, health, moisture status and maturity
of crops), coastal management (e.g. monitoring of phytoplanktons, pollution, bathymetry changes).

COLOR COMPOSITES:
While displaying the different bands of a multispectral data set, images obtained in different bands are displayed
in image planes (other than their own) the color composite is regarded as False Color Composite (FCC). High
spectral resolution is important when producing color components. For a true color composite an image data used
in red, green and blue spectral region must be assigned bits of red, green and blue image processor frame buffer
memory. A color infrared composite standard false color composite is displayed by placing the infrared, red,
green in the red, green and blue frame buffer memory.
11. IMAGES RESOLUTIONS
The quality of remote sensing data consists of its spectral, radiometric, spatial and temporal resolutions.
11.1-Spatial Resolution
Spatial resolution refers to the size of the smallest object that can be resolved on the ground. In a digital image,
the resolution is limited by the pixel size, i.e. the smallest resolvable object cannot be smaller than the pixel size.
The intrinsic resolution of an imaging system is determined primarily by the instantaneous field of view (IFOV)
of the sensor, which is a measure of the ground area viewed by a single detector element in a given instant in
time. However this intrinsic resolution can often be degraded by other factors which introduce blurring of the
image, such as improper focusing, atmospheric scattering and target motion. The pixel size is determined by the
sampling distance.
A "High Resolution" image refers to one with a small resolution size. Fine details can be seen in a high resolution
image. On the other hand, a "Low Resolution" image is one with a large resolution size, i.e. only coarse features
can be observed in the image. An image sampled at a small pixel size does not necessarily have a high resolution.
The following three images illustrate this point. The first image is a SPOT image of 10 m pixel size. It was
derived by merging a SPOT panchromatic image of 10 m resolution with a SPOT multispectral image of 20 m
resolution. The merging procedure "colours" the panchromtic image using the colours derived from the
multispectral image. The effective resolution is thus determined by the resolution of the panchromatic image,
which is 10 m. This image is further processed to degrade the resolution while maintaining the same pixel size.
The next two images are the blurred versions of the image with larger resolution size, but still digitized at the
9

same pixel size of 10 m. Even though they have the same pixel size as the first image, they do not have the same
resolution.
The following images illustrate the effect of pixel size on the visual appearance of an area. The first image is a
SPOT image of 10 m pixel size derived by merging a SPOT panchromatic image with a SPOT multispectral
image. The subsequent images show the effects of digitizing the same area with larger pixel sizes.

10 m resolution, 10 m pixel

30 m resolution, 10 m pixel

80 m resolution, 10 m pixel

11. 2-Radiometric Resolution


Radiometric Resolution refers to the smallest change in intensity level that can be detected by the sensing system.
The intrinsic radiometric resolution of a sensing system depends on the signal to noise ratio of the detector. In a
digital image, the radiometric resolution is limited by the number of discrete quantization levels used to digitize
the continuous intensity value.
11. 3-Spectral resolution
The wavelength width of the different frequency bands recorded usually, this is related to the number of
frequency bands recorded by the platform. Current Landsat collection is that of seven bands, including several
in the infra-red spectrum, ranging from a spectral resolution of 0.07 to 2.1 m. The Hyperion sensor on Earth
Observing-1 resolves 220 bands from 0.4 to 2.5 m, with a spectral resolution of 0.10 to 0.11 m per band.
11. 4-Temporal resolution
The frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or those requiring
an averaged or mosaic image as in deforesting monitoring. This was first used by the intelligence community
where repeated coverage revealed changes in infrastructure, the deployment of units or the
modification/introduction of equipment. Cloud cover over a given area or object makes it necessary to repeat
the collection of said location.
12. IMAGE CORRECTIONS:
12.1 Radiometric correction
Allows to avoid radiometric errors and distortions. The illumination of objects on the Earth surface is uneven
because of different properties of the relief. This factor is taken into account in the method of radiometric
distortion correction.Radiometric correction gives a scale to the pixel values, e. g. the monochromatic scale of 0
to 255 will be converted to actual radiance values.
12.2 Topographic correction (also called terrain correction)
In rugged mountains, as a result of terrain, the effective illumination of pixels varies considerably. In a remote
sensing image, the pixel on the shady slope receives weak illumination and has a low radiance value, in contrast,
the pixel on the sunny slope receives strong illumination and has a high radiance value. For the same object, the
pixel radiance value on the shady slope will be different from that on the sunny slope. Additionally, different
objects may have similar radiance values. These ambiguities seriously affected remote sensing image information
extraction accuracy in mountainous areas. It became the main obstacle to further application of remote sensing
images. The purpose of topographic correction is to eliminate this effect, recovering the true reflectivity or
radiance of objects in horizontal conditions.
12.3 Atmospheric correction
Elimination of atmospheric haze by rescaling each frequency band so that its minimum value (usually realised in
water bodies) corresponds to a pixel value of 0. The digitizing of data also makes it possible to manipulate the
data by changing gray-scale values.

10

13. VISUAL INTERPRETATION


Analysis of remote sensing imagery involves the identification of various targets in an image, and those targets
may be environmental or artificial features, which consist of points, lines, or areas. Targets may be defined in
terms of the way they reflect or emit radiation. This radiation is measured and recorded by a sensor, and
ultimately is depicted as an image product such as an Observing the differences between targets and their
backgrounds involves comparing different targets based on any, or all, of the visual elements of tone, shape,
size, pattern, texture, shadow, and association.
13.1 BASIC ELEMENTS OF VISUAL INTERPRETATION:
1-Tone refers to the relative brightness or colour of objects in image. Generally, tone is the fundamental element
for distinguishing between different targets or features. Variations in tone also allow the elements of shape,
texture, and pattern of objects to be distinguished.
2-Shape refers to the general form, structure, or outline of individual objects. Shape can be a very distinctive
clue for interpretation. Straight edge shapes typically represent urban or agricultural (field) targets, while natural
features, such as forest edges, are generally more irregular in shape, except where man has created a road or clear
cuts. Farm or crop land irrigated by rotating sprinkler systems would appear as circular shapes.
3-Size of objects in an image is a function of scale. It is important to assess the size of a target relative to other
objects in a scene, as well as the absolute size, to aid in the interpretation of that target. A quick approximation
of target size can direct interpretation to an appropriate result more quickly. For example, if an interpreter had to
distinguish zones of land use, and had identified an area with a number of buildings in it, large buildings such as
factories or warehouses would suggest commercial property, whereas small buildings would indicate residential
use.
4-Pattern refers to the spatial arrangement of visibly discernible objects. Typically an orderly repetition of
similar tones and textures will produce a distinctive and ultimately recognizable pattern. Orchards with evenly
spaced trees and urban streets with regularly spaced houses are good examples of pattern.
5-Texture refers to the arrangement and frequency of tonal variation in particular areas of an image. Rough
textures would consist of a mottled tone where the grey levels change abruptly in a small area, whereas some of
the textures would have very little tonal variation. Smooth textures are most often the result of uniform, even
surfaces, such as fields, asphalt, or grasslands. A target with a rough surface and irregular structure, such as a
forest canopy, results in a rough textured appearance. Texture is one of the most important elements for
distinguishing features in radar imagery.
6-Shadow is also helpful in interpretation as it may provide an idea of the profile and relative height of a target
or targets which may make identification easier. However, shadows can also reduce or eliminate interpretation
in their area of influence, since targets within shadows are much less (or not at all) discernible from their
surroundings. Shadow is also useful for enhancing or identifying topography and landforms, particularly in
radar imagery.
7-Association takes into account the relationship between other recognizable objects or features in proximity to
the target of interest. The identification of features that one would expect to associate with other features may
provide information to facilitate identification. In the example given above, commercial properties may be
associated with proximity to major transportation routes, whereas residential areas would be associated with
schools, playgrounds, and sports fields. In our example, a lake is associated with boats, a marina, and adjacent
recreational land.
14. SENSOR PLATFORMS / SATELLITES:
Earth resource satellites:
There are three distinct groups of Earth resource resource satellites. the first group of satellites record visible
and near visible wavelength the second group of satellites record thermal infrared wavelengths. Group three
satellites are deployed with sensors that record microwave lengths.
Landsat Satellite program:
National Aeronautics and Space Administration (NASA) renamed the ERTS (Earth resource technology
satellites) programs as landsat program to distinguish it from the series of metrological and oceanographic
satellites.
landsat
images
have
found
a
large
number
of
applications
such
as
agriculture, cartography, geography, geology, land use planning, forestry etc.
SPOT Satellite program:
France, Sweden and Belgium joined together to launch SPOT-1 from French Guiana on February 21, 1986.
The high resolution data obtained from SPOT sensors namely Thematic mapper and high resolution visible have
been extensively used urban planning, urban growth assessment, transportation planning, etc.
11

Indian Remote Sensing satellite (IRS)


IRS has application potential in a wide range of disciplines such as management of Agricultural resources,
Inventory of forest resources, Geological mapping, Estimation of water resources and water quality survey.
AEM satellites (Applications Explorer Mission):
Heat Capacity Mapping Machine (HCMM) satellite - The first of small and little inexpensive NASA applications
Explorer Mission (AEM). The images have found wider applications in vegetation mapping, vegetation stress
detection, Micro climatology, soil moisture mapping and monitoring industrial thermal pollution.
Meteorological satellites
These are designed specifically persistence weather prediction and monitoring.
a)

NOAA satellites: National Oceanic and Atmospheric Administration, -Sun synchronous-It contains
AVHRR (Advanced Very High Resolution Radiometer)-Used extensively for Studies of Vegetation
dynamics, Flood monitoring, regional soil moisture analysis, dust and sandstorm monitoring, forest wild
fire mapping, sea surface temperature mapping.

b) GOES Satellites: Geostationary Operational Environmental Satellites (GOES) is used for local weather
forecasting, regional snow cover mapping etc.
c)

NIMBUS Satellites: Launched in 1978-Carries Coastal Zone Colour Scanner (CZCS) designed to measure
Ocean parameters like surface temperatures, detection of chlorophyll and suspended solids of near-shore
and coastal waters

d) Meteosat series: Used to study meteorological applications like studies of synoptic climatology, sea surface
temperature, land surface temperature, monitoring all types of disasters.
e)

Satellites carrying Microwave sensors: The clear advantage of microwave sensor is its capacity to
penetrate cloud cover. Ex: Seasat with Syntheic Aperture Radar (SAR), European Remote Sensing Satellite
(ERS)-1 and Radarsat.

IKONOS Satellite series: High resolution satellite launched on Sep 24, 1999 Sunsynchronous having an
equatorial time 10:30AM Revisit time is for every 11days Spatial resolution of 1m Panchromatic and
multispectral bands can be combined to produce pan-sharpened multispectral imagery with an effective resolution
of 1m data collection over 2048 grey scales IKONOS data has been extensively used for urban growth
assessment studies, municipal planning, utility management, Cadastral Information System (CIS) etc.
Quick Bird: Launched on 18th oct 2001- Current high resolution commercial satellite- 0.6m resolution-16.5 km
swath The data acquired (imagery) is used for a wide range of applications like management of land,
infrastructure, natural resources etc.
CARTOSAT-1: Launched by ISROs PSLV6 (Polar satellite launching vehicle) from SDSC (Satish Dhawan
Space Centre), Sriharikota on May 05, 2066. The CARTOSAT-1 carries two panchromatic cameras- Spatial
resolution 2.5m, Swath cover 30km.
Resourcesat-1:
It is conceptualized and designed to provide continuity in operational remote sensing with its superior capabilities.
Main objective is to provide continued remote sensing data for integrated land and water management and
agricultural and its elated applications.

Swath Width: The strip of the Earth's surface from which geographic data are collected by a moving vehicle
such as a satellite, aircraft or ship in the course of swath mapping.
Nadir: It refers to the downward-facing viewing geometry of an orbiting satellite, such as is employed
during remote sensing of the atmosphere

12

You might also like