Remote Sensing Studies Notes - Short
Remote Sensing Studies Notes - Short
Remote Sensing Studies Notes - Short
With the advent and tremendous technological progress in the field of remote
sensing technique it has now become possible to study several things happening
on the earth surface. Earth observing satellites have the capability of producing
synoptic view of the earth and can generate wealth of information. Remote
sensing offers this perspective and allows a researcher to examine other reference
ancillary data simultaneously and synergistically. Nature and pattern of deformation
that the earth has undergone are beautifully displayed by the satellite images
enabling us to study these in details.
Certain remote sensing devices offer unique information regarding structures, such as in
the relief expression offered by radar sensors. A benefit of side looking radar is that the
illumination conditions can be controlled, and the most appropriate geometry used for
type of terrain being examined.
1
1. Energy Source or Illumination (A) – the first requirement for remote sensing is to
have an energy source which illuminates or provides electromagnetic energy to the
target of interest.
2. Radiation and the Atmosphere (B) – as the energy travels from its source to the
target, it will come in contact with and interact with the atmosphere it passes through.
This interaction may take place a second time as the energy travels from the target to
the sensor.
3. Interaction with the Target (C) - once the energy makes its way to the target through
the atmosphere, it interacts with the target depending on the properties of both the target
and the radiation.
4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or
emitted from the target, we require a sensor (remote - not in contact with the target) to
collect and record the electromagnetic radiation.
5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor
has to be transmitted, often in electronic form, to a receiving and processing station
where the data are processed into an image (hardcopy and/or digital).
6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or
digitally or electronically, to extract information about the target which was illuminated.
7. Application (G) - the final element of the remote sensing process is achieved when
we apply the information we have been able to extract from the imagery about the target
in order to better understand it, reveal some new information, or assist in solving a
particular problem.
2
Electromagnetic Radiation : The first requirement for remote sensing is to
have an energy source to illuminate the target (unless the sensed energy is
being emitted by the target).
Electromagnetic Spectrum:
For most purposes, the ultraviolet or UV portion of the spectrum has the
shortest wavelengths. The light which our eyes - our "remote sensors" - can
detect is part of the visible spectrum. The visible wavelengths cover a range
from approximately 0.4 to 0.7 µm. The longest visible wavelength is red and the
3
shortest is violet. Common wavelengths of what we perceive as particular colours
from the visible portion of the spectrum are listed below. It is important to note
that this is the only portion of the spectrum we can associate with the concept of
colours.
The next portion of the spectrum of interest is the infrared (IR) region which can
be divided into two categories based on their radiation properties - the reflected
IR, and the emitted or thermal IR. Radiation in the reflected IR region is used for
remote sensing purposes in ways very similar to radiation in the visible portion.
The portion of the spectrum of more recent interest to remote sensing is the
microwave region from about 1 mm to 1 m. This covers the longest
wavelengths used for remote sensing. The shorter wavelengths have properties
similar to the thermal infrared region while the longer wavelengths approach the
wavelengths used for radio broadcasts.
Interactions with the Atmosphere : Before radiation (used for remote sensing)
reaches the Earth's surface it has to travel through some distance of the Earth's
atmosphere. Particles and gases in the atmosphere can affect the incoming light
and radiation. These effects are caused by the mechanisms of scattering and
absorption.
Rayleigh scattering occurs when particles are very small compared to the
wavelength of the radiation. These could be particles such as small specks of
dust or nitrogen and oxygen molecules. Rayleigh scattering causes shorter
wavelengths of energy to be scattered much more than longer wavelengths.
Rayleigh scattering is the dominant scattering mechanism in the upper
atmosphere.
Mie scattering occurs when the particles are just about the same size as the
wavelength of the radiation. Dust, pollen, smoke and water vapour are common
causes of Mie scattering which tends to affect longer wavelengths than those
affected by Rayleigh scattering. Mie scattering occurs mostly in the lower
portions of the atmosphere where larger particles are more abundant, and
dominates when cloud conditions are overcast.
4
cause this type of scattering. Nonselective scattering gets its name from the fact
that all wavelengths are scattered about equally.
Atmospheric Window
Those areas of the spectrum which are not severely influenced by atmospheric
absorption are useful to remote sensors, are called atmospheric windows.
The visible portion of the spectrum, to which our eyes are most sensitive,
corresponds to both an atmospheric window and the peak energy level of the
sun. Energy emitted by the Earth corresponds to a window around 10 µm in the
thermal IR portion of the spectrum. The large window at wavelengths beyond 1
mm is associated with the microwave region.
There are three (3) forms of interaction that can take place when energy strikes,
or is incident (I) upon the surface. These are:
5
Absorption (A) Transmission (T) Reflection (R)
The total incident energy will interact with the surface in one or more of these
three ways. The proportions of each will depend on the wavelength of the energy
and the material and condition of the feature.
When a surface is smooth we get specular or mirror-like reflection where all (or
almost all) of the energy is directed away from the surface in a single direction.
Diffuse reflection occurs when the surface is rough and the energy is reflected
almost uniformly in all directions.
If the wavelengths are much smaller than the surface variations or the particle
sizes that make up the surface, diffuse reflection will dominate. Most earth
surface features lie somewhere between perfectly specular or perfectly diffuse
reflectors.
Leaves strongly absorb radiation in the red and blue wavelengths but reflects
green wavelengths producing green appearance. Water absorbs more longer
wavelength radiation than shorter visible wavelengths thus water typically looks
blue or blue-green due to stronger reflectance at these shorter wavelengths.
6
Active sensors, on the other hand, provide their own energy source for
illumination. Some examples of active sensors are a laser and a synthetic
aperture radar (SAR).
Geostationary Satellite
7
Sun-synchronous orbits: Many of these satellite orbits are also sun-
synchronous such that they cover each area of the world at a constant local time
of day called local sun time. At any given latitude, the position of the sun in the
sky as the satellite passes overhead will be the same within the same season.
Sensor Technology
CCD Detector
8
Resolution
Spatial Resolution
Spectral Resolution
Radiometric Resolution
Temporal Resolution
9
complete one entire orbit cycle. The revisit period of a satellite sensor is usually
several days. Therefore the absolute temporal resolution of a remote sensing
system to image the exact same area at the same viewing angle a second time is
equal to this period.
Thermal Imaging
All matter of the earth radiates energy at Thermal Infrared wavelength (3 µm to
15 µm) both day and night. Thermal sensors use photo detectors sensitive to
the direct contact of photons on their surface, to detect emitted thermal radiation.
The detectors are cooled to temperatures close to absolute zero (0K) in order to
limit their own thermal emissions.
The thermal inertia of water is similar to that of soils and rocks, but in
daytime, water bodies have a cooler surface temperature than soils and
rocks. At night the relative surface temperatures are reversed, so that
water is warmer than soils and rocks.
If water bodies have warm signatures relative to the adjacent terrain, the
image was acquired at night. Whereas, relatively cool water bodies
indicate daytime imagery. Damp soil is cooler than dry soil, both day and
night.
10
Land cover classification and mapping
Estimate sea surface temperatures
Estimate soil moisture
Monitor plant stress
Detect ground water and geological
Structures and materials
Detect and map thermal discharges
Measure heat loss of buildings
Assess urban heat island effects
Map forest fires
Monitor volcanic activity
RADAR Basics
A radar is essentially a ranging or distance measuring device. It consists
fundamentally of a transmitter, a receiver, an antenna, and an electronics system
to process and record the data. The transmitter generates successive short
bursts (or pulses of microwave (A) at regular intervals which are focused by the
antenna into a beam (B). The radar beam illuminates the surface obliquely at a
right angle to the motion of the platform.
11
The antenna receives a portion of the transmitted energy reflected (or
backscattered) from various objects within the illuminated beam (C). By
measuring the time delay between the transmission of a pulse and the reception
of the backscattered "echo" from different targets, their distance from the radar
and thus their location can be determined. As the sensor platform moves
forward, recording and processing of the backscattered signals builds up a two-
dimensional image of the surface.
The microwave region of the spectrum is quite large, relative to the visible and
infrared, and there are several wavelength ranges or bands commonly used.
12
While ground range is the horizontal distance between the emitter and its target
and its calculation requires knowledge of the target's elevation. Since the waves
travel to a target and back, the round trip time is divide by two in order to obtain
the time the wave took to reach the target.
Spatial resolution in both the range (look) direction and azimuth (flight) direction
is determined by the engineering characteristics of the radar system. Depression
angle defined as the angle between a horizontal plane and a beam from the
antenna to a target on the ground. The depression angle is steeper at the near
range side of an image strip and shallower at the far-range side. The average
depression angle is measured for a beam to the midline of an image strip.
Incidence angle is defined as the angle between a radar beam and a line
perpendicular to the surface.
Spatial Resolution
The spatial resolution of a radar image is determined by the dimension of ground
resolution cell, which are controlled by the combination of range resolution and
azimuth resolution.
Range Resolution
Range Resolution (Rr) or resolution in the
radar look direction is determined by the
depression angle and by the pulse length.
Pulse length () is the duration of the
transmitted pulse and is measured in
microseconds.
It is converted from time into distance by
multiplying by the speed of electromagnetic
radiation.
Targets A and B are not resolved because they are closer together (20m) than
the range resolution distance. They are within a single ground resolution cell and
cannot be separated of the image.
13
Targets C and D are also spaced 20m apart but are imaged with a depression
angle of 35o. For this depression angle, range resolution is calculated as 18m.
Targets C and D are resolved because they are more widely spaced than the
ground resolution cell.
One method of improving range resolution is to shorten the pulse length,
but this reduces the total amount of energy in each transmitted pulse.
Azimuth Resolution
Azimuth resolution (Ra), or resolution in the
azimuth direction, is determined by the width
of the terrain strip illuminated by the radar
beam.
To be resolved, targets must be separated
in the azimuth direction by a distance
greater than the beam width as measured
on the ground.
The fan-shaped beam is narrower in the
near range than in the far range, causing
azimuth resolution to be smaller in the near-
range portion of the image.
14
Slant Range Distortion
Shadowing
15
SAR Image
A digital SAR image can be seen as a mosaic (i.e. a two-dimensional array
formed by columns and rows) of small picture elements (pixels). Each pixel is
associated with a small area of the Earth’s surface (called a resolution cell).
Each pixel gives a complex number that carries amplitude and phase
information about the microwave field backscattered by all the scatterers (rocks,
vegetation, buildings etc.) within the corresponding resolution cell projected on
the ground.
Different rows of the image are associated with different azimuth locations,
whereas different columns indicate different slant range locations.
The radiation transmitted from the radar has to reach the scatterers on the
ground and then come back to the radar in order to form the SAR image (two-
way travel). Scatterers at different distances from the radar (different slant
ranges) introduce different delays between transmission and reception of the
radiation.
Due to the almost purely sinusoidal nature of the transmitted signal, this delay
is equivalent to a phase change between transmitted and received signals.
The phase change is thus proportional to the two-way travel distance 2R of the
radiation divided by the transmitted wavelength .
In other words the phase of the SAR signal is a measure of just the last fraction
of the two-way travel distance that is smaller than the transmitted wavelength.
16
A Sythetic Aperture Radar (SAR) is a coherent sidelooking airborne system
which utilizes the flight path of the aircraft to simulate an extremely large antenna
or aperture electronically and that generates high-resolution remote sensing
imagery.
The signal processing uses magnitude and phase of the received signals over
successive pulses from elements of a synthetic aperture. After a given number of
cycles, the stored data is recombined (taking into account the Doppler effects
inherent in the different transmitter to target geometry in each succeeding cycle)
to create a high resolution image of the terrain being over flown.
Interferometry
Interferometric SAR or InSAR, allows accurate measurements of the radiation
travel path because it is coherent.
Measurements of travel path variations as a function of the satellite position and
time of acquisition allow generation of Digital Elevation Models (DEM) and
measurement of centimetric surface deformations of the terrain.
A satellite SAR can observe the same area from slightly different look angles.
This can be done either simultaneously (with two radars mounted on the same
platform) or at different times by exploiting repeated orbits of the same satellite.
17
The distance between the two satellites (or orbits) in the plane
perpendicular to the orbit is called the interferometer baseline and its
projection perpendicular to the slant range is the perpendicular baseline.
18