0% found this document useful (0 votes)
46 views118 pages

RS Complete Pdfs

remote sensing

Uploaded by

amanvermaav089
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views118 pages

RS Complete Pdfs

remote sensing

Uploaded by

amanvermaav089
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 118

Remote Sensing

Course No: CE205


Course Instructor
Dr. Manali Pal, Department Of Civil Engineering
NIT Warangal
INTRODUCTION
Remote Sensing is the science and art of obtaining information about an
object, area or phenomenon through the analysis of data acquired by a device
that is not in contact with the object, area or phenomenon under investigation.

Sonar

Eye

Gravity
Meter
REMOTE SENSING OF ELECTROMAGNETIC
ENERGY
Remote Sensing is a technology for
Remote sensing is detecting and measuring sampling electromagnetic radiation to
electromagnetic energy emanating or reflected from acquire and interpret non-immediate
distant objects made of various materials, so that we geospatial data from which to extract
can identify and categorize these objects by class or information about features and objects on
type, substance and spatial distribution” the Earth’s land surface, oceans, and
[American Society of Photogrammetry, 1975] atmosphere.- Dr. Nicholas Short

➢ Remote sensing of Electromagnetic energy is used for


earth observation
➢ Surface parameters are inferred through the
measurement and interpretation of the electromagnetic
energy / radiation from the Earth’s surface
➢ Variation in electromagnetic energy can be measured
using photographic or non-photographic sensors

http://geoportal.icimod.org
STAGES OF REMOTE SENSING 1/21/2020

• Stages of remote sensing:


• A: the energy source which illuminates or provides
electromagnetic radiation (EMR) to the target (sun/
self-emission)
• B:transmission of energy from the source to earth’s
surface and absorption and scattering
• C:interaction of EMR with the earth’s surface:
reflection and emission
• D:transmission of energy from the surface to the
sensor
• E:recording of the energy at the sensor
• Photographic or non-photographic
• F: transmission of the recorded
Information to ground station
• G: processing of the data into digital or
Hard copy image
• H: analysis of data
1/21/2020

ELECTROMAGNETIC RADIATION (EMR)

 Every substance in the universe having


temperature above absolute zero radiates
electromagnetic energy. Remote sensing detects
and measures electromagnetic energy from distant
objects made of various materials, to identify and
categorize those objects by class or type,
substance and spatial distribution.
 Travels with the velocity of light
 Visible light, ultraviolet rays, infrared, heat, radio
waves and x-rays are different forms
 Shorter wavelengths have higher energy content
and longer wavelengths have lower energy content
ELECTROMAGNETIC RADIATION (EMR) 1/21/2020

 Electromagnetic Energy Or Electromagnetic Radiation (EMR) Energy propagated in the


form of an advancing Interaction between electric and magnetic fields (Sabbins, 1978)
expressed in terms of frequency (f) Or wavelength (λ) of radiation,

E = h.c.f or h.c / λ

where, h = Planck's constant (6.626 x 10-34


Joules-sec)
c = Speed of light (3 x 108 m/sec)
f = Frequency expressed in Hertz
λ = wavelength in micro meters (µm)
ELECTROMAGNETIC SPECTRUM
1/21/2020

➢ Distribution of the continuum of energy plotted as a function of wavelength (or frequency)


➢ Gamma rays, X-rays and most of the UV rays
‒ Mostly absorbed by the earth’s atmosphere and hence not used in remote sensing
Most of the remote sensing systems operate in visible, infrared (IR) and microwave regions
Some systems use the long wave portion of the UV spectrum
Passive/ Active Remote Sensing
1/21/2020

➢ Depending on the source of electromagnetic energy, remote sensing can be classified


as,
➢ Passive : source of energy is naturally
available such as the Sun. It is
similar to taking a picture with an
ordinary camera.

➢ Active : energy is generated and sent


from the remote sensing platform
towards the targets. It is analogous
to taking a picture with camera having
built-in flash.
Passive/ Active Remote Sensing
➢ PASSIVE REMOTE SENSING

• Passive Remote Sensing: Source of energy is naturally available


– Solar Energy
– Energy Emitted By The Earth etc.
• Most of the Remote Sensing Systems work in Passive Mode using Solar Energy
– Solar energy reflected by the targets at specific bands are recorded using sensors
– For ample signal strength received at the sensor, wavelengths capable of traversing through the
atmosphere without significant loss, are generally used
• The Earth will also emit some radiation since its ambient temperature is about 300K.
– Passive sensors can also be used to measure the Earth’s radiance
– Not very popular as the energy content is very low
9
Passive/ Active Remote Sensing
➢ ACTIVE REMOTE SENSING

• Active remote sensing: Energy is generated and emitted from a sensing platform towards the targets
• Energy reflected back by the targets are recorded
• Longer wavelength bands are used
• Example: Active microwave remote sensing (radar)
–Pulses of microwave signals are sent towards the target from the radar antenna located on the air /
space-borne platform
–The energy reflected back (echoes) are recorded at the sensor

10
Basic components of an ideal Remote Sensing System

1) A uniform energy source

2) A non-interfering atmosphere

3) A series of unique energy/matter

interactions at the Earth's surface


4) A super sensor

5) A real-time data handling system

6) Multiple Data Users


Basic components of an ideal Remote Sensing System
1) Energy source :
Ideal: Constant EMR over all wavelengths
Real : Usually non-uniform over different wavelengths and vary with time and space;
Varies with source of energy and earth surface features
Needs calibration for source characteristics.
2) A non-interfering atmosphere :
Ideal: Does not modify the EMR transmitted through it
Real : Atmospheric interaction varies with the wavelength, sensor used and application
Needs calibration to eliminate or diminish these atmospheric effects
3) Energy/matter interactions at the Earth's surface :
Ideal: Generates reflected / emitted signals that are based on wavelength and target earth surface feature
Real : Similar spectral signatures for different target leading to difficult or error-prone identification
Lack of complete understanding of the energy/matter interactions for surface features
Basic components of an ideal Remote Sensing System
4) Super sensor :
Ideal: Simple, highly sensitive in all wavelengths to maintain accuracy and cost-effective.
Yields data on the absolute brightness (or radiance) from a scene as a function of wavelength.
Real : Not sensitive to all wavelengths. Limited efficiency in recording spatial details.
5) Real-time data handling system :
Ideal: Generates radiance-wavelength response analyzing it to an interpretable format in real time.
Real : Real time data handling almost impossible as human intervention is necessary for processing
sensor data.
6) Multiple Data Users :
Ideal: Possess knowledge in RS techniques in their respective disciplines to apply the procured
information.
Real : User should have - thorough understanding of the problem, wide knowledge in the data
generation, knowledge in data interpretation and to make best use of the data
Applications of Remote Sensing

• Major advantages of remote sensing are


➢ Provides data for large areas
➢ Provides data of very remote and inaccessible regions
➢ Able to obtain imagery of any area over a continuous period of time
– Possible to monitor any anthropogenic or natural changes in the landscape
➢ Relatively inexpensive when compared to employing a team of surveyors
➢ Easy and rapid collection of data
➢ Rapid production of maps for interpretation

14
Applications of Remote Sensing

• Some of the drawbacks of remote sensing are


➢ The interpretation of imagery requires a certain skill level
➢ Needs cross verification with ground (field) survey data
➢ Data from multiple sources may create confusion
➢ Objects can be misclassified or confused
➢ Distortions may occur in an image due to the relative motion of sensor and source

15
Remote Sensing

Course No: CE205


Course Instructor
Dr. Manali Pal, Department Of Civil Engineering
NIT Warangal
Electromagnetic Energy
• Electromagnetic energy: all energy moving in a harmonic sinusoidal wave pattern with a
velocity equal to that of light
– Harmonic pattern means waves occurring at frequent intervals of time.
• Contains both electric and magnetic components which oscillate
– Perpendicular to each other and
– Perpendicular to the direction of energy propagation
• It can be detected only through its interaction with matter.
– Example: light, heat etc.

17
ELECTROMAGNETIC RADIATION (EMR) 1/21/2020

 Every substance in the universe having temperature above absolute zero radiates electromagnetic energy.
Remote sensing detects and measures electromagnetic energy from distant objects made of various
materials, to identify and categorize those objects by class or type, substance and spatial distribution.

 Electromagnetic Energy Or Electromagnetic Radiation (EMR)


Energy propagated in the form of an advancing
Interaction between electric and magnetic fields
(Sabbins, 1978) expressed in terms of frequency (f)
Or wavelength (λ) of radiation,
E=h.f or h.c / λ
**Since in wave theory, c = λ .f
where, h = Planck's constant (6.626 x 10-34 Joules-sec)
c = Speed of light (3 x 108 m/sec)
f = Frequency expressed in Hertz
λ = wavelength in micro meters (µm)
➢ Shorter wavelengths have higher energy content and longer wavelengths have lower energy content
Electromagnetic Energy
Characteristics Of Electromagnetic (EM) Energy –
Wave Theroy
Velocity (c)
EM waves travel at the speed of light (3×108
m/s. )
Wavelength (λ)
Distance from any point of one wave to the
same position on the next wave
The wavelengths commonly used in remote
sensing are very small
It is normally expressed in micrometers (1 μm
=1×10-6 m)
In remote sensing EM waves are categorized
in terms of their wavelength location in the
EMR spectrum
Frequency (f)
Number of waves passing a fixed point per
unit time. It is expressed in hertz (hz).
19
Electromagnetic Spectrum
1/21/2020

➢ Distribution of the continuum of energy plotted as a function of wavelength (or frequency)


➢ Gamma rays, X-rays and most of the UV rays
‒ Mostly absorbed by the earth’s atmosphere and hence not used in remote sensing
Most of the remote sensing systems operate in visible, infrared (IR) and microwave regions
Some systems use the long wave portion of the UV spectrum
Electromagnetic Spectrum

21
Electromagnetic Spectrum
Region Wavelength Remarks
(μm)
Gamma rays < 3×10-5 • Not available for remote sensing.
• Incoming radiation is absorbed by the atmosphere
X-ray 3×10-5 - 3×10-3

Ultraviolet (UV) 0.03 - 0.4 • Wavelengths < 0.3 are absorbed by the ozone layer.
rays • Wavelengths between 0.3- 0.4 μm are transmitted and termed as
“Photographic UV band”.
Visible 0.4 - 0.7 Detectable with film and photodetectors.
Infrared (IR) 0.7 - 100 • Specific atmospheric windows allows maximum transmission.
• Photographic IR band (0.7-0.9 μm) is detectable with film.
• Principal atmospheric windows exist in the thermal IR region (3 - 5
μm and 8 - 14 μm)
Microwave 103 - 106 • Can penetrate rain, fog and clouds.
• Both active and passive remote sensing is possible. Radar uses
wavelength in this range.
Radio > 106 • Have the longest wavelength. 22
• Used for remote sensing radars.
Energy Sources and Radiation Principle - Solar Radiation
• Sun is the primary source of energy
that illuminates features on the earth
surface
• SOLAR RADIATION
➢ Solar radiation (insolation)
arrives at the earth at different
wavelengths
➢ The amount of energy it
produces is not uniform across
all wavelengths
‒ Almost 99% is within the
range of 0.28-4.96 μm
‒ Within this range, 43% is
radiated in the visible region
between 0.4-0.7 μm
‒ Maximum energy (E) is Irradiance distribution of Sun and
available at 0.48 μm wave Earth (http://www.csulb.edu)
length (visible green)
Irradiance: Power of electromagnetic radiation per unit
area incident on a surface
23
Radiation from the Earth
• Earth and the terrestrial objects also emit electromagnetic radiation
‒ All matter at temperature above absolute zero (0K or -273oC) emit electromagnetic radiations continuously
‒ The amount of radiation from such objects is a function of the temperature of the object
‒ Applicable for objects that behave as a blackbody

• Ambient temperature of the earth ~ 300k


24
‒ Emits thermal IR radiation
‒ Maximum exitance in the region of 9.7 μm
‒ Can be sensed using scanners and radiometers.
Blackbody Radiation

• Blackbody : A hypothetical, ideal radiator that absorbs


and re-emits the entire energy incident upon it
• Spectral distribution or spectral curve : Energy
distribution over different wavelengths for different
temperature

➢ Area under the spectral curve for any


temperature = Total radiant exitance at that
temperature
➢ As the temperature increases total radiant
exitance increases and hence the area under
the curve
➢ Represents the Stefan-Boltzman’s law
graphically
Blackbody Radiation
• Peak of the radiant exitance varies with wavelength

• Solar radiation
- Sun’s temperature is around
6000 K
- In the spectral curve at
6000K visible part of the
energy (0.4-0.7 μm)
dominates
INTERACTIONS WITH THE ATMOSPHERE
EMR has to travel through some distance of the Earth's atmosphere
before it reaches to Earth’s surface. Particles and gases in the
atmosphere can affect the incoming light and radiation. These effects
are caused by the mechanisms of scattering and absorption.

Scattering occurs when particles present in the


atmosphere interact with and cause the EMR to be
redirected from its original path. It depends the
wavelength of the radiation, the abundance of
particles or gases, and the distance the radiation
travels through the atmosphere.

Absorption, in contrast to scattering, causes


molecules in the atmosphere to absorb energy at
various wavelengths. Ozone, carbon dioxide, and
water vapour are the three main atmospheric
constituents which absorb radiation.
INTERACTIONS WITH THE ATMOSPHERE 1/21/2020

Atmospheric windows
• Gases absorb electromagnetic energy in very specific regions of the spectrum-they influence where (in the
spectrum) we can look for remote sensing purposes
• Those areas of the spectrum which are not severely influenced by absorption and thus, are useful to remote
sensors are called atmospheric windows.
• Selective wavelength bands are used in remote sensing
• Electromagnetic energy interacts with the atmospheric gases and particles
- Scattering and Absorption
- Atmosphere absorbs / backscatters a fraction of the energy and transmits the remainder
• Atmospheric windows : Wavelength regions through which most of the energy is transmitted through atmosphere
INTERACTIONS WITH THE ATMOSPHERE
Atmospheric windows
Remote Sensing

Course No: CE205


Course Instructor
Dr. Manali Pal, Department Of Civil Engineering
NIT Warangal
Interactions with the Atmosphere
EMR has to travel through some distance of the Earth's atmosphere
before it reaches to Earth’s surface. Particles and gases in the
atmosphere can affect the incoming light and radiation. These effects
are caused by the mechanisms of scattering and absorption.

Scattering occurs when particles present in the


atmosphere interact with and cause the EMR to be
redirected from its original path. It depends the
wavelength of the radiation, the abundance of
particles or gases, and the distance the radiation
travels through the atmosphere.

Absorption, in contrast to scattering, causes


molecules in the atmosphere to absorb energy at
various wavelengths. Ozone, carbon dioxide, and
water vapour are the three main atmospheric
constituents which absorb radiation.
Interactions with the Atmosphere
• Atmosphere : Gaseous envelop that surrounds the Earth’s surface and most of the gases are
concentrated within the lower 100km of the atmosphere.
• Interactions of the direct solar radiation and reflected radiation from the target with the
atmospheric constituents interfere with the process of remote sensing and are called as
“Atmospheric Effects”.
• Information carried by EMR reflected/emitted by the earth’s surface is modified while
traversing through the atmosphere.
• The interaction of EMR with the atmosphere can be used to obtain useful information about
the atmosphere itself.
• The gases and the particles present in the atmosphere cause scattering and absorption of the
electromagnetic radiation passing through it modulating the radiation reflected from the
target by attenuating it and changing its spatial distribution.
Interactions with the Atmosphere
• From the source to the sensor, the radiation passes through the
atmosphere
• Path length: The distance traveled by the radiation through the
atmosphere
-Varies depending on the remote sensing techniques and sources
-Space photography using solar energy
• Path length = Twice the thickness of the earth’s atmosphere
-Airborne thermal sensors using emitted energy from the objects on
the earth
•Path length = One way distance from the earth’s surface to the
sensor

The intensity and the spectral composition of the incident radiation are
altered by the atmospheric effects
Atmospheric interaction depends on the
- Properties of the radiation such as magnitude and wavelength
- Atmospheric conditions
- Path length
Atmospheric Scattering
• Scattering is the redirection of EMR by particles suspended in the atmosphere or by large molecules
of atmospheric gases which diffuses a portion of the incident radiation in all direction.
• The amount of scattering depends upon the size of the particles, their abundance, the wavelength of
radiation, depth of the atmosphere through which the energy is travelling and the concentration of
the particles.
• There is no energy transformation during scattering
• Scattering not only reduces the image contrast but also changes the spectral signature of ground
objects as seen by the sensor.
• Types of Scattering

➢ Rayleigh scattering
➢ Mie scattering
➢ Non-selective scattering

http://www.geog.ucsb.edu/~joel/g110_w08/lecture_not
es/radiation_atmosphere/radiation_atmosphere.html
Rayleigh Scattering
• Also known as Selective scattering or Molecular scattering caused by the atmospheric molecules and other
tiny particles
• Dependent on the wavelength
• It predominates where EMR interacts with particles that are much smaller than the wavelength of the
incoming light (Particle size less than (1/10)th of the wavelength)
• Shorter wavelengths are scattered more than longer wavelengths.
• In the absence of these particles and scattering the sky would appear black.
• The Rayleigh scattering is the most important type of scattering in RS.
• Scattering of the visible bands is caused mainly by the molecules of oxygen and nitrogen
▪ Blue (shorter wavelength) is scattered more
- Blue light is scattered around four times the red light
- UV light is scattered about 16 times the red light
- A "blue" sky is a manifestation of Rayleigh scatter
▪ Orange or red colour during sunrise and sunset
- Sun rays have to travel a longer path
- Complete scattering (and absorption) of shorter wavelength radiations
- Only the longer wavelength (orange and red) which are less scattered are visible
▪ Other examples
- The haze in imagery
- Bluish-grey cast in a color image when taken from high altitude
Mie Scattering
• Mie scattering occurs when the wavelength of the incoming radiation is almost equal to the diameter of the
atmospheric particles.
• These are caused by aerosols: a mixture of gases, water vapour, dust, smoke and pollen; Gas molecules are too
small to cause Mie scattering of the radiation commonly used for remote sensing
• It is generally restricted to the lower atmosphere where the larger particles are abundant and dominates under
overcast cloud conditions.
• Longer wavelengths also get scattered compared to Rayleigh scatter
• It influences the entire spectral region from ultra violet to near infrared regions

Source: http://hyperphysics.phy-astr.gsu.edu
Non-Selective Scattering

• This type of scattering occurs when the particle size is much larger than the wavelength of the
incoming radiation (Diameter is greater than10 times the wavelengths being sensed).
• Particles responsible for this effect are water droplets and larger dust particles. Particles such as
pollen, cloud droplets, ice crystals and raindrops can cause non-selective scattering of the
visible light.
• The scattering is independent of the wavelength, all the wavelength are scattered equally.
• The most common example of non-selective scattering is the appearance of clouds as white. As
cloud consist of water droplet particles and the wavelengths are scattered in equal amount, the
cloud appears as white.

Source: http://hyperphysics.phy-astr.gsu.edu
Atmospheric Absorption
• Absorption is a process in which the incident energy is retained by particles in the atmosphere and it relatively reduces the
amount of light that reaches our eye making the scene look relatively duller.
• Unlike scattering, atmospheric absorption causes an effective loss of energy and energy is transformed into other forms.
• Mainly three gases are responsible for most of absorption of solar radiation, viz. ozone, carbon dioxide and water vapour.
• Ozone absorbs the high energy, short wavelength portions of the ultraviolet spectrum (<0.24 μm ) thereby preventing the
transmission of this radiation to the lower atmosphere.
• Carbon dioxide is important in RS as it effectively absorbs the radiation in mid and far infrared regions of the spectrum
particles.
• Water vapour absorption are in bands 5.5-7 μm and 27 μm .
• Absorption depends on -Wavelength of the energy; Atmospheric composition; Arrangement of the gaseous molecules and
their energy level
• The absorbing medium will not only absorb a portion of the total energy, but will also reflect, refract or scatter the energy.
The absorbed energy may also be transmitted back to the atmosphere.
Atmospheric window
• The ranges of wavelength that are partially or wholly transmitted through the atmosphere
• Remote sensing data acquisition is limited through these atmospheric windows
Atmospheric Window
• Wavelengths shorted than 0.1 μm
– Absorbed by nitrogen and other
gaseous components
• Wavelengths shorter than 0.3μm (x-
rays, gamma rays and part of ultraviolet
rays)
– Mostly absorbed by the ozone (O3)
• Visible part of the spectrum
– Little absorption occurs
• Oxygen in the atmosphere causes
absorption centered at 6.3μm.
• Infrared (IR) radiation
– Mainly absorbed by water vapour
and carbon dioxide molecules
• Far infrared region
– Mostly absorbed by the
atmosphere
• Microwave region
– Absorption is almost nil
Atmospheric Window
Major atmospheric windows used in remote sensing and their characteristics

Atmospheric window Wavelength band Characteristics


(μm)
Upper ultraviolet, Visible and 0.3-1 apprx. 95% transmission
photographic IR
Reflected infrared 1.3, 1.6, 2.2 Three narrow bands
Thermal infrared 3.0-5.0 Two broad bands
8.0-14.0
Microwave >5000 Atmosphere is mostly transparent
Energy Interactions with Target or Earth Surface Features
• Electromagnetic energy interactions with the
surface features
• There are three forms of interaction that
can take place when energy strikes, or is Relationship between reflection, absorption and transmission
incident (I) upon the surface ➢Principle of conservation of energy as a function of wavelength
➢ Reflection (R) : radiation bounces off
the target and is redirected
➢ Absorption (A) : radiation is absorbed
into the target
➢ Transmission (T) : radiation passes
through a target

• Reflection, absorption or transmission ?


➢ Energy incident on a surface may be partially
reflected, absorbed or transmitted
➢ Which process takes place on a surface depends on
the following factors:
• Wavelength of the radiation
• Angle at which the radiation intersects the
41
surface
• Composition and physical properties of the
surface
Energy Interactions with Target or Earth Surface Features

• Reflection
➢ Radiation is redirected after hitting the target
➢ Angle of incidence = angle of reflectance
• Transmission
➢ Radiation is allowed to pass through the
target
➢ Changes the velocity and wavelength of the
radiation
➢ Transmitted energy may be further scattered
or absorbed in the medium  Absorption
➢ Radiation is absorbed by the target
➢ A portion absorbed by the Earth’s surface is available
for emission as thermal radiation

42
Reflection vs Scattering

Reflection
• Incident energy is redirected
• Angle of incidence = angle of reflection
➢ The reflected radiation leaves the surface at the
same angle as it approached

Scattering
 A special type of reflection
 Incident energy is diffused in many directions
 Often called Diffuse Reflection

Reflection or Scattering?
Depends on the roughness of the surface with respect to the incident wavelength
Roughness of the surface < Incident wavelength → Smooth surface → Reflection
Roughness of the surface > Incident wavelength → Rough surface → Scattering
Roughness of the surface controls how the energy is reflected
Mainly two types
➢Specular reflection
➢Diffuse (Lambertian) reflection
Spectral Reflectance
• Represents the reflectance characteristics of earth surface features
• Ratio of energy reflected by the surface to the energy incident on the surface
• Measured as a function of wavelength
• Also known as albedo
• Mathematical representation of spectral reflectance or albedo

ER (  )
R =
EI (  )
Energy of wavelength  reflected from the object
= 100
Energy of wavelength  incident on the object

• Spectral reflectance characteristics of the surface features is used to identify the features and to
study their characteristics

• Requires basic understanding of the general reflectance characteristics of different features


Remote Sensing

Course No: CE205


Course Instructor
Dr. Manali Pal, Department Of Civil Engineering
NIT Warangal
Spectral Reflectance
• Represents the reflectance characteristics of earth surface features
• Ratio of energy reflected by the surface to the energy incident on the surface
• Measured as a function of wavelength
• Also known as albedo
• Mathematical representation of spectral reflectance or albedo

ER (  )
R =
EI (  )
Energy of wavelength  reflected from the object
= 100
Energy of wavelength  incident on the object

• Spectral reflectance characteristics of the surface features is used to identify the features and to
study their characteristics

• Requires basic understanding of the general reflectance characteristics of different features


Spectral Reflectance of Earth Surface Features

Albedo of fresh snow is generally very high. Dry snow reflects
almost 80% of the energy incident on it. Surface type Albedo %
➢ Clouds also reflect a majority of the incident energy.
Grass 25
➢ Dark soil and concrete generally show very low albedo.
Concrete 20
➢ Albedo of vegetation is also generally low, but varies with the
Water 5-70
canopy density.
Fresh snow 80
➢ Albedo of forest areas with good canopy cover is as low as 5-10%.
Forest 5-10
Albedo of water ranges from 5 to 70 percentage, due to the specular
Thick cloud 75
reflection characteristics.
Dark soil 5-10
Albedo is low at lower incidence angle and increases for higher
incidence angles. 47
Spectral Reflectance Curve
• Graphical representation of the spectral response over Spectral reflectance within one class is
different wavelengths of the electromagnetic spectrum not unique, and hence the ranges are
– Give an insight into the spectral characteristics of shown
different objects
– Used for the selection of a particular wavelength band
for remote sensing data acquisition

Example:
Generalized spectral reflectance curves for deciduous and
coniferous trees
• Sensor selection to differentiate deciduous and coniferous
trees Maximum
reflectance in
– Curves overlap in the visible portion green gives the
– Both class will be seen in shades of green green colour

– Deciduous and coniferous trees cannot be


differentiated through visible spectrum
– Spectral reflectance are quiet different in NIR
– Deciduous and coniferous trees can be 48

differentiated through NIR spectrum


Spectral Reflectance Curve

Panchromatic photograph using reflected sunlight Black and white infrared photograph using reflected
over the visible wavelength sunlight over 0.7 to 0.9 mm wavelength
• Coniferous and deciduous trees are not • Deciduous trees show bright signature compared
differentiable to coniferous trees
49

(Source: Lillesand et al., 2004)


Spectral Reflectance Curve
• Typical Spectral reflectance Curves for vegetation, Soil and Water
Spectral Reflectance of Soil
• The majority of radiation incident on a soil surface is either reflected or
absorbed and a little is transmitted.
• Factors determining soil reflectance properties
➢ Moisture content
➢ Soil texture (proportion of sand, silt, and clay)
➢ Surface roughness
➢ Presence of iron oxide and organic matter
• The presence of moisture in soil decreases the its reflectance
• Soil moisture content is strongly related to the soil texture. For example,
coarse, sandy soils are usually well drained, resulting in low moisture
content and relatively high reflectance. On the other hand, poorly drained
fine textured soils generally have lower reflectance.

51
Spectral Reflectance of Soil

• In the absence of water, the soil exhibits the reverse tendency


i.e., coarse textured soils appear darker than fine textured soils.
• Two other factors that reduce soil reflectance are surface
roughness and the content of organic matter.
• Presence of iron oxide in a soil also significantly decreases
reflectance, at least in the visible region of wavelengths.
Spectral reflectance curve for soil shows considerably less peak-
and-valley variation compared to that for vegetation

52
Spectral Reflectance of Water
• The majority of radiation incident on water is not reflected but
is either absorbed or transmitted.
• Water provides a semi-transparent medium for the
electromagnetic radiation
• Longer visible wavelengths and near infra red wavelengths are
absorbed more by water than shorter visible wavelengths.
• Thus, water looks blue or blue green due to stronger reflectance
at these shorter wavelengths and darker if viewed at red or near
infrared wavelengths.
• The factors that affect the variability in reflectance of a water
• Water in liquid phase
body are depth of water, materials within water and surface
–High reflectance in the visible region between 0.4μm and
roughness of water. 0.6μm
• Spectral responses varies with –Wavelengths beyond 0.7μm are completely absorbed.
– Wavelength of the radiation • Water in solid phase (ice or snow)
–Good reflection at all visible wavelengths
– Physical and chemical characteristics of the water
53
Spectral Reflectance of Water

• Reflectance properties of a water body depends on the materials present in water


• Clear water : Absorbs relatively little energy having wavelengths shorter than 0.6 μm.
• Turbidity : presence of suspended sediments increases visible reflectance
• Chlorophyll : decrease blue wavelength reflection and increase green wavelength reflection
• Variation in the spectral reflectance in the visible region can be used to differentiate
• Shallow and deep waters
• Clear and turbid waters
• Rough and smooth water bodies
• Reflectance in the NIR range are generally used to
• Delineate the water bodies
• To study the algal boom and phytoplankton concentration in water
Spectral Reflectance of Vegetation
• The spectral characteristics of vegetation vary with
wavelength.
• Plant pigment in leaves called chlorophyll strongly absorbs
radiation in the red and blue wavelengths but reflects green
wavelength.
• The internal structure of healthy leaves acts as diffuse
reflector of near infrared wavelength.
• The internal structure of healthy leaves acts as diffuse
reflector of near infrared wavelengths.
• Measuring and monitoring the near infrared reflectance is
one way that scientists determine how healthy particular
vegetation may be.
• Spectral reflectance curve for healthy green vegetation
exhibits the "peak-and-valley" configuration
– Peaks indicate strong reflection in the wavelength
bands
– Valleys indicate predominant absorption of the energy
in the wavelength band

55
Spectral Reflectance of Vegetation

• Spectral response of vegetation depends on the structure of the plant leaves


• Chlorophyll strongly absorbs energy in the bands centered at 0.45 and 0.67 μm (blue and red)
• Reflection peaks for green in the visible region that is why healthy vegetation is perceived as green in
colour
• Only 10-15% of the incident energy is reflected in the green band
• High reflectance in the reflected IR or NIR region
• Healthy vegetation shows brighter response in the NIR region compared to the green region
• Most of the remaining energy is transmitted and absorption is minimum
• At wavelengths beyond 1.3 μm, leaf reflectance is approximately inversely related to the total water
present in a leaf that is a function of both the moisture content and the thickness of a leaf
• A little to no transmittance of energy beyond 1.3 μm
• Energy beyond 1.3 μm is absorbed or reflected
• Dips in reflectance occur at 1.4, 1.9, and 2.7 μm as water in the leaf strongly absorbs the energy at these
wavelengths and these are named as water absorption bands
Spectral Reflectance of Vegetation
• Healthy vegetation
• Chlorophyll content absorbs blue and red in the visible region
• Mesophyll cells strongly reflects the NIR radiation
• Stressed vegetation
• Decrease in the chlorophyll content
• Less absorption in the blue and red bands
• Red and blue bands also get reflected along with the green band, giving yellow or brown
colour
• NIR bands are absorbed by the stressed mesophyll cells causing dark tones in the image
• Transmittance
• Transmittance is less in the visible region and increases in the infrared region
• A little to no transmittance of energy beyond 1.3 μm
• Total infrared reflection from thicker canopies will be more compared to thin canopy cover
• Example:
• For a densely grown agricultural area, the NIR signature will be more
• Deciduous and coniferous trees
• Spectral reflectance may be similar in the green band
• Coniferous trees show higher reflection in the NIR band
Multi Spectral Remote Sensing-Example

Aerial photographs of a stadium normal colour Aerial photographs of a stadium in colour IR

 Artificial turf appears dark, whereas the natural


 Artificial turf inside the stadium and the
natural vegetation appears in the same colour vegetation shows high reflectance in the IR
region 58

(Images are taken from Lillesand et al., 2004)


Remote Sensing

Course No: CE205


Course Instructor
Dr. Manali Pal, Department Of Civil Engineering
NIT Warangal
PRINCIPLE OF REMOTE SENSING
• The different amount of energy in different bands of the electromagnetic
spectrum from reflection or emission from different objects depends on
properties of
– The target material
– The incident energy (angle of incidence, intensity and wavelength)
Remote Sensing Platforms

• Sensor & Platform in Remote Sensing


➢ Sensor: A device used to detect the
Reflected or emitted electromagnetic
Radiation
– Cameras and scanners
➢ Platform: A vehicle used to carry
The sensor
– Aircrafts and satellites 1/21/2020
PLATFORMS
Ground-based Platforms: These are often used to record detailed information about the surface which is
compared with information collected from aircraft or satellite sensors.
In some cases, this can be used to better characterize the target which is imaged
by these other sensors, makingit possible to better under stand the information
in the image.

• Towers : better • Weather


• Mobile hydraulic • Portable Masts: Surveillance Radar:
Platforms (up to 15m Unstable in wind stability and rigidity
than Masts detects and tracks
height) conditions typhoons and cloud
masses
PLATFORMS
Airborne Platforms:
• Aerial platforms are generally stable wing platforms,
although helicopters are ocassionaly used.
• Balloons based: tool to probing the atmosphere; useful to
taste tools under development (up to 22-40 km height)
• Radiosonde: Measures pressure, temperature and relative
humidity in the atmosphere
• Rawinsode: Measure wind velocity, temperature, pressure
and relative humidity;
• Aircraft: Often used to collect vey detailed images and
facilitate the collection of data over virtually any portion of
the Earth’s surafece at any time.
• High spatial resolution (~20 cm)
• Flexible scheduling to avoid weather problems
• Sensor maintenance and repair are easy
• High cost per unit area where it takes many passes to
cover large area
• Swath is lesser compared to satellites
PLATFORMS
Spaceborne Platforms:

• Satellites are objects which revolve


around another object. Example :
The moon revolves around the
Earth.
• Man-made satellites include those
platforms launched for remote
sensing, communication, and
telemetry (location and navigation)
purposes.
• Satellites permit repetitive coverage
of the Earth’s surface on a
continuing basis because of their
orbits.
• Rockets, satellites and space shuttles
• Covers large area
• Repetitive coverage of an area
of interest
SATELLITE ORBITS
Satellite Orbits: The path followed by a satellite in the space.
• Circular or near-circular
• Elliptical

Types of Satellite Orbits: Geo-synchronous orbit


Polar orbit
Sun-synchronous orbit
GEO-SYNCHRONOUS ORBIT
Geo-synchronous or Geo-stationary Orbits

From any point on the equator, the satellite appears stationary

Time required for the satellite to Time required for the Earth to
cover one revolution
= rotate once about its polar axis
Geo-stationary Satellites
• A geostationary satellite is launched in such a way that it follows
an orbit parallel to the equator (Inclination = 180º) and travels in
the same direction as the earth’s rotation (West to East) with the
same period of 24 hours.
• It always views the same area on the earth, thus, monitoring a
location
continuously.
• A large area of the earth can also be covered by this satellite.
• These are located at a high altitude of 36000 km from the Earth’s
surface. At this distance, only low resolution images are acquired.
GEO-SYNCHRONOUS ORBIT
POLAR ORBIT
Polar or Near-Polar Orbits
• Due to the rotation of the Earth, it is possible to combine the
advantages of low-altitude orbits with global coverage, using near
polar orbiting satellites, which have an orbital plane crossing the
poles.
• These satellites are launched into orbits at high inclinations to the
Earth’s rotation (at low angles with longitude lines), such that they
pass across high latitudes near the poles.
• Inclined at nearly 90 degrees
• Usually low altitude orbits (700-800 km)
• Satellites make more than one revolution around the Earth in a
single day and due to the Earth’s rotation, in each revolution the
satellite passes over different areas
• Gives complete coverage of the earth’s surface during an orbit
cycle
• Orbit cycle: When the nadir point of the satellite passes over
the same point on the Earth’s surface for a second time
• Revisit period: Time lapsed between two successive views of the
same area by a satellite
SUN-SYNCHRONOUS ORBIT
• Sun-synchronous orbits

➢ Special case of polar orbit

➢ Satellite passes over the same part of the earth at roughly at the same
local time each day

➢ Used for satellites that need a constant amount of sunlight

➢ Satellites revolving in sun-synchronous orbits : Landsat, IRS


satellites

68
SENSORS
• The broad classes of sensors are:
Passive: Energy leading to radiation received comes from an external source, e.g., the Sun;
the MSS is an example;
Active: energy generated from within the sensor system is beamed outward, and the fraction
is measured; RADAR is an example.
• Another attribute in this classification is:
Scanning Mode: If the scene is sensed point by point (equivalent to small areas within the
scene) along successive lines over a finite time, this mode of measurement
makes up a scanning system.
Non-Scanning: If the entire scene is sensed directly with the sensor then its termed as non-
scanning system.
• Sensors can be
Non-imaging: Measures the radiation received from all points in the sensed target, integrate
this, and reports the result as electrical signal strength or some other
quantitative attributes, such as radiance.
Imaging: The electrons released are used to excite or ionize a substance like silver in film or
to drive an image producing device like a TV or computer monitor or a1/21/2020
cathode
ray tube or oscilloscope or a battery of electronic detectors.
INFORMATION COLLECTED BY
SENSORS
SENSOR CHARACTERISTICS
• In remote sensing resolution means the resolving power

➢Capability to identify the presence of two objects


➢Capability to identify the properties of the two objects
• An image that shows finer details is said to be of finer resolution compared to the image that
shows coarser details
• Four types of resolutions are defined for the remote sensing systems
• Spatial resolution : Resolving power, smallest area that can be mapped, i.e. the pixel size
• Spectral resolution : Number of spectral bands and bandwidth over which the sensor is
sensitive.
• Temporal resolution : Repetivity i.e. time interval between two consecutive visit of a
point on the Earth by the sensor.
• Radiometric resolution : Smallest amount of radiance energy that can be quantized to
generate an electrical signal.
SPATIAL RESOLUTION

• Spatial resolution: size of the smallest dimension on


the earth’s surface over which an independent
measurement can be made by the sensor
➢ Expressed by the size of the pixel on the
ground in meters
➢ Controlled by the instantaneous field of view
(IFOV)
➢ IFOV - Area on the Earth’s surface that is seen
at one particular moment of time
SPECTRAL RESOLUTION
Spectral resolution
• Ability of a sensor to define
fine wavelength intervals
• Ability of a sensor to resolve
the energy received in a
spectral bandwidth to
characterize different
constituents of earth surface
Depends on
• Spectral band width of the Using the broad
wavelength band 1, the
filter features A and B cannot be
• Sensitiveness of the differentiated
detector
The finer the spectral resolution, Spectral reflectance of A
and B are different in the
the narrower the wavelength narrow bands 2 and 3, and
range for a particular channel or hence can be differentiated
band
TEMPORAL RESOLUTION
Temporal resolution : Number of times an object is sampled or How often data are obtained for the same
area
❖The absolute temporal resolution of a remote sensing system to image the same area at the same
viewing angle a second time is equal to the repeat cycle of a satellite.
RADIOMETRIC RESOLUTION

Radiometric resolution: Sensitivity of the sensor Radiometric resolution and the corresponding
to the magnitude of the electromagnetic energy
brightness levels available
❖How many grey levels are measured between
pure black (no reflectance) to pure white
(maximum reflectance) Radiometric Number of levels Example
❖The finer the radiometric resolution of a sensor resolution
the more sensitive it is in detecting small 1 bit 21 – 2 levels
differences in the energy
7 bit 27 – 128 levels IRS 1A & 1B
❖The finer the radiometric resolution of a sensor
the system can measure more number of grey 8 bit 28 – 256 levels Landsat TM
levels 11 bit 211 – 2048 levels NOAA-AVHRR

• Radiometric resolution is measured in Bits


➢Each bit records an exponent of power 2
Maximum number of brightness levels available
depends on the number of bits used in
representing the recorded energy
RADIOMETRIC RESOLUTION
• Tones in an image vary from black to white
• Black → digital number = 0 → no reflectance
• White → digital number is the maximum
=1 for a 1-bit data
=255 for a 8-bit data
• Image data are generally displayed in a range of
grey tones, with black representing a digital
number of 0 and white representing the maximum
value (for example, 255 in 8-bit data).
• In an 8 bit system, black is measured as 0 and
white is measured as 255. The variation between
black to white is scaled into 256 classes ranging
from 0 to 255. Similarly, 2048 levels are used in
an 11 bit system as shown in Figure.
• Finer the radiometric resolution, more the number
of grey levels that the system can record
• and hence more details can be captured in the
image.
76
Remote Sensing

Course No: CE205


Course Instructor
Dr. Manali Pal, Department Of Civil Engineering
NIT Warangal
IMAGE : DIGITAL AND ANALOGUE

78
DIGITAL IMAGE
Character of Digital Image Data :
The first image appears to be a continuous tone
photograph, it is actually composed of a two
dimensional array of discrete picture elements or
pixels. The intensity of each pixel corresponds to the
average brightness or radiance measured
electronically over the ground area corresponding to
each pixel.
Whereas the individual pixels are virtually impossible
to discern in figure (a), it is observable in the
enlargement shown in (b). The enlargement
correspond to a subarea located in (a). Part (c) shows
the individual Digital Number (DN) corresponding to
the average radiance measured in each pixel. These
values are simply positive integers that results from
quantizing the original electrical signal from the
sensor into positive integer values using a process
called ‘Analogue-to-Digital’ (A-to-D) signal
conversion.
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
GEOMETRIC CORRECTIONS

Sources of Geometric
Errors of Image
RADIOMETRIC CORRECTIONS

• The radiance measured by any given system over a given object is influenced by such factors as changes in
scene illumination, atmospheric conditions, viewing geometry, and instrument response characteristics. Some
of these effects, such as viewing geometry variations, are greater in the case of airborne data collection than
in satellite image acquisition. Also, the need to perform correction for any or all of these influences depends
directly upon the particular application at hand.

• Over the course of the year, there are systematic, seasonal changes in the intensity of solar irradiance incident
on the earth’s surface. If remotely sensed images taken at different times of the year are being compared, it is
usually necessary to apply a sun elevation correction and an earth–sun distance correction. The sun elevation
correction accounts for the seasonal position of the sun relative to the earth. Through this process, image data
acquired under different solar illumination angles are normalized by calculating pixel brightness values
assuming the sun was at the zenith on each date of sensing. The correction is usually applied by dividing
each pixel value in a scene by the sine of the solar elevation angle (or cosine of the solar zenith angle) for the
particular time and location of imaging.
RADIOMETRIC CORRECTIONS
• The earth–sun distance correction is applied to normalize for the seasonal changes in the distance between
the earth and the sun. The earth–sun distance is usually expressed in astronomical units. (An astronomical
unit is equivalent to the mean distance between the earth and the sun, approximately 149.6 x 106 km). The
irradiance from the sun decreases as the square of the earth–sun distance.

• Ignoring atmospheric effects, the combined influence of solar zenith angle and earth–sun distance on the
irradiance incident on the earth’s surface can be expressed as,
RADIOMETRIC CORRECTIONS
The influence of solar illumination variation is compounded by atmospheric effects. The atmosphere
affects the radiance measured at any point in the scene in two contradictory ways. First, it attenuates
(reduces) the energy illuminating a ground object. Second, it acts as a reflector itself, adding a
scattered, extraneous “path radiance” to the signal detected by a sensor. Thus, the composite signal
observed at any given pixel location can be expressed by,
RADIOMETRIC CORRECTIONS
• Only the first term in the equation contains valid information about ground reflectance. The second
term represents the scattered path radiance, which introduces “haze” in the imagery and reduces image
contrast.
• Haze compensation procedures are designed to minimize the influence of
path radiance effects. One means of haze compensation in multispectral data is to observe the radiance
recorded over target areas of essentially zero reflectance. For example, the reflectance of deep clear water
is essentially zero in the near-infrared region of the spectrum. Therefore, any signal observed over such an
area represents the path radiance, and this value can be subtracted from all pixels in that band. This
process is referred to as dark object subtraction.
• For convenience, haze compensation routines are often applied uniformly throughout a scene. This may
or may not be valid, depending on the uniformity of the atmosphere over a scene. When extreme viewing
angles are involved in image acquisition, it is often necessary to compensate for the influence of varying
the atmospheric path length through which the scene is recording.
• More advanced methods have been developed for atmospheric correction of optical and thermal images,
when simple haze-removal techniques like dark object subtraction are insufficient. These algorithms are
broadly divided into those based on empirical correction using spectral data from the imagery itself and
those using radiative transfer methods to model atmospheric scattering and absorption from physical
principles. In many cases, these algorithms may require information about local atmospheric conditions at
the time of image acquisition.
RADIOMETRIC CORRECTIONS
• When spectral data from more than one image need to be compared, but there is not sufficient information
available for a complete atmospheric correction process, an alternative is radiometric normalization of the
images. This process involves adjusting the brightness values of one or more secondary images to match a
single base image. The images must at least partially overlap, and the overlap area must contain several
temporally stable targets, features whose true surface reflectance is assumed to be constant over time.
Typically, the analyst identifies a set of these targets covering a range of brightness values, then uses a
statistical method such as linear regression to establish a model relating the brightness values in each
secondary image to the corresponding brightness values in the base image. Each secondary image is then
normalized using its own regression model.
• Another radiometric data processing activity involved in many quantitative applications of digital image data
is conversion of DNs to absolute radiance (or reflectance) values. This operation accounts for the exact form
of the A-to-D response functions for a given sensor and is essential in applications where measurement of
absolute radiances is required. For example, such conversions are necessary when changes in the absolute
reflectance of objects are to be measured over time using different sensors (e.g., the TM on Landsat-5 versus
the OLI on Landsat-8). Likewise, such conversions are important in the development of mathematical
models that physically relate image radiance or reflectance data to quantitative ground measurements (e.g.,
water quality measurements).
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
• Atmospheric Corrections
Atmospheric models can be used to account for the effects of scattering and absorption in the
atmosphere. A number of parameters are required to accurately apply atmospheric correction,
including properties such as the amount of water vapor, distribution of aerosols. Sometimes this
data can be collected by field instruments that measure atmospheric gases and aerosols, but this is
often expensive and time consuming. Other satellite data can also be used to help estimate the
amount and distribution of atmospheric aerosols. Many software packages include special
atmospheric correction modules that use atmospheric radiation transfer models to produce an
estimate of the true surface reflectance.
DIGITAL IMAGE PROCESSING
GEOMETRIC REGISTRATION
• It involves identifying the image coorditnates of
several clearly discernible points, called Ground
Control Points (GCP). The true ground coordinates
are typically measured from a map (b ‐ b1 to b4),
either in paper or digital format. Several
well‐distributed GCP pairs are identified and the
coordinate information is processed by the computer
to determine the proper transformation equations to
apply to the original image coordinates to map them
into their new ground coordinates.
• It is also performed by registering one (or more)
images to another image, instead of to geographic
coordinates
RESAMPLING
• Resampling is used to determine the digital values to place in the new pixel locations of the corrected output image.

Nearest neighbour Resampling: uses the


digital value from the pixel in the original
image which is nearest to the new pixel
location in the corrected image.
This method tends to result in a disjointed
or blocky image appearance

Bilinear interpolation Resampling: takes a weighted average of


four pixels in the
original image nearest to the new pixel location. The averaging
process alters the
original pixel values and creates entirely new digital values in
the output image.
RESAMPLING

Cubic convolution Resampling: goes even


further to calculate a distance weighted
average of a block of sixteen pixels from the
original image which surround the new
output pixel location
These two methods both produce images which
have a much sharper appearance and
avoid the blocky appearance of the nearest
neighbour method
IMAGE ENHANCEMENT
• To improve the visual interpretability of an image by increasing the apparent distinction between the features
in the scene.
• The process of visually interpreting digitally enhanced imagery attempts to optimize the complementary
abilities of the human mind and the computer.
• The range of possible image enhancement and display options available to the image analyst is virtually
limitless. Most enhancement techniques may be categorized as either point or local operations.
• Point operations modify the brightness value of each pixel in an image data set independently
• Local operations modify the value of each pixel based on neighboring brightness values

• Most commonly applied Digital Image Enhancement techniques


1. Contrast manipulation
Gray‐level thresholding, Level slicing, Contrast stretching.
2. Spatial feature manipulation
Spatial filtering, Edge enhancement and Fourier analysis.
3. Multi‐ image manipulation
Multi spectral band ratioing and differencing,principal components, canonical components,
vegetation components, intensity‐ hue‐ saturation (IHS) colour space transformations.
CONTRAST MANIPULATION

a) Gray‐level thresholding : Used to segment an input image into two classes‐ one for those pixels
having values below an analyst defined gray level and one of those above this value.
b) Level Slicing : In this, the DNs distributed along the x-axis of an image histogram are divided into a
series of analyst specified intervals or “slices”.
All of the DNs falling within a given interval in the input image are then displayed at a single DN in the
output image. Consequently, If six different slices are established, the output image contains only six
different gray levels. Each level can also be shown as a single colour.
c) Contrast Stretching: contrast stretching is to expand the narrow range of brightness values present in
an input image over a wider range of gray values. The result is an output image that is designed to provide
a contrast between features of interest to the image analyst.
Remote Sensing

Course No: CE205


Course Instructor
Dr. Manali Pal, Department Of Civil Engineering
NIT Warangal
IMAGE ENHANCEMENT
• To improve the visual interpretability of an image by increasing the apparent distinction between the features
in the scene.
• The process of visually interpreting digitally enhanced imagery attempts to optimize the complementary
abilities of the human mind and the computer.
• The range of possible image enhancement and display options available to the image analyst is virtually
limitless. Most enhancement techniques may be categorized as either point or local operations.
• Point operations modify the brightness value of each pixel in an image data set independently
• Local operations modify the value of each pixel based on neighboring brightness values

• Most commonly applied Digital Image Enhancement techniques


1. Contrast manipulation
Gray‐level thresholding, Level slicing, Contrast stretching.
2. Spatial feature manipulation
Spatial filtering, Edge enhancement and Fourier analysis.
3. Multi‐ image manipulation
Multi spectral band ratioing and differencing,principal components, canonical components,
vegetation components, intensity‐ hue‐ saturation (IHS) colour space transformations.
CONTRAST MANIPULATION

a) Gray‐level thresholding : Used to segment an input image into two classes‐ one for those pixels
having values below an analyst defined gray level and one of those above this value.
b) Level Slicing : In this, the DNs distributed along the x-axis of an image histogram are divided into a
series of analyst specified intervals or “slices”.
All of the DNs falling within a given interval in the input image are then displayed at a single DN in the
output image. Consequently, If six different slices are established, the output image contains only six
different gray levels. Each level can also be shown as a single colour.
c) Contrast Stretching: contrast stretching is to expand the narrow range of brightness values present in
an input image over a wider range of gray values. The result is an output image that is designed to provide
a contrast between features of interest to the image analyst.
GRAY‐LEVEL THRESHOLDING

Fig: (a) original Landsat-8


OLI band 4 (red band) image containing continuous
distribution of gray tones; (b) OLI band
5 (near infrared band) image; (c) OLI band 5 histogram,
with the original 12-bit OLI
radiometric range scaled to 16-bit DNs; (d) OLI band 4
brightness variation in water areas
only; (e) level slicing operation applied to OLI band 4 data
in areas determined to be water.
➢ Gray‐level thresholding : Figure (a) shows Landsat-8 OLI band 4 (visible red band) image of the coastline of New
Zealand’s South Island displaying a broad range of gray levels over both land and water. We wish to show the brightness
variations in this band in the water areas only. Because many of the gray levels for land and water overlap in this band, it
would be impossible to separate these two classes using a threshold set in this band. This is not the case in the OLI band
5 (near-infrared band) shown in Figure (b). The histogram of DNs for the band 5 image (Figure c) shows that water
strongly absorbs the incident energy in this near-infrared band (low DNs), while the land areas are highly reflective (high
DNs). A threshold set at DN= 6000 permits separation of these two classes in the band 5 data. This binary classification
can then be applied to the band 4 data to enable display of brightness variations in only the water areas. This is illustrated
in Figure (d) where the band 4 land pixel values have all been set to 0 (black) based on their classification in the band 5
binary mask. The band 4 water pixel values have been preserved and show an enhanced representation of variability in
suspended sediment along the shoreline and in rivers and ponds.

➢ Level Slicing : In this the DNs distributed along the x axis of an image histogram are divided into a series of analyst-
specified intervals or “slices.” All of the DNs falling within a given interval in the input image are
then displayed at a single DN in the output image. Consequently, if six different
slices are established, the output image contains only six different gray levels. The
result looks something like a contour map, except that the areas between boundaries are occupied by pixel displayed at
the same DN. Each level can also beshown as a single color.
The application of level slicing to the “water” portion of the scene is illustrated in Figure (d). Here, Landsat- 8 OLI
band 4 data have been level sliced into multiple levels in those areas previously determined to be water from the band 5
binary mask. Level slicing is used extensively in the display of thermal infrared images in order to show discrete
temperature ranges coded by gray level or color.
Contrast Stretching: contrast stretching is to expand the narrow range of brightness values present in an
input image over a wider range of gray values. The result is an output image that is designed to provide a contrast
between features of interest to the image analyst.

Linear Stretch:
A more expressive display would result if we were to expand the range of image levels present in the scene (60 to
158) to fill the range of display values (0 to 255). The range of image values has been uniformly expanded to fill
the total range of the output device. The linear stretch would be applied to each pixel in the image using algorithm,

where, DN’ = digital number assigned to pixel in output image


DN = original digital number of pixel in input image
MIN = minimum value of input image, to be assigned a value of 0 in the input image .
MAX = maximum value of input image, to be assigned a value of 255 in the output image.
Consider a hypothetical sensing system whose image output levels can vary
from 0 to 255.
Figure (a) illustrates a histogram of brightness levels recorded in one spectral
band over a scene. Assume that our hypothetical output device (e.g., computer
monitor) is also capable of displaying 256 gray levels (0 to 255). Note that the
histogram shows scene brightness values occurring only in the limited range of
60 to 158. If we were to use these image values directly in our display device
(Figure b), we would be using only a small portion of the full range of possible
display levels. Display levels 0 to 59 and 159 to 255 would not be utilized.
Consequently, the tonal information in the scene would be compressed into a
small range of display values, reducing the interpreter’s ability to discriminate
radiometric detail.
A more expressive display would result if we were to expand the range of
image levels present in the scene (60 to 158) to fill the range of display values
(0 to 255). In Figure c, the range of image values has been uniformly expanded
to fill the total range of the output device. This uniform expansion is called a
linear stretch. Subtle variations in input image data values would now be
displayed in output tones that would be more readily distinguished by the
interpreter. Light tonal areas would appear lighter and dark areas would appear
darker.
In Figure (d) histogram-equalized stretch can be applied where image values
are assigned to the display levels on the
basis of their frequency of occurrence. More display
values (and hence more radiometric detail) are assigned to the frequently
occurring portion of the histogram. The image value range of 109 to 158 is
now stretched over a large portion of the display levels (39 to 255). A smaller Principle of contrast stretch enhancement.
portion (0 to 38) is reserved for the infrequently occurring image values of 60
to 108.
Figure : Effect of contrast stretching Landsat-8 OLI data
acquired over the Nile Delta: (a) original
image; (b) stretch that enhances contrast in bright image areas;
(c) stretch that enhances contrast in dark
image areas.
IMAGE FILTERING
Spatial filtering :
Used to enhance the appearance of an image.
Designed to highlight specific features in an image based on their spatial frequency.
Spatial filtering is commonly used for the following purposes.
• To restore imagery by avoiding noises
• To enhance the imagery for better interpretation
• To extract features such as edges and lineaments
• In contrast to spectral filters, which serve to block or pass energy over various spectral ranges, spatial
filters emphasize or deemphasize image data of various spatial frequencies. Spatial frequency refers
to the “roughness” of the tonal variations occurring in an image. Image areas of high spatial frequency
are tonally “rough.” That is, the gray levels in these areas change abruptly over a relatively small
number of pixels (e.g., across roads or field borders). “Smooth” image areas are those of low spatial
frequency, where gray levels vary only gradually over a relatively large number of pixels (e.g., large
agricultural fields or water bodies).
• Low-pass filters are designed to emphasize low frequency features (large-area changes in brightness)
and deemphasize the high frequency components of an image (local detail).
• High-pass filters do just the reverse. They emphasize the detailed high frequency components of an
image and deemphasize the more general low frequency information.
IMAGE FILTERING
• A common filtering procedure involves moving a 'window' of a few pixels in dimension
(e.g. 3x3, 5x5, etc.) over each pixel in the image, applying a mathematical calculation
using the pixel values under that window, and replacing the central pixel with the new value

Low‐pass filter:
• Designed to emphasize larger, homogeneous areas of
similar tone and reduce the smaller detail in an image.
• Generally serve to smooth the appearance of an image.
• Examples: Average and median filters.
High‐pass filters:
• Do the opposite and serve to sharpen the appearance of
fine detail in an image.
• It first applies a low‐pass filter to an image and then
subtracts the result from the original, leaving behind only
the high spatial frequency information.
• Directional, or edge detection filters are designed to highlight
linear features, such as roads or field boundaries.
• Useful in applications such as geology, for the detection of
linear geologic structures.
IMAGE FILTERING
▪ Spatial filtering is a “neighborhood” operation in that pixel values in an original image are modified on the basis of the
gray levels of neighboring pixels. For example, a simple low-pass filter may be implemented by passing a moving
window throughout an original image and creating a second image whose DN at each pixel corresponds to the
neighborhood average within the moving window at each of its positions in the original image.
▪ A simple high-pass filter may be implemented by subtracting a low-pass filtered image (pixel by pixel) from the
original, unprocessed image.

The original image is shown in Figure (a). Figure (b) shows the low frequency component image, and Figure (c) illustrates the high
frequency component image. Note that the low frequency component image (b) reduces deviations from the neighborhood average,
which smooths or blurs the detail in the original image, reduces the gray-level range, but emphasizes the large-area brightness
regimes of the original image. The high frequency component image (c) enhances the spatial detail in the image at the expense of
the large-area brightness information. Both images have been contrast stretched. (Such stretching is typically required because
spatial filtering reduces the gray-level range present in an image.)
SPECTRAL RATIOING
It is the most common transforms applied to image data.
• Serves to highlight subtle variations in the spectral responses of various land covers.
• By ratioing the data from two different spectral bands, the resultant image
enhances variations in the slopes of the spectral reflectance curves between the two
different spectral ranges.
Example:
Healthy vegetation reflects strongly in the NIR portion of the spectrum while absorbing strongly in the visible red.
Other surfaces, such as soil & water, show near equal reflectances in both the NIR and red portions. Thus, a ratio
image of Landsat MSS Band 7 (NIR ‐ 0.8 to 1.1 mm) divided by Band 5 (Red ‐ 0 6 .6 to 0 7 .7 mm) would result in
ratios much greater than 1 0 .0 for vegetation, and ratios around 1.0 for soil and water. Thus the discrimination of
vegetation from other surface cover types is significantly enhanced.
Normalized Difference Vegetation Index (NDVI)
In spectral ratioing, we are looking at relative values instead of absolute brightness values, variations in scene
illumination as a result of topographic effects are reduced. More complex ratios involving the sums of and
differences between spectral bands for various sensors, have been developed for monitoring vegetation conditions.
Widely used image transform is the Normalized Difference Vegetation Index Used to monitor vegetation conditions
on continental and global scales.

where, NIR and Red are the spectral reflectance in the sensor’s near-infrared and red bands, respectively. High
NDVI values will result from the combination of a high reflectance in the near infrared and lower reflectance in the
red band. This combination is typical of the spectral “signature” of vegetation. Non-vegetated areas, including bare
soil, open water, snow/ice, and most construction materials, will have much lower NDVI values.
NORMALIZED DIFFERENCE VEGETATION INDEX (NDVI)

Fig: Normalized Difference Vegetation Index (NDVI) images


from Landsat imagery in (a) 2000 and (b) 2011, Xinjiang,
China. Brightest patches are irrigated agricultural fields;
intermediate gray tones are riparian vegetation along current or
former river channels; dark areas are sparsely vegetated or
desert.
PRINCIPAL COMPONENT ANALYSIS (PCA)

▪ Extensive interband correlation is a problem frequently encountered in the analysis of multispectral image
data. That is, images generated by digital data from various wavelength bands often appear similar and
convey essentially the same information. Principal component transformations is a technique designed to
reduce such redundancy in multispectral data. These transformations may be applied either as an
enhancement operation prior to visual interpretation of the data or as a preprocessing procedure prior to
automated classification of the data. Stated differently, the purpose of these procedures is to compress all of
the information contained in an original n-band data set into fewer than n “new bands.” The new bands are
then used in lieu of the original data.
▪ . PCA is a powerful mathematical tool for analyzing the data
reduces extensive inter band correlation in multispectral image data
it is a way of identifying patterns in data, and expressing the data in such a way as to highlight their
similarities and differences.
it reduces the dimensions of the data without much loss of information i.e. useful for data
compression
IMAGE CLASSIFICATION
▪ A human analyst attempting to classify features in an image uses the elements of visual interpretation to identify
homogeneous groups of pixels which represent various features or land cover classes of interest.
▪ Digital image classification uses the spectral information represented by the digital numbers in one or more
spectral bands, and attempts to classify each individual pixel based on this spectral information.

▪ Objective is to assign all pixels in the image to particular classes or themes (e.g. water, coniferous forest,
deciduous forest, corn, wheat, etc.). The resulting classified image is comprised of a mosaic of pixels, each of
which belong to a particular theme.
IMAGE CLASSIFICATION
▪ Information classes are those categories of interest that the analyst is actually trying to identify in the imagery,
such as different kinds of crops, different forest types or tree species, different geologic units or rock types, etc.
▪ Spectral classes are groups of pixels that are uniform (or near‐similar) with respect to their brightness values in the
different spectral channels of the data

▪ Objective is to match the spectral classes in the data to the information classes of interest
▪ Common Image classification procedures are divided in to two broad subdivisions based on the method used:
• Supervised Classification

• Unsupervised Classification
SUPERVISED CLASSIFICATION
▪ In this, the analyst identifies in the imagery homogeneous representative samples (Training areas) of the different
surface cover types (information classes) of interest. The selection of appropriate training areas is based on the
analyst's familiarity with the geographical area & their knowledge of actual surface cover types present in the
image.
▪ The numerical information in all spectral bands for the pixels comprising these areas are used to "train“ the
computer to recognize spectrally similar areas for each class.
▪ The computer uses a special program or algorithm to determine the numerical "signatures" for each training class
and then each pixel in the image is compared to these signatures and labeled as the class it most closely
"resembles" digitally

Fig: Supervised Classification


SUPERVISED CLASSIFICATION

Figure shows the location of a single line of data


collected for our hypothetical example over a
landscape composed of several cover types. For
each of the pixels shown along this line, the sensor
has measured scene radiance in terms of DNs
recorded in each of the five spectral bands of
sensing: blue, green, red, near infrared, and thermal
infrared. Below the scan line, typical DNs measured
over six different land cover types are shown. The
vertical bars indicate the relative gray values in each
spectral band. These five outputs represent a coarse
description of the spectral response patterns of the
various terrain features along the scan line. If these
spectral patterns are sufficiently distinct for each
feature type, they may form the basis for image Fig: Selected multispectral sensor measurements made along one scan
classification. line. Sensor covers the following
spectral bands: 1, blue; 2, green; 3, red; 4, near infrared; 5, thermal
infrared.
SUPERVISED CLASSIFICATION

• Basic steps in supervised classification:


In the training stage (1), the analyst identifies
representative training areas and develops a
numerical description of the spectral attributes
of each land cover type of interest in the
scene. Next, in the classification stage (2),
each pixel in the image data set is categorized
into the land cover class it most closely
resembles. If the pixel is insufficiently similar
to any training data set, it is usually labeled
“unknown.” After all pixels in the input image
have been categorized, the results are
presented in the output stage (3). Being digital
in character, the results may be used in a
number of different ways. Three typical forms
of output products are thematic maps, tables
of statistics for the various land cover classes,
and digital data files amenable to inclusion in Fig: Basic steps in supervised classification.
a GIS. In this latter case, the classification
“output” becomes a GIS “input”.
UNSUPERVISED CLASSIFICATION

In this, Spectral classes are grouped first, based


solely on the numerical information in the data
and are then matched by the analyst to
information classes (if possible). Programs,
called clustering algorithms used to determine
the natural groupings in the data., e.g. ISODATA
algorithm
Usually, the analyst specifies how many groups
or clusters are to be looked for in the data.
This results in some clusters that the analyst will
want to subsequently combine, or clusters that
should be broken down further ‐ each of these
requiring a further application of the clustering
algorithm.
Thus, unsupervised classification is also not
completely without human intervention. Fig: Unsupervised classification.
However, it does not start with a pre‐determined
set of classes as in a supervised classification.

You might also like