RS Complete Pdfs
RS Complete Pdfs
Sonar
Eye
Gravity
Meter
REMOTE SENSING OF ELECTROMAGNETIC
ENERGY
Remote Sensing is a technology for
Remote sensing is detecting and measuring sampling electromagnetic radiation to
electromagnetic energy emanating or reflected from acquire and interpret non-immediate
distant objects made of various materials, so that we geospatial data from which to extract
can identify and categorize these objects by class or information about features and objects on
type, substance and spatial distribution” the Earth’s land surface, oceans, and
[American Society of Photogrammetry, 1975] atmosphere.- Dr. Nicholas Short
http://geoportal.icimod.org
STAGES OF REMOTE SENSING 1/21/2020
E = h.c.f or h.c / λ
• Active remote sensing: Energy is generated and emitted from a sensing platform towards the targets
• Energy reflected back by the targets are recorded
• Longer wavelength bands are used
• Example: Active microwave remote sensing (radar)
–Pulses of microwave signals are sent towards the target from the radar antenna located on the air /
space-borne platform
–The energy reflected back (echoes) are recorded at the sensor
10
Basic components of an ideal Remote Sensing System
2) A non-interfering atmosphere
14
Applications of Remote Sensing
15
Remote Sensing
17
ELECTROMAGNETIC RADIATION (EMR) 1/21/2020
Every substance in the universe having temperature above absolute zero radiates electromagnetic energy.
Remote sensing detects and measures electromagnetic energy from distant objects made of various
materials, to identify and categorize those objects by class or type, substance and spatial distribution.
21
Electromagnetic Spectrum
Region Wavelength Remarks
(μm)
Gamma rays < 3×10-5 • Not available for remote sensing.
• Incoming radiation is absorbed by the atmosphere
X-ray 3×10-5 - 3×10-3
Ultraviolet (UV) 0.03 - 0.4 • Wavelengths < 0.3 are absorbed by the ozone layer.
rays • Wavelengths between 0.3- 0.4 μm are transmitted and termed as
“Photographic UV band”.
Visible 0.4 - 0.7 Detectable with film and photodetectors.
Infrared (IR) 0.7 - 100 • Specific atmospheric windows allows maximum transmission.
• Photographic IR band (0.7-0.9 μm) is detectable with film.
• Principal atmospheric windows exist in the thermal IR region (3 - 5
μm and 8 - 14 μm)
Microwave 103 - 106 • Can penetrate rain, fog and clouds.
• Both active and passive remote sensing is possible. Radar uses
wavelength in this range.
Radio > 106 • Have the longest wavelength. 22
• Used for remote sensing radars.
Energy Sources and Radiation Principle - Solar Radiation
• Sun is the primary source of energy
that illuminates features on the earth
surface
• SOLAR RADIATION
➢ Solar radiation (insolation)
arrives at the earth at different
wavelengths
➢ The amount of energy it
produces is not uniform across
all wavelengths
‒ Almost 99% is within the
range of 0.28-4.96 μm
‒ Within this range, 43% is
radiated in the visible region
between 0.4-0.7 μm
‒ Maximum energy (E) is Irradiance distribution of Sun and
available at 0.48 μm wave Earth (http://www.csulb.edu)
length (visible green)
Irradiance: Power of electromagnetic radiation per unit
area incident on a surface
23
Radiation from the Earth
• Earth and the terrestrial objects also emit electromagnetic radiation
‒ All matter at temperature above absolute zero (0K or -273oC) emit electromagnetic radiations continuously
‒ The amount of radiation from such objects is a function of the temperature of the object
‒ Applicable for objects that behave as a blackbody
• Solar radiation
- Sun’s temperature is around
6000 K
- In the spectral curve at
6000K visible part of the
energy (0.4-0.7 μm)
dominates
INTERACTIONS WITH THE ATMOSPHERE
EMR has to travel through some distance of the Earth's atmosphere
before it reaches to Earth’s surface. Particles and gases in the
atmosphere can affect the incoming light and radiation. These effects
are caused by the mechanisms of scattering and absorption.
Atmospheric windows
• Gases absorb electromagnetic energy in very specific regions of the spectrum-they influence where (in the
spectrum) we can look for remote sensing purposes
• Those areas of the spectrum which are not severely influenced by absorption and thus, are useful to remote
sensors are called atmospheric windows.
• Selective wavelength bands are used in remote sensing
• Electromagnetic energy interacts with the atmospheric gases and particles
- Scattering and Absorption
- Atmosphere absorbs / backscatters a fraction of the energy and transmits the remainder
• Atmospheric windows : Wavelength regions through which most of the energy is transmitted through atmosphere
INTERACTIONS WITH THE ATMOSPHERE
Atmospheric windows
Remote Sensing
The intensity and the spectral composition of the incident radiation are
altered by the atmospheric effects
Atmospheric interaction depends on the
- Properties of the radiation such as magnitude and wavelength
- Atmospheric conditions
- Path length
Atmospheric Scattering
• Scattering is the redirection of EMR by particles suspended in the atmosphere or by large molecules
of atmospheric gases which diffuses a portion of the incident radiation in all direction.
• The amount of scattering depends upon the size of the particles, their abundance, the wavelength of
radiation, depth of the atmosphere through which the energy is travelling and the concentration of
the particles.
• There is no energy transformation during scattering
• Scattering not only reduces the image contrast but also changes the spectral signature of ground
objects as seen by the sensor.
• Types of Scattering
➢ Rayleigh scattering
➢ Mie scattering
➢ Non-selective scattering
http://www.geog.ucsb.edu/~joel/g110_w08/lecture_not
es/radiation_atmosphere/radiation_atmosphere.html
Rayleigh Scattering
• Also known as Selective scattering or Molecular scattering caused by the atmospheric molecules and other
tiny particles
• Dependent on the wavelength
• It predominates where EMR interacts with particles that are much smaller than the wavelength of the
incoming light (Particle size less than (1/10)th of the wavelength)
• Shorter wavelengths are scattered more than longer wavelengths.
• In the absence of these particles and scattering the sky would appear black.
• The Rayleigh scattering is the most important type of scattering in RS.
• Scattering of the visible bands is caused mainly by the molecules of oxygen and nitrogen
▪ Blue (shorter wavelength) is scattered more
- Blue light is scattered around four times the red light
- UV light is scattered about 16 times the red light
- A "blue" sky is a manifestation of Rayleigh scatter
▪ Orange or red colour during sunrise and sunset
- Sun rays have to travel a longer path
- Complete scattering (and absorption) of shorter wavelength radiations
- Only the longer wavelength (orange and red) which are less scattered are visible
▪ Other examples
- The haze in imagery
- Bluish-grey cast in a color image when taken from high altitude
Mie Scattering
• Mie scattering occurs when the wavelength of the incoming radiation is almost equal to the diameter of the
atmospheric particles.
• These are caused by aerosols: a mixture of gases, water vapour, dust, smoke and pollen; Gas molecules are too
small to cause Mie scattering of the radiation commonly used for remote sensing
• It is generally restricted to the lower atmosphere where the larger particles are abundant and dominates under
overcast cloud conditions.
• Longer wavelengths also get scattered compared to Rayleigh scatter
• It influences the entire spectral region from ultra violet to near infrared regions
Source: http://hyperphysics.phy-astr.gsu.edu
Non-Selective Scattering
• This type of scattering occurs when the particle size is much larger than the wavelength of the
incoming radiation (Diameter is greater than10 times the wavelengths being sensed).
• Particles responsible for this effect are water droplets and larger dust particles. Particles such as
pollen, cloud droplets, ice crystals and raindrops can cause non-selective scattering of the
visible light.
• The scattering is independent of the wavelength, all the wavelength are scattered equally.
• The most common example of non-selective scattering is the appearance of clouds as white. As
cloud consist of water droplet particles and the wavelengths are scattered in equal amount, the
cloud appears as white.
Source: http://hyperphysics.phy-astr.gsu.edu
Atmospheric Absorption
• Absorption is a process in which the incident energy is retained by particles in the atmosphere and it relatively reduces the
amount of light that reaches our eye making the scene look relatively duller.
• Unlike scattering, atmospheric absorption causes an effective loss of energy and energy is transformed into other forms.
• Mainly three gases are responsible for most of absorption of solar radiation, viz. ozone, carbon dioxide and water vapour.
• Ozone absorbs the high energy, short wavelength portions of the ultraviolet spectrum (<0.24 μm ) thereby preventing the
transmission of this radiation to the lower atmosphere.
• Carbon dioxide is important in RS as it effectively absorbs the radiation in mid and far infrared regions of the spectrum
particles.
• Water vapour absorption are in bands 5.5-7 μm and 27 μm .
• Absorption depends on -Wavelength of the energy; Atmospheric composition; Arrangement of the gaseous molecules and
their energy level
• The absorbing medium will not only absorb a portion of the total energy, but will also reflect, refract or scatter the energy.
The absorbed energy may also be transmitted back to the atmosphere.
Atmospheric window
• The ranges of wavelength that are partially or wholly transmitted through the atmosphere
• Remote sensing data acquisition is limited through these atmospheric windows
Atmospheric Window
• Wavelengths shorted than 0.1 μm
– Absorbed by nitrogen and other
gaseous components
• Wavelengths shorter than 0.3μm (x-
rays, gamma rays and part of ultraviolet
rays)
– Mostly absorbed by the ozone (O3)
• Visible part of the spectrum
– Little absorption occurs
• Oxygen in the atmosphere causes
absorption centered at 6.3μm.
• Infrared (IR) radiation
– Mainly absorbed by water vapour
and carbon dioxide molecules
• Far infrared region
– Mostly absorbed by the
atmosphere
• Microwave region
– Absorption is almost nil
Atmospheric Window
Major atmospheric windows used in remote sensing and their characteristics
• Reflection
➢ Radiation is redirected after hitting the target
➢ Angle of incidence = angle of reflectance
• Transmission
➢ Radiation is allowed to pass through the
target
➢ Changes the velocity and wavelength of the
radiation
➢ Transmitted energy may be further scattered
or absorbed in the medium Absorption
➢ Radiation is absorbed by the target
➢ A portion absorbed by the Earth’s surface is available
for emission as thermal radiation
42
Reflection vs Scattering
Reflection
• Incident energy is redirected
• Angle of incidence = angle of reflection
➢ The reflected radiation leaves the surface at the
same angle as it approached
Scattering
A special type of reflection
Incident energy is diffused in many directions
Often called Diffuse Reflection
Reflection or Scattering?
Depends on the roughness of the surface with respect to the incident wavelength
Roughness of the surface < Incident wavelength → Smooth surface → Reflection
Roughness of the surface > Incident wavelength → Rough surface → Scattering
Roughness of the surface controls how the energy is reflected
Mainly two types
➢Specular reflection
➢Diffuse (Lambertian) reflection
Spectral Reflectance
• Represents the reflectance characteristics of earth surface features
• Ratio of energy reflected by the surface to the energy incident on the surface
• Measured as a function of wavelength
• Also known as albedo
• Mathematical representation of spectral reflectance or albedo
ER ( )
R =
EI ( )
Energy of wavelength reflected from the object
= 100
Energy of wavelength incident on the object
• Spectral reflectance characteristics of the surface features is used to identify the features and to
study their characteristics
ER ( )
R =
EI ( )
Energy of wavelength reflected from the object
= 100
Energy of wavelength incident on the object
• Spectral reflectance characteristics of the surface features is used to identify the features and to
study their characteristics
Example:
Generalized spectral reflectance curves for deciduous and
coniferous trees
• Sensor selection to differentiate deciduous and coniferous
trees Maximum
reflectance in
– Curves overlap in the visible portion green gives the
– Both class will be seen in shades of green green colour
Panchromatic photograph using reflected sunlight Black and white infrared photograph using reflected
over the visible wavelength sunlight over 0.7 to 0.9 mm wavelength
• Coniferous and deciduous trees are not • Deciduous trees show bright signature compared
differentiable to coniferous trees
49
51
Spectral Reflectance of Soil
52
Spectral Reflectance of Water
• The majority of radiation incident on water is not reflected but
is either absorbed or transmitted.
• Water provides a semi-transparent medium for the
electromagnetic radiation
• Longer visible wavelengths and near infra red wavelengths are
absorbed more by water than shorter visible wavelengths.
• Thus, water looks blue or blue green due to stronger reflectance
at these shorter wavelengths and darker if viewed at red or near
infrared wavelengths.
• The factors that affect the variability in reflectance of a water
• Water in liquid phase
body are depth of water, materials within water and surface
–High reflectance in the visible region between 0.4μm and
roughness of water. 0.6μm
• Spectral responses varies with –Wavelengths beyond 0.7μm are completely absorbed.
– Wavelength of the radiation • Water in solid phase (ice or snow)
–Good reflection at all visible wavelengths
– Physical and chemical characteristics of the water
53
Spectral Reflectance of Water
55
Spectral Reflectance of Vegetation
Time required for the satellite to Time required for the Earth to
cover one revolution
= rotate once about its polar axis
Geo-stationary Satellites
• A geostationary satellite is launched in such a way that it follows
an orbit parallel to the equator (Inclination = 180º) and travels in
the same direction as the earth’s rotation (West to East) with the
same period of 24 hours.
• It always views the same area on the earth, thus, monitoring a
location
continuously.
• A large area of the earth can also be covered by this satellite.
• These are located at a high altitude of 36000 km from the Earth’s
surface. At this distance, only low resolution images are acquired.
GEO-SYNCHRONOUS ORBIT
POLAR ORBIT
Polar or Near-Polar Orbits
• Due to the rotation of the Earth, it is possible to combine the
advantages of low-altitude orbits with global coverage, using near
polar orbiting satellites, which have an orbital plane crossing the
poles.
• These satellites are launched into orbits at high inclinations to the
Earth’s rotation (at low angles with longitude lines), such that they
pass across high latitudes near the poles.
• Inclined at nearly 90 degrees
• Usually low altitude orbits (700-800 km)
• Satellites make more than one revolution around the Earth in a
single day and due to the Earth’s rotation, in each revolution the
satellite passes over different areas
• Gives complete coverage of the earth’s surface during an orbit
cycle
• Orbit cycle: When the nadir point of the satellite passes over
the same point on the Earth’s surface for a second time
• Revisit period: Time lapsed between two successive views of the
same area by a satellite
SUN-SYNCHRONOUS ORBIT
• Sun-synchronous orbits
➢ Satellite passes over the same part of the earth at roughly at the same
local time each day
68
SENSORS
• The broad classes of sensors are:
Passive: Energy leading to radiation received comes from an external source, e.g., the Sun;
the MSS is an example;
Active: energy generated from within the sensor system is beamed outward, and the fraction
is measured; RADAR is an example.
• Another attribute in this classification is:
Scanning Mode: If the scene is sensed point by point (equivalent to small areas within the
scene) along successive lines over a finite time, this mode of measurement
makes up a scanning system.
Non-Scanning: If the entire scene is sensed directly with the sensor then its termed as non-
scanning system.
• Sensors can be
Non-imaging: Measures the radiation received from all points in the sensed target, integrate
this, and reports the result as electrical signal strength or some other
quantitative attributes, such as radiance.
Imaging: The electrons released are used to excite or ionize a substance like silver in film or
to drive an image producing device like a TV or computer monitor or a1/21/2020
cathode
ray tube or oscilloscope or a battery of electronic detectors.
INFORMATION COLLECTED BY
SENSORS
SENSOR CHARACTERISTICS
• In remote sensing resolution means the resolving power
Radiometric resolution: Sensitivity of the sensor Radiometric resolution and the corresponding
to the magnitude of the electromagnetic energy
brightness levels available
❖How many grey levels are measured between
pure black (no reflectance) to pure white
(maximum reflectance) Radiometric Number of levels Example
❖The finer the radiometric resolution of a sensor resolution
the more sensitive it is in detecting small 1 bit 21 – 2 levels
differences in the energy
7 bit 27 – 128 levels IRS 1A & 1B
❖The finer the radiometric resolution of a sensor
the system can measure more number of grey 8 bit 28 – 256 levels Landsat TM
levels 11 bit 211 – 2048 levels NOAA-AVHRR
78
DIGITAL IMAGE
Character of Digital Image Data :
The first image appears to be a continuous tone
photograph, it is actually composed of a two
dimensional array of discrete picture elements or
pixels. The intensity of each pixel corresponds to the
average brightness or radiance measured
electronically over the ground area corresponding to
each pixel.
Whereas the individual pixels are virtually impossible
to discern in figure (a), it is observable in the
enlargement shown in (b). The enlargement
correspond to a subarea located in (a). Part (c) shows
the individual Digital Number (DN) corresponding to
the average radiance measured in each pixel. These
values are simply positive integers that results from
quantizing the original electrical signal from the
sensor into positive integer values using a process
called ‘Analogue-to-Digital’ (A-to-D) signal
conversion.
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
GEOMETRIC CORRECTIONS
Sources of Geometric
Errors of Image
RADIOMETRIC CORRECTIONS
• The radiance measured by any given system over a given object is influenced by such factors as changes in
scene illumination, atmospheric conditions, viewing geometry, and instrument response characteristics. Some
of these effects, such as viewing geometry variations, are greater in the case of airborne data collection than
in satellite image acquisition. Also, the need to perform correction for any or all of these influences depends
directly upon the particular application at hand.
• Over the course of the year, there are systematic, seasonal changes in the intensity of solar irradiance incident
on the earth’s surface. If remotely sensed images taken at different times of the year are being compared, it is
usually necessary to apply a sun elevation correction and an earth–sun distance correction. The sun elevation
correction accounts for the seasonal position of the sun relative to the earth. Through this process, image data
acquired under different solar illumination angles are normalized by calculating pixel brightness values
assuming the sun was at the zenith on each date of sensing. The correction is usually applied by dividing
each pixel value in a scene by the sine of the solar elevation angle (or cosine of the solar zenith angle) for the
particular time and location of imaging.
RADIOMETRIC CORRECTIONS
• The earth–sun distance correction is applied to normalize for the seasonal changes in the distance between
the earth and the sun. The earth–sun distance is usually expressed in astronomical units. (An astronomical
unit is equivalent to the mean distance between the earth and the sun, approximately 149.6 x 106 km). The
irradiance from the sun decreases as the square of the earth–sun distance.
• Ignoring atmospheric effects, the combined influence of solar zenith angle and earth–sun distance on the
irradiance incident on the earth’s surface can be expressed as,
RADIOMETRIC CORRECTIONS
The influence of solar illumination variation is compounded by atmospheric effects. The atmosphere
affects the radiance measured at any point in the scene in two contradictory ways. First, it attenuates
(reduces) the energy illuminating a ground object. Second, it acts as a reflector itself, adding a
scattered, extraneous “path radiance” to the signal detected by a sensor. Thus, the composite signal
observed at any given pixel location can be expressed by,
RADIOMETRIC CORRECTIONS
• Only the first term in the equation contains valid information about ground reflectance. The second
term represents the scattered path radiance, which introduces “haze” in the imagery and reduces image
contrast.
• Haze compensation procedures are designed to minimize the influence of
path radiance effects. One means of haze compensation in multispectral data is to observe the radiance
recorded over target areas of essentially zero reflectance. For example, the reflectance of deep clear water
is essentially zero in the near-infrared region of the spectrum. Therefore, any signal observed over such an
area represents the path radiance, and this value can be subtracted from all pixels in that band. This
process is referred to as dark object subtraction.
• For convenience, haze compensation routines are often applied uniformly throughout a scene. This may
or may not be valid, depending on the uniformity of the atmosphere over a scene. When extreme viewing
angles are involved in image acquisition, it is often necessary to compensate for the influence of varying
the atmospheric path length through which the scene is recording.
• More advanced methods have been developed for atmospheric correction of optical and thermal images,
when simple haze-removal techniques like dark object subtraction are insufficient. These algorithms are
broadly divided into those based on empirical correction using spectral data from the imagery itself and
those using radiative transfer methods to model atmospheric scattering and absorption from physical
principles. In many cases, these algorithms may require information about local atmospheric conditions at
the time of image acquisition.
RADIOMETRIC CORRECTIONS
• When spectral data from more than one image need to be compared, but there is not sufficient information
available for a complete atmospheric correction process, an alternative is radiometric normalization of the
images. This process involves adjusting the brightness values of one or more secondary images to match a
single base image. The images must at least partially overlap, and the overlap area must contain several
temporally stable targets, features whose true surface reflectance is assumed to be constant over time.
Typically, the analyst identifies a set of these targets covering a range of brightness values, then uses a
statistical method such as linear regression to establish a model relating the brightness values in each
secondary image to the corresponding brightness values in the base image. Each secondary image is then
normalized using its own regression model.
• Another radiometric data processing activity involved in many quantitative applications of digital image data
is conversion of DNs to absolute radiance (or reflectance) values. This operation accounts for the exact form
of the A-to-D response functions for a given sensor and is essential in applications where measurement of
absolute radiances is required. For example, such conversions are necessary when changes in the absolute
reflectance of objects are to be measured over time using different sensors (e.g., the TM on Landsat-5 versus
the OLI on Landsat-8). Likewise, such conversions are important in the development of mathematical
models that physically relate image radiance or reflectance data to quantitative ground measurements (e.g.,
water quality measurements).
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
DIGITAL IMAGE PROCESSING
• Atmospheric Corrections
Atmospheric models can be used to account for the effects of scattering and absorption in the
atmosphere. A number of parameters are required to accurately apply atmospheric correction,
including properties such as the amount of water vapor, distribution of aerosols. Sometimes this
data can be collected by field instruments that measure atmospheric gases and aerosols, but this is
often expensive and time consuming. Other satellite data can also be used to help estimate the
amount and distribution of atmospheric aerosols. Many software packages include special
atmospheric correction modules that use atmospheric radiation transfer models to produce an
estimate of the true surface reflectance.
DIGITAL IMAGE PROCESSING
GEOMETRIC REGISTRATION
• It involves identifying the image coorditnates of
several clearly discernible points, called Ground
Control Points (GCP). The true ground coordinates
are typically measured from a map (b ‐ b1 to b4),
either in paper or digital format. Several
well‐distributed GCP pairs are identified and the
coordinate information is processed by the computer
to determine the proper transformation equations to
apply to the original image coordinates to map them
into their new ground coordinates.
• It is also performed by registering one (or more)
images to another image, instead of to geographic
coordinates
RESAMPLING
• Resampling is used to determine the digital values to place in the new pixel locations of the corrected output image.
a) Gray‐level thresholding : Used to segment an input image into two classes‐ one for those pixels
having values below an analyst defined gray level and one of those above this value.
b) Level Slicing : In this, the DNs distributed along the x-axis of an image histogram are divided into a
series of analyst specified intervals or “slices”.
All of the DNs falling within a given interval in the input image are then displayed at a single DN in the
output image. Consequently, If six different slices are established, the output image contains only six
different gray levels. Each level can also be shown as a single colour.
c) Contrast Stretching: contrast stretching is to expand the narrow range of brightness values present in
an input image over a wider range of gray values. The result is an output image that is designed to provide
a contrast between features of interest to the image analyst.
Remote Sensing
a) Gray‐level thresholding : Used to segment an input image into two classes‐ one for those pixels
having values below an analyst defined gray level and one of those above this value.
b) Level Slicing : In this, the DNs distributed along the x-axis of an image histogram are divided into a
series of analyst specified intervals or “slices”.
All of the DNs falling within a given interval in the input image are then displayed at a single DN in the
output image. Consequently, If six different slices are established, the output image contains only six
different gray levels. Each level can also be shown as a single colour.
c) Contrast Stretching: contrast stretching is to expand the narrow range of brightness values present in
an input image over a wider range of gray values. The result is an output image that is designed to provide
a contrast between features of interest to the image analyst.
GRAY‐LEVEL THRESHOLDING
➢ Level Slicing : In this the DNs distributed along the x axis of an image histogram are divided into a series of analyst-
specified intervals or “slices.” All of the DNs falling within a given interval in the input image are
then displayed at a single DN in the output image. Consequently, if six different
slices are established, the output image contains only six different gray levels. The
result looks something like a contour map, except that the areas between boundaries are occupied by pixel displayed at
the same DN. Each level can also beshown as a single color.
The application of level slicing to the “water” portion of the scene is illustrated in Figure (d). Here, Landsat- 8 OLI
band 4 data have been level sliced into multiple levels in those areas previously determined to be water from the band 5
binary mask. Level slicing is used extensively in the display of thermal infrared images in order to show discrete
temperature ranges coded by gray level or color.
Contrast Stretching: contrast stretching is to expand the narrow range of brightness values present in an
input image over a wider range of gray values. The result is an output image that is designed to provide a contrast
between features of interest to the image analyst.
Linear Stretch:
A more expressive display would result if we were to expand the range of image levels present in the scene (60 to
158) to fill the range of display values (0 to 255). The range of image values has been uniformly expanded to fill
the total range of the output device. The linear stretch would be applied to each pixel in the image using algorithm,
Low‐pass filter:
• Designed to emphasize larger, homogeneous areas of
similar tone and reduce the smaller detail in an image.
• Generally serve to smooth the appearance of an image.
• Examples: Average and median filters.
High‐pass filters:
• Do the opposite and serve to sharpen the appearance of
fine detail in an image.
• It first applies a low‐pass filter to an image and then
subtracts the result from the original, leaving behind only
the high spatial frequency information.
• Directional, or edge detection filters are designed to highlight
linear features, such as roads or field boundaries.
• Useful in applications such as geology, for the detection of
linear geologic structures.
IMAGE FILTERING
▪ Spatial filtering is a “neighborhood” operation in that pixel values in an original image are modified on the basis of the
gray levels of neighboring pixels. For example, a simple low-pass filter may be implemented by passing a moving
window throughout an original image and creating a second image whose DN at each pixel corresponds to the
neighborhood average within the moving window at each of its positions in the original image.
▪ A simple high-pass filter may be implemented by subtracting a low-pass filtered image (pixel by pixel) from the
original, unprocessed image.
The original image is shown in Figure (a). Figure (b) shows the low frequency component image, and Figure (c) illustrates the high
frequency component image. Note that the low frequency component image (b) reduces deviations from the neighborhood average,
which smooths or blurs the detail in the original image, reduces the gray-level range, but emphasizes the large-area brightness
regimes of the original image. The high frequency component image (c) enhances the spatial detail in the image at the expense of
the large-area brightness information. Both images have been contrast stretched. (Such stretching is typically required because
spatial filtering reduces the gray-level range present in an image.)
SPECTRAL RATIOING
It is the most common transforms applied to image data.
• Serves to highlight subtle variations in the spectral responses of various land covers.
• By ratioing the data from two different spectral bands, the resultant image
enhances variations in the slopes of the spectral reflectance curves between the two
different spectral ranges.
Example:
Healthy vegetation reflects strongly in the NIR portion of the spectrum while absorbing strongly in the visible red.
Other surfaces, such as soil & water, show near equal reflectances in both the NIR and red portions. Thus, a ratio
image of Landsat MSS Band 7 (NIR ‐ 0.8 to 1.1 mm) divided by Band 5 (Red ‐ 0 6 .6 to 0 7 .7 mm) would result in
ratios much greater than 1 0 .0 for vegetation, and ratios around 1.0 for soil and water. Thus the discrimination of
vegetation from other surface cover types is significantly enhanced.
Normalized Difference Vegetation Index (NDVI)
In spectral ratioing, we are looking at relative values instead of absolute brightness values, variations in scene
illumination as a result of topographic effects are reduced. More complex ratios involving the sums of and
differences between spectral bands for various sensors, have been developed for monitoring vegetation conditions.
Widely used image transform is the Normalized Difference Vegetation Index Used to monitor vegetation conditions
on continental and global scales.
where, NIR and Red are the spectral reflectance in the sensor’s near-infrared and red bands, respectively. High
NDVI values will result from the combination of a high reflectance in the near infrared and lower reflectance in the
red band. This combination is typical of the spectral “signature” of vegetation. Non-vegetated areas, including bare
soil, open water, snow/ice, and most construction materials, will have much lower NDVI values.
NORMALIZED DIFFERENCE VEGETATION INDEX (NDVI)
▪ Extensive interband correlation is a problem frequently encountered in the analysis of multispectral image
data. That is, images generated by digital data from various wavelength bands often appear similar and
convey essentially the same information. Principal component transformations is a technique designed to
reduce such redundancy in multispectral data. These transformations may be applied either as an
enhancement operation prior to visual interpretation of the data or as a preprocessing procedure prior to
automated classification of the data. Stated differently, the purpose of these procedures is to compress all of
the information contained in an original n-band data set into fewer than n “new bands.” The new bands are
then used in lieu of the original data.
▪ . PCA is a powerful mathematical tool for analyzing the data
reduces extensive inter band correlation in multispectral image data
it is a way of identifying patterns in data, and expressing the data in such a way as to highlight their
similarities and differences.
it reduces the dimensions of the data without much loss of information i.e. useful for data
compression
IMAGE CLASSIFICATION
▪ A human analyst attempting to classify features in an image uses the elements of visual interpretation to identify
homogeneous groups of pixels which represent various features or land cover classes of interest.
▪ Digital image classification uses the spectral information represented by the digital numbers in one or more
spectral bands, and attempts to classify each individual pixel based on this spectral information.
▪ Objective is to assign all pixels in the image to particular classes or themes (e.g. water, coniferous forest,
deciduous forest, corn, wheat, etc.). The resulting classified image is comprised of a mosaic of pixels, each of
which belong to a particular theme.
IMAGE CLASSIFICATION
▪ Information classes are those categories of interest that the analyst is actually trying to identify in the imagery,
such as different kinds of crops, different forest types or tree species, different geologic units or rock types, etc.
▪ Spectral classes are groups of pixels that are uniform (or near‐similar) with respect to their brightness values in the
different spectral channels of the data
▪ Objective is to match the spectral classes in the data to the information classes of interest
▪ Common Image classification procedures are divided in to two broad subdivisions based on the method used:
• Supervised Classification
• Unsupervised Classification
SUPERVISED CLASSIFICATION
▪ In this, the analyst identifies in the imagery homogeneous representative samples (Training areas) of the different
surface cover types (information classes) of interest. The selection of appropriate training areas is based on the
analyst's familiarity with the geographical area & their knowledge of actual surface cover types present in the
image.
▪ The numerical information in all spectral bands for the pixels comprising these areas are used to "train“ the
computer to recognize spectrally similar areas for each class.
▪ The computer uses a special program or algorithm to determine the numerical "signatures" for each training class
and then each pixel in the image is compared to these signatures and labeled as the class it most closely
"resembles" digitally