0% found this document useful (0 votes)
15 views119 pages

Remote Sening and Satellite Image-2021

This document discusses remote sensing. It defines remote sensing as acquiring information about an object without physical contact using devices like satellites and aircraft. The document outlines the electromagnetic spectrum used in remote sensing, including visible, infrared, and microwave portions. It describes active and passive sensors, with active sensors providing their own illumination and passive sensors detecting naturally occurring radiation. The advantages of remote sensing are presented, such as its ability to systematically collect data and provide biophysical information about location, biomass, and temperature from a distance.

Uploaded by

Fatma Mohammed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views119 pages

Remote Sening and Satellite Image-2021

This document discusses remote sensing. It defines remote sensing as acquiring information about an object without physical contact using devices like satellites and aircraft. The document outlines the electromagnetic spectrum used in remote sensing, including visible, infrared, and microwave portions. It describes active and passive sensors, with active sensors providing their own illumination and passive sensors detecting naturally occurring radiation. The advantages of remote sensing are presented, such as its ability to systematically collect data and provide biophysical information about location, biomass, and temperature from a distance.

Uploaded by

Fatma Mohammed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 119

Lectures Remote Sensing

Presented by
Prof. Dr. Safaa Mohamed Hasan

1
What is remote sensing?
• In the broadest sense, remote sensing is the
small or large-scale acquisition of information
of an object or phenomenon, by the use of
either recording or real-time sensing device's)
that is not in physical or intimate contact with
the object (such as by way of aircraft,
spacecraft, satellite, or ship).

2
Remote sensing is…

the practice of deriving information about the Earth’s land


and water surfaces using images acquired from an
overhead perspective, by employing electromagnetic
radiation in one or more regions of the electromagnetic
spectrum, reflected or emitted from the Earth’s surface.

Campbell and Wynne


Introduction to Remote Sensing
p.6, 5th ed. (2011)

3
A remote sensing instrument
collects information about an
object or phenomenon within the
-field-of-view (FOV) of the
sensor system without being in
direct physical contact with it.
The sensor is located on a
suborbital or satellite platform.

4
Interaction Model Depicting the Relationships of the Mapping Sciences as
they relate to Mathematics and Logic, and the Physical, Biological, and Social
Sciences

5
RS as Source of information

Traditionally: Sensor:
•Field equipment •Active RS.
•Manual recording. •Passive RS.
•Field work.

6
Active and Passive

Passive Sensor
Radiance

ATMOSPHERE

SURFACE

Active Sensor

Passive vs. Active Sensing


So far, throughout this chapter, we have made
various references to the sun as a source of
energy or radiation. The sun provides a very
convenient source of energy for remote sensing.
The sun's energy is either reflected, as it is for
visible wavelengths, or absorbed and then reemitted,
as it is for thermal infrared
wavelengths. Remote sensing systems which
measure energy that is naturally available are
called passive sensors. Passive sensors can
only be used to detect energy when the naturally
occurring energy is available. For all reflected
energy, this can only take place during the time
when the sun is illuminating the Earth. There is
no reflected energy available from the sun at night. Energy that is naturally emitted (such as
thermal infrared) can be detected day or night, as long as the amount of energy is large
enough to be recorded.
Active sensors, on the other hand, provide their own
energy source for illumination. The sensor emits radiation
which is directed toward the target to be investigated. The
radiation reflected from that target is detected and
measured by the sensor. Advantages for active sensors
include the ability to obtain measurements anytime,
regardless of the time of day or season. Active sensors can
be used for examining wavelengths that are not sufficiently
provided by the sun, such as microwaves, or to better
control the way a target is illuminated. However, active
systems require the generation of a fairly large amount of
energy to adequately illuminate targets. Some examples of
active sensors are a laser fluorosensor and a synthetic

7
8
RS Properties
•Overview regions.

•Variety of sensors, techniques, image processing algorithms.

•Up to date information.

•More that human knowledge.

•Capability of monitoring with time series of data (earth,


environment).
•The invisible becomes visible (inaccessible areas).

9
Person in photo is holding a sensor similar to the ones used on satellites. By recording a highly
accurate spectral signature for this crop type using this hand-held spectrometer, scientists can
then search for and extract this signature from satellite imagery and develop detailed maps of
this crop type over very large areas.

10
Advantages of Remote Sensing
• Remote sensing is unremarkable if the sensor passively records the EMR reflected or
emitted by the object of interest. Passive remote sensing does not disturb the object or
area of interest.

• Remote sensing devices may be programmed to collect data systematically, such as


within a 9  9 in. frame of vertical aerial photography. This systematic data collection
can remove the sampling bias introduced in some in situ investigations.

• Under controlled conditions, remote sensing can provide fundamental biophysical


information, including x,y location, z elevation or depth, biomass, temperature, and
moisture content.

11
Advantages of Remote Sensing

• Remote sensing–derived information is now critical


to the successful modeling of numerous natural (e.g.,
water-supply estimation; eutrophication studies;
nonpoint source pollution) and cultural (e.g., land-use
conversion at the urban fringe; water-demand
estimation; population estimation) processes (Walsh
et al., 1999; Stow et al., 2003).

12
13
The Electromagnetic Spectrum

reflective
Sensor Types
AVHRR
OLS
Radiative
Emissive

MODIS
Thermal

MERIES

SSM/I
Seawinds
SAR
ASAR

The electromagnetic spectrum ranges from the shorter wavelengths (including gamma and
x-rays) to the longer wavelengths (including microwaves and broadcast radio waves). There
are several regions of the electromagnetic spectrum which are useful for remote sensing.
For most purposes, the ultraviolet or UV
portion of the spectrum has the shortest
wavelengths which are practical for remote
sensing. This radiation is just beyond the
violet portion of the visible wavelengths,
hence its name. Some Earth surface
materials, primarily rocks and minerals,
fluoresce or emit visible light when illuminated
by UV radiation. The light which our eyes - our "remote
sensors" - can detect is part of the visible
spectrum. It is important to recognize how
small the visible portion is relative to the rest
of the spectrum. There is a lot of radiation
around us which is "invisible" to our eyes, but
can be detected by other remote sensing
instruments and used to our advantage. The
visible wavelengths cover a range from
approximately 0.4 to 0.7 μm. The longest
visible wavelength is red and the shortest is
violet. Common wavelengths of what we
perceive as particular colours from the visible
portion of the spectrum are listed below. It is
important to note that this is the only portion
of the spectrum we can associate with the
concept of colours. Blue, green, and red are the primary
colours or wavelengths of the visible
spectrum. They are defined as such because
no single primary colour can be created from
the other two, but all other colours can be
formed by combining blue, green, and red in
various proportions. Although we see sunlight
as a uniform or homogeneous colour, it is
actually composed of various wavelengths of
radiation in primarily the ultraviolet, visible
and infrared portions of the spectrum. The visible portion of this radiation can be shown in itsThe next portion of the spectrum of interest is
the infrared (IR) region which covers the
wavelength range from approximately 0.7 μm
to 100 μm - more than 100 times as wide as
the visible portion! The infrared region can be
divided into two categories based on their
radiation properties - the reflected IR, and
the emitted or thermal IR. Radiation in the
reflected IR region is used for remote sensing
purposes in ways very similar to radiation in
the visible portion. The reflected IR covers
wavelengths from approximately 0.7 μm to
3.0 μm. The thermal IR region is quite
different than the visible and reflected IR
portions, as this energy is essentially the
radiation that is emitted from the Earth's
surface in the form of heat. The thermal IR
covers wavelengths from approximately 3.0
μm to 100 μm. The portion of the spectrum of more recent
interest to remote sensing is the microwave
region from about 1 mm to 1 m. This covers
the longest wavelengths used for remote
sensing. The shorter wavelengths have
properties similar to the thermal infrared
region while the longer wavelengths approach
the wavelengths used for radio broadcasts.
Because of the special nature of this region
and its importance to remote sensing in
Canada, an entire chapter (Chapter 3) of the
tutorial is dedicated to microwave sensing.

14
The Electromagnetic Spectrum

CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=2521356

15
Visible Spectrum

0.4 0.5 0.6 0.7 Wavelength,µm

• Visible region (0.4-0.7m)


– Blue (0.4-0.5 m)
– Green (0.5-0.6 m
– Red (0.6-0.7 m)
• Near-infrared region (0.7-1.3 m)
• Middle-infrared (MIR, 1.3-3 m) or shortwave infrared (SWIR, 1.3-8 m) region
• Thermal infrared region (8-14 m)
• Microwave region (>1mm)

16
Types of Sensors
Visible & Reflected IR Thermal IR Microwave
Remote Sensing Remote Sensing Remote Sensing

Radiation
Source >> Sun Object Object Sensor

Object >> Reflecting Radiating Radiating Reflecting

Type >> Passive Passive Passive Active

Spectral
Radiance >>

© Asian Institute of Technology, 2014, All Rights Reserved

17
Electromagnetic Wave Interaction/ Propagation
through the Atmosphere
• What happens to solar radiation as it passes through the atmosphere?
• What happens to radiation as it reaches the earth?

• Transmission
• Absorption
• Scattering/Reflection
• emission

Before radiation used for remote sensing reaches the Earth's surface it has to travel through
some distance of the Earth's atmosphere. Particles and gases in the atmosphere can affect
the incoming light and radiation. These effects are caused by the mechanisms of scattering
and absorption. Scattering occurs when particles or large gas molecules present in the atmosphere interact
with and cause the electromagnetic radiation to be redirected from its original path. How much
scattering takes place depends on several factors including the wavelength of the radiation,
the abundance of particles or gases, and the distance the radiation travels through the
atmosphere. There are three (3) types of scattering which take place. Absorption is the other main mechanism at work
when electromagnetic radiation interacts with the
atmosphere. In contrast to scattering, this
phenomenon causes molecules in the atmosphere to
absorb energy at various wavelengths. Ozone,
carbon dioxide, and water vapour are the three main
atmospheric constituents which absorb radiation. Ozone serves to absorb the harmful (to most living
things) ultraviolet radiation from the sun. Without this
protective layer in the atmosphere our skin would
burn when exposed to sunlight.
You may have heard carbon dioxide referred to as
a greenhouse gas. This is because it tends to absorb radiation strongly in the far infrared
portion of the spectrum - that area associated with thermal heating - which serves to trap this
heat inside the atmosphere. Water vapour in the atmosphere absorbs much of the incoming
longwave infrared and shortwave microwave radiation (between 22μm and 1m). The
presence of water vapour in the lower atmosphere varies greatly from location to location and
at different times of the year. For example, the air mass above a desert would have very little
water vapour to absorb energy, while the tropics would have high concentrations of water
vapour (i.e. high humidity). Because these gases absorb
electromagnetic energy in very
specific regions of the spectrum, they
influence where (in the spectrum) we
can "look" for remote sensing
purposes. Those areas of the
spectrum which are not severely
influenced by atmospheric absorption
and thus, are useful to remote
sensors, are called atmospheric
windows. By comparing the
characteristics of the two most
common energy/radiation sources
(the sun and the earth) with the
atmospheric windows available to us, we can define those wavelengths that we can
use most effectively for remote sensing. The visible portion of the spectrum, to
which our eyes are most sensitive, corresponds to both an atmospheric window and
the peak energy level of the sun. Note also that heat energy emitted by the Earth
corresponds to a window around 10 μm in the thermal IR portion of the spectrum,
while the large window at wavelengths beyond 1 mm is associated with the

18
Radiation - Target Interactions

Incident (I)
Absorption (A)
Transmission (T)
Reflection (R)

Specular or mirror Diffuse reflection.


the surface is rough and the
reflection. where all (or almost energy is reflected almost
all) of the energy is directed away uniformly in all directions
from the surface in a single direction

Radiation that is not absorbed or scattered in


the atmosphere can reach and interact with
the Earth's surface. There are three (3) forms
of interaction that can take place when energy
strikes, or is incident (I) upon the surface.
These are: absorption (A); transmission
(T); and reflection (R). The total incident
energy will interact with the surface in one or
more of these three ways. The proportions of
each will depend on the wavelength of the
energy and the material and condition of the
Feature. Absorption (A) occurs when radiation (energy) is absorbed into the target while transmission
(T) occurs when radiation passes through a target. Reflection (R) occurs when radiation
"bounces" off the target and is redirected. In remote sensing, we are most interested in
measuring the radiation reflected from targets. We refer to two types of reflection, which
represent the two extreme ends of the way in which energy is reflected from a target:
specular reflection and diffuse reflection. When a surface is smooth we get specular or mirror-like
reflection where all (or almost all) of
the energy is directed away from the surface in a single direction. Diffuse reflection occurs
when the surface is rough and the energy is reflected almost uniformly in all directions. Most
earth surface features lie somewhere between perfectly specular or perfectly diffuse
reflectors. Whether a particular target reflects specularly or diffusely, or somewhere in
between, depends on the surface roughness of the feature in comparison to the wavelength of
the incoming radiation. If the wavelengths are much smaller than the surface variations or the
particle sizes that make up the surface, diffuse reflection will dominate. For example, finegrained
sand would appear fairly smooth to long wavelength microwaves but would appear
quite rough to the visible wavelengths.

19
1.5 Radiation - Target Interactions

Leaves Interactions: A chemical Water Interactions: Longer


compound in leaves called chlorophyll wavelength visible and near infrared
strongly absorbs radiation in the red radiation is absorbed more by water
and blue wavelengths but reflects green than shorter visible wavelengths
wavelengths

Let's take a look at a couple of examples of targets at the Earth's surface and how energy at
the visible and infrared wavelengths interacts with them.
Leaves: A chemical compound in leaves
called chlorophyll strongly absorbs
radiation in the red and blue
wavelengths but reflects green
wavelengths. Leaves appear "greenest"
to us in the summer, when chlorophyll
content is at its maximum. In autumn,
there is less chlorophyll in the leaves, so
there is less absorption and
proportionately more reflection of the red
wavelengths, making the leaves appear
red or yellow (yellow is a combination of
red and green wavelengths). The
internal structure of healthy leaves act as excellent diffuse reflectors of near-infrared
wavelengths. If our eyes were sensitive to near-infrared, trees would appear extremely bright
to us at these wavelengths. In fact, measuring and monitoring the near-IR reflectance is one
way that scientists can determine how healthy (or unhealthy) vegetation may be.
Water: Longer wavelength visible and near
infrared radiation is absorbed more by water
than shorter visible wavelengths. Thus water
typically looks blue or blue-green due to
stronger reflectance at these shorter
wavelengths, and darker if viewed at red or
near infrared wavelengths. If there is
suspended sediment present in the upper
layers of the water body, then this will allow
better reflectivity and a brighter appearance
of the water. The apparent colour of the
water will show a slight shift to longer. be easily confused with shallow (but clear) water, since these two phenomena appear very
similar. Chlorophyll in algae absorbs more of the blue wavelengths and reflects the green,
making the water appear more green in colour when algae is present. The topography of the
water surface (rough, smooth, floating materials, etc.) can also lead to complications for
water-related interpretation due to potential problems of specular reflection and other
influences on colour and brightness. We can see from these examples that, depending on the complex make-up of the target that
is being looked at, and the wavelengths of radiation involved, we can observe very different
responses to the mechanisms of absorption, transmission, and reflection. By measuring the
energy that is reflected (or emitted) by targets on the Earth's surface over a variety of different
wavelengths, we can build up a spectral response for that object. By comparing the
response patterns of different features we may be able to distinguish between them, where
we might not be able to, if we only compared them at one wavelength. For example, water
and vegetation may reflect somewhat similarly in the visible wavelengths but are almost
always separable in the infrared. Spectral response can be quite variable, even for the same
target type, and can also vary with time (e.g. "green-ness" of leaves) and location. Knowing
where to "look" spectrally and understanding the factors which influence the spectral response
of the features of interest are critical to correctly interpreting the interaction of electromagnetic
radiation with the surface.

20
Microwave Radiation

10GHz 1GHz
Frequency & wavelengths for
microwave bands
0.2μm 1.0μm 10μm 1mm 10mm 10cm 1m
Wavelength Frequency
Middle Thermal Microwave Bands Band
Visible
infra-red infra-red range[mm] range[GHz]
UV Ka Ku X C S L P Ka 7.5 - 11.0 40.0 - 26.5
Near infra-red
K 11.0 - 16.7 26.5 - 18.0
The electromagnetic spectrum (after NASA/JPL, 1988)
Ku 16.7 - 24.0 18.0 - 12.5
X 24.0 - 37.5 12.5 - 8.0
Transmittance (%)

100 C 37.5 - 75.0 8.0 - 4.0


S 75.0 - 150 4.0 - 2.0
50
L 150 - 300 2.0 - 1.0
P 300 - 1000 1.0 - 0.3
0 10μm
0.2 μm 1.0μm 1mm 10nm 10cm 1m

Wavelength
X, C, S, L bands are used for
Characteristic of atmospheric spectral transmittance (after NASA/JPL, 1988) SAR on-board satellites

© Remote Sensing Technology Center of Japan, 2014, All Rights Reserved

21
Surface scatter (Reflection) mechanism

forest grass lake

©METI/JAXA

© Remote Sensing Technology Center of Japan, 2014, All Rights Reserved

22
Backscatter change in a paddy field

Open water Planting Vegetated

weak strong
HH backscattering

23
Seasonal change
Seasonal changes in microwave backscatter provide useful information to detect paddy field area.

SAR Backscatter

Low High

24
Sentinel 1A
11 Aug 2015

Innundated paddies are dark

25
Sentinel 1A
22 Oct 2015

After growing and/or harvest, more backscatter. The river, however, has flooded!

26
Landsat 8 OLI Multispectral sensor
15 Apr 2016

27
Sentinel 1A
17 Jan 2016

Healthy vegetation is bright. Resolution is nominally 10m but despeckling reduces


that a bit.

28
Sentinel 1A
05 Mar 2016

29
Sentinel 1A
29 Mar 2016

30
Sentinel 1A
22 Apr 2016

31
Sentinel 1A
16 May 2016

32
Sentinel 1A
09 Jun 2016

33
Sentinel 1A
03 Jul 2016

34
Sentinel 1A
20 Aug 2016

35
Sentinel 1A
13 Sep 2016

36
Landsat 8 OLI
04 Jul 2016

Landsat is good but the 30m resolution loses to Sentinel 2A on the next slide.

37
Sentinel 2A
30 Jun 2016

38
Sentinel 1A
03 Jul 2016

Sentinel 1 is not as good as Sentinel 2 because it has much less spectral resolution,
however, given the prevalence of clouds, it is a great resource.

39
Characteristics of Images

An image refers to any pictorial representation,


regardless of what wavelengths or remote sensing
device has been used to detect and record the
electromagnetic energy

A digital remotely sensed image is typically composed of picture elements (pixels) located at the intersection of each row i and column j in each K
bands of imagery. Associated with each pixel is a number known as Digital Number (DN) or Brightness Value (BV), that depicts the average
radiance of a relatively small area within a scene (Fig. 1). A smaller number indicates low average radiance from the area and the high number is
an indicator of high radiant properties of the area.
Electromagnetic energy may be detected either photographically or electronically. The photographic process uses chemical reactions on the
surface of light-sensitive film to detect and record energy variations. It is important to distinguish between the terms images and photographs in
remote sensing. An image refers to any pictorial representation, regardless of what wavelengths or remote sensing devic has been used to detect
and record the electromagnetic energy. A photograph refers specifically to images that have been detected as well as recorded on photographic
film. The black and white photo to the left, of part of the city of Ottawa, Canada was taken in the visible part of the spect rum. Photos are
normally recorded over the wavelength range from 0.3 μm to 0.9 μm - the visible and reflected infrared. Based on these definitions, we can say
that all photographs are images, but not all images are photographs. Therefore, unless we are talking specifically about an i mage recorded
photographically, we use the term image. A photograph could also be represented and displayed in a digital format by subdividing the image into
small equal-sized and shaped areas, called picture
elements or pixels, and representing the brightness of each area with a numeric value or digital number. Indeed, that is exactly what has been
done to the photo to the left. In fact, using the definitions we have just discussed, this is actually a digital image of the original photograph! The
photograph was scanned and subdivided into pixels with each pixel assigned a digital number representing its relative brightness. The computer
displays each digital value as different brightness levels. Sensors thatrecord electromagnetic energy, electronically record the energy as an array
of numbers in digital format right from the start. These two different ways of representing and displaying remote sensing data, either pictorially
or digitally, are interchangeable as they convey the same information (although some detail may be lost when converting back and forth). In
previous sections we described the visible portion of the spectrum and the concept of colours. We see colour because our eyes detect the entire
visible range
of wavelengths and our brains process the information into separate colours. Can you imagine what the world would look like if we could only see
very narrow ranges of wavelengths or colours? That is how many sensors work. The information from a narrow wavelength range is gathered and
stored in a channel, also sometimes referred to as a band. We can combine and display channels of information digitally using the three primary
colours (blue, green, and red). The data from each channel is represented as one of the primary colours and, depending on the relative brightness
(i.e. the digital value) of each pixel in each channel, the primary colours combine in different proportions to represent different colours. When we
use this method to display a single channel or range of wavelengths, we
are actually displaying that channel through all three primary colours. Because the brightness level of each pixel is the same for each primary
colour, they combine to form a black and white image, showing various shades of gray from black to white. When we display more than one
channel each as a different primary colour, then the brightness levels may be different for each channel/primary colour combination and
they will combine to form a colour image.

40
RS Resolution
•Temporal Resolution.
•Spatial Resolution.
•Spectral Resolution.
•Radiometric Resolution

41
the concept of temporal resolution is also important to consider in a remote sensing system. We
alluded to this idea in section
2.2 when we discussed the concept of revisit period, which refers to the length of time it takes
for a satellite to complete one entire orbit cycle. The revisit period of a satellite sensor is
usually several days. Therefore the absolute temporal resolution of a remote sensing system
to image the exact same area at the same viewing angle a second time is equal to this period.
However, because of some degree of overlap in the imaging swaths of adjacent orbits for
most satellites and the increase in this overlap with increasing latitude, some areas of the
Earth tend to be re-imaged more frequently. Also, some satellite systems are able to point
their sensors to image the same area between different satellite passes separated by
periods from one to five days. Thus, the actual temporal resolution of a sensor depends on a
variety of factors, including the satellite/sensor capabilities, the swath overlap, and latitude. The ability
to collect imagery of the same area of the Earth's surface at different periods of
time is one of the most important elements for applying remote sensing data. Spectral
characteristics of features may change over time and these changes can be detected by
collecting and comparing multi-temporal imagery. For example, during the growing season,
most species of vegetation are in a continual state of change and our ability to monitor those
subtle changes using remote sensing is dependent on when and how frequently we collect
imagery. By imaging on a continuing basis at different times we are able to monitor the
changes that take place on the Earth's surface, whether they are naturally occurring (such as
changes in natural vegetation cover or flooding) or induced by humans (such as urban
development or deforestation). The time factor in imaging is important when:
1. persistent clouds offer limited clear views of the Earth's surface (often in the tropics)
2. short-lived phenomena (floods, oil slicks, etc.) need to be imaged
3. multi-temporal comparisons are required (e.g. the spread of a forest disease from one year to the
next)
4. the changing appearance of a feature over time can be used to distinguish it from nearsimilar
features (wheat / maize)

42
Temporal Resolution

Remote Sensor Data Acquisition

June 1, 2004 June 17, 2004 July 3, 2004

16 days

43
Spatial Resolution, Pixel Size, and Scale

spatial resolution of the sensor and


refers to the size of the smallest possible
feature that can be detected.

Instantaneous Field of View (IFOV).


The IFOV is the angular cone of
visibility of the sensor (A) and
determines the area on the Earth's
surface which is "seen" from a given
altitude at one particular moment in
time (B).

Pixels which are the smallest units of an


image.
Scale = The distance on an image or map/
actual ground distance.

For some remote sensing instruments, the distance between the target being imaged and the
platform, plays a large role in determining the detail of information obtained and the total area
imaged by the sensor. Sensors onboard platforms far away from their targets, typically view a
larger area, but cannot provide great detail. Compare what an astronaut onboard the space
shuttle sees of the Earth to what you can see from an airplane. The astronaut might see your
whole province or country in one glance, but couldn't distinguish individual houses. Flying over
a city or town, you would be able to see individual buildings and cars, but you would be
viewing a much smaller area than the astronaut. There is a similar difference between satellite
images and airphotos.
The detail discernible in an image is dependent on the
spatial resolution of the sensor and refers to the size of
the smallest possible feature that can be detected.
Spatial resolution of passive sensors (we will look at the
special case of active microwave sensors later) depends
primarily on their Instantaneous Field of View (IFOV).
The IFOV is the angular cone of visibility of the sensor (A)
and determines the area on the Earth's surface which is
"seen" from a given altitude at one particular moment in
time (B). The size of the area viewed is determined by
multiplying the IFOV by the distance from the ground to
the sensor (C). This area on the ground is called the
resolution cell and determines a sensor's maximum
spatial resolution. For a homogeneous feature to be
detected, its size generally has to be equal to or larger than the resolution cell. If the feature is
smaller than this, it may not be detectable as the average brightness of all features in that
resolution cell will be recorded. However, smaller features may sometimes be detectable if
their reflectance dominates within a articular resolution cell allowing sub-pixel or resolution cell
detection. As we mentioned in Chapter 1, most remote sensing images are composed of a matrix of
picture elements, or pixels, which are the smallest units of an image. Image pixels are normally square and represent a certain area on an image. It is important to
distinguish between pixel size and spatial resolution - they are not interchangeable. If a sensor has a
spatial resolution of 20 metres and an image from that sensor is displayed at full resolution,
each pixel represents an area of 20m x 20m on the ground. In this case the pixel size and
resolution are the same. However, it is possible to display an image with a pixel size different
than the resolution. Many posters of satellite images of the Earth have their pixels averaged to
represent larger areas, although the original spatial resolution of the sensor that collected the
imagery remains the same. Images where only large features are visible are said to have coarse or low resolution. In fine or high resolution images, small objects can be
detected. Military sensors for example, are designed to view as much detail as possible, and therefore have very fine resolution. Commercial satellites provide imagery
with resolutions varying from a few metres to several kilometres. Generally speaking, the finer the resolution, the less total ground area can be seen. The ratio of distance
on an image or map, to actual ground distance is referred to as scale. If you had a map with a scale of 1:100,000, an object of 1cm length on the map would actually be an
object 100,000cm (1km) long on the ground. Maps or images with small "map-to-ground ratios" are referred to as small scale (e.g. 1:100,000), and those with larger ratios
(e.g. 1:5,000) are called large scale.

44
45
Spatial domain
Spatial domain refers to the spatial scale of
observations being collected including the spatial
resolution and the area covered

What is spatial resolution?

Distance between measurements

46
Spatial resolution example

47
Low resolution image: High spatial resolution
where pixels are greater than the objects where pixels are smaller than the
in the image objects in the image

48
Spatial
Resolution

Imagery of residential housing


in Mechanicsville, New York,
obtained on June 1, 1998, at a
nominal spatial resolution of
0.3 x 0.3 m (approximately 1 x
1 ft.) using a digital camera.

49
Spectral Resolution

In Chapter 1, we learned about spectral response and spectral emissivity curves


which
characterize the reflectance and/or emittance of a feature or target over a variety of
wavelengths. Different classes of features and details in an image can often be
distinguished
by comparing their responses over distinct wavelength ranges. Broad classes, such as
water
and vegetation, can usually be separated using very broad wavelength ranges - the
visible
and near infrared - as we learned in section 1.5. Other more specific classes, such as
different rock types, may not be easily distinguishable using either of these broad
wavelength ranges and would require comparison at much finer wavelength ranges to
separate them. Thus, we would require a sensor with higher spectral resolution.
Spectral
resolution describes the ability of a sensor to define fine wavelength intervals. The
finer the
spectral resolution, the narrower the wavelength range for a particular channel or
band.

50
Spectral
Resolution

51
In a multispectral image, individual bands can be
used to make individual images, which can then
be combined in various ways to bring out
particular details on the surface.

52
Multispectral/hyperspectral image

Ultraspectral
(1000’s of bands)

Hyperspectral
(100’s of bands)

Multispectral
(10’s of bands)

Panchromatic

IEEE International Geoscience and Remote Sensing Symposium (IGARSS 2011), Vancouver, Canada, July 24 – 29, 2011 2

Many electronic (as opposed to photographic) remote sensors acquire data using scanning
systems, which employ a sensor with a narrow field of view (i.e. IFOV) that sweeps over the
terrain to build up and produce a two-dimensional image of the surface. Scanning systems
can be used on both aircraft and satellite platforms and have essentially the same operating
principles. A scanning system used to collect data over a variety of different wavelength
ranges is called a multispectral scanner (MSS), and is the most commonly used scanning
system. There are two main modes or methods of scanning employed to acquire multispectral
image data - across-track scanning, and along-track scanning.
Across-track scanners scan the Earth in a
series of lines. The lines are oriented
perpendicular to the direction of motion of the
sensor platform (i.e. across the swath). Each line
is scanned from one side of the sensor to the
other, using a rotating mirror (A). As the
platform moves forward over the Earth,
successive scans build up a two-dimensional
image of the Earth´s surface. The incoming
reflected or emitted radiation is separated into
several spectral components that are detected
independently. The UV, visible, near-infrared, and thermal radiation are dispersed into their constituent
wavelengths. A bank of internal detectors (B), each sensitive to a specific range of wavelengths, detects and
measures the energy for each spectral band and then, as an electrical signal, they are converted to digital data
and recorded for subsequent computer processing. The IFOV (C) of the sensor and the altitude of the platform
determine the ground resolution cell viewed (D), and thus the spatial resolution. The angular field of view (E) is
the sweep of the mirror, measured in degrees, used to record a scan line, and determines the width of the
imaged swath (F). Airborne scanners typically sweep large angles (between 90º and 120º), while satellites,
because of their higher altitude need only to sweep fairly small angles (10- 20º) to cover a broad region. Because
the distance from the sensor to the target increases towards the edges of the swath, the ground resolution cells
also become larger and introduce geometric distortions to the images. Also, the length of time the IFOV "sees" a
ground resolution cell as the rotating mirror scans (called the dwell time), is generally quite short and influences
the design of the spatial, spectral, and radiometric resolution of the sensor.

53
Challenges in Hyperspectral Image Classification

Concept of hyperspectral imaging using NASA Jet Propulsion Laboratory’s Airborne Visible Infra-Red Imaging Spectrometer

IEEE International Geoscience and Remote Sensing Symposium (IGARSS 2011), Vancouver, Canada, July 24 – 29, 2011 1

54
Airborne Visible Infrared
Imaging Spectrometer
(AVIRIS) Datacube of
Sullivan’s Island Obtained
on October 26, 1998

55
Spectral Resolution

Spectral resolution: the ability of a sensor to define fine wavelength intervals.


Finer the spectral resolution ➔ narrower wavelength range for a particular band.

Black and White film


Coarse spectral resolution

Color film
fine spectral resolution

Some material may not be easily


distinguishable using broad wavelength.

They would require comparison at much


finer spectral resolution (hyperspectral
sensors)

Black and white film records wavelengths extending over much, or all of the visible portion of
the electromagnetic spectrum. Its spectral resolution is fairly coarse, as the various
wavelengths of the visible spectrum are not individually distinguished and the overall
reflectance in the entire visible portion is recorded. Colour film is also sensitive to the
reflected energy over the visible portion of the spectrum, but has higher spectral resolution,
as it is individually sensitive to the reflected energy at the blue, green, and red wavelengths
of the
spectrum. Thus, it can represent features of various colours based on their reflectance in
each
of these distinct wavelength ranges. Many remote sensing systems record energy over
several separate wavelength ranges at
various spectral resolutions. These are referred to as multi-spectral sensors and will be
described in some detail in following sections. Advanced multi-spectral sensors called
hyperspectral sensors, detect hundreds of very narrow spectral bands throughout the visible,
near-infrared, and mid-infrared portions of the electromagnetic spectrum. Their very high
spectral resolution facilitates fine discrimination between different targets based on their
spectral response in each of the narrow bands.

56
Spectral Bandwidths of SPOT and Landsat Sensor Systems

57
Radiometric Resolution

Bit: is the smallest stored units for computers


Byte: is the smallest stored units for users

One byte= 8bit


If we have 11 bits digital values,
should be stored on………byte on our computer

If we have 6 bits digital values,


What ranges on the gray level put in our computer

characteristics describe the actual information content in an image. Every time an


image is acquired on film or by a sensor, its sensitivity to the magnitude of the
electromagnetic energy determines the radiometric resolution. The radiometric
resolution of an imaging system describes its ability to discriminate very slight
differences in energy The finer the radiometric
resolution of a sensor, the more sensitive it is to detecting small differences in
reflected or emitted energy. Imagery data are represented by positive digital numbers
which vary from 0 to (one less than) a selected power of 2. This range corresponds to
the number of bits used for coding numbers in binary format. Each bit records an
exponent of power 2 (e.g. 1 bit=2 1=2). The maximum
number of brightness levels available depends on the number of bits used in
representing the energy recorded. Thus, if a sensor used 8 bits to record the data,
there would be 28=256 digital values available, ranging from 0 to 255. However, if
only 4 bits were used, then only 24=16 values ranging from 0 to 15 would be
available. Thus, the radiometric resolution would
be much less. Image data are generally displayed in a range of grey tones, with black
representing a digital number of 0 and white representing the maximum value (for
example, 255 in 8-bit data). By comparing a 2-bit image with an 8-bit image, we can
see that there is a large difference in the level of detail discernible depending on their
radiometric resolutions.

58
Radiometric Resolution

•The sensor sensitivity to the magnitude of the electromagnetic


energy
•The finer the radiometric resolution of a sensor, the more
sensitive it is to detecting small differences in reflected or emitted
energy.

4 bits were used, then only 24=16 values ranging from 0 to 15


would be available (low Radiometric Resolution)

6 bits recorded data, 26=64 digital values, ranging from 0 to 63.


8 bits recorded data, 28=256 digital values available, ranging from
0 to 255. (High Radiometric Resolution)

59
Radiometric Resolution

7-bit
0 (0 - 127)

8-bit
0 (0 - 255)

0 9-bit
(0 - 511)

10-bit
0 (0 - 1023)

60
Low High
Radiometric Resolution Radiometric
Resolution

61
62
63
It is useful to examine the image Histograms before performing any image enhancement. The x-axis of the histogram is
the range of the available digital numbers, i.e. 0 to 255. The y-axis is the number of pixels in the image having a given
digital number. The histograms of the three bands of this image is shown in the following figures.

64
IMAGE ENHANCEMENT TECHNIQUES

65
66
67
68
69
70
71
Apply histogram stretching for the following sub
image value

72
73
74
75
76
If We have an image with 3 bit /pixel, so the possible range of values is 0 to 7.
We have an image with the following histogram

77
78
79
Remote Sensing data

80
Satellite imagery
Low resolution satellite imagery is generally available for free, while high-resolution imagery must be purchased and
licensing usually restrict its distribution

Freely available low resolution multi-spectral & radar satellite imagery


• Landsat 7 ETM+, Landsat 8… : Earth Explorer (http://earthexplorer.usgs.gov)
• Sentinel-1,2,3 : Open Access Hub (https://scihub.copernicus.eu)

Ocean monitoring
• Sea Surface Temperature, Chlorophyll-a (SeaWIFS), MODIS : NOAA/NASA (https://oceancolor.gsfc.nasa.gov)

Commercial high-resolution imagery


• IKONOS, GeoEye, QuickBird, WorldView-1,2,3,4
Pléiades,…

Capricorn Seamount

81
Remote Sensing

Sun-synchronous orbits Geostationary orbits

LANDSAT Weather and communications


SPOT satellites commonly have these types
IRS of orbits.
ASTER METEOSAT
ERS GOES
JERS Telecom Sat.

Many remote sensing platforms are designed to follow an orbit (basically north-
south) which, in conjunction with the Earth's rotation (west-east), allows them to
cover most of the Earth's surface over a certain period of time. These are nearpolar
orbits, so named for the inclination of the orbit relative to a line running between the
North and South poles. Many of these satellite orbits are also sun-synchronous such
that they cover each area of the world at a constant local time of day called local sun
time. At any given latitude, the position of the sun in the sky as the satellite passes
overhead will be the same within the same season. This ensures consistent
illumination conditions when acquiring images in a specific season over successive
years, or over a particular area over a series of days. This is an important factor for
monitoring changes between images or for mosaicking adjacent images together, as
they do not have to be corrected for different illumination conditions.
Most of the remote sensing satellite platforms today are in near-polar orbits, which
means that the satellite travels northwards on one side of the Earth and then toward
the southern pole on the second half of its orbit. These are called ascending and
descending passes, respectively. If the orbit is also sunsynchronous, the ascending
pass is most likely on the shadowed side of the Earth while the descending pass is on
the sunlit side. Sensors recording reflected solar energy only image the surface on a
descending pass, when solar illumination is available. Active sensors which provide

82
their own illumination or passive sensors that record emitted (e.g. thermal) radiation
can also image the surface on ascending
As a satellite revolves around the Earth, the sensor "sees" a certain portion of the
Earth's surface. The area imaged on the surface, is referred to as the swath. Imaging
swaths for spaceborne sensors generally vary between tens and hundreds of
kilometres wide. As the satellite orbits the Earth from pole to pole, its east-west
position wouldn't change if the Earth didn't rotate. However, as seen from the Earth,
it seems that the satellite is shifting westward because the Earth is rotating (from
west to east) beneath it. This apparent movement allows the satellite swath to cover
a new area with each consecutive pass. The satellite's orbit and the rotation of the
Earth work together to allow complete coverage of the Earth's surface, after it has
completed one complete cycle of orbits.

82
Remote Sensing
Landsat

• Landsat 1: originally named ERTS-1 (Earth Resources Technology Satellite 1),


launched 1972, terminated 1978
• Landsat 2: launched 1975, terminated 1981
• Landsat 3: launched 1978, terminated 1983
• Landsat 4: launched 1982, terminated 1993
• Landsat 5: launched 1984, still functioning but severe problems since November
2011. Will be terminated
• Landsat 6: launched 1993, failed to reach orbit
• Landsat 7: launched 1999, still functioning, but with faulty scan line corrector
• Landsat 8: Landsat Data Continuity Mission (LDCM) launched February 11, 2013.
Turned over to USGS and renamed Landsat 8

83
Remote Sensing
SPOT

• SPOT 1: launched 1986, terminated 1990


• SPOT 2: launched 1990, terminated 2009
• SPOT 3: launched 1993, stopped functioning 1997
• SPOT 4: launched 1998, stopped functioning 2013
• SPOT 5: launched 2002
• SPOT 6: launched 2012

84
Remote Sensing
Landsat 1 to 8 - TM

Spectral Band Wavelength Resolution


Band 1 - Blue 0.45 - 0.52 µm 30 m
Band 2 - Green 0.52 - 0.60 µm 30 m
Band 3 - Red 0.63 - 0.69 µm 30 m
Band 4 - Near Infrared 0.76 - 0.90 µm 30 m
Band 5 - Short Wavelength Infrared 1.55 - 1.75 µm 30 m
Band 6 - Long Wavelength Infrared 10.40 – 12.50 µm 60 m
Band 7 - Short Wavelength Infrared 2.08 – 2.35 µm 30 m

85
Remote Sensing
SPOT

Spectral Band SPOT 5 SPOT 6

Panchromatic 5m 0.48 - 0.71 µm 1.5 m 0.45-0.75 µm

Blue 6m 0.40-0.52 µm

Green 10 m 0.50 - 0.59 µm 6m 0.53-0.59 µm

Red 10 m 0.61 - 0.68 µm 6m 0.62-0.69 µm

Near Infrared 10 m 0.78 - 0.89 µm 6m 0.76-0.89 µm

Mid Infrared 20 m 1.58 - 1.75 µm

86
Remote Sensing
SPOT

Spectral Band SPOT 4 & 5 VEGETATION

Blue ~1 km 0.43 - 0.47 µm

Red ~1 km 0.61 - 0.68 µm

Near Infrared ~1 km 0.78 - 0.89 µm

Mid Infrared ~1 km 1.58 - 1.75 µm

You can buy primary products or synthesis product (e.g. S10)


Price is ~35 Euros / million of km 2
S10 older than three months are now available for free

87
Remote Sensing
Landsat 7 - ETM

Spectral Band Wavelength Resolution


Band 1 - Blue 0.45 - 0.52 µm 30 m
Band 2 - Green 0.52 - 0.60 µm 30 m
Band 3 - Red 0.63 - 0.69 µm 30 m
Band 4 - Near Infrared 0.77 - 0.90 µm 30 m
Band 5 - Short Wavelength Infrared 1.55 - 1.75 µm 30 m
Band 6 - Long Wavelength Infrared 10.40 – 12.50 µm 60 m
Band 7 - Short Wavelength Infrared 2.08 – 2.35 µm 30 m
Band 8 - Panchromatic 0.52 - 0.90 µm 15 m

88
Remote Sensing
Landsat 8 - OLI & TIRS

Spectral Band Wavelength Resolution


Band 1 - Coastal / Aerosol 0.43 - 0.45 µm 30 m
Band 2 - Blue 0.45 - 0.51 µm 30 m
Band 3 - Green 0.53 - 0.59 µm 30 m
Band 4 - Red 0.64 - 0.67 µm 30 m
Band 5 - Near Infrared 0.85 - 0.88 µm 30 m
Band 6 - Short Wavelength Infrared 1.57 - 1.65 µm 30 m
Band 7 - Short Wavelength Infrared 2.11 - 2.29 µm 30 m
Band 8 - Panchromatic 0.50 - 0.68 µm 15 m
Band 9 - Cirrus 1.36 - 1.38 µm 30 m
Band 10 - Long Wavelength Infrared 10.60 - 11.19 µm 100 m
Band 11 - Long Wavelength Infrared 11.50 - 12.51 µm 100 m

89
Sensing in the Visible and NIR

http://landsat.gsfc.nasa.gov/wp-content/uploads/2015/06/Landsat.v.Sentinel-2.png

90
Remote Sensing
ASTER
Band Wavelength (µm) Resolution (m) Description
VNIR_Band1 0.520–0.600 15 Nadir Green
VNIR_Band2 0.630–0.690 15 Nadir Red
VNIR_Band3N 0.760–0.860 15 Nadir
Near infrared
VNIR_Band3B 0.760–0.860 15 Backward
SWIR_Band4 1.600–1.700 30 Nadir
SWIR_Band5 2.145–2.185 30 Nadir
SWIR_Band6 2.185–2.225 30 Nadir
Short-wave infrared
SWIR_Band7 2.235–2.285 30 Nadir
SWIR_Band8 2.295–2.365 30 Nadir
SWIR_Band9 2.360–2.430 30 Nadir
TIR_Band10 8.125–8.475 90 Nadir
TIR_Band11 8.475–8.825 90 Nadir
TIR_Band12 8.925–9.275 90 Nadir Long-wave infrared
TIR_Band13 10.250–10.950 90 Nadir
TIR_Band14 10.950–11.650 90 Nadir

91
92
Remote Sensing

ASTER Global Digital Elevation Model (GDEM)

93
Remote Sensing
ASTER vs. Landsat ETM

94
Remote Sensing
High Resolution Satellites
• GeoEye (US)
• Ikonos
• OrbView
• GeoEye
• DigitalGlobe (US)
• QuickBird
• WorldView
• Astrium (EU/France)
• Pléiades
• BlackBridge (Germany)
• RapidEye
• ImageSat (Israel)
• EROS A & B
• etc. etc.

95
Remote Sensing
Example: GeoEye

Band Wavelength Resolution


Panchromatic 0.45 - 0.80 μm 50 cm
Blue 0.45 - 0.51 μm 2m
Green 0.51 - 0.58 μm 2m
Red 0.65 - 0.69 μm 2m
Near Infrared 0.78 - 0.92 μm 2m

96
Remote Sensing data avilable for free download

• USGS EarthExplorer
• http://earthexplorer.usgs.gov
• USGS Global Visualization Viewer
• http://glovis.usgs.gov
• Global Land Cover Facility (GLCF)
• http://glcf.umd.edu
• ITC's database of Satellites and Sensors
• http://www.itc.nl/research/products/sensordb/AllSatellites.aspx

97
Image acquisition
As there is no reception station in the South Pacific, image acquisition must be scheduled ahead, stored on-board
and transferred when above a receiving station.
As a result there are not many images available for a given area, and they might be cloudy

Transfer picture

Schedule acquisition

Acquire picture

Capricorn Seamount

98
Image acquisition
Imagery is acquired along an orbit, and the desired area may overlap several orbits

The Landsat 8 and Landsat 7 satellites both


maintain a near-polar, sun-synchronous orbit,
following the World Reference System (WRS-2).
They each make an orbit in about 99 minutes,
complete over 14 orbits per day, and provide
complete coverage of the Earth every 16 days

99
Obtaining free imagery
For freely available imagery (Landsat, Sentinel), you just browse through the catalogue of archived images and select
one with not too many clouds above the area of interest (using quicklooks as guidance). Then you submit the
request and receive a link to download the image. An image would typically be around 1 GB

100
Ordering commercial imagery
For high resolution imagery, you create a polygon for the area of interest and query available imagery for that
polygon. Because the image is billed per sq km, you only select the area you need, yet there are restrictions on the
shape of the polygon and a minimum area for the order (25 km² for WorldView imagery for example)

Total area 25 km²


Min width

101
Image mosaic
Because of paths and cloud cover, a combination (mosaic) of images taken at different times is generally necessary to
cover a big area

102
Satellite sensors
Imagery is the output of the satellite sensors, at a given spatial resolution with bands corresponding to a range of
visible, infrared (or radar wavelengths for radar satellites)

Capricorn Seamount

103
Radar imagery

Water is opaque to radar wavelengths. Radar imagery can be used to detect boats and often impractical for
coastal applications

• A small boat would only be a few pixels on a radar image


• The image is not received in real time
• Once a boat is detected and there is suspicion of illegal activity, you still need to send aerial and naval forces to
control the boat and gather evidences.

Boats

Capricorn Seamount

104
Multispectral imagery
Satellite Resolution Bands
Landsat-8 30m/15m Pan+8 MS +TIR
Sentinel-2 10m-60m 12 bands
IKONOS 4m/1m Pan+4 MS (defunct)
QuickBird 3m/.7m Pan+4 MS (destroyed)
GeoEye-1 1.8m/.5m Pan+4 MS
WorldView-1 .6m Pan
WorldView-2 2m/.5m Pan+8 MS
WorldView-3 1.2m/.3m Pan+8 MS +SWIR+CAVIS

Capricorn Seamount

105
Hyperspectral imagery
Airborne (AVIRIS, CASI) and satellite (EO-1, HySIS)

Capricorn Seamount

106
Sea surface temperature
https://www.ospo.noaa.gov/Products/ocean/sst/50km_night/index.html

Capricorn Seamount

107
Winds
https://manati.star.nesdis.noaa.gov/datasets/ASCATData.php

Capricorn Seamount

108
Satellite bands
WorldView-2 : MSS bands 1-8 have a resolution of 2m, while panchromatic is 50 cm

Band 1: Coastal blue Band 5: Red Band 8: NIR 2

Panchromatic – Res .5m


Capricorn Seamount

109
Pan-sharpening

Bands 4,3,2 - Resolution 2m


+ Panchromatic – Resolution .5m

Pan-sharpening increases the spatial


resolution of the multispectral
image by merging it with the higher
resolution panchromatic image

Capricorn Seamount
Pan-sharpened 4,3,2 Resolution .5m

110
Image rectification

Capricorn Seamount

111
Image rectification

112
Image rectification

113
Image classification: isodata

Capricorn Seamount

114
Image classification: k-means

Capricorn Seamount

115
Image classification
Image classification can be supervised or unsupervised

Capricorn Seamount

116
Image classification: k-means
Compare the classified image with the original image

Capricorn Seamount

117
Image classification: raster to vector
Set the symbology and remove classes for deep water

Capricorn Seamount

118

You might also like