Remote Sening and Satellite Image-2021
Remote Sening and Satellite Image-2021
Presented by
Prof. Dr. Safaa Mohamed Hasan
1
What is remote sensing?
• In the broadest sense, remote sensing is the
small or large-scale acquisition of information
of an object or phenomenon, by the use of
either recording or real-time sensing device's)
that is not in physical or intimate contact with
the object (such as by way of aircraft,
spacecraft, satellite, or ship).
2
Remote sensing is…
3
A remote sensing instrument
collects information about an
object or phenomenon within the
-field-of-view (FOV) of the
sensor system without being in
direct physical contact with it.
The sensor is located on a
suborbital or satellite platform.
4
Interaction Model Depicting the Relationships of the Mapping Sciences as
they relate to Mathematics and Logic, and the Physical, Biological, and Social
Sciences
5
RS as Source of information
Traditionally: Sensor:
•Field equipment •Active RS.
•Manual recording. •Passive RS.
•Field work.
6
Active and Passive
Passive Sensor
Radiance
ATMOSPHERE
SURFACE
Active Sensor
7
8
RS Properties
•Overview regions.
9
Person in photo is holding a sensor similar to the ones used on satellites. By recording a highly
accurate spectral signature for this crop type using this hand-held spectrometer, scientists can
then search for and extract this signature from satellite imagery and develop detailed maps of
this crop type over very large areas.
10
Advantages of Remote Sensing
• Remote sensing is unremarkable if the sensor passively records the EMR reflected or
emitted by the object of interest. Passive remote sensing does not disturb the object or
area of interest.
11
Advantages of Remote Sensing
12
13
The Electromagnetic Spectrum
reflective
Sensor Types
AVHRR
OLS
Radiative
Emissive
MODIS
Thermal
MERIES
SSM/I
Seawinds
SAR
ASAR
The electromagnetic spectrum ranges from the shorter wavelengths (including gamma and
x-rays) to the longer wavelengths (including microwaves and broadcast radio waves). There
are several regions of the electromagnetic spectrum which are useful for remote sensing.
For most purposes, the ultraviolet or UV
portion of the spectrum has the shortest
wavelengths which are practical for remote
sensing. This radiation is just beyond the
violet portion of the visible wavelengths,
hence its name. Some Earth surface
materials, primarily rocks and minerals,
fluoresce or emit visible light when illuminated
by UV radiation. The light which our eyes - our "remote
sensors" - can detect is part of the visible
spectrum. It is important to recognize how
small the visible portion is relative to the rest
of the spectrum. There is a lot of radiation
around us which is "invisible" to our eyes, but
can be detected by other remote sensing
instruments and used to our advantage. The
visible wavelengths cover a range from
approximately 0.4 to 0.7 μm. The longest
visible wavelength is red and the shortest is
violet. Common wavelengths of what we
perceive as particular colours from the visible
portion of the spectrum are listed below. It is
important to note that this is the only portion
of the spectrum we can associate with the
concept of colours. Blue, green, and red are the primary
colours or wavelengths of the visible
spectrum. They are defined as such because
no single primary colour can be created from
the other two, but all other colours can be
formed by combining blue, green, and red in
various proportions. Although we see sunlight
as a uniform or homogeneous colour, it is
actually composed of various wavelengths of
radiation in primarily the ultraviolet, visible
and infrared portions of the spectrum. The visible portion of this radiation can be shown in itsThe next portion of the spectrum of interest is
the infrared (IR) region which covers the
wavelength range from approximately 0.7 μm
to 100 μm - more than 100 times as wide as
the visible portion! The infrared region can be
divided into two categories based on their
radiation properties - the reflected IR, and
the emitted or thermal IR. Radiation in the
reflected IR region is used for remote sensing
purposes in ways very similar to radiation in
the visible portion. The reflected IR covers
wavelengths from approximately 0.7 μm to
3.0 μm. The thermal IR region is quite
different than the visible and reflected IR
portions, as this energy is essentially the
radiation that is emitted from the Earth's
surface in the form of heat. The thermal IR
covers wavelengths from approximately 3.0
μm to 100 μm. The portion of the spectrum of more recent
interest to remote sensing is the microwave
region from about 1 mm to 1 m. This covers
the longest wavelengths used for remote
sensing. The shorter wavelengths have
properties similar to the thermal infrared
region while the longer wavelengths approach
the wavelengths used for radio broadcasts.
Because of the special nature of this region
and its importance to remote sensing in
Canada, an entire chapter (Chapter 3) of the
tutorial is dedicated to microwave sensing.
14
The Electromagnetic Spectrum
15
Visible Spectrum
16
Types of Sensors
Visible & Reflected IR Thermal IR Microwave
Remote Sensing Remote Sensing Remote Sensing
Radiation
Source >> Sun Object Object Sensor
Spectral
Radiance >>
17
Electromagnetic Wave Interaction/ Propagation
through the Atmosphere
• What happens to solar radiation as it passes through the atmosphere?
• What happens to radiation as it reaches the earth?
• Transmission
• Absorption
• Scattering/Reflection
• emission
Before radiation used for remote sensing reaches the Earth's surface it has to travel through
some distance of the Earth's atmosphere. Particles and gases in the atmosphere can affect
the incoming light and radiation. These effects are caused by the mechanisms of scattering
and absorption. Scattering occurs when particles or large gas molecules present in the atmosphere interact
with and cause the electromagnetic radiation to be redirected from its original path. How much
scattering takes place depends on several factors including the wavelength of the radiation,
the abundance of particles or gases, and the distance the radiation travels through the
atmosphere. There are three (3) types of scattering which take place. Absorption is the other main mechanism at work
when electromagnetic radiation interacts with the
atmosphere. In contrast to scattering, this
phenomenon causes molecules in the atmosphere to
absorb energy at various wavelengths. Ozone,
carbon dioxide, and water vapour are the three main
atmospheric constituents which absorb radiation. Ozone serves to absorb the harmful (to most living
things) ultraviolet radiation from the sun. Without this
protective layer in the atmosphere our skin would
burn when exposed to sunlight.
You may have heard carbon dioxide referred to as
a greenhouse gas. This is because it tends to absorb radiation strongly in the far infrared
portion of the spectrum - that area associated with thermal heating - which serves to trap this
heat inside the atmosphere. Water vapour in the atmosphere absorbs much of the incoming
longwave infrared and shortwave microwave radiation (between 22μm and 1m). The
presence of water vapour in the lower atmosphere varies greatly from location to location and
at different times of the year. For example, the air mass above a desert would have very little
water vapour to absorb energy, while the tropics would have high concentrations of water
vapour (i.e. high humidity). Because these gases absorb
electromagnetic energy in very
specific regions of the spectrum, they
influence where (in the spectrum) we
can "look" for remote sensing
purposes. Those areas of the
spectrum which are not severely
influenced by atmospheric absorption
and thus, are useful to remote
sensors, are called atmospheric
windows. By comparing the
characteristics of the two most
common energy/radiation sources
(the sun and the earth) with the
atmospheric windows available to us, we can define those wavelengths that we can
use most effectively for remote sensing. The visible portion of the spectrum, to
which our eyes are most sensitive, corresponds to both an atmospheric window and
the peak energy level of the sun. Note also that heat energy emitted by the Earth
corresponds to a window around 10 μm in the thermal IR portion of the spectrum,
while the large window at wavelengths beyond 1 mm is associated with the
18
Radiation - Target Interactions
Incident (I)
Absorption (A)
Transmission (T)
Reflection (R)
19
1.5 Radiation - Target Interactions
Let's take a look at a couple of examples of targets at the Earth's surface and how energy at
the visible and infrared wavelengths interacts with them.
Leaves: A chemical compound in leaves
called chlorophyll strongly absorbs
radiation in the red and blue
wavelengths but reflects green
wavelengths. Leaves appear "greenest"
to us in the summer, when chlorophyll
content is at its maximum. In autumn,
there is less chlorophyll in the leaves, so
there is less absorption and
proportionately more reflection of the red
wavelengths, making the leaves appear
red or yellow (yellow is a combination of
red and green wavelengths). The
internal structure of healthy leaves act as excellent diffuse reflectors of near-infrared
wavelengths. If our eyes were sensitive to near-infrared, trees would appear extremely bright
to us at these wavelengths. In fact, measuring and monitoring the near-IR reflectance is one
way that scientists can determine how healthy (or unhealthy) vegetation may be.
Water: Longer wavelength visible and near
infrared radiation is absorbed more by water
than shorter visible wavelengths. Thus water
typically looks blue or blue-green due to
stronger reflectance at these shorter
wavelengths, and darker if viewed at red or
near infrared wavelengths. If there is
suspended sediment present in the upper
layers of the water body, then this will allow
better reflectivity and a brighter appearance
of the water. The apparent colour of the
water will show a slight shift to longer. be easily confused with shallow (but clear) water, since these two phenomena appear very
similar. Chlorophyll in algae absorbs more of the blue wavelengths and reflects the green,
making the water appear more green in colour when algae is present. The topography of the
water surface (rough, smooth, floating materials, etc.) can also lead to complications for
water-related interpretation due to potential problems of specular reflection and other
influences on colour and brightness. We can see from these examples that, depending on the complex make-up of the target that
is being looked at, and the wavelengths of radiation involved, we can observe very different
responses to the mechanisms of absorption, transmission, and reflection. By measuring the
energy that is reflected (or emitted) by targets on the Earth's surface over a variety of different
wavelengths, we can build up a spectral response for that object. By comparing the
response patterns of different features we may be able to distinguish between them, where
we might not be able to, if we only compared them at one wavelength. For example, water
and vegetation may reflect somewhat similarly in the visible wavelengths but are almost
always separable in the infrared. Spectral response can be quite variable, even for the same
target type, and can also vary with time (e.g. "green-ness" of leaves) and location. Knowing
where to "look" spectrally and understanding the factors which influence the spectral response
of the features of interest are critical to correctly interpreting the interaction of electromagnetic
radiation with the surface.
20
Microwave Radiation
10GHz 1GHz
Frequency & wavelengths for
microwave bands
0.2μm 1.0μm 10μm 1mm 10mm 10cm 1m
Wavelength Frequency
Middle Thermal Microwave Bands Band
Visible
infra-red infra-red range[mm] range[GHz]
UV Ka Ku X C S L P Ka 7.5 - 11.0 40.0 - 26.5
Near infra-red
K 11.0 - 16.7 26.5 - 18.0
The electromagnetic spectrum (after NASA/JPL, 1988)
Ku 16.7 - 24.0 18.0 - 12.5
X 24.0 - 37.5 12.5 - 8.0
Transmittance (%)
Wavelength
X, C, S, L bands are used for
Characteristic of atmospheric spectral transmittance (after NASA/JPL, 1988) SAR on-board satellites
21
Surface scatter (Reflection) mechanism
©METI/JAXA
22
Backscatter change in a paddy field
weak strong
HH backscattering
23
Seasonal change
Seasonal changes in microwave backscatter provide useful information to detect paddy field area.
SAR Backscatter
Low High
24
Sentinel 1A
11 Aug 2015
25
Sentinel 1A
22 Oct 2015
After growing and/or harvest, more backscatter. The river, however, has flooded!
26
Landsat 8 OLI Multispectral sensor
15 Apr 2016
27
Sentinel 1A
17 Jan 2016
28
Sentinel 1A
05 Mar 2016
29
Sentinel 1A
29 Mar 2016
30
Sentinel 1A
22 Apr 2016
31
Sentinel 1A
16 May 2016
32
Sentinel 1A
09 Jun 2016
33
Sentinel 1A
03 Jul 2016
34
Sentinel 1A
20 Aug 2016
35
Sentinel 1A
13 Sep 2016
36
Landsat 8 OLI
04 Jul 2016
Landsat is good but the 30m resolution loses to Sentinel 2A on the next slide.
37
Sentinel 2A
30 Jun 2016
38
Sentinel 1A
03 Jul 2016
Sentinel 1 is not as good as Sentinel 2 because it has much less spectral resolution,
however, given the prevalence of clouds, it is a great resource.
39
Characteristics of Images
A digital remotely sensed image is typically composed of picture elements (pixels) located at the intersection of each row i and column j in each K
bands of imagery. Associated with each pixel is a number known as Digital Number (DN) or Brightness Value (BV), that depicts the average
radiance of a relatively small area within a scene (Fig. 1). A smaller number indicates low average radiance from the area and the high number is
an indicator of high radiant properties of the area.
Electromagnetic energy may be detected either photographically or electronically. The photographic process uses chemical reactions on the
surface of light-sensitive film to detect and record energy variations. It is important to distinguish between the terms images and photographs in
remote sensing. An image refers to any pictorial representation, regardless of what wavelengths or remote sensing devic has been used to detect
and record the electromagnetic energy. A photograph refers specifically to images that have been detected as well as recorded on photographic
film. The black and white photo to the left, of part of the city of Ottawa, Canada was taken in the visible part of the spect rum. Photos are
normally recorded over the wavelength range from 0.3 μm to 0.9 μm - the visible and reflected infrared. Based on these definitions, we can say
that all photographs are images, but not all images are photographs. Therefore, unless we are talking specifically about an i mage recorded
photographically, we use the term image. A photograph could also be represented and displayed in a digital format by subdividing the image into
small equal-sized and shaped areas, called picture
elements or pixels, and representing the brightness of each area with a numeric value or digital number. Indeed, that is exactly what has been
done to the photo to the left. In fact, using the definitions we have just discussed, this is actually a digital image of the original photograph! The
photograph was scanned and subdivided into pixels with each pixel assigned a digital number representing its relative brightness. The computer
displays each digital value as different brightness levels. Sensors thatrecord electromagnetic energy, electronically record the energy as an array
of numbers in digital format right from the start. These two different ways of representing and displaying remote sensing data, either pictorially
or digitally, are interchangeable as they convey the same information (although some detail may be lost when converting back and forth). In
previous sections we described the visible portion of the spectrum and the concept of colours. We see colour because our eyes detect the entire
visible range
of wavelengths and our brains process the information into separate colours. Can you imagine what the world would look like if we could only see
very narrow ranges of wavelengths or colours? That is how many sensors work. The information from a narrow wavelength range is gathered and
stored in a channel, also sometimes referred to as a band. We can combine and display channels of information digitally using the three primary
colours (blue, green, and red). The data from each channel is represented as one of the primary colours and, depending on the relative brightness
(i.e. the digital value) of each pixel in each channel, the primary colours combine in different proportions to represent different colours. When we
use this method to display a single channel or range of wavelengths, we
are actually displaying that channel through all three primary colours. Because the brightness level of each pixel is the same for each primary
colour, they combine to form a black and white image, showing various shades of gray from black to white. When we display more than one
channel each as a different primary colour, then the brightness levels may be different for each channel/primary colour combination and
they will combine to form a colour image.
40
RS Resolution
•Temporal Resolution.
•Spatial Resolution.
•Spectral Resolution.
•Radiometric Resolution
41
the concept of temporal resolution is also important to consider in a remote sensing system. We
alluded to this idea in section
2.2 when we discussed the concept of revisit period, which refers to the length of time it takes
for a satellite to complete one entire orbit cycle. The revisit period of a satellite sensor is
usually several days. Therefore the absolute temporal resolution of a remote sensing system
to image the exact same area at the same viewing angle a second time is equal to this period.
However, because of some degree of overlap in the imaging swaths of adjacent orbits for
most satellites and the increase in this overlap with increasing latitude, some areas of the
Earth tend to be re-imaged more frequently. Also, some satellite systems are able to point
their sensors to image the same area between different satellite passes separated by
periods from one to five days. Thus, the actual temporal resolution of a sensor depends on a
variety of factors, including the satellite/sensor capabilities, the swath overlap, and latitude. The ability
to collect imagery of the same area of the Earth's surface at different periods of
time is one of the most important elements for applying remote sensing data. Spectral
characteristics of features may change over time and these changes can be detected by
collecting and comparing multi-temporal imagery. For example, during the growing season,
most species of vegetation are in a continual state of change and our ability to monitor those
subtle changes using remote sensing is dependent on when and how frequently we collect
imagery. By imaging on a continuing basis at different times we are able to monitor the
changes that take place on the Earth's surface, whether they are naturally occurring (such as
changes in natural vegetation cover or flooding) or induced by humans (such as urban
development or deforestation). The time factor in imaging is important when:
1. persistent clouds offer limited clear views of the Earth's surface (often in the tropics)
2. short-lived phenomena (floods, oil slicks, etc.) need to be imaged
3. multi-temporal comparisons are required (e.g. the spread of a forest disease from one year to the
next)
4. the changing appearance of a feature over time can be used to distinguish it from nearsimilar
features (wheat / maize)
42
Temporal Resolution
16 days
43
Spatial Resolution, Pixel Size, and Scale
For some remote sensing instruments, the distance between the target being imaged and the
platform, plays a large role in determining the detail of information obtained and the total area
imaged by the sensor. Sensors onboard platforms far away from their targets, typically view a
larger area, but cannot provide great detail. Compare what an astronaut onboard the space
shuttle sees of the Earth to what you can see from an airplane. The astronaut might see your
whole province or country in one glance, but couldn't distinguish individual houses. Flying over
a city or town, you would be able to see individual buildings and cars, but you would be
viewing a much smaller area than the astronaut. There is a similar difference between satellite
images and airphotos.
The detail discernible in an image is dependent on the
spatial resolution of the sensor and refers to the size of
the smallest possible feature that can be detected.
Spatial resolution of passive sensors (we will look at the
special case of active microwave sensors later) depends
primarily on their Instantaneous Field of View (IFOV).
The IFOV is the angular cone of visibility of the sensor (A)
and determines the area on the Earth's surface which is
"seen" from a given altitude at one particular moment in
time (B). The size of the area viewed is determined by
multiplying the IFOV by the distance from the ground to
the sensor (C). This area on the ground is called the
resolution cell and determines a sensor's maximum
spatial resolution. For a homogeneous feature to be
detected, its size generally has to be equal to or larger than the resolution cell. If the feature is
smaller than this, it may not be detectable as the average brightness of all features in that
resolution cell will be recorded. However, smaller features may sometimes be detectable if
their reflectance dominates within a articular resolution cell allowing sub-pixel or resolution cell
detection. As we mentioned in Chapter 1, most remote sensing images are composed of a matrix of
picture elements, or pixels, which are the smallest units of an image. Image pixels are normally square and represent a certain area on an image. It is important to
distinguish between pixel size and spatial resolution - they are not interchangeable. If a sensor has a
spatial resolution of 20 metres and an image from that sensor is displayed at full resolution,
each pixel represents an area of 20m x 20m on the ground. In this case the pixel size and
resolution are the same. However, it is possible to display an image with a pixel size different
than the resolution. Many posters of satellite images of the Earth have their pixels averaged to
represent larger areas, although the original spatial resolution of the sensor that collected the
imagery remains the same. Images where only large features are visible are said to have coarse or low resolution. In fine or high resolution images, small objects can be
detected. Military sensors for example, are designed to view as much detail as possible, and therefore have very fine resolution. Commercial satellites provide imagery
with resolutions varying from a few metres to several kilometres. Generally speaking, the finer the resolution, the less total ground area can be seen. The ratio of distance
on an image or map, to actual ground distance is referred to as scale. If you had a map with a scale of 1:100,000, an object of 1cm length on the map would actually be an
object 100,000cm (1km) long on the ground. Maps or images with small "map-to-ground ratios" are referred to as small scale (e.g. 1:100,000), and those with larger ratios
(e.g. 1:5,000) are called large scale.
44
45
Spatial domain
Spatial domain refers to the spatial scale of
observations being collected including the spatial
resolution and the area covered
46
Spatial resolution example
47
Low resolution image: High spatial resolution
where pixels are greater than the objects where pixels are smaller than the
in the image objects in the image
48
Spatial
Resolution
49
Spectral Resolution
50
Spectral
Resolution
51
In a multispectral image, individual bands can be
used to make individual images, which can then
be combined in various ways to bring out
particular details on the surface.
52
Multispectral/hyperspectral image
Ultraspectral
(1000’s of bands)
Hyperspectral
(100’s of bands)
Multispectral
(10’s of bands)
Panchromatic
IEEE International Geoscience and Remote Sensing Symposium (IGARSS 2011), Vancouver, Canada, July 24 – 29, 2011 2
Many electronic (as opposed to photographic) remote sensors acquire data using scanning
systems, which employ a sensor with a narrow field of view (i.e. IFOV) that sweeps over the
terrain to build up and produce a two-dimensional image of the surface. Scanning systems
can be used on both aircraft and satellite platforms and have essentially the same operating
principles. A scanning system used to collect data over a variety of different wavelength
ranges is called a multispectral scanner (MSS), and is the most commonly used scanning
system. There are two main modes or methods of scanning employed to acquire multispectral
image data - across-track scanning, and along-track scanning.
Across-track scanners scan the Earth in a
series of lines. The lines are oriented
perpendicular to the direction of motion of the
sensor platform (i.e. across the swath). Each line
is scanned from one side of the sensor to the
other, using a rotating mirror (A). As the
platform moves forward over the Earth,
successive scans build up a two-dimensional
image of the Earth´s surface. The incoming
reflected or emitted radiation is separated into
several spectral components that are detected
independently. The UV, visible, near-infrared, and thermal radiation are dispersed into their constituent
wavelengths. A bank of internal detectors (B), each sensitive to a specific range of wavelengths, detects and
measures the energy for each spectral band and then, as an electrical signal, they are converted to digital data
and recorded for subsequent computer processing. The IFOV (C) of the sensor and the altitude of the platform
determine the ground resolution cell viewed (D), and thus the spatial resolution. The angular field of view (E) is
the sweep of the mirror, measured in degrees, used to record a scan line, and determines the width of the
imaged swath (F). Airborne scanners typically sweep large angles (between 90º and 120º), while satellites,
because of their higher altitude need only to sweep fairly small angles (10- 20º) to cover a broad region. Because
the distance from the sensor to the target increases towards the edges of the swath, the ground resolution cells
also become larger and introduce geometric distortions to the images. Also, the length of time the IFOV "sees" a
ground resolution cell as the rotating mirror scans (called the dwell time), is generally quite short and influences
the design of the spatial, spectral, and radiometric resolution of the sensor.
53
Challenges in Hyperspectral Image Classification
Concept of hyperspectral imaging using NASA Jet Propulsion Laboratory’s Airborne Visible Infra-Red Imaging Spectrometer
IEEE International Geoscience and Remote Sensing Symposium (IGARSS 2011), Vancouver, Canada, July 24 – 29, 2011 1
54
Airborne Visible Infrared
Imaging Spectrometer
(AVIRIS) Datacube of
Sullivan’s Island Obtained
on October 26, 1998
55
Spectral Resolution
Color film
fine spectral resolution
Black and white film records wavelengths extending over much, or all of the visible portion of
the electromagnetic spectrum. Its spectral resolution is fairly coarse, as the various
wavelengths of the visible spectrum are not individually distinguished and the overall
reflectance in the entire visible portion is recorded. Colour film is also sensitive to the
reflected energy over the visible portion of the spectrum, but has higher spectral resolution,
as it is individually sensitive to the reflected energy at the blue, green, and red wavelengths
of the
spectrum. Thus, it can represent features of various colours based on their reflectance in
each
of these distinct wavelength ranges. Many remote sensing systems record energy over
several separate wavelength ranges at
various spectral resolutions. These are referred to as multi-spectral sensors and will be
described in some detail in following sections. Advanced multi-spectral sensors called
hyperspectral sensors, detect hundreds of very narrow spectral bands throughout the visible,
near-infrared, and mid-infrared portions of the electromagnetic spectrum. Their very high
spectral resolution facilitates fine discrimination between different targets based on their
spectral response in each of the narrow bands.
56
Spectral Bandwidths of SPOT and Landsat Sensor Systems
57
Radiometric Resolution
58
Radiometric Resolution
59
Radiometric Resolution
7-bit
0 (0 - 127)
8-bit
0 (0 - 255)
0 9-bit
(0 - 511)
10-bit
0 (0 - 1023)
60
Low High
Radiometric Resolution Radiometric
Resolution
61
62
63
It is useful to examine the image Histograms before performing any image enhancement. The x-axis of the histogram is
the range of the available digital numbers, i.e. 0 to 255. The y-axis is the number of pixels in the image having a given
digital number. The histograms of the three bands of this image is shown in the following figures.
64
IMAGE ENHANCEMENT TECHNIQUES
65
66
67
68
69
70
71
Apply histogram stretching for the following sub
image value
72
73
74
75
76
If We have an image with 3 bit /pixel, so the possible range of values is 0 to 7.
We have an image with the following histogram
77
78
79
Remote Sensing data
80
Satellite imagery
Low resolution satellite imagery is generally available for free, while high-resolution imagery must be purchased and
licensing usually restrict its distribution
Ocean monitoring
• Sea Surface Temperature, Chlorophyll-a (SeaWIFS), MODIS : NOAA/NASA (https://oceancolor.gsfc.nasa.gov)
Capricorn Seamount
81
Remote Sensing
Many remote sensing platforms are designed to follow an orbit (basically north-
south) which, in conjunction with the Earth's rotation (west-east), allows them to
cover most of the Earth's surface over a certain period of time. These are nearpolar
orbits, so named for the inclination of the orbit relative to a line running between the
North and South poles. Many of these satellite orbits are also sun-synchronous such
that they cover each area of the world at a constant local time of day called local sun
time. At any given latitude, the position of the sun in the sky as the satellite passes
overhead will be the same within the same season. This ensures consistent
illumination conditions when acquiring images in a specific season over successive
years, or over a particular area over a series of days. This is an important factor for
monitoring changes between images or for mosaicking adjacent images together, as
they do not have to be corrected for different illumination conditions.
Most of the remote sensing satellite platforms today are in near-polar orbits, which
means that the satellite travels northwards on one side of the Earth and then toward
the southern pole on the second half of its orbit. These are called ascending and
descending passes, respectively. If the orbit is also sunsynchronous, the ascending
pass is most likely on the shadowed side of the Earth while the descending pass is on
the sunlit side. Sensors recording reflected solar energy only image the surface on a
descending pass, when solar illumination is available. Active sensors which provide
82
their own illumination or passive sensors that record emitted (e.g. thermal) radiation
can also image the surface on ascending
As a satellite revolves around the Earth, the sensor "sees" a certain portion of the
Earth's surface. The area imaged on the surface, is referred to as the swath. Imaging
swaths for spaceborne sensors generally vary between tens and hundreds of
kilometres wide. As the satellite orbits the Earth from pole to pole, its east-west
position wouldn't change if the Earth didn't rotate. However, as seen from the Earth,
it seems that the satellite is shifting westward because the Earth is rotating (from
west to east) beneath it. This apparent movement allows the satellite swath to cover
a new area with each consecutive pass. The satellite's orbit and the rotation of the
Earth work together to allow complete coverage of the Earth's surface, after it has
completed one complete cycle of orbits.
82
Remote Sensing
Landsat
83
Remote Sensing
SPOT
84
Remote Sensing
Landsat 1 to 8 - TM
85
Remote Sensing
SPOT
Blue 6m 0.40-0.52 µm
86
Remote Sensing
SPOT
87
Remote Sensing
Landsat 7 - ETM
88
Remote Sensing
Landsat 8 - OLI & TIRS
89
Sensing in the Visible and NIR
http://landsat.gsfc.nasa.gov/wp-content/uploads/2015/06/Landsat.v.Sentinel-2.png
90
Remote Sensing
ASTER
Band Wavelength (µm) Resolution (m) Description
VNIR_Band1 0.520–0.600 15 Nadir Green
VNIR_Band2 0.630–0.690 15 Nadir Red
VNIR_Band3N 0.760–0.860 15 Nadir
Near infrared
VNIR_Band3B 0.760–0.860 15 Backward
SWIR_Band4 1.600–1.700 30 Nadir
SWIR_Band5 2.145–2.185 30 Nadir
SWIR_Band6 2.185–2.225 30 Nadir
Short-wave infrared
SWIR_Band7 2.235–2.285 30 Nadir
SWIR_Band8 2.295–2.365 30 Nadir
SWIR_Band9 2.360–2.430 30 Nadir
TIR_Band10 8.125–8.475 90 Nadir
TIR_Band11 8.475–8.825 90 Nadir
TIR_Band12 8.925–9.275 90 Nadir Long-wave infrared
TIR_Band13 10.250–10.950 90 Nadir
TIR_Band14 10.950–11.650 90 Nadir
91
92
Remote Sensing
93
Remote Sensing
ASTER vs. Landsat ETM
94
Remote Sensing
High Resolution Satellites
• GeoEye (US)
• Ikonos
• OrbView
• GeoEye
• DigitalGlobe (US)
• QuickBird
• WorldView
• Astrium (EU/France)
• Pléiades
• BlackBridge (Germany)
• RapidEye
• ImageSat (Israel)
• EROS A & B
• etc. etc.
95
Remote Sensing
Example: GeoEye
96
Remote Sensing data avilable for free download
• USGS EarthExplorer
• http://earthexplorer.usgs.gov
• USGS Global Visualization Viewer
• http://glovis.usgs.gov
• Global Land Cover Facility (GLCF)
• http://glcf.umd.edu
• ITC's database of Satellites and Sensors
• http://www.itc.nl/research/products/sensordb/AllSatellites.aspx
97
Image acquisition
As there is no reception station in the South Pacific, image acquisition must be scheduled ahead, stored on-board
and transferred when above a receiving station.
As a result there are not many images available for a given area, and they might be cloudy
Transfer picture
Schedule acquisition
Acquire picture
Capricorn Seamount
98
Image acquisition
Imagery is acquired along an orbit, and the desired area may overlap several orbits
99
Obtaining free imagery
For freely available imagery (Landsat, Sentinel), you just browse through the catalogue of archived images and select
one with not too many clouds above the area of interest (using quicklooks as guidance). Then you submit the
request and receive a link to download the image. An image would typically be around 1 GB
100
Ordering commercial imagery
For high resolution imagery, you create a polygon for the area of interest and query available imagery for that
polygon. Because the image is billed per sq km, you only select the area you need, yet there are restrictions on the
shape of the polygon and a minimum area for the order (25 km² for WorldView imagery for example)
101
Image mosaic
Because of paths and cloud cover, a combination (mosaic) of images taken at different times is generally necessary to
cover a big area
102
Satellite sensors
Imagery is the output of the satellite sensors, at a given spatial resolution with bands corresponding to a range of
visible, infrared (or radar wavelengths for radar satellites)
Capricorn Seamount
103
Radar imagery
Water is opaque to radar wavelengths. Radar imagery can be used to detect boats and often impractical for
coastal applications
Boats
Capricorn Seamount
104
Multispectral imagery
Satellite Resolution Bands
Landsat-8 30m/15m Pan+8 MS +TIR
Sentinel-2 10m-60m 12 bands
IKONOS 4m/1m Pan+4 MS (defunct)
QuickBird 3m/.7m Pan+4 MS (destroyed)
GeoEye-1 1.8m/.5m Pan+4 MS
WorldView-1 .6m Pan
WorldView-2 2m/.5m Pan+8 MS
WorldView-3 1.2m/.3m Pan+8 MS +SWIR+CAVIS
Capricorn Seamount
105
Hyperspectral imagery
Airborne (AVIRIS, CASI) and satellite (EO-1, HySIS)
Capricorn Seamount
106
Sea surface temperature
https://www.ospo.noaa.gov/Products/ocean/sst/50km_night/index.html
Capricorn Seamount
107
Winds
https://manati.star.nesdis.noaa.gov/datasets/ASCATData.php
Capricorn Seamount
108
Satellite bands
WorldView-2 : MSS bands 1-8 have a resolution of 2m, while panchromatic is 50 cm
109
Pan-sharpening
Capricorn Seamount
Pan-sharpened 4,3,2 Resolution .5m
110
Image rectification
Capricorn Seamount
111
Image rectification
112
Image rectification
113
Image classification: isodata
Capricorn Seamount
114
Image classification: k-means
Capricorn Seamount
115
Image classification
Image classification can be supervised or unsupervised
Capricorn Seamount
116
Image classification: k-means
Compare the classified image with the original image
Capricorn Seamount
117
Image classification: raster to vector
Set the symbology and remove classes for deep water
Capricorn Seamount
118