0% found this document useful (0 votes)
70 views100 pages

REMOTE Final

The document outlines the course objectives and fundamental concepts of Geographic Information Systems (GIS) and Remote Sensing, emphasizing their importance in data acquisition, processing, and analysis. It discusses the principles of remote sensing, including the electromagnetic spectrum, types of sensors, and the classification of remote sensing systems. Additionally, it covers key concepts such as resolution types and their significance in interpreting spatial data.

Uploaded by

habte
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views100 pages

REMOTE Final

The document outlines the course objectives and fundamental concepts of Geographic Information Systems (GIS) and Remote Sensing, emphasizing their importance in data acquisition, processing, and analysis. It discusses the principles of remote sensing, including the electromagnetic spectrum, types of sensors, and the classification of remote sensing systems. Additionally, it covers key concepts such as resolution types and their significance in interpreting spatial data.

Uploaded by

habte
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 100

Course objective

• Course Objective: up on successful


completions of this course, students will be
able to Understand the basic concept of
GIS and Remote sensing in relation to its
data Acquisition, organization, storage,
processing, interpretation and Analysis.
Introduction
to Remote Sensing & GIS
Cont

• What is GIS?

• The Importance of Remote Sensing in


GIScience.

• RS Methods Used in GIS Data


Acquisition.
Data vs.
Information
• Data, by itself, generally differs from
information.
• Data is of little use unless it is
transformed into information.
• The data is transformed in to information
through the use of an Information System.
What is an Information System?
What is a GIS?

Information System
A means of storing,
retrieving, sorting,
and comparing
+
spatial data
Geographic Position
to support some
analytic process.
Cont
• Geographic Information System Allows the viewing and analysis of
multiple layers of spatially related information associated with a
geographic region/location.

• Both spatial and attribute (tabular) data are


integrated.

• Information has always been the cornerstone


of effective decisions.

• Spatial information is particularly complex


as it requires two descriptors—Where is what.
What is a GIS?
• An integration of five basic components
GIS functions
The Importance of RS
• Large amounts of data needed, and Remote Sensing can provide it

• Reduces manual field work dramatically

• Allows retrieval of data for regions difficult or impossible to reach:


– Open ocean
– Hazardous terrain (high mountains, extreme weather areas, etc.)
– Ocean depths
– Atmosphere

• Allows for the collection of much more data in a shorter amount of


time

• Digital Imagery greatly enhances a GIS


– DIRECTLY: Imagery can serve as a visual aid
– INDIRECTLY: Can serves as a source to derive information such as…
• Land use/land cover
• Vegetation
• Water bodies
• Change detection
Remote Sensing:
Remote Sensing is the acquisition and measurement
of data/information on some property (ies) of a
phenomenon, object, or material by a recording
device not in physical, intimate contact with the
feature(s) under surveillance.

The art and science of obtaining information about


an object without physically contact between the
object and sensor.
Remote sensing cycle

• Remote Sensing Includes:

A) The mission plan and choice of sensors.

B) The reception, recording, and processing of the


signal data.

C) The analysis of the resultant data.


Remote Sensing
Electromagnetic energy
• Electromagnetic energy refers to all energy that moves
with the velocity of light in a harmonic wave pattern.

• The word harmonic implies that the component waves


are equally and repetitively spaced in time.

• The wave concept explains the propagation of


Electromagnetic energy, but this energy is detectable
only in terms of its interaction with matter.
Cont
Electromagnetic waves can be described in terms of their

• Velocity: The speed of light.

• Wavelength: the distance from any position in a cycle to


the same position in the next cycle, measured in the
standard metric system. Two units are usually used: the
micrometer and the nanometer.

• Frequency: the number of wave crests passing a given


point in specific unit of time, with one hertz being the unit
for a frequency of one cycle per second.
Cont

• Wavelength and frequency are related by


the following formula:

Velocity=wavelength*frequency
The Electromagnetic Spectrum

• The electromagnetic spectrum ranges from


the shorter wavelengths (including gamma and
x-rays) to the longer wavelengths (including
microwaves and broadcast radio waves).

• There are several regions of the electromagnetic


spectrum which are useful for remote sensing.
Cont
• For most purposes, the ultraviolet or UV
portion of the spectrum has the shortest
wavelengths which are practical for remote
sensing.

• This radiation is just beyond the violet portion of


the visible wavelengths.

• Some Earth surface materials, primarily rocks


and minerals emit visible light when illuminated
by UV radiation.
Cont

• The light which our eyes - our "remote sensors" can detect is part of the
visible spectrum.

• It is important to recognize how small the visible portion is relative to the rest
of the spectrum.

• There is a lot of radiation around us which is "invisible" to our eyes, but can
be detected by other remote sensing instruments and used to our
advantage.

• The visible wavelengths cover a range from approximately 0.4 to 0.7µm.


The longest visible wavelength is red and the shortest is violet.
Cont

• Violet: 0.4 - 0.446µm

• Blue:0.446 - 0.500µm

• Green:0.500 - 0.578µm

• Yellow:0.578 - 0.592µm

• Orange:0.592 - 0.620µm

• Red:0.620 - 0.7µm
Interactions with the Atmosphere

• Before radiation used for remote sensing


reaches the Earth's surface it has to travel
through some distance of the Earth's
atmosphere.

• Particles and gases in the atmosphere can


affect the incoming light and radiation. These
effects are caused by the mechanisms of
scattering and absorption.
Scattering
• Scattering occurs when particles or large gas
molecules present in the atmosphere interact
with and cause the electromagnetic radiation to
be redirected from its original path.

• How much scattering takes place depends on


several factors including the wavelength of the
radiation ,the abundance of particles or gases,
and the distance the radiation travels through
the atmosphere.
Cont

There are three types of scattering


Rayleigh scattering occurs when particles are
very small compared to the wavelength of the
radiation. These could be particles such as small
specks of dust or nitrogen and oxygen
molecules.

• Rayleigh scattering causes shorter wavelengths


of energy to be scattered much more than
longer wavelengths.
Rayleigh scattering

• Rayleigh scattering is the dominant scattering


mechanism in the upper atmosphere.

• The fact that the sky appears "blue" during the


day is because of this phenomenon. As sunlight
passes through the atmosphere, the shorter
wavelengths (i.e. blue) of the visible spectrum
are scattered more than the other (longer)
visible wavelengths.
Mie scattering

• Mie scattering occurs when the particles


are just about the same size as the
wavelength of the radiation. Dust, pollen,
smoke and water vapour are common
causes of Mie scattering.

• Mie scattering occurs mostly in the lower


portions of the atmosphere where larger
particles are more abundant.
Nonselective scattering

Nonselective scattering
• This occurs when the particles are much larger than the
wavelength of the radiation. Water droplets and large
dust particles can cause this type of scattering.

• Nonselective scattering gets its name from the fact that


all wavelengths are scattered about equally.

• This type of scattering causes fog and clouds to


appear white to our eyes because blue, green, and
red light are all scattered in approximately equal
quantities (blue+green+red light = white light).
Basic Principles of Remote Sensing
The process of remote sensing
12
65

C 28
33
76
E

D
A
B
A. Radiation and C. Energy
the atmosphere recorded and
converted by E. Interpretation
B. Interaction with
sensor
D. Reception and and analysis
target
processing Text by the Canadian Centre for Remote Sensing
Elements of Remote Sensing:
Energy Source or Illumination (A) –

The first requirement for remote sensing is to have an


energy source which illuminates or provides
electromagnetic energy to the target of interest.

Radiation and the Atmosphere (B) –


As the energy travels from its source to the target, it will
come in contact with and interact with the atmosphere it
passes through.

This interaction may take place a second time as the


energy travels from the target to the sensor.
Interaction with the Target (C) –

Once the energy makes its way to the target through the
atmosphere, it interacts with the target depending on the
properties of both the target and the radiation.

Recording of Energy by the Sensor (D) --


After the energy has been scattered by, or emitted from
the target, we require a sensor (remote - not in contact
with the target) to collect and record the electromagnetic
radiation.
Transmission, Reception, and Processing (E)

The energy recorded by the sensor has to be


transmitted, often in electronic form, to a receiving and
processing station where the data are processed into an
image (hardcopy and/or digital).

Interpretation and Analysis (F) –


The processed image is interpreted, visually and/or
digitally or electronically, to extract information about
the target which was illuminated.
Application (G) –
The final element of the remote sensing process is
achieved when we apply the information we have been
able to extract from the imagery about the target in
order to better understand it, reveal some new
information, or assist in solving a particular problem.
Classification of Remote Sensing Systems:
It is mentioned that remote sensing is based on reflected
and/or emitted energy. This energy may come from
different sources. Based on the sources of energy
remote sensing systems can be classified in to
distinctions:

Passive Systems

Active Systems
Passive systems
• The sun provides a very convenient source
of energy for remote sensing.

• The sun's energy is either reflected, as it is


for visible wavelengths, or absorbed and
then reemitted, as it is for thermal infrared
wavelengths.

• Remote sensing systems which measure


energy that is naturally available are called
passive sensors.
Passive Remote sensing System
Active Systems
• Active sensors, on the other hand, provide their own
energy source for illumination.

• The sensor emits radiation which is directed toward


the target to be investigated. The radiation reflected
from that target is detected and measured by the
sensor.

• Advantages for active sensors include the ability to


obtain measurements anytime, regardless of the time
of day or season.
Active Systems

• Active sensors can be used for


examining wavelengths that are not
sufficiently provided by the sun, such
as microwaves, or to better control the
way a target is illuminated.

• However, active systems require the


generation of a fairly large amount of
energy to adequately illuminate targets.
Cont…
Some examples of active sensors are a laser fluorosensor
and a Synthetic Aperture Radar (SAR).

Active remote Sensing


System
Platforms of Remote Sensing
There is a substantial distance between the sensor and
the earth’s surface features. This means that remote
sensing sensors should be hold aloft while imaging. The
major platforms used in remote sensing are:

ØOn the Ground


ØAircraft
ØSatellites
Space borne Platforms

• In space, remote sensing is sometimes conducted from


satellites.

• Satellites are objects which revolve around another


object - in this case, the Earth.

• For example, the moon is a natural satellite, whereas


man-made satellites include those platforms launched
for remote sensing, communication, and telemetry.
(location and navigation) purposes.
Sun Synchronous orbit

• An orbit chosen in such a way that the satellite always


overhead at the sun local time is called Sun-
synchronous.

• Most sun-synchronous orbits cross the equator at


midmorning( around 10:30h). At that moment the Sun
angle is low and the resultant shadows reveal terrain
relief.

• Sun synchronous orbits allow a satellite to record


images at two fixed times during one 24 hour period: one
during the day and one at night.

• Examples of near polar sun-synchronous satellites are


Landsat, SPOT, and IRS.
Geostationary Orbit:

• This refers to orbits in which the satellite is placed


above the equator (inclination angle is 0º) at a distance
of some 36, 000 km.

• At this distance, the period of the satellite is equal to


the period of the Earth.

• The result is that the satellite is at a fixed position


relative to the Earth.

• Geostationary orbits are used for meteorological and


telecommunication satellites.
Geo-stationary orbit.
Fundamental term

Resolution
• All remote sensing systems have four types of
resolution:

– Spatial

– Spectral

– Temporal

– Radiometric
Resolution
• “Resolution” in remote sensing is the ability of a sensor to distinguish
or resolve objects that are physically near or spectrally similar to other
adjacent objects.

• High resolution will allow a user to distinguish small, adjacent targets,


while objects and their boundaries will be difficult to pinpoint in images
with low resolution.
Spatial Resolution
• Spatial resolution is a measure of the smallest
separation between two objects that can be resolved
by the sensor.

• Spatial resolution is best described by the size of an


image pixel.

• Large area covered by a pixel means low spatial


resolution and vice versa.


Cont
Spectral band

• A spectral band of a remote sensor is an


interval of the EM spectrum for which the
average radiance is measured.

Examples:
• Single band: panchromatic camera, radar
sensor, laser scanner.
• Multi-band: Multi spectral sensors.
Spectral Resolution

• Is the ability to resolve spectral features and bands into their


separate components.

• More number of bands in a specified bandwidth means higher


spectral resolution and vice versal .

• Spectral resolution is the size and number of wavelengths


intervals or divisions of the spectrum that a system is able to
detect.

• A digital sensor that collects data in different portions of the EM


spectrum is called a multi-spectral sensor.
Cont

Spectral resolution describes the ability


of a sensor to define fine wavelength
intervals. The finer the spectral
resolution, the narrower the wavelength
ranges for a particular channel or band.
Temporal Resolution

• Frequency at which images are recorded/


captured in a specific place on the earth.
• The more frequently it is captured, the better or
finer the temporal resolution is said to be.
• For example, a sensor that captures an image of
an agriculture land twice a day has better
temporal resolution than a sensor that only
captures that same image once a week.
Cont
The ability to collect imagery of the same area of the Earth's
surface at different periods of time is one of the most important
elements for applying remote sensing data.

Spectral characteristics of features may change over time and


these changes can be detected by collecting and comparing
multi-temporal imagery.

For example, during the growing season, most species of


vegetation are in a continual state of change and our ability to
monitor those subtle changes using remote sensing is
dependent on when and how frequently we collect imagery.
Radiometric Resolution

• Sensitivity of the sensor to the magnitude


of the received electromagnetic energy
determines the radiometric resolution.

• Finer the radiometric resolution of a


sensor, if it is more sensitive in detecting
small differences in reflected or emitted
energy.
Cont

• Advanced multi-spectral sensors called


hyper-spectral sensors, detect hundreds of
very narrow spectral bands throughout the
visible, near-infrared, and mid-infrared
portions of the electromagnetic spectrum.

• Very high spectral resolution facilitates fine


discrimination between different targets
based on their spectral response in each of
the narrow bands.
Cont
• While the arrangement of pixels describes the
spatial structure of an image, the radiometric
characteristics describe the actual information
content in an image.

• The radiometric resolution of an imaging system


describes its ability to discriminate very slight
differences in energy The finer the radiometric
resolution of a sensor, the more sensitive it is to
detecting small differences in reflected or emitted
energy.
Cont
• Spatial resolution for some remote sensing instruments, the
distance between the target being imaged and the platform,
plays a large role in determining the detail of information
obtained and the total area imaged by the sensor.

• Sensors onboard platforms far away from their targets,


typically view a larger area, but cannot provide great detail.

• Different classes of features and details in an image can often


be distinguished by comparing their responses over distinct
wavelength ranges.
Comparison of Satellite image and aerial photograph
Cont

• Images where only large features are visible


are said to have coarse or low resolution.

• In fine or high resolution images, small


objects can be detected.

• Generally speaking, the finer the resolution,


the less total ground area can be seen.
Spectral response
• Spectral response is a characteristic used to identify
individual objects present on an image or
photograph.

• Spectral response and spectral emissivity curves


which characterize the reflectance and/or emittance
of a feature or target over a variety of wavelengths.

• Different classes of features and details in an image


can often be distinguished by comparing their
responses over distinct wavelength ranges.
Cont
• Broad classes, such as water and vegetation, can
usually be separated using very broad wavelength
ranges - the visible and near infrared.

• Other more specific classes, such as different rock


types, may not be easily distinguishable using either
of these broad wavelength ranges and would require
comparison at much finer wavelength ranges to
separate them.

• Thus, we would require a sensor with higher spectral


resolution.
Wavelength Bands
Visual image interpretation
Visual interpretation

• Spontaneous recognition.
Identification at first glance.
• Logical inference.
Converging evidences.
Associative thinking.
• External information.
Additional data by fieldwork.
Interpretation elements

Terms to express characteristics of an image


• Tone/brightness and Hue/colour
• Texture
• Pattern
• Shape
• Size
• Height (by stereo or shadow)
• Location/Association
Digital image classification
lecture topics

• basic concepts of pixel-based classification

• Review of principal terms (Image space vs. feature space)

• Decision boundaries in feature space

• Unsupervised vs. supervised classification

• Training of classifier

• Classification algorithms available

• Validation of results
Multispectral classification

What is it ?
• grouping of similar features
• separation of dissimilar ones
• assigning class label to pixels
• resulting in manageable size of classes
Cont
What are the advantages of using image classification?

• We are not interested in brightness values, but in


thematic characteristics

• To translate continuous variability of image data


into map patterns that provide meaning to the user

• To obtain insight in the data with respect to ground


cover and surface characteristics
Cont
Why use it?

• Cost efficient in the analyses of large data sets

• Results can be reproduced More objective then visual interpretation

• Effective analysis of complex multi band (spectral) interrelationships

• Classification achieves data size reduction

• Together with manual digitizing and photogrammetric processing


(for map making), classification is the most commonly used image
processing technique.
Supervised vs. unsupervised
classification
UNSUPERVISED APPROACH
• Considers only spectral distance measures
• Minimum user interaction
• Requires interpretation after classification
• Based on spectral groupings

SUPERVISED APPROACH
• Incorporates prior knowledge
• Requires a training set (samples)
• Based on spectral groupings
• More extensive user interaction
Validation-terminology
User accuracy:
• Probability that a certain reference class has also been labelled as
that class. In other words it tells us the likelihood that a ,pixel
classified as a certain class actually represents that class (57% of
what has been classified as A is A).

Producer accuracy:
• Probability that a reference pixel on a map is that particular class It
indicates how well the reference pixels for that class. It indicates
how well the reference pixels for that class have been classified
(66% of the reference pixels A were classified as A)

Kappa statistic:
• Takes into account that even assigning labels at random has a
certain degree of accuracy. Kappa allows to detect if 2 datasets
have a statistically different accuracy.
Cont
• The error matrix provides information on the
overall accuracy = proportion correctly classified
(PCC)

• PCC tells about the amount of error, not where


the errors are located

• PCC = Sum of the diagonal elements/total


number of sampled pixels for accuracy
assessment

You might also like