0% found this document useful (0 votes)
66 views128 pages

FRS Odd

This document discusses the fundamentals of remote sensing including definitions, components, types, uses and applications. It covers key topics such as the electromagnetic spectrum, spatial data acquisition, and electromagnetic energy and its relationship to remote sensing.

Uploaded by

Addaa Wondime
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views128 pages

FRS Odd

This document discusses the fundamentals of remote sensing including definitions, components, types, uses and applications. It covers key topics such as the electromagnetic spectrum, spatial data acquisition, and electromagnetic energy and its relationship to remote sensing.

Uploaded by

Addaa Wondime
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 128

Fundamental of Remote

Sensing GeES 3081

Oda Bultum University


CSSH, Geo Dpt’
Course Instructor: Sisay Getahun
GIS and Remote sensing (MSc)
course contents:
 Introduction: Concepts and definition of
remote sensing

 Electromagnetic energy and remote sensing

Aerial imagery, Satellite imagery, and

Platforms
 Image enhancement and classification
 Visual image interpretation
 GPS and remote sensing
Unit One: Introduction

1.1 Definition of remote sensing


What is Remote Sensing? In fact, any information
acquired from the object without touching is Remote
Sensing.
• Remote sensing is the science (and to some
extent, art) of acquiring information about the
Earth's surface without actually being in contact
with it. This is done by sensing and recording
reflected or emitted energy.
Cont’
• It is also defined by other scholars as “ RS is the
science of acquiring, processing and interpreting
images that record the interaction between
electromagnetic energy and matter”.
Components in Remote Sensing

1. Platform: The vehicle which


carries a sensor. i.e. satellite,
aircraft, balloon, etc...
2. Sensors: Device that receives
EMR and converts it into a
signal and displayed as either
numerical data or an image.
Types of Remote Sensing
1. Passive sensor: Remote sensing systems which measure energy
that is naturally available.
• For all reflected energy, this can only take place during the time
when the sun is illuminating (lighting) the Earth. There is no
reflected energy available from the sun at night. E.g. Thermal
infrared, Landsat.
Sun
Sensor

EME

Earth Sun
surface
Cont’
2. Active sensor: provides their own energy source for
illumination. The radiation reflected from that target is detected
and measured by the sensor.
• Advantages for the ability to obtain measurements any time,
day and night or any seasons. E.g. Laser, SAR, RadarSAT-1
& RadarSAT-2, and LIDAR.
1.2 Uses and applications
 Remote Sensing is used for:

• Its provides spatial data: Images or map data.

• Its provides surface information: Estimate surface and

subsurface characteristics.

• Its the only way to do Accessibility: Size of area to be covered

with consistent measurements (global monitoring).

• Its provides multipurpose data: mapping of vegetation,


infrastructure, air pollution etc.

• Its cost effective: Compare cost of Extensive ground based data

collection with remote sensing based data acquisition.


Application of remote sensing technology
• Remote sensing involves the interaction between
incident radiation and targets of interest that
requires seven specific elements.
Cont’

1. Energy Source or Illumination (A) – the first requirement


for remote sensing is to have an energy source which
illuminates or provides electromagnetic energy to the target
of interest.
2. Radiation and the Atmosphere (B) – as the energy travels
from its source to the target, it will come in contact with and
interact with the atmosphere it passes through. This
interaction may take place a second time as the energy
travels from the target to the sensor.
Cont’
3. Interaction with the Target (C) - once the energy makes its way to the

target through the atmosphere, it interacts with the target depending on the

properties of both the target and the radiation.

4. Recording of Energy by the Sensor (D) - after the energy has been

scattered by, or emitted from the target, we require a sensor (remote - not in

contact with the target) to collect and record the electromagnetic radiation.

5. Transmission, Reception, and Processing (E) - the energy recorded by

the sensor has to be transmitted, often in electronic form, to a receiving and

processing station where the data are processed into an image (hardcopy

and/or digital).
Cont’
6. Interpretation and Analysis (F) - the processed image
is interpreted, visually and/or digitally or electronically, to
extract information about the target which was illuminated.

7. Application (G) - the final element of the remote sensing


process is achieved when we apply the information we have
been able to extract from the imagery about the target in
order to better understand it, reveal some new information,
or assist in solving a particular problem.
1.3 Spatial data acquisition
• Geospatial data is acquired through Earth observation.
• Earth observation is gathering of information about physical,
chemical, biological, geometrical properties of our planet; it
helps us to assess the status and monitor changes of the natural
and cultural environment.
• Purpose of Earth Observation:-

- mapping,
- monitoring Natural and manmade
environment
- forecasting
 Spatial data acquisition from remote sensing in
the form of images.

band 321

band 432

band 352
Classification of Earth Observation

• The general classification of earth observation are


divided in to two methods that are Ground Based and
Remote Sensing Based
I. Ground-based methods: such as making field
observations, taking in situ measurements and
performing land surveying.
Cont’
II. Remote sensing methods: which are based on the
use of image data acquired by a sensor such as aerial
cameras, scanners or a radar.
 Taking a remote sensing approach means that
information is derived from the image data
Chapter Two
2. Electromagnetic energy and remote sensing
 Electromagnetic energy is considered to propagate
through space in the form of sine waves.

 The first requirement for remote sensing is to have an


energy source to illuminate the target. This energy is
measurement of electromagnetic radiation.
 Electromagnetic energy has two oscillating components;
electrical energy and magnetic energy.
• Electromagnetic radiation consists of two fields that are: an
electrical field(E) which varies in magnitude in a direction
perpendicular to the direction in which the radiation is traveling,
and a magnetic field (M) oriented at right angles to the electrical
field. Both these fields travel at the speed of light (c).
Cont’

• Two characteristics of electromagnetic radiation are


particularly important for understanding remote sensing that
are the Wavelength and frequency.

I, Wavelength:- the length in (m) of one wave cycle measured as


the distance between two successive wave crests.

• Wavelength is usually represented by Greek letter lambda (λ).

• Wavelength is measured in meters (m) or some factor of


meters such as :-
Cont’

II, Frequency:- refers to the number of cycles of


a wave passing a fixed point per unit of time.

• Frequency is normally measured in hertz (Hz),


equivalent to one cycle per second, and various
multiples of hertz.
Models of EME measurement

-There are two models by which


electromagnetic energies are measured: Wave
model and Particle model.

1.Wave model:- In this model wavelength and


frequency are related by the following formula:
c = λv Where:
c = speed of light (3x108 m/s)
λ = wavelength (M)
v = frequency(Hz)
2. Particle model (Quantum theory ):-
According to this theory
• Light travels as discrete particles (“photons”)
• Photon energy (Q) and frequency (v) have positive
relationship.
Q = hv or Q = hc/ λ
Where
Q = energy of a quantum (or photon), in Jouele (J)
h = Planck’s costant (6.626 x 10-34 J sec)
v = frequency = c/λ

• The energy of a quantum is inversely proportional to its


wavelength (λ)
Electromagnetic spectrum

• Electromagnetic Spectrum: ranges from the


shorter wavelengths (including cosmic,
gamma and x-rays) to the longer wavelengths
(including microwaves and broadcast radio
waves).
The electromagnetic spectrum ranges from the shorter
wavelengths (GR, XR) to the longer wavelengths
(microwaves, radio waves )
Visible: 0.4 - 0.7 µm
Near infrared: 0.7 - 1.3 µm
Middle infrared: 1.3 - 3 µm
Thermal infrared: 3 - 100 µm
Microwaves: 1 mm - 1 m

shorter Longer
wavelengths wavelengths
High frequency
Low frequency
High energy
Low energy
Cont’

• Illustrated below is the portion of the electromagnetic


spectrum that is useful in remote sensing of the
Earth’s surface.

Figure ----: Electromagnetic spectrum which are useful for remote sensing
Ultraviolet (UV) light has shorter
wavelengths than visible light

The ultraviolet has the shortest


wavelengths which are practical for
remote sensing

• Some Earth surface materials, primarily rocks and minerals,


fluoresce or emit visible light when illuminated by UV
radiation.
• Astronomers have to put ultraviolet telescopes on satellites
to measure the ultraviolet light from stars and galaxies
The visible wavelengths cover a range
Cont’
from approximately 0.4 to 0.7 µm
• Most sensors operate
The longest in thewavelength
visible visible,is Red
infrared
and
the shortest is Violet
and micro wave regions of the spectrum as
Red: 0.620 - 0.7 µm
illustrated the above figure.
Orange: 0.592 - 0.620 µm
Yellow: 0.578 - 0.592 µm
Green: 0.500 - 0.578 µm
Blue: 0.446 - 0.500 µm
Violet: 0.4 - 0.0446 µm

The visible portion can be shown in its


component colours when sunlight is passed
through a prism

Blue, green, and red are the primary


colours all other colours can be formed by
combining blue, green, and red in various
proportions
The next portion of the spectrum of interest is
the infrared (IR) region which covers the
wavelength range from approximately 0.7 µm
to 100 µm
The infrared region can be divided into two
categories based on their radiation properties:

1. Reflected IR: it is used for remote sensing


purposes in ways very similar to radiation in
the visible portion

The reflected IR covers wavelengths from


approximately 0.7 µm to 3.0 µm

2. Thermal IR: its is quite different than the visible and reflected IR
portions, as this energy is essentially the radiation that is emitted from
the Earth's surface in the form of heat.

The thermal IR covers wavelengths from approximately 3.0 µm to 100 µm


• The portion of the spectrum of
more recent interest to remote
sensing is the microwave region
from about 1 mm to 1 m.

• This covers the longest


wavelengths used for remote
sensing longer wavelength
approach

• The shorter wavelengths have


properties similar to the thermal
infrared region while the wavelengths
used for radio broadcasts.
The microwave region of the spectrum is
quite large, relative to the visible and infrared,
and there are several wavelength ranges or
bands commonly used which given code letters
during World War II, and remain to this day.
 Ka, K, and Ku bands: very short
wavelengths used in early airborne radar
systems but uncommon today.
 X-band: used extensively on airborne
systems for military reconnaissance and
terrain mapping.
 C-band: common on many airborne
research systems (CCRS Convair-580 and
NASA AirSAR) and spaceborne systems
(including ERS-1 and 2 and RADARSAT).
 S-band: used on board the Russian
ALMAZ satellite.
 L-band: used onboard American SEASAT
and Japanese JERS-1 satellites and NASA
airborne system.
 P-band: longest radar wavelengths, used
on NASA experimental airborne research
system.
2.1 Energy interaction in the Atmosphere
• The most important source of energy is the Sun.

• Before the Sun’s energy reaches the Earth’s surface, three


fundamental interactions in the atmosphere are possible:

(1) absorption, (2) transmission and (3)scattering.

1. Absorption:- Electromagnetic energy travelling through the


atmosphere is partly absorbed by various molecules.

• The most efficient absorbers of solar Radiation in the


atmosphere are ozone (O3), water vapor (H2O) and carbon
dioxide (CO2).
Atmospheric transmission windows

• About half of the spectrum between 0–22μm is useless for remote sensing
of the Earth’s surface, simply because energy can not penetrate the
atmosphere due to (O3), (H2O) and (CO2) present in the atmosphere.

2.Transmission: On the above wavelength, only the spectrum portions outside the main
Absorption ranges of the atmospheric gases can be used for remote sensing.

• The useful ranges are referred to as the Atmospheric transmission windows. E.g. The

window from 0.4μmto 2μm. The radiation in this range (visible, NIR, SWIR) is mainly

reflected energy.
3. Atmospheric scattering: occurs when the particles or gaseous molecules

present in the atmosphere cause the EM radiation to be redirected from its

original path.

 The amount of scattering depends on several factors including:-

 The wavelength of the radiation,

 The amount of particles and gases and

 the distance the radiant energy travels through the atmosphere.

 Types of Scattering:-

i. Rayleigh scattering

ii. Mie Scattering

iii. Non Selective Scattering


I. Rayleigh scattering
• Rayleigh scattering: occur when electromagnetic
radiation interacts with particles are smaller than the
wave length of the incoming light.
• Examples of these particles are tiny specks of dust and
nitrogen (N2) and oxygen (O2) molecules.
• The effect of Rayleigh scattering is inversely
proportional to the wavelength: shorter wavelengths
are scattered more than longer wavelengths.
Effect of scattering on our vision of the sky

• In the absence of particles and scattering, the sky would appear black.

• At day time the solar energy travels the shortest distance through the

atmosphere;

• Rayleigh scattering causes a clear sky to be observed as blue.

• At sun rise and sunset, the sunlight travels a longer distance through

the Earth’s atmosphere before reaching us.

• All the radiation of shorter wavelengths is scattered after some

distance and only the longer wavelengths reach the Earth’s surface. As

a result we do not see a blue but an orange or red sky.


Cont’
II. Mie scattering
• Mie scattering: occurs when the wavelength of the incoming
radiation is similar in size to the atmospheric particles.

• The most important cause of Mie scattering are the aerosols: a


mixture of gases, water vapor and dust.

• Mie scattering is generally restricted to the lower atmosphere


where larger particles are more abundant, and dominates
under overcast cloud conditions.

• Mie scattering influences the entire spectral region from the


near-ultraviolet up to and including the near-infrared.
III. Non-selective scattering

• Non-selective scattering: occurs when the particle size is


much larger than the radiation wavelength.

• Typical particles responsible for this effect are water droplets


and larger dust particles

• Non-selective scattering is independent of wavelength, with all


wavelengths scattered about equally.

• The most prominent example of non-selective scattering


includes the effect of clouds (clouds consist of water droplets).
Since all wavelengths are scattered equally, a cloud appears
white.
2.2 Energy interaction in the earth’s surface


• Reflection: occurs when radiation ‘bounces’ off the target and is
then redirected.
• Absorption: occurs when radiation is absorbed by the target
• Transmission: occurs when radiation passes through a target
Cont’
• Many remote sensing system are interested in measuring the radiation
reflected from targets
Spectral Reflectance and Earth Surface Interaction
 Normalized Difference Vegetation Index (NDVI) It is calculated in
the following manner by general formula:

Where:- NIR = Near Infrared and


R = Red
 The Landsat TM and Landsat OLI images are important as input for

NDVI calculation.
For Landsat TM the formula is change due to FCC (visualization)

 For Landsat OLI the formula can change in to the following


equation.
Unit Three
3. Aerial imagery, Satellite imagery, and Platforms

3.1 Platform
• In order for a sensor to collect and record energy
reflected or emitted from a target or surface, it must
reside on a stable platform removed from the target
or surface being observed.
• Platform is the vehicle carrying the remote sensing
device.
1. Ground-based sensors
• Ground-based sensors are often used to record
detailed information about the surface.
• Sensors may be placed on a ladder, scaffolding, tall
building, cherry-picker, crane.
2. Aerial Platforms
• Are primarily stable wing aircraft, although
helicopters are occasionally used.
• Often used to collect very detailed images.
• Facilitate the collection of data over virtually any
time portion of the Earth's surface at any time.
3. Space Born Remote Sensing
• Space borne remote sensing is carried out
using sensors that are mounted on satellites.
• Satellites are objects which revolve around
another object, that is Earth.
• Satellites permit repetitive coverage of the
Earth's surface on a continuing basis.
Satellite Orbits and Swathes
Satellite orbit characteristics
Cont’
Cont’
Cont’
3.2 Scanners and cameras
B. Scanning Systems

1. The push broom scanners:


Push broom sensor has a
geometry similar to vertical
Aerial photograph.
• This is also called “Along
Track Scanner “.
Cont’
Cont’

2.Whiskbroom scanner: The


earth’s surface is scanned point by
point and line after line as the
satellite moves forward
• This is also called “across track
scanner
3.3 Image data characteristics
Image characteristics:
• Spatial: area measured
• Spectral: wavelengths sensor is sensitive to
• Radiometric: energy levels measured
• Temporal: time of acquisition
Each of these to be specified by:
• Coverage: range between min. and max.
• Resolution: smallest units distinguished
Spatial characteristics

• Spatial coverage: total area covered by one image


• Depends on: Total Field of View (FOV) (as angle or
in km) and swath width (in km).
• Spatial Resolution: The smallest unit of area
detected.
Spectral characteristics
• Spectral coverage: total wavelength range observed
by the sensor
• Spectral resolution: related to the widths of the
spectral bands that a sensor is sensitive.

E.g. Spectral resolution


Radiometric characteristics
• Radiometric coverage/Dynamic range: minimum
and maximum energy levels that can be distinguished
by the sensor
• Radiometric Resolution: smallest differences in
energy distinguished.
Temporal characteristics
• Temporal coverage: span of time over which images are
recorded and archived.
• Temporal resolution/Revisit time: minimum time
between two images of the same area.
Additional properties of image data
• Pixel size: area on the ground covered by one pixel.
• Number of bands: number of distinct wavelength
ranges stored.
• Quantization: number of values used to represent
different energy levels.
• Image size: number of rows and columns
• Data Storage = f(size, number of bands, quantization).
3.4 Data selection criteria

Data select based on the following criteria:


 Availability (in archives or to be sensed?)
 Costs: Costs are related to.
- resolutions
- quality
- availability
Few satellite images and characteristics
Stereoscopic Imagery
• Needed in order to measure three dimensionally
• Achieved by forward overlap in aerial photography
• Can be achieved in satellites (e.g. SPOT) but only
from different orbital passé.
Unit four
4.1 Image enhancement and visualization

• Image enhancements: are used to make it easier for


visual interpretation and understanding of imagery.
• Image enhancement deals with the procedure of
making raw images better interpretable/suitable for a
particular application.
Remote Sensing Data Pre-processing
(a) Atmospheric correction
(b) Radiometric correction
(c) Geometric correction
Cont’

Methods of Geometric correction


1.Using satellite header file (satellite onboard GPS)
2.Image to image registration
3.Image to map registration
4.Manually entered GCPs (Ground Control Points)
4.2 Image classification
• Image classification refers to the computer-assisted
interpretation of remotely sensed images.
• Image classification is based on the different spectral
characteristics of different materials on the Earth’s
surface.
Cont’
Principles of image classification
 A digital image is a 2D-array of elements. In each
element the energy reflected or emitted from the
corresponding area on the Earth’s surface is stored.
 The spatial arrangement of the measurements defines
the image or image space.

E.g. image spaces


 Feature space: In one pixel, the value in (for example) two
bands can be regarded as components of a two-dimensional
vector, the feature vector.
 The feature vector can be plotted in a two-dimensional graph.
 Similarly, this approach can be visualized for a three band
situation in a three-dimensional graph.
Image classification process
1. Selection and preparation of image data: select the most
appropriate sensor, date (s) and wavelength bands
2. Definition of clusters in the feature space:
- Supervised classification: operator defined the clusters during the
training process
- Unsupervised classification: a cluster algorithm automatically finds
and defines a number of clusters in the feature space
3. Selection of classification algorithms: the operator needs to decide
on how the pixels are assigned to the classes
4. Running the actual classification: based on its DN-values, each
individual pixel in the image is assigned to one of the predefined
classes
5. Validation of the result: Once the classified image has been
produced its quality is assessed by comparing it to reference data
(ground truth). This requires selection of a sampling technique,
generation of an error matrix, and the calculation of error parameters
The process of image classification typically involves
five steps.
• Common classification procedures can be broken down into two
broad subdivisions based on the method used: supervised
classification and unsupervised classification.

1. Supervised classification: is the most used technique for the


quantitative analysis of RS image data depending on their
reflectance properties. It uses the spectral signature obtained
from training samples to classify an image.

• Thus, the analyst is“ supervising" the categorization of a set of


specific classes.
Supervised classification
2. Unsupervised Classification: Unsupervised classification in

essence reverses the supervised classification process.

• This is a computerized method without direction from the

analyst in which pixels with similar digital numbers are grouped

together into spectral classes using statistical procedures such as

nearest neighbour and cluster analysis.

• unsupervised classification is not completely without human

intervention. Because need the analyst to specifies how many

groups or clusters are to be looked for in the data.


Cont’
• Unsupervised classification is used for statistical clustering
methods to conglomerate pixels into groups according to the
amount of similarity for reflectance value in each spectral
band.
Unit five
5. Visual image interpretation
• Information extraction methods from remote sensing
imagery can be subdivided into two groups:
1. Information extraction based on visual analysis or
interpretation of the data E.g. visual interpretation
methods for land use or soil mapping
2. Information extraction based on semi-automatic
processing by the computer.
• The most intuitive way to extract information from
remote sensing images is by visual image
interpretation, which is based on man’s ability to
relate colors and patterns in an image to real world
features.
Cont.
1. Spontaneous recognition: refers to the ability of an
interpreter to identify objects or phenomena at a first
glance. This might be because of:
- Earlier (professional) experience
- Knowledge of the area/object-I see because I known.
2. Logical inference: means that the interpreter applies
reasoning. In the reasoning the interpreter will use his/her
professional knowledge and experience
• E.g. Concluding that a rectangular shape is a swimming pool
because of its location in a garden and near to a house.
• Sometimes, logical inference alone cannot help you in
interpreting images, so that field observations are required.
• Hue is just about color
• We can distinguish objects based on their color
• Variations in hue are primarily related to the
spectral characteristics of the measured areas and
also to the bands selected from visualization
• Tone is defined as the relative brightness of a
black/white image.
• Tonal variations are important interpretation elements
in an image interpretation
• The tonal expression of objects on the image is
directly related to the amount of light (energy)
reflected from the surface
• Shape or form characterizes many terrain objects
visible in the image.
• The shape of objects often helps to determine the
character of the object (built-up areas, roads and
railroads, agricultural fields, etc).
• Size of objects can be considered in a relative or
absolute sense.
• The width of roads can be estimated, for example by,
the scale and the measurement on the image
Shadow
• Pattern refers to the spatial arrangement of objects
and implies the characteristic repetition of certain
forms or relationship.
• Pattern can be described by terms such as concentric,
radial, checkerboard, etc.
• Another typical example include the hydrological
system (river with its branches) and patterns related
to erosion
• Texture related to the frequency of tonal change
• Texture may be described by terms as coarse or fine,
smooth or rough, even or uneven, mottled, speckled,
granular, linear, woolly, etc.
• Texture can often be related to terrain roughness
• Texture is strongly related to the spatial resolution of
the sensor applied
• A pattern on a large scale image may show as texture
on a small scale image
• Site relates to the topographic or geographic location
• A typical example of this interpretation element is
that back-swamps can be found in a flood plain but
not in the centre of a city area.
• Similarly, a large building at the end of a number of
converging railroads is likely to be a railway station –
we would not expect a hospital at this site.
Association
Convergence
Other interpretation aid includes:
Quality of interpretation

The quality of the result of interpretation


depends on a number of factors:
• The interpreter
• The image data used
• The guidelines provided
Unit six
GPS and remote sensing

• Global Positioning System (GPS) is a satellite-


based navigation system, which provides
approximate position and time information on the
earth any time, any where, in any weather.
• GPS allows an equipped user their 3 dimensional
position, velocity and time with accuracy etc.
• A network of 24 satellites all orbiting in space and
transmits signals that can be detected by anyone with
GPS receivers.
Cont’
• GPS consists of three major segments.
These are:
Structure of GPS
1. Space Segment
• 24 satellites orbiting the earth.

• six orbits with 4 satellites each

• 20,000km above the earth

• Each satellites produces and broadcast GPS signal towards the earth

• Controlled and managed by the GPS control segment (US).


2. User Segment
• The user segment consists of GPS receiver and antenna that
allow the user to receive GPS signals and compute their
position, velocity & time.
3. Control Segment
• The control segment is composed of a master control station, an
alternate master control station, and a host of dedicated and shared
ground antennas and monitor stations.
Cont’
Basic concept of GPS
• A GPS receiver calculates its position by precisely timing
the signals sent by GPS satellites high above the Earth.
• The receiver uses the messages it receives to determine
the transit time of each message and computes the
distance to each satellite using the speed of light.
• These distances and satellites' locations are used to compute
the location of the receiver using the navigation equations.
• Many GPS units show derived information such as
direction and speed, calculated from position changes.
Cont’

Speed of light (c)= 300,000 km/s

Approximately 300,000,000 m/s or 3x10^8 m/s


GPS Error sources
• GPS errors are affected by: geometric dilution of
precision and depend on signal arrival time errors,
numerical errors, atmospherics effects, ephemeris
errors, multipath errors and other effects.
• Variability in solar radiation pressure has an indirect effect
on GPS accuracy due to its effect on ephemeris errors.
Application of GPS

• Resource Mapping

• Features mapping

- Point, lines & area


- Points joining to form lines & Area
- Calculation of sizes & features
• Educational mapping

• Health mapping
cont;’

• Population distribution (people & animals).


• Soil mapping.
• Water resource mapping (sources &distribution)
• Infrastructure mapping (Point & Sources).
• Market mapping (product outlets, clients
distributions & focus).
• Monitoring for disaster mgt (Dams buildings).
Absolut position
Absolut position is involves the use of only one
passive receiver at one station to collect data from
multiple satellite to determine the station location.
• It is not sufficiently accurate for precise surveying
and positioning use.
Relative position
Relative position is used to determine the
coordinate of an unknown point with respect to
other known points.
• This means that the vector between two point
which is called baseline vector.
u r
y o
f o r
u
o on ! !
y
k nti
n
a tte
T h a

You might also like