Pgdgis 03
Pgdgis 03
Pgdgis 03
1.1. Introduction
1.3.1 Introduction
1.4. Resolutions
1.5. Summary
1.6. Glossary
1.7. References
1.1 Introduction
"Remote sensing is the science (and to some extent, art) of acquiring information about
the Earth's surface without actually being in contact with it. This is done by sensing and
recording reflected or emitted energy and processing, analyzing, and applying that
information." In much of remote sensing, the process involves an interaction between
incident radiation and the targets of interest. This is exemplified by the use of imaging
systems where the following seven elements are involved. Note, however that remote
sensing also involves the sensing of emitted energy and the use of non-imaging sensors.
1. Energy Source or Illumination (A) – the first requirement for remote sensing
is to have an energy source which illuminates or provides electromagnetic energy
to the target of interest.
2. Radiation and the Atmosphere (B) – as the energy travels from its source to
the target, it will come in contact with and interact with the atmosphere it passes
through. This interaction may take place a second time as the energy travels from
the target to the sensor.
3. Interaction with the Target (C) - once the energy makes its way to the target
through the atmosphere, it interacts with the target depending on the properties of
both the target and the radiation.
4. Recording of Energy by the Sensor (D) - after the energy has been scattered
by, or emitted from the target, we require a sensor (remote - not in contact with
the target) to collect and record the electromagnetic radiation.
7. Application (G) - the final element of the remote sensing process is achieved
when we apply the information we have been able to extract from the imagery
about the target in order to better understand it, reveal some new information, or
assist in solving a particular problem.
These seven elements comprise the remote sensing process from beginning to end. We
will be covering all of these in sequential order throughout the five chapters of this
tutorial, building upon the information learned as we go. Enjoy the journey!
As was noted in the previous section, the first requirement for remote sensing is to
have an energy source to illuminate the target (unless the sensed energy is being
emitted by the target). This energy is in the form of electromagnetic radiation.
The wavelength is the length of one wave cycle, which can be measured as the
distance between successive wave crests. Wavelength is usually represented by
the Greek letter lambda (λ). Wavelength is measured in metres (m) or some factor
of metres such as nanometres (nm, 10-9 metres), micrometres (µm, 10-6 metres)
(µm, 10-6 metres) or centimetres (cm, 10-2 metres). Frequency refers to the
number of cycles of a wave passing a fixed point per unit of time. Frequency is
normally measured in hertz (Hz), equivalent to one cycle per second, and various
multiples of hertz. Wavelength and frequency are related by the following
formula:
Therefore, the two are inversely related to each other. The shorter the
wavelength, the higher the frequency. The longer the wavelength, the lower the
frequency. Understanding the characteristics of electromagnetic radiation in
terms of their wavelength and frequency is crucial to understanding the
information to be extracted from remote sensing data. Next we will be examining
the way in which we categorize electromagnetic radiation for just that purpose.
The electromagnetic spectrum ranges from the shorter wavelengths (including gamma
and x-rays) to the longer wavelengths (including microwaves and broadcast radio
waves). There are several regions of the electromagnetic spectrum which are useful for
remote sensing.
For most purposes, the ultraviolet or UV portion of the spectrum has the shortest
wavelengths which are practical for remote sensing. This radiation is just beyond the
violet portion of the visible wavelengths, hence its name. Some Earth surface materials,
primarily rocks and minerals, fluoresce or emit visible light when illuminated by UV
radiation.
The light which our eyes - our "remote sensors" - can detect is part of the visible
spectrum. It is important to recognize how small the visible portion is relative to the rest
of the spectrum. There is a lot of radiation around us which is "invisible" to our eyes, but
can be detected by other remote sensing instruments and used to our advantage. The
visible wavelengths cover a range from approximately 0.4 to 0.7 µm. The longest visible
wavelength is red and the shortest is violet. Common wavelengths of what we perceive as
particular colours from the visible portion of the spectrum are listed below. It is
important to note that this is the only portion of the spectrum we can associate with the
concept of colours.
Blue, green, and red are the primary colours or wavelengths of the visible spectrum.
They are defined as such because no single primary colour can be created from the other
two, but all other colours can be formed by combining blue, green, and red in various
The next portion of the spectrum of interest is the infrared (IR) region which covers the
wavelength range from approximately 0.7 µm to 100 µm - more than 100 times as wide as
the visible portion! The infrared region can be divided into two categories based on their
radiation properties - the reflected IR, and the emitted or thermal IR.
Radiation in the reflected IR region is used for remote sensing purposes in ways very
similar to radiation in the visible portion. The reflected IR covers wavelengths from
approximately 0.7 µm to 3.0 µm. The thermal IR region is quite different than the visible
and reflected IR portions, as this energy is essentially the radiation that is emitted from
the Earth's surface in the form of heat. The thermal IR covers wavelengths from
approximately 3.0 µm to 100 µm.
The portion of the spectrum of more recent interest to remote sensing is the microwave
region from about 1 mm to 1 m. This covers the longest wavelengths used for remote
sensing. The shorter wavelengths have properties similar to the thermal infrared region
while the longer wavelengths approach the wavelengths used for radio broadcasts.
Because of the special nature of this region and its importance to remote sensing in
Canada, an entire chapter (Chapter 3) of the tutorial is dedicated to microwave sensing.
Before radiation used for remote sensing reaches the Earth's surface it has to
travel through some distance of the Earth's atmosphere. Particles and gases in
the atmosphere can affect the incoming light and radiation. These effects are
caused by the mechanisms of scattering and absorption.
Rayleigh scattering occurs when particles are very small compared to the
wavelength of the radiation.
These could be particles such as small specks of dust or nitrogen and oxygen
molecules. Rayleigh scattering causes shorter wavelengths of energy to be
scattered much more than longer wavelengths. Rayleigh scattering is the
dominant scattering mechanism in the upper atmosphere. The fact that the sky
appears "blue" during the day is because of this phenomenon. As sunlight passes
through the atmosphere, the shorter wavelengths (i.e. blue) of the visible spectrum
are scattered more than the other (longer) visible wavelengths. At sunrise and
sunset the light has to travel farther through the atmosphere than at midday and
the scattering of the shorter wavelengths is more complete; this leaves a greater
proportion of the longer wavelengths to penetrate the atmosphere.
Mie scattering occurs when the particles are just about the same size as the
wavelength of the radiation.
Dust, pollen, smoke and water vapour are common causes of Mie scattering
which tends to affect longer wavelengths than those affected by Rayleigh
scattering. Mie scattering occurs mostly in the lower portions of the atmosphere
where larger particles are more abundant, and dominates when cloud conditions
are overcast. The final scattering mechanism of importance is called nonselective
scattering. This occurs when the particles are much larger than the wavelength of
the radiation. Water droplets and large dust particles can cause this type of
scattering. Nonselective scattering gets its name from the fact that all wavelengths
are scattered about equally. This type of scattering causes fog and clouds to
appear white to our eyes because blue, green, and red light are all scattered in
approximately equal quantities (blue+green+red light = white light).
Ozone, carbon dioxide, and water vapour are the three main atmospheric
constituents which absorb radiation. Ozone serves to absorb the harmful (to most
living things) ultraviolet radiation from the sun. Without this protective layer in
the atmosphere our skin would burn when exposed to sunlight. You may have
heard carbon dioxide referred to as a greenhouse gas. This is because it tends to
absorb radiation strongly in the far infrared portion of the spectrum - that area
associated with thermal heating - which serves to trap this heat inside the
atmosphere. Water vapour in the atmosphere absorbs much of the incoming
longwave infrared and shortwave microwave radiation (between 22µm and 1m).
The presence of water vapour in the lower atmosphere varies greatly from
location to location and at different times of the year. For example, the air mass
above a desert would have very little water vapour to absorb energy, while the
tropics would have high concentrations of water vapour (i.e. high humidity).
Because these gases absorb electromagnetic energy in very specific regions of the
spectrum, they influence where (in the spectrum) we can "look" for remote
sensing purposes.
Those areas of the spectrum which are not severely influenced by atmospheric
absorption and thus, are useful to remote sensors, are called atmospheric
windows. By comparing the characteristics of the two most common
energy/radiation sources (the sun and the earth) with the atmospheric windows
available to us, we can define those wavelengths that we can use most effectively
for remote sensing.
Radiation that is not absorbed or scattered in the atmosphere can reach and
interact with the Earth's surface. There are three (3) forms of interaction that can
take place when energy strikes, or is incident (I) upon the surface. These are:
absorption (A); transmission (T); and reflection (R). The total incident energy
will interact with the surface in one or more of these three ways. The proportions
of each will depend on the wavelength of the energy and the material and
condition of the feature.
Absorption (A) occurs when radiation (energy) is absorbed into the target while
transmission (T) occurs when radiation passes through a target.
Reflection (R) occurs when radiation "bounces" off the target and is redirected. In
remote sensing, we are most interested in measuring the radiation reflected from
targets. We refer to two types of reflection, which represent the two extreme ends
of the way in which energy is reflected from a target: specular reflection and
diffuse reflection.
When a surface is smooth we get specular or mirror-like reflection where all (or
almost all) of the energy is directed away from the surface in a single direction.
Diffuse reflection occurs when the surface is rough and the energy is reflected
almost uniformly in all directions.
Most earth surface features lie somewhere between perfectly specular or perfectly
diffuse reflectors. Whether a particular target reflects specularly or diffusely, or
Water: Longer wavelength visible and near infrared radiation is absorbed more
by water than shorter visible wavelengths. Thus water typically looks blue or
blue-green due to stronger reflectance at these shorter wavelengths, and darker if
viewed at red or near infrared wavelengths. If there is suspended sediment
present in the upper layers of the water body, then this will allow better
reflectivity and a brighter appearance of the water. The apparent colour of the
water will show a slight shift to longer wavelengths. Suspended sediment (S) can
be easily confused with shallow (but clear) water, since these two phenomena
appear very similar. Chlorophyll in algae absorbs more of the blue wavelengths
and reflects the green, making the water appear more green in colour when algae
is present. The topography of the water surface (rough, smooth, floating
materials, etc.) can also lead to complications for water-related interpretation
due to potential problems of specular reflection and other influences on colour
and brightness. We can see from these examples that, depending on the complex
make-up of the target that is being looked at, and the wavelengths of radiation
involved, we can observe very different responses to the mechanisms of
absorption, transmission, and reflection. By measuring the energy that is reflected
(or emitted) by targets on the Earth's surface over a variety of different
wavelengths, we can build up a spectral response for that object. By comparing
the response patterns of different features we may be able to distinguish between
them, where we might not be able to, if we only compared them at one
wavelength. For example, water and vegetation may reflect somewhat similarly in
the visible wavelengths but are almost always separable in the infrared. Spectral
response can be quite variable, even for the same target type, and can also vary
with time (e.g. "green-ness" of leaves) and location. Knowing where to "look"
spectrally and understanding the factors which influence the spectral response of
the features of interest are critical to correctly interpreting the interaction of
electromagnetic radiation with the surface.
1.3.1 Introduction
In previous sections we described the visible portion of the spectrum and the
concept of colours. We see colour because our eyes detect the entire visible range
of wavelengths and our brains process the information into separate colours. Can
you imagine what the world would look like if we could only see very narrow
ranges of wavelengths or colours? That is how many sensors work. The
information from a narrow wavelength range is gathered and stored in
For any given material, the amount of solar radiation that it reflects, absorbs,
transmits, or emits varies with wavelength.
Fig.
1.22: EMR
When that amount (usually intensity, as a percent of maximum) coming from the
material is plotted over a range of wavelengths, the connected points produce a
curve called the material’s spectral signature (spectral response curve). Here is a
general example of a reflectance plot for some (unspecified) vegetation type (bio-
organic material), with the dominating factor influencing each interval of the
curve so indicated; note the downturns of the curve that result from selective
absorption:
For example, at some wavelengths, sand reflects more energy than green
vegetation but at other wavelengths it absorbs more (reflects less) than does the
vegetation. In principle, we can recognize various kinds of surface materials and
distinguish them from each other by these differences in reflectance. Of course,
there must be some suitable method for measuring these differences as a function
of wavelength and intensity (as a fraction [normally in percent] of the amount of
irradiating radiation). Using reflectance differences, we may be able to
distinguish the four common surface materials in the above signatures (GL =
grasslands; PW = pinewoods; RS = red sand; SW = silty water) simply by
plotting the reflectances of each material at two wavelengths, commonly a few
tens (or more) of micrometers apart.
So far, throughout this chapter, we have made various references to the sun as a
source of
energy or radiation. The sun provides a very convenient source of energy for
remote sensing. The sun's energy is either reflected, as it is for visible
wavelengths, or absorbed and then reemitted, as it is for thermal infrared
wavelengths. Remote sensing systems which measure energy that is naturally
available are called passive sensors. Passive sensors can only be used to detect
energy when the naturally occurring energy is available. For all reflected energy,
this can only take place during the time when the sun is illuminating the Earth.
There is no reflected energy available from the sun at night. Energy that is
naturally emitted (such as thermal infrared) can be detected day or night, as long
as the amount of energy is large enough to be recorded.
These sensors are called radiometers and they can detect EMR within the
ultraviolet to microwave wavelengths. Two important spatial characteristics of
passive sensors are:
Their “instantaneous field of view” (IFOV) - this is the angle over which the
detector is sensitive to radiation. This will control the picture element (pixel) size
which gives the ground (spatial) resolution of the ultimate image i.e. the spatial
resolution is a function of the detector angle and the height of the sensor above
the ground. For more details on spatial, spectral, radiometric and temporal
resolutions.
The Concept of IFOV and AFOV (after Avery and Berlin, 1985)
The “swath width” - this is the linear ground distance over which the scanner is
tracking (at right angles to the line of flight). It is determined by the angular field
of view (AFOV - or scanning angle) of the scanner. The greater the scanning
angle, the greater the swath width.
This uses a wide angle optical system in which all the scenes across the AFOV
are imaged on a detector array at one time, i.e. there is no mechanical movement.
As the sensor moves along the flight line, successive lines are imaged by the
sensor and sampled by a multiflexer for transmission. The push broom system is
generally better than the mechanical scanner since there is less noise in the
signal, there are no moving parts and it has a high geometrical accuracy.
Active sensors, on the other hand, provide their own energy source for
illumination. The sensor emits radiation which is directed toward the target to be
investigated. The radiation reflected from that target is detected and measured by
the sensor. Advantages for active sensors include the ability to obtain
measurements anytime, regardless of the time of day or season. Active sensors
can be used for examining wavelengths that are not sufficiently provided by the
sun, such as microwaves, or to better control the way a target is illuminated.
However, active systems require the generation of a fairly large amount of energy
to adequately illuminate targets. Some examples of active sensors are a laser
fluro-sensor and synthetic aperture radar (SAR).
We will review briefly airborne and satellite active systems, which are commonly
called Radar, and which are generally classified either imaging or non-imaging:
Imaging Radars. These display the radar backscatter characteristics of the earth's
surface in the form of a strip map or a picture of a selected area. A type used in
aircraft is the SLAR whose sensor scans an area not directly below the aircraft,
but at an angle to the vertical, i.e. it looks sideways to record the relative intensity
of the reflections so as to produce an image of a narrow strip of terrain.
Sequential strips are recorded as the aircraft moves forward allowing a complete
image to be built up. The SLAR is unsuitable for satellites since, to achieve a
useful spatial resolution, it would require a very large antenna. A variant used in
satellites is the SAR whose short antenna gives the effect of being several hundred
times longer by recording and processing modified data.
The Synthetic Aperture Radar System (after Avery and Berlin, 1985)
Fig. 1.27
1.4 Resolutions
For some remote sensing instruments, the distance between the target being
imaged and the platform, plays a large role in determining the detail of
information obtained and the total area imaged by the sensor. Sensors onboard
platforms far away from their targets, typically view a larger area, but cannot
provide great detail. Compare what an astronaut onboard the space shuttle sees
of the Earth to what you can see from an airplane. The astronaut might see your
whole province or country in one glance, but couldn't distinguish individual
houses. Flying over a city or town, you would be able to see individual buildings
and cars, but you would be viewing a much smaller area than the astronaut.
There is a similar difference between satellite images and airphotos. The detail
discernible in an image is dependent on the spatial resolution of the sensor and
refers to the size of the smallest possible feature that can be detected. Spatial
resolution of passive sensors (we will look at the special case of active microwave
sensors later) depends primarily on their Instantaneous Field of View (IFOV).
The IFOV is the angular cone of visibility of the sensor (A) and determines the
area on the Earth's surface which is "seen" from a given altitude at one particular
moment in time (B). The size of the area viewed is determined by multiplying the
IFOV by the distance from the ground to the sensor (C). This area on the ground
is called the resolution cell and determines a sensor's maximum spatial
resolution. For a homogeneous feature to be detected, its size generally has to be
equal to or larger than the resolution cell. If the feature is smaller than this, it
may not be detectable as the average brightness of all features in that resolution
cell will be recorded. However, smaller features may sometimes be detectable if
their reflectance dominates within a articular resolution cell allowing sub-pixel
or resolution cell detection.
Images where only large features are visible are said to have coarse or low
resolution. In fine or high resolution images, small objects can be detected.
Military sensors for example, are designed to view as much detail as possible,
and therefore have very fine resolution. Commercial satellites provide imagery
with resolutions varying from a few metres to several kilometres. Generally
speaking, the finer the resolution, the less total ground area can be seen. The
ratio of distance on an image or map, to actual ground distance is referred to as
scale. If you had a map with a scale of 1:100,000, an object of 1cm length on the
map would actually be an object 100,000cm (1km) long on the ground. Maps or
images with small "map-to-ground ratios" are referred to as small scale (e.g.
1:100,000), and those with larger ratios (e.g. 1:5,000) are called large scale.
wavelength ranges. Many remote sensing systems record energy over several
separate wavelength ranges at various spectral resolutions. These are referred to
as multi-spectral sensors and will be described in some detail in following
sections. Advanced multi-spectral sensors called hyperspectral sensors, detect
hundreds of very narrow spectral bands throughout the visible, near-infrared, and
mid-infrared portions of the electromagnetic spectrum. Their very high spectral
resolution facilitates fine discrimination between different targets based on their
spectral response in each of the narrow bands.
While the arrangement of pixels describes the spatial structure of an image, the
radiometric characteristics describe the actual information content in an image.
Every time an image is acquired on film or by a sensor, its sensitivity to the
magnitude of the electromagnetic energy determines the radiometric resolution.
The radiometric resolution of an imaging system describes its ability to
discriminate very slight differences in energy The finer the radiometric resolution
of a sensor, the more sensitive it is to detecting small differences in reflected or
emitted energy. Imagery data are represented by positive digital numbers which
vary from 0 to (one less than) a selected power of 2. This range corresponds to
the number of bits used for coding numbers in binary format. Each bit records an
exponent of power 2 (e.g. 1 bit=2 1=2). The maximum number of brightness
levels available depends on the number of bits used in representing the energy
recorded. Thus, if a sensor used 8 bits to record the data, there would be 28=256
digital values available, ranging from 0 to 255. However, if only 4 bits were used,
then only 24=16 values ranging from 0 to 15 would be available. Thus, the
radiometric resolution would be much less. Image data are generally displayed in
a range of grey tones, with black representing a digital number of 0 and white
representing the maximum value (for example, 255 in 8-bit data). By comparing a
2-bit image with an 8-bit image, we can see that there is a large difference in the
level of detail discernible depending on their radiometric resolutions.
persistent clouds offer limited clear views of the Earth's surface (often in the
tropics) short-lived phenomena (floods, oil slicks, etc.) need to be imaged multi-
temporal comparisons are required (e.g. the spread of a forest disease from one
year to the next) the changing appearance of a feature over time can be used to
distinguish it from near similar features (wheat / maize)
1.5 Summary
The unit begins with an introduction to remote sensing and its basic concepts. The
electromagnetic spectrums being the key component have been elaborately discussed. We
also learned about the various techniques of satellite remote sensing along with
understanding the satellite remotely sensed data components. The resolution of a satellite
remote sensing data and its various types has also been covered here.
1.6 Glossary
Electromagnetic- The object / wavelength associated with electric and magnetic fields
and their interactions with each other and with electric charges and currents.
Radar- Acronym for radio detection and ranging. A device or system that detects surface
features on the earth by bouncing radio waves off them and measuring the energy
reflected back.
Radiation- The emission and propagation of energy through space in the form of waves.
Electromagnetic energy and sound are examples of radiation.
Resolution- The detail with which a map depicts the location and shape of geographic
features. The larger the map scale, the higher the possible resolution. As scale decreases,
resolution diminishes and feature boundaries must be smoothed, simplified, or not shown
at all; for example, small areas may have to be represented as points.
Sensors- An electronic device for detecting energy, whether emitted or radiated, and
converting it into a signal that can be recorded and displayed as numbers or as an
image.Spatial- Related to or existing within space
1.7 References
2. www.ccrs.nrcan.gc.ca/resource/tutor/fundam/pdf/fundamentals_e.pdf
3. http://jersey.uoregon.edu/~imamura/122/lecture-2/lecture-2.html
4. http://outreach.atnf.csiro.au/education/senior/cosmicengine/sun_earth.html
5. http://rst.gsfc.nasa.gov/Intro/Part2_5.html
6. http://xkcd.com/273/
7. http://learn.uci.edu/oo/getOCWPage.php?course=OC0811004&lesson=005&top
ic=002&page=21
8. http://www.fao.org/docrep/003/T0446E/T0446E04.htm
9. http://www.cdioinstitute.org/papers/Day2/basic%20image%20processing.pdf
5. Which are the two objects whose reflectance is shown in the picture. Explain why
6. What are the three different types of resolutions? What do you understand by high
***
2.1 Introduction
2.2 Platforms
2.3.1 GOES
2.3.3 SPOT
2.3.4 Radarsat
2.4 Summary
2.5 Glossary
2.6 References
2.1 Introduction
In order for a sensor to collect and record energy reflected or emitted from a target or
surface, it must reside on a stable platform removed from the target or surface being
observed. Platforms for remote sensors may be situated on the ground, on an aircraft or
balloon (or some other platform within the Earth's atmosphere), or on a spacecraft or
2.2 Platforms
Ground-based sensors are often used to record detailed information about the surface
which is compared with information collected from aircraft or satellite sensors. In some
cases, this can be used to better characterize the target which is being imaged by these
other sensors, making it possible to better understand the information in the imagery.
Sensors may be placed on a ladder, scaffolding, tall building, cherry-picker, crane, etc.
Aerial platforms are primarily stable wing aircraft, although helicopters are
occasionally used. Aircraft are often used to collect very detailed images and facilitate
the collection of data over virtually any portion of the Earth's surface at any time.
In space, remote sensing is sometimes conducted from the space shuttle or, more
commonly, from satellites. Satellites are objects which revolve around another object - in
this case, the Earth. For example, the moon is a natural satellite, whereas man-made
satellites include those platforms launched for remote sensing, communication, and
telemetry (location and navigation) purposes. Because of their orbits, satellites permit
In the 1960s, a revolution in remote sensing technology began with the deployment of
space satellites. From their high vantage-point, satellites have a greatly extended view of
the Earth's surface. The first meteorological satellite, TIROS-1, was launched by the
United States using an Atlas rocket on April 1, 1960. This early weather satellite used
vidicon cameras to scan wide areas of the Earth's surface. Early satellite remote sensors
did not use conventional film to produce their images. Instead, the sensors digitally
capture the images using a device similar to a television camera. Once captured, this
data is then transmitted electronically to receiving stations found on the Earth's surface.
The image below is from TIROS-7 of a mid-latitude cyclone off the coast of New Zealand.
Fig. 1.5: TIROS-7 image of a mid-latitude cyclone off the coast of New Zealand, August
24, 1964
2.3.1 GOES
satellites provides most of the remotely sensed weather information for North America.
To cover the complete continent and adjacent oceans two satellites are employed in
a geostationary orbit. The western half of North America and the eastern Pacific Ocean
is monitored by GOES-10, which is directly above the equator and 135° West longitude.
The eastern half of North America and the western Atlantic are cover by GOES-8. The
GOES-8 satellite is located overhead of the equator and 75° West longitude. Advanced
sensors aboard the GOES satellite produce a continuous data stream so images can be
viewed at any instance. The imaging sensor produces visible and infrared images of the
Earth's terrestrial surface and oceans. Infrared images can depict weather conditions
even during the night. Another sensor aboard the satellite can determine vertical
temperature profiles, vertical moisture profiles, total perceptible water, and atmospheric
stability.
Fig. 1.6: Color image from GOES-8 of hurricanes Madeline and Lester off the coast of Mexico,
In the 1970s, the second revolution in remote sensing technology began with the
deployment of the Landsat satellites. Since this 1972, several generations of Landsat
satellites with their Multispectral Scanners (MSS) have been providing continuous
coverage of the Earth for almost 30 years. Current, Landsat satellites orbit the Earth's
the ground surface is 79 x 56 meters. Complete coverage of the globe requires 233 orbits
and occurs every 16 days. The Multispectral Scanner records a zone of the Earth's
surface that is 185 kilometers wide in four wavelength bands: band 4 at 0.5 to 0.6
micrometers, band 5 at 0.6 to 0.7 micrometers, band 6 at 0.7 to 0.8 micrometers, and
band 7 at 0.8 to 1.1 micrometers. Bands 4 and 5 receive the green and red wavelengths in
the visible light range of the electromagnetic spectrum. The last two bands image near-
infrared wavelengths. A second sensing system was added to Landsat satellites launched
after 1982. This imaging system, known as the Thematic Mapper, records seven
meters. This modification allows for greatly improved clarity of imaged objects.
2.3.3 SPOT
The usefulness of satellites for remote sensing has resulted in several other organizations
launching their own devices. In France, the SPOT (Satellite Pour l'Observation de la
Terre) satellite program has launched five satellites since 1986. Since 1986, SPOT
satellites have produced more than 10 million images. SPOT satellites use two different
sensing systems to image the planet. One sensing system produces black and white
panchromatic images from the visible band (0.51 to 0.73 micrometers) with a ground
red, and reflected infrared bands at 20 x 20 meters. SPOT-5, which was launched in
2002, is much improved from the first four versions of SPOT satellites. SPOT-5 has a
maximum ground resolution of 2.5 x 2.5 meters in both panchromatic mode and
multispectral operation.
Fig. 1.8: SPOT false-color image of the southern portion of Manhatten Island and part of
Long Island, New York. The bridges on the image are (left to right): Brooklyn Bridge,
2.3.4 Radarsat
remote sensing device, Radarsat is quite different from the Landsat and SPOT satellites.
Radarsat is an active remote sensing system that transmits and receives microwave
energy penetrates clouds, rain, dust, or haze and produces images regardless of the Sun's
8 to 100 meters. This sensor has found important applications in crop monitoring,
Fig. 1.9: Radarsat image acquired on March 21, 1996, over Bathurst Island in Nunavut,
Canada.
This image shows Radarsat's ability to distinguish different types of bedrock. The light shades
on this image (C) represent areas of limestone, while the darker regions (B) are composed of
sedimentary siltstone. The very dark area marked A is Bracebridge Inlet which joins the Arctic
ocean. (Source:Canadian Centre for Remote Sensing - Geological Mapping Bathurst Island,
A list of the sensors that have been used in Indian Remote Sensing satellites:
Satellite Microwave Radiometer (SAMIR) SAMIR was the payload for BHASKAR I and
II satellites launched in 1979 and 1981. They sucessfully provided data on the sea
surface temperature, ocean winds, moisture content over the land and sea. It was a dicke
The Bhaskara satellites I and II had a two band TV payload for land applications. It
gave images of earth from a height of 525 Km. The data were used in meterology,
Smart Sensor Rohini Rs-D2, (the sucessor to the failed Rs-D1) was launched on Apr.
1983. It carried a Smart sensor, which was a 2-Band solid-state device. It had the first
LISS-I (Linear Imaging self Scanner) was a payload for the IRS-1A satellite. This camera
CCD array. It was again used in IRS-1B. It used 7 bit quantization, and had a swath of
148 Kms. Images of LISS-I were extensively used in forestry, crop acreage, yield
LISS-II was similar to LISS-I, but with higher spatial resolution and smaller swath. it was
LISS-III is onboard two satellites IRS-1C and IRS-1D. This is a multi-spectral camera
which operates in four bands. It provides color images. Its images were used widely in
This was carried by IRS-1c and IRS-1D satellites. Pan camera enables the acquisition of
images at the resolution of 5.8m, which was the highest resolution offered by a civilian
satellite until recently, when American satellite Ikonos with a resolution of 1m surpassed
IRS-1C, IRS-1D, IRS-P3, which are all second generation Indian remote sensing
satellites, carried the WIFS sensor. The WIFS camera uses an 8 element refractive optics
like in LISS-III. Two such cameras are mounted with overlapping pixels of imaging.
WIFS data was used in assesment of rabi cropped area, crop inventory, observation of
IRS-P4, also called Oceansat, carried the ocean color monitor, launched on board
PSLV-C1. This payload is meant for oceanographic applications. The OCM is a solid
state camera operating in the push-broom scanning mode, using linear array CCD'S as
All the INSAT-1 and the INSAT-2, INSAT-3 series communications satellites carry the
VHRR to provide various remote sensing applications. Since INSAT satellites are
2.4 Summary
This unit begins with an introduction the various platforms available for remote sensing.
The most widely used satellites and sensors have been covered here. The Indian satellites
2.5 Glossary
example, visible light is one band of the electromagnetic spectrum, which also includes
Satellite- a device designed to be launched into orbit around the earth, another planet,
the sun,
Sensor- An electronic device for detecting energy, whether emitted or radiated, and
image.
Geostationary- Positioned in an orbit above the earth's equator with an angular velocity
the same as that of the earth and an inclination and eccentricity approaching zero. A
geostationary satellite will orbit as fast as the earth rotates on its axis, so that it remains
spectrum.
2.6 References
1. http://www.angelfire.com/co/pallav/sensorindian.html
2. http://www.physicalgeography.net/fundamentals/2e.html
3. http://envisat.esa.int/webcam/earth.html
4. http://picturesofsatellites.com/
5. http://www.asc-csa.gc.ca/eng/programs/esa/canada.asp
13: 9780131889507
2004. Remote Sensing and Image Interpretation, 5th ed., Published by John
3. Name one active remote sensing satellite. What kind of information can be
4. LISS III is the sensor of which satellite? How many bands are there in LISS III?
5. What is the full form of WIFS? Name two use of WIFS data.
***
3.1. Introduction
3.6. Summary
3.7. Glossary
3.8. References
3.1 Introduction
Global Positioning System (GPS) technology is a great boon to anyone who has the need
to navigate either great or small distances. This wonderful navigation technology was
actually first available for government use back in the late 1970s. In the past ten or so
years, It has been made available to the general public in the form of handheld receivers
that use this satellite technology provided by the U.S. government.
GPS formally known as the NAVSTAR (Navigation Satellite Timing and Ranging) Global
Positioning System, originally was developed for the military. Because of its popular
navigation capabilities and because you can access GPS technology using small
inexpensive equipment, the government mad the system available for civilian use. The
USA owns GPS technology and the Department of Defense maintains it. The first
satellite was placed in orbit on 22nd February 1978, and there are currently 28
operational satellites orbiting the Earth at a height of 20,180 km on 6 different
orbital planes. Their orbits are inclined at 55° to the equator, ensuring that at least 4
satellites are in radio communication with any point on the planet. Each satellite orbits
the Earth in approximately 12 hours and has four atomic clocks on board. During the
development of the GPS system, particular emphasis was placed on the following three
aspects:
1. It had to provide users with the capability of determining position, speed and
time, whether in motion or at rest.
GPS has also demonstrated a significant benefit to the civilian community who are
applying GPS to a rapidly expanding number of applications. What attracts us to GPS is:
1. The relatively high positioning accuracies, from tens of meters down to the
millimeter level.
3. The signals are available to users anywhere on the globe: in the air, on the
ground, or at sea.
4. Its is a positioning system with no user charges, that simply requires the use of
relatively low cost hardware.
Using the Global Positioning System (GPS, a process used to establish a position at any
point on the globe) the following two values can be determined anywhere on Earth:
2. The precise time (Universal Time Coordinated, UTC) accurate to within a range
of 60ns to approx. 5ns.
Speed and direction of travel (course) can be derived from these co-ordinates as
well as the time. The coordinates and time values are determined by 28 satellites
orbiting the Earth.
Generating GPS signal transit time 28 satellites inclined at 55° to the equator orbit
the Earth every 11 hours and 58 minutes at a height of 20,180 km on 6 different
orbital planes (Figure 3). Each one of these satellites has up to four atomic clocks
on board. Atomic clocks are currently the most precise instruments known, losing a
maximum of one second every 30,000 to 1,000,000 years. In order to make them even
more accurate, they are regularly adjusted or synchronised from various control
points on Earth. Each satellite transmits its exact position and its precise on board clock
time to Earth at a frequency of 1575.42 MHz. These signals are transmitted at the
speed of light (300,000 km/s) and therefore require approx. 67.3 ms to reach a
position on the Earth’s surface located directly below the satellite. The signals
require a further 3.33 us for each excess kilometer of travel. If you wish to
establish your position on land (or at sea or in the air), all you require is an accurate
clock. By comparing the arrival time of the satellite signal with the on board
clock time the moment the signal was emitted, it is possible to determine the transit
time of that signal (Figure 4).
The distance S to the satellite can be determined by using the known transit time τ:
S =τ x c
Measuring signal transit time and knowing the distance to a satellite is still not enough to
calculate one’s own position in 3-D space. To achieve this, four independent transit
time measurements are required. It is for this reason that signal communication with
four different satellites is needed to calculate one’s exact position. Why this should be so,
can best be explained by initially determining one’s position on a plane.
The GPS system consists of three segments. (Good general references on the GPS
system are) :
• The Space Segment: comprising the satellites and the transmitted signals.
• The Control Segment: the ground facilities carrying out the task of satellite
tracking, orbit computations, telemetry and supervision necessary for the daily
control of the space segment.
The Space Segment consists of the constellation of spacecraft and the signals
broadcast by them
which allow users to determine position, velocity and time. The basic functions of
the satellites are to:
Satellite signals can be received anywhere within a satellite’s effective range. The
effective range (shaded area) of a satellite located directly above the equator/zero
meridian intersection. The distribution of the 28 satellites at any given time can
be seen. It is due to this ingenious pattern of distribution and to the great
height at which they orbit that communication with at least 4 satellites is
ensured at all times anywhere in the world.
The control segment also oversees the artificial distortion of signals (SA,
Selective Availability), in order to degrade the system’s positional accuracy for
civil use. System accuracy had been intentionally degraded up until May 2000 for
political and tactical reasons by the U.S. Department of Defense (DoD), the
satellite operators. It was shut down in May 2000, but it can be started up again,
if necessary, either on a global or regional basis.
Four different signals are generated in the receiver having the same
structure as those received from the 4 satellites. By synchronising the signals
generated in the receiver with those from the satellites, the four satellite signal
time shifts ∆t are measured as a timing mark.
Imagine that you are wandering across a vast plateau and would like to know
where you are. Two satellites are orbiting far above you transmitting their own on
board clock times and positions. By using the signal transit time to both satellites
you can draw two circles with the radii S1 and S2 around the satellites. Each
radius corresponds to the distance calculated to the satellite. All possible
distances to the satellite are located on the circumference of the circle. If the
position above the satellites is excluded, the location of the receiver is at
the exact point where the two circles intersect beneath the satellites.
The position sought is at the point where all three surfaces of the spheres
intersect.
All statements made so far will only be valid, if the terrestrial clock and the
atomic clocks on board the satellites are synchronised, i.e. signal transit time can
be correctly determined.
Google Earth and Google Maps are made to work with GPS data
Many services allow you to upload your GPS tracks and waypoints to Google Earth.
Others also let you upload your photos and even geo-reference them for you, so they are
projected exactly on the spots where they have been taken.
Hybrid GeoTools make custom and standard software to extend the functionality of
popular geographic tools such as Google Earth.
Hybrid GeoTools' Active GPX Route Player for Google Earth. The “Media Player” of
GPS playback. Simple to use yet endlessly customizable, up to 50 routes can be played
back at the same time. Adjust time, speed scale, viewing behavior, track and icon
appearance and watch progress against an altitude profile. Every turn, acceleration and
stop is faithfully recreated.
New in Version 1.1 - Virtual Cyclist - Set the power, weight, aerodynamics and see how
you’d perform on the climbs of the Tour de France.
Hybrid GeoTools' 3D Route Builder is a GPS Editor for Google Earth. It offers fine
grain control of routes directly in Google Earth not only in terms of positioning and
altitude but also in time. Easily shift and scale time, correct barometric drift, synch to
video files and build accurate GPS (GPX), KML/KMZ and Garmin TCX files from
scratch or from existing files. Playback routes in real-time and optionally with absolute
altitude - that means tunnels, bridges, cable car rides and flights take on new levels of
realism.
3dtracking Ltd has just launched a new range of completely free GPS services through
their website http://www.3dtracking.net. Simply put, through using your mobile phone or
PDA, along with your GPS receiver, you can record and view your movements in detail
on Google Earth or Google Maps. You can even use the free service for live tracking
using Google Earth or Google Maps. Download of the required 3dtracking GPS software
application, as well as use of the website, is completely free (and there are no future
plans to charge for this either. Ever). The web server also retains all the data you've ever
recorded and submitted, so you can always go back and view your older recorded data at
any time.
Adam Schneider has added Google Maps as an output format in GPS Visualizer. You
can upload your GPS data file (in a supported format) and instantly view it in Google
Maps. It is also available as an output choice in the other input forms, including the
address form.
Phone2GEarth is an easy Nokia Series 80 GPS software application that allows to log
tracks which are directly saved as Google Earth KML files. New and useful features like:
- English, Spanish, German and French languages. - Place marks supported with
timestamps in the track. - bluetooth autostart, for easy use. - Complete Series 80: 9300,
9500 - Color and phone name configurable. It allows deferent phones, tracks etc.
Requirements: * Series 80 (Symbian) Smartphone (9500, 9300). * GPS Bluetooth (NMEA
protocol). * Google Earth (Windows).
Earth Bridge is designed to bridge the gap between Google Earth and your GPS
receiver. See your location on Google Earth in real-time and easily control your view.
Record your track as you move. Earth Bridge GPS software requires an NMEA 0183
compatible GPS device connected via a serial interface.
GPX
XML
OziExplorer
Plot GPX or OziExplorer format waypoint files in Google Earth (as placemarks).
Read live (NMEA) data from any gps (COM port) and plot location and track in Google
Earth.
Read a track file directly from a Garmin gps and plot it in Google Earth.
Put tracks (including live data) on a server and email a recipient for remote viewing.
Read Ham radio tracking data from Findu.com or from a receiver and plot the location
and track in Google Earth.
GPS Radar GPS software from JGUI allows you to use your Windows Mobile*) device
for the following reasons:
• review your moving on designed web pages with Google Maps streets or satellites
images.
• review your moving directly on GoogleEarth interface screen with all its features.
This GPS software version works well with any Windows Mobile device with GSM
network connection built-in. So called: Phone Edition devices. Any GPS receiver is
required.
GPS Track GPS software connects to a GPS and records the path that you travel. Tracks
can be uploaded to a web site, sent by email, transferred via Bluetooth, or written to a
flash memory card. Google Maps and Google Earth are used to view the tracks. File
formats such as GPX and CSV are also supported. Compatibility: This GPS software
requires a cell phone or other mobile device with:
EveryTrail is an online platform that enables you to visualize your travel and outdoor
activities and share these with like minded people from all over the world. With
EveryTrail you can easily upload GPS data you recorded while out on the trail and add
your photos and notes, to create a visual record of your outdoor activity. EveryTrail was
created by a small group of passionate travel and outdoor enthusiasts, out of
dissatisfaction with current solutions to share trips with friends and like minded people.
Antenna and Preamplifier: Antennas used for GPS receivers have broadbeam
characteristics, thus they do not have to be pointed to the signal source like
satellite TV receiving dishes. The antennas are compact and a variety of designs
are possible. There is a trend to integrating the antenna assembly with the
receiver electronics.
Radio Frequency Section and Computer Processor: The RF section contains the
signal processing electronics. Different receiver types use somewhat different
techniques to process the signal. There is a powerful processor onboard not only
to carry out computations such as extracting the ephemerides and determining the
elevation/azimuth of the satellites, etc., but also to control the tracking and
measurement function within modern digital circuits, and in some cases to carry
out digital signal processing.
Control Unit Interface: The control unit enables the operator to interact with the
microprocessor. Its size and type varies greatly for different receivers, ranging
from a handheld unit to soft keys surrounding an LCD screen fixed to the receiver
"box".
Recording Device: in the case of GPS receivers intended for specialised uses such
as the surveying the measured data must be stored in some way for later data
processing. In the case of ITS applications such as the logging of vehicle
movement, only the GPS-derived coordinates and velocity may be recorded. A
variety of storage devices were utilised in the past, including cassette and tape
recorders, floppy disks and computer tapes, etc., but these days almost all
receivers utilise solid state (RAM) memory or removable memory "cards".
Power Supply: Transportable GPS receivers these days need low voltage DC
power. The trend towards more energy efficient instrumentation is a strong one
and most GPS receivers operate from a number of power sources, including
internal NiCad or Lithium batteries, external batteries such as wet cell car
batteries, or from mains power.
TRACK: This indicates the direction in which you move. Sometimes this is called
HEADING. For navigation on land this is OK, but a boat or a plane can travel in
another direction, than the direction in which it is headed, due to wind or current.
TRACKLOG: This is the electronic equivalent of the famous bread crumb trail. If
you turned (automatic) tracklog on, your receiver will, at fixed intervals or at
special occasions, save the position, together with the time, to its memory. This
can be invaluable if at any moment during your trip you (have to) decide to go
back exactly along the route that brought you to your actual position.
TRACBACK: Among the best known GPS terms, it is the navigation method that
will bring you back to your point of departure along the same trail that you
traveled to your actual position. In order to be able to use this method, you may
need to copy the tracklog to one of the free track channels. (This is where you
need your manual for). Often a saved track can only contain 250 points, but be
assured that your GPS receiver will do a wonderful job in choosing the points
which best represent your traveled track.
WAYPOINT: Probably one of the most used general GPS terms. A waypoint is
nothing more or less than a saved set of co-ordinates. It does not have to
represent a physical point on land. Even at sea or in the air, one can mark a
waypoint. Once saved in your GPS receiver, you can turn back to exactly that set
of co-ordinates. You can give waypoints meaningful names. They can be created
‘on the fly’, which means that you can register them at 130 km/h on the road or
even at 800 km/h in a plane. Your GPS will attribute it a number, which you can
change to any name you want, once you have the time. You can also manually
enter a set of co-ordinates, that you found on a map. This way you can plan ahead
a trip or a walk with as much detail as you like.
Waypoints are very powerful navigation aids and for really critical operations it
should be considered to not only store their co-ordinates in your GPS receiver,
but also in your paper notebook. After all a highly sophisticated device as a GPS
receiver could stop functioning correctly for a lot of reasons.
order in which you want to travel them, but you can easily navigate them in
reverse order. You can add waypoints and delete others, but once saved, the order
in which your GPS will guide you along the waypoints is fixed.
This is a great way to plan ahead a walk. You can even create waypoints and
routes on your desktop PC and transfer them to your GPS receiver. All you need
for this is a cable which links your GPS to a RS232-port(COM) on your computer
and a piece of software, that enables you to mark points on a map at your screen.
We will treat this in more detail elsewhere on the site. You will see that this is
absolutely not rocket-science.
ROUTE LEG is the straight line between two adjacent waypoints in a route.
GOTO is also among the best known GPS terms and probably the most used
navigation method with a GPS receiver, because it is easily understood and
executed. If you tell your companion that you will GOTO waypoint X, it will
calculate the direction and distance from your actual location to the set of co-
ordinates, represented by the indicated waypoint. Your GPS receiver is unable to
know what obstacles, hazards or whatever, if any, there are between you and
waypoint X, so it will guide you in a straight line to the indicated point. This is
great on open water or in the air, but on land it is often not the best method.
BEARING: Once you told to which point you want to travel, your GPS will
continuously calculate in which direction that point is situated, seen from your
actual position. That direction is the bearing. If you navigate along a route, the
bearing will be the direction to the NEXT waypoint in the route. If you do or can
not travel in a straight line to the waypoint, the bearing will fluctuate all the time.
TURN: This GPS term indicates the difference between the direction you should
travel in (BEARING) and the direction in which you are actually traveling
(TRACK). An indication of ‘28L’ means that you should modify your actual
direction of travel with 28° to the Left, if you wish to ever reach your point. In
principle, when you have the reading of TURN on your navigation page, you
don’t need the readings of those other two GPS terms BEARING and TRACK, but
most people prefer reading these two.
We have been assuming up until now that it has been possible to measure signal
transit time precisely. However, this is not the case. For the receiver to measure
time precisely a highly accurate, synchronised clock is needed. If the transit time
is out by just 1 µs this produces a positional error of 300m. As the clocks on
board all three satellites are synchronised, the transit time in the case of all
three measurements is inaccurate by the same amount. Mathematics is the
only thing that can help us now. We are reminded when producing
calculations that if N variables are unknown, we need N independent equations.
3-D space:
• longitude (X)
• latitude (Y)
• height (Z)
If a receiver sees 4 satellites and all are arranged for example in the north-west,
this leads to a “bad” geometry. In the worst case, no position determination is
possible at all, when all distance determinations point to the same direction. Even
if a position is determined, the error of the positions may be up to 100 – 150 m. If,
on the other hand, the 4 satellites are well distributed over the whole firmament
the determined position will be much more accurate. Let’s assume the satellites
are positioned in the north, east, south and west in 90° steps. Distances can then
be measured in four different directions, reflecting a „good“ satellite geometry.
If the two satellites are in an advantageous position, from the view of the receiver
they can be seen in an angle of approximately 90° to each other. The signal
runtime can not be determined absolutely precise as explained earlier. The
possible positions are therefore marked by the grey circles. The point of
intersection A of the two circles is a rather small, more or less quadratic field
(blue), the determined position will be rather accurate.
If the satellites are more or less positioned in one line from the view of the
receiver, the plane of intersection of possible positions is considerably larger and
elongated- The determination of the position is less accurate.
The satellite geometry is also relevant when the receiver is used in vehicles or
close to high buildings. If some of the signals are blocked off, the remaining
satellites determine the quality of the position determination and if a position fix
is possible at all. This can be observed in buildings close to the windows. If a
position determination is possible, mostly it is not very accurate. The larger the
obscured part of the sky, the more difficult the position determination gets.
Most GPS receivers do not only indicate the number of received satellites, but
also their position on the firmament. This enables the user to judge, if a relevant
satellite is obscured by an obstacle and if changing the position for a couple of
meters might improve the accuracy. Many instruments provide a statement of the
To indicate the quality of the satellite geometry, the DOP values (dilution of
precision) are commonly used. Based on which factors are used for the
calculation of the DOP values, different variants are distinguished:
Although the satellites are positioned in very precise orbits, slight shifts of the
orbits are possible due to gravitation forces. Sun and moon have a weak influence
on the orbits. The orbit data are controlled and corrected regularly and are sent
to the receivers in the package of ephemeris data. Therefore the influence on the
correctness of the position determination is rather low, the resulting error being
not more than 2 m.
For GPS signals this effect mainly appears in the neighbourhood of large
buildings or other elevations. The reflected signal takes more time to reach the
receiver than the direct signal. The resulting error typically lies in the range of a
few meters.
These errors are mostly corrected by the receiver by calculations. The typical
variations of the velocity while passing the ionosphere for low and high
frequencies are well known for standard conditions. Theses variations are taken
into account for all calculations of positions. However civil receivers are not
capable of correcting unforeseen runtime changes, for example by strong solar
winds.
Despite the synchronization of the receiver clock with the satellite time during the
position determination, the remaining inaccuracy of the time still leads to an
error of about 2 m in the position determination. Rounding and calculation errors
of the receiver sum up approximately to 1 m.
The following section shall not provide a comprehensive explanation of the theory
of relativity. In the normal life we are quite unaware of the omnipresence of the
theory of relativity. However it has an influence on many processes, among them
is the proper functioning of the GPS system. This influence will be explained
shortly in the following.
As we already learned, the time is a relevant factor in GPS navigation and must
be accurate to 20 - 30 nanoseconds to ensure the necessary accuracy. Therefore
the fast movement of the satellites themselves (nearly 12000 km/h) must be
considered.
Whoever already dealt with the theory of relativity knows that time runs slower
during very fast movements. For satellites moving with a speed of 3874 m/s,
clocks run slower when viewed from earth. This relativistic time dilation leads to
an inaccuracy of time of approximately 7,2 microseconds per day (1 microsecond
= 10-6 seconds).
The theory of relativity also says that time moves the slower the stronger the field
of gravitation is. For an observer on the earth surface the clock on board of a
satellite is running faster (as the satellite in 20000 km height is exposed to a much
weaker field of gravitation than the observer). And this second effect is six times
stronger than the time dilation explained above.
That's right - we are the 'Users'. All kinds of people use GPS for all kinds of purposes.
While the GPS was designed for the Military, the number of civilian users is greater than
Military users. Some of the more common uses of the GPS are:
Emergency Services - Fire, ambulance or other 911 services to locate people in distress.
Ground Transportation
GPS technology helps with automatic vehicle location and in-vehicle navigation systems.
Many navigation systems show the vehicle’s location on an electronic street map,
allowing drivers to keep track of where they are and to look up other destinations. Some
systems atutomatically create a route and give turn-by-turn directions. GPS technology
also helps monitor and plan routes for delivery vans and emergency vehicles.
GIS (Geographic Information System) Data Collection - cities use it to locate their
services such as power lines and water hydrants even streets
Marine - fishermen and vessels at sea use it as a guide to steer their boats or to identify a
location on the sea
Rail
3.6 Summary
In this unit we have discussed the technology of global positioning system. The key
components essential for a global positioning system to function has also been discussed.
We also learned about the errors that can be present in a GPS data and therefore what
are the most suitable methods of collecting data with a GPS. The types of GPS enables us
to understand the various functionality of the GPS, and thus its applications have also
been enclosed.
3.7 Glossary
Google Earth/Maps- Software or an interphase where online maps and satellite data can
be viewed. This was launched by the network chain Google.
GPS position- A satellite based device that records x,y,z coordinates and other data.
Ground locations are calculated by signals from satellites orbiting the Earth.
Orbital Plane- All of the planets, comets, and asteroids in the solar system are in orbit
around the Sun. All of those orbits line up with each other making a semi-flat disk called
the orbital plane. The orbital plane of an object orbiting another is the geometrical plane
in which the orbit is embedded.
3.8 References
1. http://www.gmat.unsw.edu.au/snap/gps/gps_notes1.pdf
2. http://www8.garmin.com/manuals/GPSGuideforBeginners_Manual.pdf
3. http://www.inovatrack.com/gps/GPS_basics_u_blox_en.pdf
4. http://www.gps-practice-and-fun.com/gps-receivers.html
5. http://www.diginetlink.com/Buyer_s_Guide_s/262.htm
6. http://gps.about.com/od/beforeyoubuy/tp/GPS_do_for_you.htm
7. http://www.trimble.com/gps/howgps-error.shtml
8. http://www.kowoma.de/en/gps/errors.htm
9. http://www.geod.nrcan.gc.ca/edu/geod/gps/gps09_e.php
10. http://www.dw-world.de/dw/article/0,,4543691,00.html
11. http://www.gpsworld.com/transportation/rail/news/antenna-based-control-
system-can-stop-a-train-8802
12. http://www.vehicle-tracking-gps.com/buyhere-payhere.htm
13. http://www.vosizneias.com/60544/2010/07/21/goshen-ny-orange-county-
emergency-services-to-receive-automatic-vehicle-locator-dispatching-system
14. http://www.dw-world.de/dw/article/0,,4543691,00.html
1. http://www.gmat.unsw.edu.au/snap/gps/gps_notes1.pdf
2. http://www.kowoma.de/en/gps/errors.htm
3. http://www.geod.nrcan.gc.ca/edu/geod/gps/gps09_e.php
4. Name any 4 errors in acquiring GPS data. Explain geometric error and its
corrections.
***