Unit Ii Notes RS & Gis
Unit Ii Notes RS & Gis
Unit Ii Notes RS & Gis
Introduction:
Remote sensing is an art and science of obtaining information about an object or feature
without physically coming in contact with that object or feature. Humans apply remote sensing
in their day-to-day business, through vision, hearing and sense of smell. The data collected can
be of many forms: variations in acoustic wave distributions (e.g., sonar), variations in force
distributions (e.g., gravity meter), variations in electromagnetic energy distributions (e.g., eye)
etc. These remotely collected data through various sensors may be analyzed to obtain
information about the objects or features under investigation. In this course we will deal with
remote sensing through electromagnetic energy sensors only.
Thus, remote sensing is the process of inferring surface parameters from measurements of the
electromagnetic radiation (EMR) from the Earth’s surface. This EMR can either be reflected or
emitted from the Earth’s surface. In other words, remote sensing is detecting and measuring
electromagnetic (EM) energy emanating or reflected from distant objects made of various
materials, so that we can identify and categorize these objects by class or type, substance and
spatial distribution [American Society of Photogrammetry, 1975].
Remote sensing provides a means of observing large areas at finer spatial and temporal
frequencies. It finds extensive applications in civil engineering including watershed studies,
hydrological states and fluxes simulation, hydrological modeling, disaster management services
such as flood and drought warning and monitoring, damage assessment in case of natural
calamities, environmental monitoring, urban planning etc.
‘Remote’ means far away, and ‘sensing’ means believing or observing or acquiring some
information.
Of our five senses, we use three as remote sensors
1. Watch a cricket match from the stadium (sense of sight)
2. Smell freshly cooked curry in the oven (sense of smell)
3. Hear a telephone ring (sense of hearing)
Then what are our other two senses and why are they not used “remotely”?
4. Try to feel smoothness of a desktop (Sense of touch)
5. Eat a mango to check the sweetness (sense of taste)
In the last two cases, we are actually touching the object by our organs to collect the
information about the object.
Distance of Remote sensing:
Remote sensing occurs at a distance from the object or area of interest, Interestingly,
there is no clear distinction about this distance. It could be 1 m, 1,000 m, or greater than 1
million meters from the object or area of interest. In fact, virtually all astronomy is based on RS.
Many of the most innovative RS systems, and visual and Digital image processing methods were
originally developed for RS of extraterrestrial landscapes such as moon, Mars, Saturn, Jupiter,
etc.
RS techniques may also be used to analyse inner space, for example, an electron
microscope and its associated hardware may be used to obtain photographs of extremely small
objects on the skin, in the eye, etc, similarly, an X-ray device is a RS instrument to examin bones
and organs inside the body. In such cases, the distance is less than 1 m.
1. Remote Sensing Data Collection:
The data collection may be take place directly in the field, or at some remote distance
from the object or area of interest. Data that are collected directly in the field (study site
or the ground for which data are to be collected) are termed as in situ data, and the data
collected remotely called Remote Sensing data.
In Situ Data: An in-situ data is the data collected that is associated with
measurement has the exact measurement of the actual location. An example of
this would be when collecting Remote sensing data, in-situ data will be used to
verify that the measurement of the data collected will be the same as the actual
location.
Transducers are the devices that convert variations in physical quantities (such as
pressure or brightness) into electrical signals, or vice versa. Many different transducers
are available. A scientist could use a thermometer to measure the temperature of the air,
soil, or water: spectrometer to measure the spectral reflectance: anemometer to measure
the speed of the wind: or a psychrometer to measure the humidity of the air. The data
recorded by the transducer may be an analog signal with voltage variations related to the
intensity of the property being measured. Often these analog signals are transformed into
digital values using analog to digital conversion procedures.
1-Passive sensors detect natural radiation that is emitted or reflected by the object or
surrounding area being observed. Reflected sunlight is the most common source of
radiation measured by passive sensors. Examples of passive remote sensors include film
photography, infrared, and radiometers.
2-Active remote sensing, on the other hand, emits energy in order to scan objects and
areas whereupon a sensor then detects and measures the radiation that is reflected or
backscattered from the target. RADAR is an example of active remote sensing where the
time delay between emission and return is measured, establishing the location, height,
speeds and direction of an object.
3. Classification Based on Image Media:
Reflected or emitted energy from terrain may be imaged, either
Photographic images
Digital Images
5. Explain in detail step by step procedure in Remote sensing process with neat
diagram? (Or) Describe briefly the different elements of RS?
The process involved in RS system requires an involvement of energy. For
example, when we view the screen of a computer monitor, we are actively engaged in
RS. A physical quantity (light) emanates from the screen, which is a source of radiation.
The radiated light passes over a distance, and thus is remote to some extent, until it
encounters and is captured by a sensor (eyes). Each eye sends a signal to a processor
(brain) which records the data and interprets this into information.
Now consider, if the energy being remotely sensed comes from the sun, the
energy is radiated by atomic particles at the source (the sun), propagates through the
vacuum of space at the speed of light, interacts with the earth’s atmosphere, interacts with
the earth’s surface, some amount of energy reflects back, interacts with the earth’s
atmosphere once again, and finally reaches the remote sensor, where it interacts with
various optical systems, filters, film emulsions, or detectors.
"Remote sensing is the science (and to some extent, art) of acquiring information about the
Earth's surface without actually being in contact with it. This is done by sensing and recording
reflected or emitted energy and processing, analyzing, and applying that information."
In much of remote sensing, the process involves an interaction between incident
radiation and the targets of interest. This is exemplified by the use of imaging systems where the
following seven elements are involved. Note, however that remote sensing also involves the
sensing of emitted energy and the use of non-imaging sensors.
1. Energy Source or Illumination (A) - the first requirement for remote sensing is to have an
energy source which illuminates or provides electromagnetic energy to the target of interest.
(Active RS or Passive RS)
2. Radiation and the Atmosphere (B) - as the energy travels from its source to the target, it will
come in contact with and interact with the atmosphere it passes through. This interaction may
take place a second time as the energy travels from the target to the sensor.
3. Interaction with the Target (C) - once the energy makes its way to the target through the
atmosphere, it interacts with the target depending on the properties of both the target and the
radiation.
4. Recording of Energy by the Sensor (D) - after the energy has been scattered by, or emitted
from the target, we require a sensor (remote - not in contact with the target) to collect and record
the electromagnetic radiation.
5. Transmission, Reception, and Processing (E) - the energy recorded by the sensor has to be
transmitted, often in electronic form, to a receiving and processing station where the data are
processed into an image (hardcopy and/or digital).
6. Interpretation and Analysis (F) - the processed image is interpreted, visually and/or digitally
or electronically, to extract information about the target which was illuminated.
7. Application (G) - The final element of the remote sensing process is achieved when we apply
the information we have been able to extract from the imagery about the target in order to better
understand it, reveal some new information, or assist in solving a particular problem.
These seven elements comprise the remote sensing process from beginning to end. We will be
covering all of these in sequential order throughout the five chapters of this tutorial, building
upon the information learned as we go.
6. Explain about EMR with neat diagram?
To understand how EMR (Electro Magnetic Radiation) is produced, how it propagates
through space, and how it interacts with other matter, it is useful to describe the
electromagnetic energy using two different models: Wave model and Particle model.
WAVE MODEL:
In the 1860s, James Maxwell conceptualized EMR as an electromagnetic energy or wave
that travels through space at the speed of light that is 299,792.46 km /s or 186,282.03
miles /s (commonly rounded off to 3 X 10 8 m/s ). The electromagnetic wave consists of
two fluctuating fields – one electrical and the other magnetic. These two fluctuating fields
are at right angles (900) to one another and both are perpendicular to the direction of
propagation. Both have the same amplitudes (strengths) which reach their maxima –
minima at the same time. Unlike other wave types that require a carrier (e.g. sound
waves), electromagnetic waves can transmit through vacuum (such as in space).
Electromagnetic radiation is generated whenever an electrical charge is accelerated.
Wavelength and frequency are the two important characteristics of EMR which are
particularly important for understanding remote sensing. The wavelength is the length of
one complete wave cycle, which can be measured as the distance between two successive
crests or troughs. A crest is the point on a wave with the greatest positive value or upward
displacement in a cycle. A trough is the inverse of crest. The wavelength of the EMR
depends up on the length of time that the charged particle is accelerated. It is usually
represented by the Greek letter lambda (λ). It is measured in meters (m), or some factors
of meters such as nanometers (nm, 10 -9 m), micrometers (10 -6 m), or centimeters (cm,
10 -2 m),
Frequency refers to the number of cycles of a wave passing a fixed point per unit of time.
It is represented by Greek letter nu (v), It is normally measured in hertz (Hz), equivalent
to one cycle per second. A wave that completes one cycle in every second is said to have
a frequency of one cycle per second, or one hertz (1 Hz).
The relationship between the wave length () and frequency ( f of v) of EMR is
based on the following formula: c= λ v
V = c/ λ, or λ= c/v
Where ‘c’ is the velocity of light.
Note that frequency is inversely proportional to wavelength. The relationship is
shown diagrammatically in fig, the longer the wavelength, the lower the frequency, the
shorter the wave length, the higher the frequency. When the EMR passes from one
medium to another, then the speed of light and the wavelength change while the
frequency remains constant.
These differences make it possible to identify different earth surface features or materials by
analysing their spectral reflectance patterns or spectral signatures. These signatures can be
visualised in so called spectral reflectance curves as a function of wavelengths. The figure in the
right column shows typical spectral reflectance curves of three basic types of Earth
features: green vegetation, dry bare soil and clear water.
The spectral reflectance curve of healthy green vegetation has a significant minimum of
reflectance in the visible portion of the electromagnetic spectrum resulting from the pigments in
plant leaves. Reflectance increases dramatically in the near infrared. Stressed vegetation can also
be detected because stressed vegetation has a significantly lower reflectance in the infrared.
Vegetation covers a large portion of the Earth's land surface. Its role on the regulation of the
global temperature, absorption of CO2 and other important functions, make it a land cover type
of great significance and interest. Remote sensing can take advantage of the particular manner
that vegetation reflects the incident electromagnetic energy and obtain information about the
vegetation.
Cellular leaf structure and its interaction with electromagnetic energy. Most visible light is
absorbed, while almost half of the near infrared energy is reflected.
Under the upper epidermis (the thin layer of cells that forms the top surface of the leaf) there are
primarily two layers of cells. The top one is the palisade parenchyma and consists of elongated
cells, tightly arranged in a vertical manner. In this layer resides most of the chlorophyll, a protein
that is responsible for capturing the solar energy and power the process of photosynthesis. The
lower level is the spongy parenchyma, consisting of irregularly shaped cells, with a lot of air
spaces between them, in order to allow the circulation of gases.
The spectral reflectance curve of bare soil is considerably less variable. The reflectance curve is
affected by moisture content, soil texture, surface roughness, presence of iron oxide and organic
matter. These factors are less dominant than the absorbance features observed in vegetation
reflectance spectra
The water curve is characterized by a high absorption at near infrared wavelengths range and
beyond. Because of this absorption property, water bodies as well as features containing water
can easily be detected, located and delineated with remote sensing data. Turbid water has a
higher reflectance in the visible region than clear water. This is also true for waters containing
high chlorophyll concentrations. These reflectance patterns are used to detect algae colonies as
well as contaminations such as oil spills or industrial waste water (more about different
reflections in water can be found in the tutorial Ocean Colour in the Coastal Zone).
Features on the Earth reflect, absorb, transmit, and emit electromagnetic energy
from the sun. Special digital sensors have been developed to measure all types of
electromagnetic energy as it interacts with objects in all of the ways listed above. The
ability of sensors to measure these interactions allows us to use remote sensing to
measure features and changes on the Earth and in our atmosphere. A measurement of
energy commonly used in remote sensing of the Earth is reflected energy (e.g., visible
light, near-infrared, etc.) coming from land and water surfaces. The amount of energy
reflected from these surfaces is usually expressed as a percentage of the amount of energy
striking the objects. Reflectance is 100% if all of the light striking and object bounces off
and is detected by the sensor. If none of the light returns from the surface, reflectance is
said to be 0%. In most cases, the reflectance value of each object for each area of the
electromagnetic spectrum is somewhere between these two extremes. Across any range
of wavelengths, the percent reflectance values for landscape features such as water, sand,
roads, forests, etc. can be plotted and compared
Most remote sensing applications process digital images to extract spectral
signatures at each pixel and use them to divide the image in groups of similar pixels
(segmentation) using different approaches. As a last step, they assign a class to each
group (classification) by comparing with known spectral signatures. Depending on pixel
resolution, a pixel can represent many spectral signature "mixed" together - that is why
much remote sensing analysis is done to "unmix mixtures". Ultimately correct matching
of spectral signature recorded by image pixel with spectral signature of existing elements
leads to accurate classification in remote sensing.
9. Energy interaction with the atmosphere in remote sensing? (or) Atmospheric effects
in RS?
Before radiation used for remote sensing reaches the Earth's surface it has to travel
through some distance of the Earth's atmosphere. Particles and gases in the atmosphere
can affect the incoming light and radiation. These effects are caused by the mechanisms
of Scattering and Absorption.
Raman’s Scattering: when the particles are same size , more or lessthan the
wavelength of the radiation.
11. Interaction of EMR with Earth surface features (Soil, Vegetation and water)?
The interaction of electro-magnetic radiation with the Earth's surface is driven by three physical
processes: reflection, absorption, and transmission of radiation. Absorption involves a reduction
in radiation intensity as its energy is converted on reaching an object on the Earth's surface.
Reflection involves the returning or throwback of the radiation incident on an object on the
Earth's surface, whilst transmission entails the transfer of irradiative energy from an object on the
Earth's surface to surrounding bodies. Together, these three concepts make up an object's
radioactive flux:
12. What is atmospheric window?
Atmospheric windows are these portions of the EM radiation spectrum with low
absorption/high transmission.
Following are some examples of the atmospheric windows:
(0.3 – 1.3 μm): Visible/near infrared window.
(1.5 – 1.8, 2.0 – 2.5, and 3.5 – 4.1μm): Mid infrared window.
(7.0 – 15.0 μm): Thermal/far infrared window.
Some wavelengths cannot be used in remote sensing because our atmosphere absorbs
essentially all the photons at these wavelengths that are produced by the sun. In particular, the
molecules of water, carbon dioxide, oxygen, and ozone in our atmosphere block solar radiation.
The wavelength ranges in which the atmosphere is transparent are called atmospheric windows.
Remote sensing projects must be conducted in wavelengths that occur within atmospheric
windows. Outside of these windows, there is simply no radiation from the sun to detect--the
atmosphere has blocked it.
The figure above shows the percentage of light transmitted at various wavelengths from
the near ultraviolet to the far infrared, and the sources of atmospheric opacity are also given. You
can see that there is plenty of atmospheric transmission of radiation at 0.5 microns, 2.5 microns,
and 3.5 microns, but in contrast there is a great deal of atmospheric absorption at 2.0, 3.0, and
about 7.0 microns. Both passive and active remote sensing technologies do best if they operate
within the atmospheric windows.
13. What are Black Body and its laws?
A black body or blackbody is an idealized physical body that absorbs all
incident electromagnetic radiation, regardless of frequency or angle of incidence. The name
"black body" is given because it absorbs all colors of light. A black body also emits black-body
radiation. In contrast, a white body is one with a "rough surface that reflects all incident rays
completely and uniformly in all directions."
It is a diffuse emitter: measured per unit area perpendicular to the direction, the energy is
radiated isotropically, independent of direction.
Planck's Law:
Planck's Law can be generalized as such: Every object emits radiation at all times and
at all wavelengths. If you think about it, this law is pretty hard to wrap your brain around. We
know that the sun emits visible light (below left), infrared waves, and ultraviolet waves (below
right), but did you know that the sun also emits microwaves, radio waves, and X-rays? OK...
you are probably saying, the sun is a big nuclear furnace, so it makes sense that it emits all sorts
of electromagnetic radiation. However, Plank's Law states that every object emits over the entire
electromagnetic spectrum. That means that you emit radiation at all wavelengths -- so does
everything around you!
Two images of the sun taken at different wavelengths of the electromagnetic spectrum. The left
image shows the sun's emission at a wavelength in the visible range. The right image is the
ultraviolet emission of the sun. Note: colors in these images and the ones above are deceptive.
There is no sense of "color" in spectral regions other than visible light. The use of color in these
"false-color" images is only used as an aid to show radiation intensity at one particular
wavelength. Credit: NASA/JPL
Now before you dismiss this statement out-of-hand, let me say that you are not emitting X-rays
in any measurable amount (thank goodness!). The mathematics behind Plank's Law hinge on the
fact that there is a wide distribution of vibration speeds for the molecules in a substance. This
means that it is possible for matter to emit radiation at any wavelength, and in fact it does.
Another common misconception that Plank's Law dispels is that matter selectively emits
radiation. Consider what happens when you turn off a light bulb. Is it still emitting radiation?
You might be tempted to say "No" because the light is off. However, Plank's Law tells us that
while the light bulb may no longer be emitting radiation that we can see, it is still emitting at all
wavelengths (most likely, it is emitting copious amounts of infrared radiation). Another example
that you hear occasionally on TV weathercasts goes something like this. "When the sun sets, the
ground begins to emit infrared radiation..." This is certainly not true by nature of Planck's Law
(and besides, how does the ground know when the sun sets anyway). We'll talk more about
radiation emission from the ground in a future lesson. For now, please dismiss such statements
as hogwash. The surface of the earth emits radiation all the time and at all wavelengths.
Wein's Law:
At this point I know what you are thinking... there must be a "catch". In fact there is. While all
matter emits radiation at all wavelengths, it does not do so equally. This is where the next
radiation law comes in. Wein's Law states that the wavelength of peak emission is inversely
proportional to the temperature of the emitting object. Put another way, the hotter the object,
the shorter the wavelength of max emission. You have probably have observed this law in action
all the time without even realizing it. Want to know what I mean? Check out this steel bar.
Which end might you pick up? Certainly not the right end... it looks hot. Why does it "look
hot"? Well, the wavelength of peak emission for the right side of the bar is obviously shorter
than the left side's peak emission wavelength. You see this shift in the peak emission
wavelength as a color changes from red to orange to yellow as the metal's temperature increases.
Note: I should point out that even though the steel bar is a yellow-white color at the end, the peak
emission is still in the infrared part of the electromagnetic spectrum. However, the peak is so
close to the visible part of the spectrum, that there is a significant amount of visible light also
being emitted from the steel. Judging by the look of this photograph, the steel has a temperature
of roughly 1500 kelvins, resulting in a max emission wavelength of 2 microns (remember visible
light is 0.4-0.7 microns). Here is a chart showing how I estimated the steel temperature. To the
left of the visibly red metal, the bar is still likely several hundred degrees Celsius. However, in
this section of the bar, the peak emission wavelength is far into the IR portion of the spectrum --
so much so that no visible light emission is discernible with the human eye.
So, now that we've established Wein's Law, how do we apply it to the emission sources that
effect the atmosphere. Consider the chart below showing the emission curves (called Planck
functions) for both the sun and the earth.
The emission spectrum of the sun (orange curve) compared to the earth's emission (dark red
curve). The x-axis shows wavelength in factors of 10 (called a "log scale"). The y-axis is the
amount of energy per unit area per unit time per unit wavelength. I have kept the units arbitrary
because as you can see, they are messy. Credit: David Babb
Note the idealized spectrum for the earth's emission (dark red line) of electromagnetic radiation
compared to the sun's electromagnetic spectrum (orange line). The radiating temperature of the
sun is 6000 degrees Celsius compared to the earth's measly 15 degrees Celsius. This means that
given its high radiating temperature, the sun's peak emission occurs near 0.5 microns, on the
short-wave end of the visible spectrum. Meanwhile the Earth's peak emission is located in the
infrared portion of the electromagnetic spectrum.
By the way, because the sun's peak emission is located around 0.5 microns, we see it as having a
yellow quality. But this is not the case for all stars. Some stars in our galaxy are somewhat
cooler and exhibit a reddish hue, while others are much hotter and appear blue. The constellation
Orion(link is external) contains the red supergiant Betelgeuse and several blue supergiants, the
largest being Rigel and Bellatrix. Can you spot them in this photograph of Orion?
Stefan–Boltzmann Law:
Examine once again the graph of the sun's emission curve versus the Earth's emission curve. Pay
particular attention to the energy values on the left axis (for the sun) and right axis (for the
earth). The first thing to notice is that the energy values are given in powers of 10 (that is, 10 6 is
equal to 1,000,000). This means that if we compare the peak emissions from the earth and sun
we see that the sun at its peak wavelength emits 30,000 times more energy than the earth at its
peak. In fact, if we add up the total energy emitted by each body (by adding the energy
contribution at each wavelength), we see that the sun emits over 150,000 times more energy per
unit area than the earth!
I calculated the numbers above using the third radiation law that you need to know, the Stefan-
Boltzmann Law. The Stefan-Boltzmann Law states that the total amount of energy per unit area
emitted by an object is proportional to the 4th power of the temperature. We'll more talk more
about this relationship when discuss satellite remote sensing. It is also particularly useful when
we want to understand how much energy the earth's surface emits in the form of infrared
radiation.
Kirchhoff's Law:
In the preceding radiation laws, we have been taking about the ideal amount of radiation
than can be emitted by an object. This theoretical limit is called "black body radiation".
However, the actual radiation emitted by an object can be much less than the ideal, especially at
certain wavelengths. Kirchhoff's Law describes the linkage between an object's ability to emit at
a particular wavelength with its ability to absorb radiation at that same wavelength. In plain
language, Kirchhoff's Law states that for an object whose temperature is not changing, an object
that absorbs radiation well at a particular wavelength will also emit radiation well at that
wavelength. One implication of Kirchhoff's law is as follows: If we want to measure a
particular constituent in the atmosphere (water vapor for example), we need to choose a
wavelength that is emitted well by water vapor (otherwise we wouldn't detect it). However,
since water vapor readily emits at our chosen wavelength, it also readily absorbs radiation at this
wavelength -- which is going to cause some problems measurement-wise.
Well look at the implications of Kirchhoff's Law in a later section. For now, we need to
complete our discuss of radiation by looking at the possible things that can happen to a beam of
radiation as it passes through a medium.
14. What are the different platforms that are used in RS?
For remote sensing applications, sensors should be mounted on suitable stable
platforms. These platforms can be ground based air borne or space borne based. As the
platform height increases the spatial resolution and observational area increases. Thus,
higher the sensor is mounted; larger the spatial resolution and synoptic view is obtained.
The types or characteristics of platform depend on the type of sensor to be attached and
its application. Depending on task, platform can vary from ladder to satellite. For some
task sensors are also placed on ground platforms. Though aircrafts and satellites are
commonly used platforms, balloons and rockets are also used.
Three types of platforms are used to mount the remote sensors –
1. Ground Observation Platform
Operate at ranges of 50-100m with panoramic scanning and are often used to map
building interiors or small objects
Can measure at distances of up to 1km and are frequently used in open-pit mining and
topographic survey applications.
Airborne platforms were the sole non-ground-based platforms for early remote sensing
work. Aircraft remote sensing system may also be referred to as sub-orbital or airborne, or aerial
remote sensing system. At present, airplanes are the most common airborne platform. Other
airborne observation platforms include balloons, drones (short sky spy) and high altitude
sounding rockets. Helicopters are occasionally used.
Balloons are used for remote sensing observation (aerial photography) and nature
conservation studies. The first aerial images were acquired with a camera carried aloft by a
balloon in 1859. Balloon floats at a constant height of about 30 km. It consists of a rigid circular
base plate for supporting the entire sensor system which is protected by an insulating and shock
proof light casing. The payload used for Indian balloon experiment of three Hasselblad cameras
with different film filter combinations, to provide PAN, infra red black and white and infra red
false color images. Flight altitude being high compared to normal aircraft height used for aerial
survey, balloon imagery gives larger synoptic views. The balloon is governed by the wind at the
floating altitude. Balloons are rarely used today because they are not very stable and the course
of flight is not always predictable, although small balloons carrying expendable probes are still
used for some meteorological research.
2.2 Drone
2.3 Aircraft
Special aircraft with cameras and sensors on vibration less platforms are traditionally
used to acquire aerial photographs and images of land surface features. While low altitude aerial
photography results in large scale images providing detailed information on the terrain, the high
altitude smaller scale images offer advantage to cover a larger study area with low spatial
resolution.
The National High Altitude Photography (NHAP) program (1978), coordinated by the US
Geological Survey, started to acquire coverage of the United States with a uniform scale and
format. Beside aerial photography multi spectral, hyperspectral and microwave imaging is also
carried out by aircraft; thereafter multi spectral, hyperspectral and microwave imaging were also
initiated.
Aircraft platforms offer an economical method of remote sensing data collection for small to
large study areas with cameras, electronic imagers, across- track and along-track scanners, and
radar and microwave scanners. AVIRIS hyperspectral imaging is famous aircraft aerial
photographic operation of USGS.
High altitude sounding rocket platforms are useful in assessing the reliability of the
remote sensing techniques as regards their dependence on the distance from the target is
concerned. Balloons have a maximum altitude of approximately 37 km, while satellites cannot
orbit below 120 km. High altitude sounding rockets can be used to a moderate altitude above
terrain. Imageries with moderate synoptic view can be obtained from such rockets for areas of
some 500,000 square kilometers per frame. The high altitude sounding rocket is fired from a
mobile launcher. During the flight its scanning work is done from a stable altitude, the payload
and the spent motor are returned to the ground gently by parachute enabling the recovery of the
data. One most important limitations of this system is to ensure that the descending rocket not
going to cause damage.
In spaceborne remote sensing, sensors are mounted on-board a spacecraft (space shuttle or
satellite) orbiting the earth. Space-borne or satellite platform are onetime cost effected but
relatively lower cost per unit area of coverage, can acquire imagery of entire earth without taking
permission. Space borne imaging ranges from altitude 250 km to 36000 km.
There are two types of well recognized satellite platforms- manned satellite platform and
unmanned satellite platform.
Manned Satellite Platforms:
Manned satellite platforms are used as the last step, for rigorous testing of the remote
sensors on board so that they can be finally incorporated in the unmanned satellites. This multi-
level remote sensing concept is already presented. Crew in the manned satellites operates the
sensors as per the program schedule.
Landsat series, SPOT series and IRS series of remote sensing satellite, NOAA series of
meteorological satellites, the entire constellation of the GPS satellites and the GOES and INSAT
series of geostationary environmental, communication, television broadcast, weather and earth
observation satellites etc are examples of unmanned satellite category.
15. Characteristics of satellite orbits?
The path followed by a satellite in the space is called the orbit of the satellite. Orbits may
be circular (or near circular) or elliptical in shape.
Orbital period: Time taken by a satellite to compete one revolution in its orbit around
the earth is called orbital period.
It varies from around 100 minutes for a near-polar earth observing satellite to 24 hours
for a geo-stationary satellite.
Altitude: Altitude of a satellite is its heights with respect to the surface immediately
below it. Depending on the designed purpose of the satellite, the orbit may be located at
low (160-2000 km), moderate, and high (~36000km) altitude.
Apogee and perigee: Apogee is the point in the orbit where the satellite is at maximum
distance from the Earth. Perigee is the point in the orbit where the satellite is nearest to
the Earth as shown in Fig.
Schematic representation of the satellite orbit showing the Apogee and Perigee
Inclination: Inclination of the orbital plane is measured clockwise from the equator.
Orbital inclination for a remote sensing satellite is typically 99 degrees. Inclination of any
satellite on the equatorial plane is nearly 180 degrees.
Nadir, ground track and zenith: Nadir is the point of interception on the surface of the
Earth of the radial line between the center of the Earth and the satellite. This is the point
of shortest distance from the satellite to the earth’s surface. Any point just opposite to the
nadir, above the satellite is called zenith. The circle on the earth’s surface described by
the nadir point as the satellite revolves is called the ground track. In other words, it is the
projection of the satellites orbit on the ground surface.
Swath of a satellite is the width of the area on the surface of the Earth, which is imaged
by the sensor during a single pass. For example, swath width of the IRS-1C LISS-3sensor
is 141 km in the visible bands and 148 km in the shortwave infrared band.
16. What are the different satellite orbital’s explain with diagram?
When a satellite is launched into the space, it moves in a well-defined path around
the Earth, which is called the orbit of the satellite. Gravitational pull of the Earth and the
velocity of the satellite are the two basic factors that keep the satellites in any particular
orbit. Spatial and temporal coverage of the satellite depends on the orbit. There are three
basic types of orbits in use.
1. By Inclination:
Equatorial Orbit
Inclined Orbit
Polar Orbit
2. BY ALTITUDE:
Low Earth Orbit (LEO)
Medium Earth Orbit (MEO)
Geostationary Earth Orbit (GEO)
LEO: LOW EARTH ORBIT (160 – 2,000 KM), it takes 120 min to circle the Earth.
MEO: MEDIUM EARTH ORBIT (2,000 – 35,786 KM), 2 to 6 Hours,
GEO: GEOSTATIONARY EARTH ORBIT (36,000 KM) 24 hours over the Earth.
3. By Shape:
Circular Orbit:- It is a fixed distance around the barycenter, that is in the shape of circle.
o Geostationary orbit, Polar Orbit and Equatorial Orbit.
Elliptical Orbit:- Is the revolving of one object around another in an oval-shaped path
called ellipse.
Closest point is Perigee and longest point is Apogee.
17. Explain about any five IRS satellite characteristics?
18. Define satellite and its types explain in detail?
A satellite is a body that orbits around another body in space. There are two
different types of satellites – natural and man-made. Examples of natural satellites are the
Earth and Moon. Earth is a satellite because it orbits the sun. Likewise, the moon is a
satellite because it orbits Earth ... A man-made satellite is a machine that is launched into
space and orbits around a body in space.
Types Of Satellites:
Navigation satellites. The GPS (global positioning system) is made up of 24 satellites
that orbit at an altitude of 20,200 km above the surface of the Earth.
Communication satellites.
Weather satellites.
Earth observation satellites.
Astronomical satellites.
International Space Station (ISS).
Navigation Satellites:-
A satellite navigation or satnav system is a system that uses satellites to provide
autonomous geo-spatial positioning. It allows small electronic receivers to determine
their location (longitude, latitude, and altitude/elevation) to high precision (within a
few centimeters to metres) using time signals transmitted along a line of
sight by radio from satellites. The system can be used for providing position,
navigation or for tracking the position of something fitted with a receiver (satellite
tracking). The signals also allow the electronic receiver to calculate the current local
time to high precision, which allows time synchronization.
A satellite navigation system with global coverage may be termed a global
navigation satellite system (GNSS). As of September 2020, the United
States' Global Positioning System (GPS), Russia's Global Navigation Satellite
System (GLONASS), China's BeiDou Navigation Satellite System (BDS) and
the European Union's Galileo are fully operational GNSSs. Japan's Quasi-Zenith
Satellite System (QZSS) is a (US) GPS satellite-based augmentation system to
enhance the accuracy of GPS, with satellite navigation independent of GPS scheduled
for 2023.
The Indian Regional Navigation Satellite System (IRNSS) plans to expand to a global
version in the long term
A communications satellite is an artificial satellite that relays and
amplifies radio telecommunication signals via a transponder; it creates
a communication channel between a source transmitter and a receiver at different
locations on Earth. Communications satellites are used
for television, telephone, radio, internet, and military applications. As of 1 January
2021, there are 2,224 communications satellites in Earth orbit. Most communications
satellites are in geostationary orbit 22,236 miles (35,785 km) above the equator, so
that the satellite appears stationary at the same point in the sky; therefore the satellite
dish antennas of ground stations can be aimed permanently at that spot and do not
have to move to track the satellite.
A weather satellite is a type of satellite that is primarily used to monitor
the weather and climate of the Earth. Satellites can be polar orbiting (covering the
entire Earth asynchronously), or geostationary (hovering over the same spot on
the equator). Satellites can be polar orbiting, covering the entire Earth
asynchronously, or geostationary, hovering over the same spot on the equator.
While primarily used to detect the development and movement of storm systems
and other cloud patterns, meteorological satellites can also detect other phenomena
such as city lights, fires, effects of pollution, auroras, sand and dust storms, snow
cover, ice mapping, boundaries of ocean currents, and energy flows. Other types of
environmental information are collected using weather satellites. Weather satellite
images helped in monitoring the volcanic ash.
An Earth observation satellite or Earth remote sensing satellite is a satellite used
or designed for Earth observation (EO) from orbit, including spy satellites and similar
ones intended for non-military uses such
as environmental monitoring, meteorology, cartography and others. The most
common type are Earth imaging satellites, that take satellite images, analogous
to aerial photographs.
An Astronomy satellite is basically a really big telescope floating in space. ...
Astronomy satellites have many different applications: they can be used to make star
maps. they can be used to study mysterious phenomena such as black holes and
quasars. they can be used to take pictures of the planets in the solar system
The International Space Station (ISS) is a modular space station (habitable artificial
satellite) in low Earth orbit. It is a multinational collaborative project involving five
participating space agencies: NASA (United
States), Roscosmos (Russia), JAXA (Japan), ESA (Europe), and CSA (Canada).The
ownership and use of the space station is established by intergovernmental treaties
and agreements. The station serves as a microgravity and space environment research
laboratory in which scientific research is conducted
in astrobiology, astronomy, meteorology, physics, and other fields. The ISS is suited
for testing the spacecraft systems and equipment required for possible future long-
duration missions to the Moon and Mars
3. Radiometric resolution:
Sensor’s sensitivity to the magnitude of the electromagnetic energy,
Sensor’s ability to discriminate very slight differences in (reflected or emitted)
energy,
The finer the radiometric resolution of a sensor, the more sensitive it is to
detecting small differences in energy.
4. Temporal resolution and coverage:
Temporal resolution is the revisit period, and is the length of time for a satellite
to complete one entire orbit cycle, i.e. start and back to the exact same area at the
same viewing angle. For example, Landsat needs 16 days, MODIS needs one day,
NEXRAD needs 6 minutes for rain mode and 10 minutes for clear sky mode.
Temporal coverage is the time period of sensor from starting to ending. For
example,
o MODIS/Terra: 2/24/2000 through present
o Landsat 5: 1/3/1984 through present
o ICESat: 2/20/2003 to 10/11/2009
20. What are the advantages and Disadvantages of using remotely sensed data?
Advantages of using Remote Sensed data:
1. Large area coverage: Remote sensing allows coverage of very large areas which enables
regional surveys on a variety of themes and identification of extremely large features.
2. Remote sensing allows repetitive coverage which comes in handy when collecting data on
dynamic themes such as water, agricultural fields and so on.
3. Remote sensing allows for easy collection of data over a variety of scales and resolutions.
4. A single image captured through remote sensing can be analyzed and interpreted for use in
various applications and purposes. There is no limitation on the extent of information that can
be gathered from a single remotely sensed image.
5. Remotely sensed data can easily be processed and analyzed fast using a computer and the data
utilized for various purposes.
6. Remote sensing is unobstructive especially if the sensor is passively recording the
electromagnetic energy reflected from or emitted by the phenomena of interest. This means
that passive remote sensing does not disturb the object or the area of interest.
7. Data collected through remote sensing is analyzed at the laboratory which minimizes the work
that needs to be done on the field.
8. Remote sensing allows for map revision at a small to medium scale which makes it a bit
cheaper and faster.
9. Color composite can be obtained or produced from three separate band images which ensure
the details of the area are far much more defined than when only a single band image or aerial
photograph is being reproduced.
10. It is easier to locate floods or forest fire that has spread over a large region which makes it
easier to plan a rescue mission easily and fast.
11. Remote sensing is a relatively cheap and constructive method reconstructing a base map in the
absence of detailed land survey methods.
21. What are the different applications of Remote Sensing? State its uses?
There are probably hundreds of applications - these are typical:
Meteorology - Study of atmospheric temperature, pressure, water vapour, and
wind velocity.
Oceanography: Measuring sea surface temperature, mapping ocean currents, and
wave energy spectra and depth sounding of coastal and ocean depths
Glaciology- Measuring ice cap volumes, ice stream velocity, and sea ice
distribution. (Glacial)
Geology- Identification of rock type, mapping faults and structure.
Geodesy- Measuring the figure of the Earth and its gravity field.
Topography and cartography - Improving digital elevation models.
Agriculture Monitoring the biomass of land vegetation (Crop Type, Crop
Condition Assessment, Crop Yield Estimation, Mapping of soil characteristic, soil
moisture estimation)
Forest- monitoring the health of crops, mapping soil moisture
Botany- forecasting crop yields.
Hydrology- Assessing water resources from snow, rainfall and underground
aquifers.
(Watershed mapping and management, Flood delineation and mapping)
Disaster warning and assessment - Monitoring of floods and landslides,
monitoring volcanic activity, assessing damage zones from natural disasters.
Planning applications - Mapping ecological zones, monitoring deforestation,
monitoring urban land use.
Oil and mineral exploration- Locating natural oil seeps and slicks, mapping
geological structures, monitoring oil field subsidence.
Military- developing precise maps for planning, monitoring military
infrastructure, monitoring ship and troop movements
Urban- Land parcel mapping, Infrastructure mapping, Land use change detection,
Future urban expansion planning.
Climate- the effects of climate change on glaciers and Arctic and Antarctic
regions
Sea- Monitoring the extent of flooding (Storm forecasting, Water quality
monitoring, Aquaculture inventory and monitoring, Navigation routing, Coastal
vegetation mapping, Oil spills, Coastal hazard monitoring & Assessment)
Rock- Recognizing rock types
Space program- is the backbone of the space program
Seismology: as a premonition.
22. What are the basic elements to be considered during visual interpretation of satellite
image?
IMAGE INTERPRETATION:
Image is a pictorial representation of an object or a scene.
Image can be analog or digital.
Aerial photographs are generally analog, while satellite data is in digital form.
A digital image is made up of square or rectangular areas called pixels.
Each pixel has an associated pixel value which depends on the amount reflected energy
from the ground
Advantages of aerial photographs/Satellite Images over ground observation
Synoptic view
Time freezing ability
Permanent record
Spectral resolution
Spatial resolution
Cost and time effective
Stereoscopic view
Brings out relationship between objects
Methods of Image Interpretation:
Visual
Image interpretation on a hardcopy image/photograph
Visual image interpretation on a digital image
Digital image processing
Why do we process images?
It has been developed to deal with 3 major problems —
To improve the image data to suppress the unwanted distortions.
To enhance some features of the input image.
As a means of translation between the human visual system and digital imaging
devices.
ACTIVITIES OF IMAGE INTERPRETATION:
Detection
Recognition
Analysis
Deduction
Classification
Idealization
Convergence of evident
Elements of Visual Image Interpretation:
Location, Size, Shape, Shadow, Tone, Colour, Texture, Pattern, Height and
Depth, Site, Situation, and Association
Location
There are two primary methods to obtain a precise location in the form of coordinates. 1)
survey in the field by using traditional surveying techniques or global positioning system
instruments, or 2) collect remotely sensed data of the object, rectify the image and then
extract the desired coordinate information. Most scientists who choose option 1 now use
relatively inexpensive GPS instruments in the field to obtain the desired location of an
object. If option 2 is chosen, most aircraft used to collect the remotely sensed data have a
GPS receiver.
Size
The size of an object is one of the most distinguishing characteristics and one of the most
important elements of interpretation. Most commonly, length, width and perimeter are
measured. To be able to do this successfully, it is necessary to know the scale of the
photo. Measuring the size of an unknown object allows the interpreter to rule out possible
alternatives. It has proved to be helpful to measure the size of a few well-known objects
to give a comparison to the unknown-object. For example, field dimensions of major
sports like soccer, football, and baseball are standard throughout the world. If objects like
this are visible in the image, it is possible to determine the size of the unknown object by
simply comparing the two.
Shape
There is an infinite number of uniquely shaped natural and man-made objects in the
world. A few examples of shape are the triangular shape of modern jet aircraft and the
shape of a common single-family dwelling. Humans have modified the landscape in very
interesting ways that has given shape to many objects, but nature also shapes the
landscape in its own ways. In general, straight, recti-linear features in the environment
are of human origin. Nature produces more subtle shapes.
Shadow
Virtually all remotely sensed data are collected within 2 hours of solar noon to avoid
extended shadows in the image or photo. This is because shadows can obscure other
objects that could otherwise be identified. On the other hand, the shadow cast by an
object act as a key for the identification of the object as the length of the shadow will be
used to estimate the height of the object which is vital for the recognition of the object.
Take for example, the Washington Monument in Washington D.C. While viewing this
from above, it can be difficult to discern the shape of the monument, but with a shadow
cast, this process becomes much easier. It is a good practice to orient the photos so that
the shadows are falling towards the interpreter. A pseudoscopic illusion can be produced
if the shadow is oriented away from the observer. This happens when low points appear
high and high points appear low.
Texture
This is defined as the “characteristic placement and arrangement of repetitions of tone or
color in an image.” Adjectives often used to describe texture are smooth (uniform,
homogeneous), intermediate, and rough (coarse, heterogeneous). It is important to
remember that texture is a product of scale. On a large scale depiction, objects could
appear to have an intermediate texture. But, as the scale becomes smaller, the texture
could appear to be more uniform, or smooth. A few examples of texture could be the
“smoothness” of a paved road, or the “coarseness” a pine forest.
Pattern
Pattern is the spatial arrangement of objects in the landscape. The objects may be
arranged randomly or systematically. They can be natural, as with a drainage pattern of a
river, or man-made, as with the squares formed from the United States Public Land
Survey System. Typical adjectives used in describing pattern are: random, systematic,
circular, oval, linear, rectangular, and curvilinear to name a few.
The correction of deficiencies and the removal of flaws present in the data are called pre-
processing( some times referred to as image restoration or image correction or image
rectification).
Pre-processing techniques involved in remote sensing may be categorized into two broad
categories
Radiometric corrections:
When the emitted or reflected electro-magnetic energy is observed by a sensor on-
board an aircraft or spacecraft, the observed energy does not coincide with the energy
emitted or reflected from the same object observed from a short distance.
1. Detector response calibration
De-striping
Removal of missing scan line
Random noise removal
Vignetting removal (Corner to center clarity Differ)
2. Sun angle and topographic correction
3. Atmospheric Correction
Geometric Corrections:
Geometric errors that arise from
The Earth Curvature
Platform Motion
Relief Displacement
Non - linearity's in scanning motion
The Earth rotation
These are Systematic Corrections, non-systematic, coordinate transformation.
2. Image Enhancement:
Image enhancement is the procedure of improving the quality and information
content of original data before processing. Common practices include contrast
enhancement, spatial filtering, density slicing, and FCC. Contrast enhancement or
stretching is performed by linear transformation expanding the original range of gray
level. Spatial filtering improves the naturally occurring linear features like fault, shear
zones, and lineaments. Density slicing converts the continuous gray tone range into a
series of density intervals marked by a separate color or symbol to represent different
features.
FCC is commonly used in remote sensing compared to true colors because of the
absence of a pure blue color band because further scattering is dominant in the blue
wavelength. The FCC is standardized because it gives maximum identical information of
the objects on Earth and satisfies all users. In standard FCC, vegetation looks red
(Fig. 3.6) because vegetation is very reflective in NIR and the color applied is red. Water
bodies look dark if they are clear or deep because IR is an absorption band for water.
Water bodies give shades of blue depending on their turbidity or shallowness because
such water bodies reflect in the green wavelength and the color applied is blue.
3. Image Transformation:
A function or operator that takes an image as its input and produces an image as
its output. Fourier transforms, principal component analysis (also called Karhunen-Loeve
analysis), and various spatial filters, are examples of frequently used image
transformation procedures.
Image Reduction:
Image Reduction: Image Reduction techniques allow the analyst to obtain a regional
perspective of the remotely sensed data.
Common screen resolution is 1024 X 768, that is much lower than the number of
pixels generally present in an image
The computer screen cannot display the entire image on the screen unless reduce the
visual representation of the image. It is commonly known as zoom out.
Image Magnification:
Referred to as zoom in. This technique is most commonly employed for two purposes.
To improve the display – scale of the image for enhanced visual interpretation.
To match the display- scale of another image.
Colour Composition:
The remote sensing images, which are displayed in three primary colours, True
color composite uses visible light bands red (B04), green (B03) and blue (B02) in the
corresponding red, green and blue color channels, resulting in a natural colored result,
that is a good representation of the Earth as humans would see it naturally.
Particularly, the colour composite with the assignment of blue color gun to the
green band, green gun to red band, and the red gun to NIR band is very popular, and is
called an infrared colour composition, which is the same as that found in colour infrared
film
Transect Extraction:
The ability to extract brightness values along a user-specified transect (also referred
to as a spatial profile) between two points in a single-band or multiple-band color
composite image is important in many remote sensing image interpretation
application. Basically, the spatial profile in histogram format depicts the magnitude of
the brightness value at each pixel along the transect.
Contrast Enhancement:
One material would reflect a tremendous amount of energy in a certain wavelength
and another material would reflect much less energy in the same wavelength. This
would result in contrast between two types of material when recorded by the remote
sensing system.
Unfortunately, different materials often reflect similar amounts of radiant flux
throughout the visible, near infrared and middle-infrared portions of the
electromagnetic spectrum, resulting in a relatively low-contrast imagery. In addition,
besides this obvious low-contrast characteristic of biophysical materials, there are
cultural factors at work.
The detectors on remote sensing systems are designed to record a relatively wide
range of scene brightness values (e.g., 0-255) without becoming saturated.
Filtering:
Spatial filtering term is the filtering operations that are performed directly on the
pixels of an image
Spatial frequency describes the periodic distributions of light and dark in an image.
High spatial frequencies correspond to features such as sharp edges and fine details,
whereas low spatial frequencies correspond to features such as global shape
Filters are classified as:
Low-pass (i.e., preserve low frequencies)
High-pass (i.e., preserve high frequencies)
Band-pass (i.e., preserve frequencies within a band)
Band-reject (i.e., reject frequencies within a band)
Image Transformation:
Image transformations typically involve the manipulation of multiple bands of data,
whether from a single multispectral image or from two or more images of the same
area acquired at different times (i.e. multi temporal image data). ... Basic image
transformations apply simple arithmetic operations to the image
The basic transformations are scaling, rotation, translation, and shear. Other important
types of transformations are projections and mappings. By scaling relative to the
origin, all coordinates of the points defining an entity are multiplied by the same
factor, possibly different for each axis.age data.
Sensors, or instruments, aboard satellites and aircraft use the Sun as a source of
illumination or provide their own source of illumination, measuring energy that is
reflected back. Sensors that use natural energy from the Sun are called passive sensors;
those that provide their own source of energy are called active sensors.
Active sensors include different types of radio detection and ranging (radar) sensors,
altimeters, and scatterometers. The majority of active sensors operate in the microwave band of
the electromagnetic spectrum, which gives them the ability to penetrate the atmosphere under
most conditions. These types of sensors are useful for measuring the vertical profiles of aerosols,
forest structure, precipitation and winds, sea surface topography, and ice, among others.
Sensors can be non-imaging (measures the radiation received from all points in the
sensed target, integrates this, and reports the result as an electrical signal strength or some
other quantitative attribute, such as radiance) or Imaging (the electrons released are used
to excite or ionize a substance like silver (Ag) in film or to drive an image producing
device like a TV or computer monitor or a cathode ray tube or oscilloscope or a battery of
electronic detectors; since the radiation is related to specific points in the target, the end
result is an image [picture] or a raster display [as in the parallel lines {horizontal} on a
TV screen).
Sensors are devices used to take observations
Sensors are characterized by spatial, spectral, and radiometric performance.
IRS Satellite sensors:
o Linear Imaging self-scanning sensors(LISS-I to IV)
o Panchromatic (PAN)
o Wide Field Sensor(WiFS) and advanced AWiFS
o Ocean Colour Monitor (OCM) sensor operating in eight narrow spectral bands,
for oceanographic applications.
o Multi-spectral Optoelectronic Scanner(MOS)
o Multi-frequency Scanning Microwave Radiometer(MSMR)
o Synthetic Aperture Radar (SAR) for active microwave RS.