Remote Sensing

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 40

DEFINITION AND PROOCESS OF REMOTE

SENSING
INTRODUCTION

Now-a-days the field of Remote Sensing and GIS has become exciting
and glamorous with rapidly expanding opportunities. Many organizations spend
large amounts of money on these fields. Here the question arises why these
fields are so important in recent years. Two main reasons are there behind this.
1) Now-a-days scientists, researchers, students, and even common people are
showing great interest for better understanding of our environment. By
environment we mean the geographic space of their study area and the events
that take place there. In other words, we have come to realize that geographic
space along with the data describing it, is part of our everyday world; almost
every decision we take is influenced or dictated by some fact of geography. 2)
Advancement in sophisticated space technology (which can provide large
volume of spatial data), along with declining costs of computer hardware and
software (which can handle these data) has made Remote Sensing and G.I.S.
affordable to not only complex environmental / spatial situation but also
affordable to an increasingly wider audience.

Remote Sensing and Its Components

Remote sensing is the science of acquiring information about the Earth's surface without actually being in
contact with it.

REMOTE SENSING AND ITS COMPONENTS:


Remote sensing is the science of acquiring information about the Earth's
surface without actually being in contact with it. This is done by sensing and
recording reflected or emitted energy and processing, analyzing, and applying
that information." In much of remote sensing, the process involves an
interaction between incident radiation and the targets of interest. This is
exemplified by the use of imaging systems where the following seven elements
are involved. However that remote sensor also involves the sensing of emitted
energy and the use of non-imaging sensors.
1. Energy Source or Illumination (A) - the first requirement for remote sensing
is to have an energy source which illuminates or provides electromagnetic
energy to the target of interest.

2. Radiation and the Atmosphere (B) - as the energy travels from its source to
the target, it will come in contact with and interact with the atmosphere it
passes through. This interaction may take place a second time as the energy
travels from the target to the sensor.

3. Interaction with the Target (C) - once the energy makes its way to the target
through the atmosphere, it interacts with the target depending on the properties
of both the target and the radiation.

4. Recording of Energy by the Sensor (D) - after the energy has been scattered
by, or emitted from the target, we require a sensor (remote - not in contact with
the target) to collect and record the electromagnetic radiation.

5. Transmission, Reception, and Processing (E) - the energy recorded by the


sensor has to be transmitted, often in electronic form, to a receiving and
processing station where the data are processed into an image (hardcopy and/or
digital).

6. Interpretation and Analysis (F) - the processed image is interpreted, visually


and/or digitally or electronically, to extract information about the target which
was illuminated.

7. Application (G) - the final element of the remote sensing process is achieved
when we apply the information we have been able to extract from the imagery
about the target in order to better understand it, reveal some new information, or
assist in solving a particular problem.

HISTRY OF REMOTE SENSING:

1839 - first photograph


1858 - first photo from a balloon

1903 - first plane

1909 first photo from a plane 1903-4 -B/W infrared film

WW I and WW II 1960 - space

Electromagnetic Spectrum

The first requirement for remote sensing is to have an energy source to illuminate the target
(unless thesensed energy is being emitted by the target).

ELECTROMAGNETIC SPECTRUM
The first requirement for remote sensing is to have an energy source to
illuminate the target (unless thesensed energy is being emitted by the target).
This energy is in the form of electromagnetic radiation. All electromagnetic
radiation has fundamental properties and behaves in predictable ways according
to the basicsof wave theory.
Electromagnetic radiation consists of an electrical field (E) which
varies inmagnitude in a direction perpendicular to the direction in which the
radiation is traveling, and a magnetic field (M) oriented at right angles to the
electrical field. Both these fields travel at the speed of light (c). Two
characteristics of electromagnetic radiation are particularly important to
understand remote sensing. These are the wavelength and frequency.

Electromagnetic radiation (EMR) as an electromagnetic wave that travels


through space at the speed of light C which is 3x108 meters per second.

Theoretical model of random media including the anisotropic effects, random


distribution discrete scatters, rough surface effects, have been studied for remote
sensing with electromagnetic waves.
Light can be thought ofasawave in the 'electro magnetic field'of the univer
Awave can be characterized by its wavelength or its frequency

The wavelength is the length of one wave cycle, which can be measured
as the

distance between successive wave crests. Wavelength is usually represented by


the Greek letter lambda (?). Wavelength is measured in metres (m) or some
factor of metres such as nanometres (nm, 10-9 metres), micrometres (?m, 10-6
metres) (?m, 10-6 metres) or

centimetres (cm, 10-2 metres). Frequency refers to the number of cycles of a


wave passing a fixed point per unit of time. Frequency is normally measured
in hertz (Hz), equivalent to one cycle per second, and various multiples of
hertz.
Wavelength and frequency are related by the following formula:
Therefore, the two are inversely related to each other. The shorter the
wavelength, the higher the frequency. The longer the wavelength, the lower the
frequency. Understanding the characteristics of electromagnetic radiation in
terms of their wavelength and frequency iscrucial to understanding the
information to be extracted from remote sensing data.

The electromagnetic spectrum ranges from the shorter wavelengths (including


gamma and x-rays) to the longer wavelengths (including microwaves and
broadcast radio waves). There are several regions of the electromagnetic
spectrum which are useful for remote sensing.

Wavelength Regions Important To Remote Sensing


1 Ultraviolet or UV 2 Visible Spectrum 3 Infrared (IR)

WAVELENGTH REGIONS IMPORTANT TO REMOTE SENSING:


1 Ultraviolet or UV
For the most purposes ultraviolet or UV of the spectrum shortest
wavelengths are practical for remote sensing. This wavelength beyond the violet
portion of the visible wavelengths hence it name. Some earth surface materials
primarly rocks and materials are emit visible radiation when illuminated by UV
radiation.
2 Visible Spectrum
The light which our eyes - our "remote sensors" - can detect is part of
the visible spectrum. It is important to recognize how small the visible portion
is relative to the rest of the spectrum. There is a lot of radiation around us which
is "invisible" to our eyes, but can be detected by other remote sensing
instruments and used to our advantage. The visible wavelengths cover a range
from approximately 0.4 to 0.7 ?m. The longest visible wavelength is red and the
shortest is violet. Common wavelengths of what we perceive as particular
colours from the visible portion of the spectrum are listed below. It isimportant
to note that this is the only portion of the spectrum we can associate with the
concept of colours.
Violet: 0.4 -0.446 ?m
Blue: 0.446 -0.500 ?m
Green: 0.500 -0.578 ?m
Yellow: 0.578 -0.592 ?m
Orange: 0.592 -0.620 ?m
Red: 0.620 -0.7 ?m
Blue, green, and red are the primary colours or wavelengths of the
visible spectrum. Theyare defined as such because no single primary colour can
be created from the other two, but all other colours can be formed by combining
blue, green, and red in various proportions. Although we see sunlight as a
uniform or homogeneous colour, it is actually composed of various wavelengths
of radiation in primarily the ultraviolet, visible and infrared portions of the
spectrum. The visible portion of this radiation can be shown in its component
colours when sunlight is passed through a prism, which bends the light in
differing amounts according to wavelength.
3 Infrared (IR)
The next portion of the spectrum of interest is the infrared (IR) region which
covers the wavelength range from approximately 0.7? m to 100? m more than
100 times as wide as the visible portion. The infrared can be divided into 3
categories based on their radiation properties-the reflected near- IR middle IR
and thermal IR.
The reflected near IR covers wavelengths from approximately 0.7 ?m to 1.3 ?m
iscommonly used to expose black and white and color-infrared sensitive film.
The middle-infrared region includes energy with a wavelength of 1.3 to
3.0 ?m.
The thermal IR region is quite different than the visible and reflected IR
portions, as
this energy is essentially the radiation that is emitted from the Earth's surface in
the form of heat. The thermal IR covers wavelengths from approximately 3.0 ?
m to 100 ?m.
Microwave
This wavelength (or frequency) interval in the electromagnetic spectrum
is commonly referred to as a band, channel or region.The major subdivision
The portion of the spectrum of more recent interest to remote sensing is the
microwave region from about 1 mm to 1 m. This covers the longest
wavelengths used for remote sensing. The shorter wavelengths have properties
similar to the thermal infrared region while the longer wavelengths approach the
wavelengths used for radio broadcasts.
Wave Theory and Parrtical Theory

Light can exhibit both a wave theory, and a particle theory at the same time. Much of the time,
light behaves like a wave. Light waves are also called electromagnetic waves because they are
made up of both electric (E) and magnetic (H) fields.

WAVE THEORY AND PARRTICAL THEORY

Light can exhibit both a wave theory, and a particle theory at the same time.
Much of the time, light behaves like a wave. Light waves are also called
electromagnetic waves because they are made up of both electric (E) and
magnetic (H) fields. Electromagnetic fields oscillate perpendicular to the
direction of wave travel, and perpendicular to each other. Light waves are
known as transverse waves as they oscillate in the direction traverse to the
direction of wave travel.

Fig 1.4 - Electromagnetic propagation


Waves have two important characteristics - wavelength and frequency.

The sine wave is the fundamental waveform in nature. When dealing with light
waves, we refer to the sine wave. The period (T) of the waveform is one full 0
to 360 degree sweep. The relationship of frequency and the period is given by
the equation:

f = 1 /T

T = 1 /f

The waveforms are always in the time domain and go on for infinity.

The speed of a wave can be found by multiplying the two units together. The
wave's speed is measured in units of length (distance) per second:

Wavelength x Frequency = Speed

As proposed by Einstein, light is composed of photons, a very small packets of


energy. The reason that photons are able to travel at light speeds is due to the
fact that they have no mass and therefore, Einstein's infamous equation
- E=MC2 cannot be used. Another formula devised by Planck, is used to
describe the relation between photon energy and frequency
Planck's Constant (h) - 6.63x10-34 Joule-Second.
E = hf(or)E = hc / /?
E is the photonic energy in Joules, h is Planks constant and f is the frequency
in Hz.

PARTICAL THEORY

The basic idea of qua ntum theory is that radiant energy is trans mitted
inindivisible packets whose energy is given in integral parts, of size hv, where h
is Planck's constant = 6.6252 x 10-34 J - s, and v is the frequency of the
radiation. These ar e called quanta or photons.
The dilemma of the si multaneous wave and particle waves of elec
tromagneticenergy may be conceptually resolved by considering that energy is
not supplied continuously throughout a wave, but rather that it is carried by
photons. The classical w ave theory does not give the intensity of energy at a
point in space, but gives the probability of finding a photon at that point. Thus
the classica l concept of a wave yields to the idea th at a wave simply describes
the probability path for the motion of the individual photons.

The particular impor tance of the quantum approach for remote sensing is
thatit provides the concept of discrete energy levels in materials. The values a
nd arrangement of these levels are different for different materials. Information
about a given material is thus available in electromagnetic radiation as a
consequence of transitions bettween these energy levels. A transition to a highe
r energy level is caused by the absorption of energy, or from a higher to a lower
energy leve l is caused by the' emission of energy. The amounts of energy either
absorbed or emitted correspond precisely to the energy difference between the
two levels involved in the transitio n. Because the energy levels are different for
each material, the amount of energy a particular substance can absorb or emit is
different for that material from any other materials. Conseque ntly, the position
and intensities of the band s in the spectrum of a given material are
characteriistic to that material.

Stefan Boltzmann law and Wien's displacementt law


the Stefan-Boltzmann law states that the total energy radiated per unit surface area of a black
body across all wavelengths per unit time (also known as the b lack-body radiant exitance or
emissive power), J , is directlyproportional to the fou rth power of the black body's
thermodynamic temperature T: Wien's displacementt law states that the black body radiation
curve for different temperatures peaks at a waveelength inversely proportional to the temper
ature.

STEFAN-BOLTZMAN N LAW
Stefan-Boltzmann law, also known as Stefan's law, describes the power
radiated from a black body in terms of itste mperature. Specifically, the Stefan-
Boltzmann law states that the total energy radiated per unit surface area of a
black body across all wavelengths per unit time (also known as the b lack-
body radiant exitance or emissive power), J , is directlyproportional to the fou
rth power of the black body's thermodynamic temperature T:
WIEN'S DISPLACEMEENT LAW
Wien's displacementt law states that the black body radiation curve for
different temperatures peaks at a waveelength inversely proportional to the
temper ature. The shift of that peak is a direct consequ ence of the Planck
radiation law which des cribes the spectral brightness of black body rad iation
as a function of wavelength at any given temperature. However it had been
discovered by Wilhelm Wien several years before Max Planck developed that
more general equation, and describes the entire shift of the spectrum of black
body radiation toward shorter wavelengths as temperature increases.

Formally, Wien's displacement law states that the spectral radiance of black
body radiation per unit wavelength, peaks at the wavelength ?max given by:

where T is the absol ute temperature in degrees kelvin. b i s a


constant of proportionality called Wie n's displacement constant, equal to
2.8 977721(26)�10 ?3 m K.[1], or more convenien tly to obtain wavelength
in microns, b?2900 ?m K. If one is considering the peak of black body
emission per unit frequency o r per proportional bandwidth, one must use a
different proportionality constant. However the form of the law remains the
same: the peak wavelength is inversely proportional to temperature (or the
peak frequency is directly proportional to temperature).

Wien's displacement law may be referred to as "Wien's law", a term which is


also used for the Wien approximation.

Energy Interactions With The Atmosphere


Particles and gases in the atmosphere can affect the incoming light and radiation. These effects are
caused by the mechanisms of scattering and absorption .

ENERGY INTERACTIONS WITH THE ATMOSPHERE

Before radiation used for remote sensing reaches the Earth's surface it has to
travel through some distance of the Earth's atmosphere. Particles and gases in
the atmosphere can affect the incoming light and radiation. These effects are
caused by the mechanisms of scattering and absorption .

1 SCATTERING
Scattering occurs when particles or large gas molecules present in
the atmosphere interact with and cause the electromagnetic radiation to be
redirected from its original path. How much scattering takes place depends
on several factors including the wavelength of the radiation, the abundance
of particles or gases, and the distance the radiation travels through the
atmosphere. There are three (3) types of scattering which take place.

2 RAYLEIGH SCATTERING

Rayleigh scattering occurs when particles are very small compared to the
wavelength of the radiation. These could bearticles such as small specks of dust
or nitrogen and oxygen molecules. Rayleigh scattering causes shorter
wavelengths of energy to be scattered much more than longer wavelengths.
Rayleigh scattering is the dominant scattering mechanism in the upper
atmosphere.The fact that the sky appears "blue" during the day is because of
this phenomenon. As sunlight passes through the atmosphere, the shorter
wavelengths (i.e. blue) of the visible spectrum are scattered more than the other
(longer) visible wavelengths. At sunrise and sunset the light has to travel
farther through the atmosphere than at midday and the scattering of the shorter
wavelengths is more complete; this leaves a greater proportion of the longer
wavelengths to penetrate the atmosphere.

3ABSORPTION

Absorption is the other main mechanism at work when electromagnetic


radiation interacts with the atmosphere. In contrast to scattering, this
phenomenon causes molecules in the atmosphere to absorb energy at various
wavelengths. Ozone, carbon dioxide, and water vapor are the three main
atmospheric constituents which absorb radiation. Ozone serves to absorb the
harmful (to most living things) ultraviolet radiation for the sun. Without this
protective layer in the atmosphere our skin would burn when exposed to
sunlight. Carbon dioxide referred to as a greenhouse gas. This is because it
tends to absorb radiation strongly in the far infrared portion of the spectrum -
that area associated with thermal heating - which serves to trap this heat inside
the atmosphere. Water vapour in the atmosphere absorbs much of the
incoming longwave infrared and shortwave microwave radiation (between 22?
m and 1m). The presence of water vapour in the lower atmosphere varies greatly
from location to location and at different times of the year. For example, the air
mass above a desert would have very little water vapour to absorb energy, while
the tropics would have high concentrations of water vapour (i.e. high humidity).

4 MIE SCATTERING
Mie scattering occurs when the particles are just about the same size as
the wavelength of the radiation. Dust, pollen, smoke and water vapour are
common causes of Mie scattering which tends to affect longer wavelengths than
those affected by Rayleigh scattering. Mie scattering occurs mostly in the lower
portions of the atmosphere where larger particles are more abundant, and
dominates when cloud conditions are overcast.

The final scattering mechanism of importance is called nonselective


scattering. This occurs when the particles are much larger than the wavelength
of the radiation.

Water droplets and large dust particles can cause this type of scattering.
Nonselective scattering gets its name from the fact that all wavelengths are
scattered about equally. This type of scattering causes fog and clouds to appear
white to our eyes because blue, green, and red light are all scattered in
approximately equal quantities (blue+green+red light = white light).

Atmospheric Windows

While EMR is transmitted from the sun to the surface of the earth, it passes through the
atmosphere. Here, electromagnetic radiation is scattered and absorbed by gases and dust
particles. Besides the major atmospheric gaseous components like molecular nitrogen and
oxygen, other constituents like water vapour, methane, hydrogen, helium and nitrogen
compounds play important role in modifying electro magnetic radiation.

ATMOSPHERIC WINDOWS
While EMR is transmitted from the sun to the surface of the earth, it
passes through the atmosphere. Here, electromagnetic radiation is scattered and
absorbed by gases and dust particles. Besides the major atmospheric gaseous
components like molecular nitrogen and oxygen, other constituents like water
vapour, methane, hydrogen, helium and nitrogen compounds play important
role in modifying electro magnetic radiation. This affects image quality.
Regions of the electromagnetic spectrum in which the atmosphere is transparent
are called atmospheric windows. In other words, certain spectral regions of the
electromagnetic radiation pass through the atmosphere without much
attenuation are called atmospheric windows. The atmosphere is practically
transparent in the visible region of the electromagnetic spectrum and therefore,
many of the satellite based remote sensing sensors are designed to collect data
in this region. Some of the commonly used atmospheric windows are shown in
the figure.

Figure . They are: 0.38-0.72 microns (visible), 0.72-3.00 microns (near infra-red
and middle infra-red), and 8.00-14.00 microns (thermal infra-red).

Transmission100%UVVisibleInfraredEnergy Blocked0.3 Wavelength


(microns)1101001 mm

Spectral Signature Concepts Typical Spectral


Reflectance Charactristics Of Water, Vegetation And
Soil

A basic assumption made in remote sensing is that a specific target has anindividual and
characteristic manner of interacting with incident radiation.
SPECTRAL SIGNATURE CONCEPTS-TYPICAL SPECTRAL
REFLECTANCE CHARACTRISTICS OF WATER, VEGETATION AND
SOIL:

A basic assumption made in remote sensing is that a specific target has


anindividual and characteristic manner of interacting with incident radiation.
The manner of interaction is described by the spectral response of the target.
The spectral reflectance curves describe the spectral response of a target in a
particular wavelength region of electromagnetic spectrum, which, in turn
depends upon certain factors, namely, orientation of the sun (solar azimuth), the
height of the Sun in the sky (solar elevation angle), the direction in which the
sensor is pointing relative to nadir (the look angle) and nature of the target, that
is, state of health of vegetation.

Fig 1.8 Spectral reflectance Curve

Every object on the surface of the earth has its unique spectral
reflectance.Fig. 1.8 shows the average spectral reflectance curves for three
typical earth's features: vegetation, soil and water. The spectral reflectance
curves for vigorous vegetation manifests the "Peak-and-valley" configuration.
The valleys in the visible portion of the spectrum are indicative of pigments in
plant leaves. Dips in reflectance (Fig. 1.8) that can be seen at wavelengths of
0.65 .�m, 1.4 � m and 1.9 �m are attributable to absorption of water by
leaves. The soil curve shows a more regular variation of reflectance. Factors
that evidently affect soil reflectance are moisture content, soil texture, surface
roughness, and presence of organic matter. The term spectral signature can also
be used for spectral reflectance curves. Spectral signature is a set of
characteristics by which a material or an object may be identified on any
satellite image or photograph within the given range of wavelengths.
Sometime&,spectral signatures are used to denote the spectral response of a
target.

The characteristic spectral reflectance curve Fig1.8 for water shows thatfrom
about 0.5�m , a reduction in reflectance with increasing wavelength, so that in
the near infrared range, the reflectance of deep, clear water is virtually a zero
(Mather, 1987). However, the spectral reflectance of water is significantly
affected by the presence of dissolved and suspended organic and inorganic
material and by thedepth of the water body. Fig. 1.8 shows the spectral
reflectance curves for visible and near-infrared wavelengths at the surface and
at 20 m depth. Suspended solids

in water scatter the down welling radiation, the degree of scatter being
proportional to the concentration and the color of the sediment. Experimental
studies in the field and in the laboratory as well as experience with multispectral
remote sensing have shown that the specific targets are characterized by an
individual spectral response. Indeed the successful development of remote
sensing of environment over the past decade bears witness to its validity. In the
remaining part of this section, typical and representative spectral reflectance
curves for characteristic types of the surface materials are considered. Imagine a
beach on a beautiful tropical island. of electromagnetic radiation with the top
layer of sand grains on the beach. When an incident ray of electromagnetic
radiation strikes an air/grain interface, part of the ray is reflected and part of it is
transmitted into the sand grain. The solid lines in the figure represent the
incident rays, and dashed lines 1, 2, and 3 represent rays reflected from the
surface but have never penetrated a sand grain. The latter are called specular
rays by Vincent and Hunt (1968), and surface-scattered rays by Salisbury and
Wald

(1992); these rays result from first-surface reflection from all grains
encountered. For a given reflecting surface, all specular rays reflected in the
same direction, such that the angle of reflection (the angle between the reflected
rays and the normal, or perpendicular to the reflecting surface) equals the angle
of incidence (the angle between the incident rays and the surface normal). The
measure of how much electromagnetic radiation is reflected off a surface is
called its reflectance, which is a number between 0 and 1.0. A measure of 1.0
means the 100% of the incident radiation is reflected off the surface, and a
measure of 0 means that 0% is reflected.

Important Questions and Answers - EMR And Its


Interaction With Atmophere And Earth Material
Civil - Remote Sensing Techniques and GIS - EMR and Its Interaction With Atmosphere and
Earth Material

EMR And Its Interaction With Atmophere And Earth Material

1.What is Remote Sensing?


Remote sensing is the science and art of obtaining information about
object, area, or phenomena through the analysis of data acquired by a device
that is not in contact with the object, area, or phenomena under
investigation.

2.What are all the applications of remote sensing?


In many respects, remote sensing can be thought of as a reading
process. Using various sensors, we remotely collect data that may be analyzed
to obtain information about the objects, areas,or phenomena
beinginvestigated.
The remotely collected data can be of many forms, including Variations in
force distributions, acoustic wave distributions, or electromagnetic energy
distributions.

3.Write the physics of remote sensing ?


Visible light is only one of many forms of Electromagnetic energy.
Radio waves, heat, ultraviolet rays, and X-rays are other familiar forms. All
this energy is inherently similar and radiates in accordance with basic wave
theory. This theory describes electromagnetic energy as traveling in
harmonic, sinusoidal fashion at the 'velocity of light' c. The distance from
one wave peak to the next is the wave length ?, and the number of peaks
passing a fixed point in space per unit time is the wave frequency V.
From basic physics, wave obey the general equation
C=vy

4.What are the Components of Remote Sensing ?

5.What is Electromagnetic radiation?


Electromagnetic (EM) radiation is a self-propagating wave in space or
through matter. EM radiation has an electric and magnetic field component
which oscillate in phase perpendicular to each other and to the direction of
energy propagation.
6.Write the type of Electromagnetic radiation?
Electromagnetic radiation is classified into types according to the
frequency of the wave, these types include (in order of increasing frequency):
radio waves, microwaves, terahertz radiation, infrared radiation, visible
light, ultraviolet radiation, X-rays and gamma rays.

7.Draw the quantum theory interaction?


A quantum theory of the interaction between electromagnetic radiation
and matter such as electrons is described by the theory of quantum
electrodynamics.
8.Write about refraction?
In refraction, a wave crossing from one medium to
another of different density alters its speed and direction upon
entering the new medium.
The ratio of the refractive indices of the
media determines the degree of refraction, and is summarized by Snell's
law. Light disperses into a visible spectrum as light is shone through a prism
because of refraction.
9.Draw the Wave model?

10.Write Planck's equation?


The frequency of the wave is proportional to the magnitude
of the particle's energy. Moreover, because photons are
emitted and absorbed by charged particles, they act as transporters of
energy.
The energy per photon can be calculated by Planck's equation:
where E is the energy, h is Planck's constant, and f is frequency.

11.What is Black body ?


By definition a black body is a material that absorbs all the radiant energy that
strikes it. A black body also radiates the maximum amount of energy, which is
dependent on the kinetic temperature.

12.Write Stefan Boltzman law?


According to the Stefan-Boltzman law the radiant flux of a black body, Fb, at a
kinetic temperature, Tkin, is Fb = s* Tkin 4 where s is the Stefan- Boltzman
constant, 5.67*10-12 W*cm-2* o K-4.

13.What is emissivity?
Emissivity is a measure of the ability of a material to both radiate and
absorb energy. Materials with a high emissivity absorb and radiate large
proportions of incident and kinetic energy, respectively (and vice-versa).
14.Write Wein's Displacement law?
For an object at a constant temperature the radiant power peak refers to
the wavelength at which the maximum amount of energy is radiated, which is
expressed as lmax. The sun, with a surface temperature of almost 6000 o K, has
its peak at 0.48mm (wavelength of yellow). The average surface temperature of
the earth is 290 o K (17 o C), which is also called the ambient temperature; the
peak concentration of energy emitted from the earth is at 9.7mm.This shift to
longer wavelengths with decreasing temperature is described by Wien's
displacement law, which states:
lmax = 2,897mm o K /Trad o K
15.Write Planck's Law?
The primary law governing blackbody radiation is the Planck Radiation
Law, which governs the intensity of radiation emitted by unit surface area into a
fixed direction (solid angle) from the blackbody as a function of wavelength for
a fixed temperature. The Planck Law can be expressed through the following
equation.
16.What is Scattering?
Scattering occurs when particles or large gas molecules present in the
atmosphere interact with and cause the electromagnetic radiation to be
redirected from its original path. How much scattering takes place depends on
several factors including the wavelength of the radiation, the abundance of
particles or gases, and the distance the radiation travels through the atmosphere.
There are three (3) types of scattering which take place.
17.What are the types of scattering?
(i) Rayleigh scattering occurs when particles are very small compared to
the wavelength of the radiation.
(ii) Mie scattering It occurs when the particles are just about the same
size as the wavelength of the radiation.
(iii) Non Selective Scattering
The final scattering mechanism of importance is called nonselective scattering.
This occurs when the particles are much larger than the wavelength of the
radiation.

18.What is Atmospheric Windows?


The areas of the spectrum which are not severely influenced by atmospheric
absorption and thus, are useful to remote sensors, are called atmospheric
windows.

Remote Sensing Techniques: Types Of Platforms


There are three main types of platforms, namely 1) Ground borne, 2) Air borne and 3) Space
borne.

TYPES OF PLATFORMS

The base, on which remote sensors are placed to acquire information about
the Earth's surface, is called platform. Platforms can be stationary like atripod
(for field observation) and stationary balloons or mobile like aircrafts and
spacecraft's. The types of platforms depend upon the needs as well as
constraints of the observation mission.

There are three main types of platforms, namely 1) Ground borne, 2) Air
borne and 3) Space borne.
1.GROUND BORNE PLATFORMS:

These platforms are used on the surface of the Earth. Cherryarm configuration
of Remote Sensing van and tripod are the two commonly used ground borne
platforms.

Theyhavethecapabilityofviewingtheobjectfromdifferentanglesandare mainly
used for collecting the ground truth or for laboratory simulation studies.

2.AIR BORNE PLATFORMS:

These platforms are placed within the atmosphere of the Earth and can be
further classified into balloons and aircrafts.

a. Balloons: Balloons as platforms are not very expensive like aircrafts.


They have a great variety o f s h a p e s , sizes and performance capabilities.
The balloons have low acceleration, require no power and exhibit low
vibrations. There are three main types of balloon systems, viz. free balloons,
Tethered balloons and Powered Balloons. Free balloons can reach almost the
top of the atmosphere; hence they can provide a platform at intermediate
altitude between those of aircraft and space craft.

Thousands of kilograms of scientific payloads can be lifted by free balloons.


Unless a mobile launching system is developed, the flights can be carried out
only from a fixed launching station. The free balloons are dependent on
meteorological conditions, particularly winds. The flight trajectory cannot be
controlled. All the semake extremely difficult to predict whether the balloon
swill fly over the specific area of interest or not.
In India, at present, Tata Institute of Fundamental Research, Mumbai,
hassetupa National balloon facility at Hyderabad. Teethered balloon sare
connected to the earth station by means of wire shaving hight ensional
strength and high flexibility.
The teethered line can carry the antenna, power lines and gas tubes etc.
when wind velocity is less than 35km. perhourat the altitude of 3000m.,
sphere type balloon is used. When the wind velocity is less than 30 km per
hour, natural shape balloons are restricted to be placed. Tethered balloons
have the capability of keeping the equipment at a fixed position for a long
time and thus, useful for many remote sensing programmers.Powered
balloons require some means of propulsion to maintain or achieve station
over adesignated geographic location. These can be remotely controlled and
guided along with apath or fly above a given area within certain limitations.

b. Aircrafts: Aircrafts are commonly used as remote-sensing for obtaining


Aerial Photographs. In India, four types of aircrafts are being used for remote
sensing operations.These are asfollows:

DAKOTA: The ceiling height is 5.6 to 6.2 km and minimum speed is 240 km./hr.
AVRO: Ceiling height is 7.5 km and minimum speed is 600 km./hr. CESSNA:
Ceiling height is 9 km. and minimum speed is 350 km./hr. CANBERRA: Ceiling
height is 40 km. and minimum speed is 560 km./hr.

The following special aircrafts are being used in abroad for remote sensing
operations in high altitude photography.

U-2:Ceilingheightis21km.(for strategic photographic).Minimum speedis798


km./hr.

ROCKELL X-15 (Research Craft): Ceiling height is 108 km. and speed is 6620
km./hr.

The advantages of using aircraft sasremotesensing platform are : high


resolution of data recorded, possibility of carrying large payloads, capability of
imaging large area economically, accessibility of remote areas, convenience of
selecting different scales, adequate controlatalltime etc. However, due to
limitations of operating altitudes and range, the aircraft find s its great est
application sinlocalor regional programmeratherthan measurements on global
scale. Beside sallthese, aircrafts have been playing an important role in the
development of space borne remote sensing Techniques. Testing of sensors
and various systems and subsystems involved in space borne remote sensing
programme is always undertaken in a well-equipped aircraft.
3.SPACE BORNE PLATFORMS:

Platforms in space, i.e.satellites are not affected by the earth's atmosphere.


The seplatforms move freely in their orbits around the earth. The entire earth
or any part of the earth can be covered at specified intervals. The coverage
mainly depends on the orbit of the satellite. It is through the sespaceborne
platforms, wegetenormous amount of remote sensing data and asaresult
Remote Sensing has gained international popularity. According to the orbital
mode, there are two types of satellites- Geostationary or Earth synchronous
and sun-synchronous.

Orbit Types: Geo- Synchronous And Sun-


Synchronous

STATIONARYSATELLITES: Geostationary satellites are the satellites which revolve round the
earth above the equatoratthe height of about 36,000 to 41,000km., in the direction of earth's
rotation. SUN-SYNCHRONOUSSATELLITES:Sun-
synchronoussatellitesarethesatelliteswhichrevolved round the earth in north-south direction
(poleto pole) at the height of about 300to1000 km.

ORBIT TYPES

GEO- SYNCHRONOUS AND SUN-SYNCHRONOUS


STATIONARYSATELLITES: Geostationary satellites are the satellites
which revolve round the earth above the equatoratthe height of about 36,000 to
41,000km., in the direction of earth's rotation. They make one revolution in
24hours, synchronous with the earth's rotation (Fig.2). As a result, it appears
stationary with respect to earth.
These platforms always cover a specific area and give continuous coverage
over the same area day and night. Their coverage is limited to 70 N and 70 S
latitudes and one satellite can view one third globe. These are mainly used for
communication and weather monitoring. Some of these satellites are INSAT,
METSAT and ERS series.

SUN-SYNCHRONOUSSATELLITES:
Sun-synchronoussatellitesarethesatelliteswhichrevolved round the earth in
north-south direction (poleto pole) at the height of about 300to1000 km.
(Fig.2.1) they pass over places on earth having the same latitude twice in each
orbit at the same local sun-time, hence are called sun-synchronous satellites.
Through these satellites, the entire globe is covered on regular basis and gives
repetitive coverage on periodic basis. Alltheremotesensing resources satellites
may be grouped in this category. Few of the sesatellites are: LANDSAT, IRS,
SPOT series and NOAA, SKYLAB, SPACE SHUTTLE etc.

Passive and Active Sensors

Remote sensors are the instruments which detect various objects on the earth's surface by
measuring electromagnetic energy reflected or emitted from them.

PASSIVE AND ACTIVE SENSORS


Remote sensors are the instruments which detect various objects on the earth's
surface by measuring electromagnetic energy reflected or emitted from them.
The sensors are mounted on the platforms discussed above. Different sensors
record different wavelengths bands of electromagnetic energy coming from the
earth's surface. As for example, anordinary camera is the most familiar type of
remote sensor which uses visible portion of electromagnetic radiation.
Classification of Sensors

Remote sensors can be classified in different ways as follows.

1. On the Basis of Source of Energy Used: On the basis of source of energy


used by the sensors, they can be classified into two types - Active sensors and
Passive sensors.

2.3.1ACTIVESENSORS: Active sensors use their own source of energy


and earth surface is illuminated by this energy.

Thenapartofthis energy is reflected back which is received by the sensor to


gather information about the earth's surface (Fig.3).

When photographic camera uses its flash, it actsasan active sensor. Radar and
laser altimeterare active sensors. Radar is composed of a transmitter and a
receiver. The transmitter emits a wave, which strikes objects and is then
reflected or echoed back to the receiver. The properties of an active sensor are:
1) It uses both transmitter and receiver units to produce imagery, hence it
requires high energy levels. 2) It mostly works in microwave regions of EMR
spectrum, which can penetrate clouds and is not affected by rain. 3) It is an all
weather, day-night system and independent of solar radiation. 4)The RADAR
signal does not detect colour information or temperature information, but it can
detect the roughness, slope and electrical conductivity of the objects under
study.

2.3.2. PASSIVE SENSORS:


Passive sensors do not have their own source of energy. The earth surface is
illuminated by sun/solar energy. The reflected solar energy from
theearthsurfaceortheemittedelectromagneticenergybytheearthsurfaceitself is
received by the sensor (Fig.3). Photographic camera is a passive sensor when it
is used in sunlight, without using its flash. The properties of a passive sensor
are:
1)It is relatively simple both mechanically and electrically and it does not have
high power requirement.
2) The wavebands, where natural remittance or reflected levels are low ,high
detector sensitivities and wider adiation collection apertures are necessary to
obtain areasonable signal level. Therefore, most passive sensors are relatively
wide band systems. 3) It depends upon good weather conditions.

2. On the Basis of Function of Sensors: On the basis of function of sensors,


they are divided into two main types - Framing System and Scanning System.

a. Framing system: Inframing system, two dimensional images are formed at


one single instant. Here, alensisused together the light which is passed through
various filters and then focused on a flat photo sensitive target. In ordinary
camera, the target is film emulsion, where asinvidicon camera, thetargetis
electrically charged plate.

b. Scanning System: In scanning system, asingle detector/anumberofdetectors


with specific field of view, is used which sweeps across a scene in a series of
parallel inesand collect data for continuous cellsto produce an image. Multi
Spectral Scanner, Microwave Radiometer, Microwave Radar, Optical Scanners
are few examples of scanning system sensors.

3. On the Basis of Technical Components of the System: The sensors can be


classified into three categories on the basis of technical components of the
system and the capability of the detection. Theseare:1) Multi spectral imaging
sensorsystems,2) Thermal remote sensing systems, and 3) Microwave radar
sensing systems. The multispectral or multiband imaging systems may use
conventional type camers or may use a combination of both cameras and
scanners for various bands of electromagnetic energy. As for example, Return
Beam Vidicon (RBV) sensor of Landsatuses both photographic and Scanning
systems, which is similar to an ordinary TV camera. The thermal system uses
radiometers, photometers, spectrometers, thermometers to detect the
temperature changes where microwave sensing systems use the antenna arrays
for collecting and detecting the energy from the terrain elements.
Resolution Concept: Spatial, Spectral, Radiometric
Resolution
SPATIAL RESOLUTION: It is a measure of the smallest angular or linear separation between
two? objects that can be resolved by the sensor.

RESOLUTION CONCEPT:

1 SPATIAL RESOLUTION:

It is a measure of the smallest angular or linear separation between


two? objects that can be resolved by the sensor. The greater the sensor's
resolution, the greater the data volume and smaller the area covered. In fact, the
area coverage and resolution areinter dependent and these factors determine the
scale of the imagery.

Spatial resolution is a complex concept which can, for the purpose of


remotesensing of polar regions, be defined as the smallest object that can be
detected and distinguished from a point. The most frequently used measure,
based upon the geometriC properties of an imaging system, is the instantaneous
field of view (IFOV) of a sensor. The IFOV is the area on the surface that is
theoretically viewed by the Instrument from a given altitude at a given time.
The spatial resolution is usually determined by instrumental parameters and by
the height of the satellite above the ground. With the exception of active
microwave systems, the resolution of a system cannot be better than
approximately HIID (the diffraction limit), where H is the height, I is the
wavelength and 0 is the diameter of the objective lens, objective mirror or
antenna. This limit is typically of the order of 10 to 100 m for VIS and IR
systems operating from satellites in low orbits, and typically 1 to 10 km when
the satellite is geostationary. For passive microwave observations, the resolution
limit is much coarser (of the order of tens of km) because of the larger
wavelength measured.

It was stated that the best achievable spatial resolution is of the order of HI/D
(except for some types of radar system), although some non-radar systems may
not reach this resolution because of other instrumental effects. Two important
examples are sensors in which the incoming radiation is focused on to an image
array of discrete detecting elements, and photographic systems. The detecting
element or film imposes its own maximum resolution, again proportional to the
height H and, if this is poorer than the diffraction-limited resolution, it will
dominate.

The spatial resolution achievable by radar systems is very dependent on


theway the data from the system are processed. Such systems are often pulsed,
and one important factor is the length of the emitted pulse. Synthetic aperture
radars (SARs) also integrate the return signal for a period of time while the
radar is carried forward on its platform, and the integration time also influences
the resolution. It is not possible to give here a statement of the general
principles determining radar spatial resolution, and the interested reader is
referred to treatments given by Ulaby, Moore and Fung (1981 and 1982), Elachi
(1987) and Rees (1990). Spatial resolution of an imaging system can be
measured in a number of different ways. It is the size of the smallest object that
can be discriminated by the sensor. The greater the sensor's resolution, the
greater the data volume and smaller the area covered. In fact, area coverage and
resolution are interdependent and these two factors determine the scale of an
imagery. Altematively, spatial resolution can be said to be the length of the size
of the area on the ground represented by a pixel on an image. The basIs for the
definition of spatial resolution can depend on four criteria, namely, : (i)
Geometrical properties of the imaging system, (ii) the ability to distinguish
between point targets, (iii) the ability to meaSllre the periodicity of repetitive
targets, and (iv) the ability to measure the spectral properties of small targets
(Mather, 1999).

Spatial resolution of any satellite sensor applies to the image produced by


thesystem, whereas resolving power of any photograph applies to an imaging
system or a component of the system. As mentioned earlier, the most commonly
used measure for spatial resolution of any sensor, based on the geometric
properties of the imaging system, is the Instantaneous Field of View (IFOV) of
a sensor. IFOV is defined as thearea on the ground that is viewed by an
instrument from a given altitude at any given instant of time. Fig. 2.3 illustrates
the relationship between the swath width and the IFOV. The IFOV can be
measured in one of the two ways, (i) by measuring angle "a"and (ii) by
measuring the distance XY on the ground.
2 SPECTRAL RESOLUTION:

It refers to the dimension and number of specific wavelength intervals in


the electromagnetic spectrum to which a sensor is sensitive. Narrow bandwidths
in certain regions of the electromagnetic spectrum allow the discrimination of
various features more easily. Temporal resolution: It refers to how often a given
sensor obtains imagery of a particular area. Ideally, the sensor obtains data
repetitively to capture unique discriminating characteristics of the phenomena
of interest.

It is the width of the spectral band and the number of spectral bands in whichthe
image is taken. Narrow band widths in certain regions of the electromagnetic
spectrum allow us to discriminate between the various features more easily.
Consequently, we need to have more number of spectra! bands, each having a
narrow bandwidth, and these bands should together cover the entire spectral
range of interest. The digital images collected by satellite sensors except
microwave sensing systems like Seasat, SIR B Radarsat, have been multi-band
or multispectral, individual images separately recorded in discrete spectral
bands. Multispectral imaging refers to viewing a given area in several narrow
bands to obtain better identification andclassification of objects. Multistage
imaging refers to the observations of the same area from different positions of
the platforms (stereoscopic data). Multistage imaging refers to the observations
made over the same area on different dates to monitor the objects like crop
growth. This is also called temporal resolution. The term spectral resolution
refers to the width of the spectral bands. Spectral resolution can be explained by
considering two points,

(i) the position of the spectrum, width and number of spectral bands will
determine the degree to which individual targets can be determined on the
multispectral image, and (ii) the use of multispectral imagery can lead to a
higher degree of discriminating power than any single band taken on its own.

3 RADIOMETRIC RESOLUTION:

It is the capability to differentiate the spectral reflectance/ remittance


from various targets. This depends on the number of quantization levels within
the spectral band. In other words, the number of bits of digital data in the
spectral band will decide the Sensitivity of the sensor.

It is the smallest difference in exposure that can be detected in a given film


analysis. It is also the ability of a given sensing system to discriminate between
density leve:s. In general, the radiometric resolution is inversely proportional to
contrast, so that higher contrast film is able to resolve smaller differences in
exposure. Low contrast films have greater radiometric range while highest
contrast films have smaller exposure range and lower radiometric range.

Pay Load Description Of Important Earth Resources


And Meterological Satellites
1 EARTH RESOURCES SATELLITES 2 LANDSAT SATELLITE PROGRAMME 3 SPOT SATELLITE
PROGRAMME 4 INDIAN REMOTE SENSING SATELLITE (IRS)

PAY LOAD DESCRIPTION OF IMPORTANT EARTH RESOURCES


AND METEROLOGICAL SATELLITES

1 EARTH RESOURCES SATELLITES

There are three distinct groups of earth resources satellites. The first
group of satellites record visible and near visible wavelengths. The five
satellites of Landsat series which are the first generation earth resources
satellites are a classic example of this group. The four IRS satellites and the
more improved SPOT series of these satellites may be considered the second
generation earth resources satellites of the same group. Group two satellites
carry sensors that record thermal infrared wavelengths and include the Heat
Capacity Mapping Mission sate�llites, namely, Explorer series. Group three
satellites are deployed with sensors that record micro wavelengths. The seasat
series and the ERS are examples of this group.

2 LANDSAT SATELLITE PROGRAMME

National Aeronautics and Space Administration (NASA) of USA with the


cooperation of the U.S. Department of Interior planned the launching of a series
of Earth Resources Technology Satellites (ERTS). ERTS-1 was launched by a
ThorDelta rocket on July 23, 1972 and it operated until January 6,1978. It
represented the first unmanned satellite designed to acquire data about the earth
resources on a systematic, repetitive, medium resolution, multispectral basis.
Subsequently, NASA renamed the ERTS programme as "Landsat" programme
to distinguish it from the series of meteorological and oceanographic satellites
that the USA launched later. ERTS-1 was retrospectively named Landsat-1.
Five Landsat satellites have been launched so far and this experimental
programme has evolved into an operational global resource monitoring
programme. Three different types of sensors have been flown in various
combinations on the five missions. These are Return Beam Vidicon (RBV)
camera system, the Multispectral Scanner (MSS) system and the Thematic
Mapper (TM).
Characteristics of Landsat Satellites and Their Sensors:
3 SPOT SATELLITE PROGRAMME
France, Sweden and Belgium joined together and pooled up their
resources to develop the System Pourl' Observation dela Terre (SPOT), an earth
observation satellite programme. The first satellite of the series, SPOT-1 was
launched from Kourou LaunchRange in French Guiana on February 21,1986
aboard an Ariance Launch vehicle (AIV).This is the first earth resource satellite
system to include a linear array sensor employing the push broom scanning
technique. This enables side-to-side oft-nadir viewing capabilities and affords a
full scene stereoscopic imaging from two different viewing points of the same
area. The high resolution data obtained from SPOT sensors, namely, Thematic
Mapper (TM) and High Resolution Visible (HRV), have been extensively
usedfor urban planning, urban growth assessment, transportation planning,
besides the conventional applications related to natural resources.
Characteristics of SPOT Satellite and HRV Sensor Satellite
4 INDIAN REMOTE SENSING SATELLITE (IRS)

The IRS mission envisages the planning and implementation of a satellite based
remote sensing system for evaluating the natural resources. The principal
components of the mission are: a three axis stabilised polar sunsynchronous
satellite withmultispectral sensors, a ground based data reception, recording and
processingsystems for the multispectral data, ground systems for the in-orbit
satellite controlincluding the tracking network with the associated supporting
systems, and hardwareand software elements for the generation of user oriented
data products, data analysis and archival. The principal aim of the IRS mission
is to use the satellite data in conjunction with supplementary/complementary
information from other sources forsurvey and management of natural resources
in important areas, such as, agriculture, geology and hydrology in association
with the user agencies. IR$ series of satellites are IRS lA, IRS IB, IRS IC, IRS
ID and IRS P4 apart from other satellites which were launched by the
Government of India. The orbital and sensor characteristics of IRS IAand IB are
the same and IRS IC and IRS ID have almost similar characteristics. IRSP4is an
oceanographic satellite, and this will be discussed in the next section. IRS has
application potential in a wide range of disciplines such as management of
agricultural resources, inventory of forest resources, geological mapping,
estimation of water resources, study of coastal hydrodynamics, and water
quality surveying. The sensor payload system consists of two push broom
cameras (LiSS-II) of36.25 m resolution and one camera .(LlSS-I) of 72.5 m
resolution employing linear Charge Coupled Device (CCD) arrays as detectors.
Each camera system images in four spectral bands in the visible and near IR
region. The camera system consists of collecting optics, imaging detectors,
inflight calibration equipment, and processing

devices. The orbital characteristics of the IRS-1A, 1 B satellites and the


sensorcapabilities are given in Table 4.3. As IRS-1 D satellite is the latest
satellite of theseries and hence the system overview of IRS - 1 D is provided.

The IRS-1 D is a three-axes body stabilized satellite, similar to IRS-1 C.


SinceIRS-1 C and 1 D are similar in orbital characteristics and sensor
capabilities, the detailsof IRS-1 D are discussed as it is a very recent satellite. It
will have an operational lifeof three years in a near polar sun synchronous orbit
at a mean altitude of 780 Km. The payload consists of three sensors, namely,
Panchromatic camera (PAN), linear imaging and self-scanning sensor (LiSS-III)
and wide Field sensor (WiFs). The satellite is equipped with an On-Board Tape
Recorder (OBTR) capable of recording limited amount of specified sensor data.
Operation of each of the sensors can be programmed.
The payload operation sequence for the whole day can be loaded daily on
to theon-board command memory when the satellite is within the visibility
range. The ground segment consists of a Telemetry Tracking and Command
(TTC) segment comprisinga TTC network, and an Image segment comprising
data acquisition, data processing and product generation system along with data
dissemination centre. The over view of IRS-1 D mission is to provide optimum
satellite operation and a mission control centre for mission management,
spacecraft operations and scheduling. The three sensors on board IRS-1 D and
IRS-1 C are described in the following paragraph.
The panchromatic camera provides data with a spatial resolution of 5.2-5.8 m(at
nadir) and a ground swath between 63 Km -70 Km (at nadir). It operates in
the0.50 - 0.75 microns spectral band. This camera can be steered upto � 26
deg. storable upto �398 Km across the track from nadir, which in turn
increases the revisit capability to 3 days for most part of the cycle and 7 days in
some extreme cases.

Meterological Satellites
Meteorological satellites designed specifically to assist. in weather predictionand monitoring,
generally incorporate sensors that have very coarse spatial resolution compared to land-oriented
systems.

METEROLOGICAL SATELLITES:

Meteorological satellites designed specifically to assist. in weather


predictionand monitoring, generally incorporate sensors that have very coarse
spatial resolution compared to land-oriented systems. These satellites, however,
afford a high frequency global coverage. USA has launched a multiple series of
meteorological satellites with a wide range of orbit and sensing system designs.
The first of these series is called the NOAA, an acronym for National Oceanic
and Atmospheric Administration. These satetlites are in near-polar,
sunsynchronous orbits similar to those of 'Landsat and IRS'. In contrast, another
series of satellites which are of essentially meteorological type, called
Geostationary Operational Environmental Satellite (GOES) series and Meteosat
operated by European Space Agency, are geostationary, remaining in a constant
relative position over the equator.

1 NOAA SATELLITES

Several generations of satellites in the NOAA series have been placed in


orbit.The satellites NOAA-6 through NOAA-10 contained Advanced Very High
Resolution Radiometer (AVHRR). The even-numbered missions have daylight
(7.30 A.M.) north-to-south equatorial crossing and the odd-numbered missions
have night time (2.30 A.M.) north-to-south equatorial crossing. The basic
characteristics of these missions and the AVHRR instrument are listed in Table
4.8. Apart from routine climatological analyses, the AVHRR data have been
used extensively in studies of vegetation dynamics, flood monitoring, regional
soil moisture analysis, dust and sandstorm monitoring, forest wild fire mapping,
sea surface temperature mapping, and various geological applications, including
observation of volcanic eruptions, and mapping of regional drainage and
physiographic features.

Details of NOAA Satellite and AVHRR Sensor


Characteristics of Satellite
2 GOES SATELLITES

The GOES programme is a cooperative venture between NOM and NASA. The
Geo-stationary Operational Environmental Satellites (GOES) are part of a
global network of meteorological satellites spaced about 70 o longitude apart
around the world. The GOES images are distributed in near real-time for use in
local weatherforecasting. They have also been used in certain large area
analyses such as regional snow cover mapping.
3 NIMBUS SATELLITES

This is one of the ocean monitoring satellites launched in October 1978. This
satellite carries the Coastal Zone Colour Scanner (CZCS) designed specifically
to measure ocean parameters. The details of the six bands in which the CZCS
operates and the characteristics of NIMBUS-7 satellite are presented in Table
4.10 The CZCS has been used to measure sea surface temperatures, detection of
chlorophyll and suspended solids of near-shore and coastal waters.

You might also like