0% found this document useful (0 votes)
30 views21 pages

Remote Sensing

The document provides a comprehensive overview of remote sensing, including its definition, components, applications, and the electromagnetic spectrum's role in data acquisition. It discusses the importance of spectral signatures, resolutions, and the interaction of electromagnetic energy with the atmosphere and Earth's surface. Additionally, it highlights the advantages and challenges of remote sensing, emphasizing its utility in various fields such as environmental monitoring, urban planning, and disaster management.

Uploaded by

Aryan Badghare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views21 pages

Remote Sensing

The document provides a comprehensive overview of remote sensing, including its definition, components, applications, and the electromagnetic spectrum's role in data acquisition. It discusses the importance of spectral signatures, resolutions, and the interaction of electromagnetic energy with the atmosphere and Earth's surface. Additionally, it highlights the advantages and challenges of remote sensing, emphasizing its utility in various fields such as environmental monitoring, urban planning, and disaster management.

Uploaded by

Aryan Badghare
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Remote

Sensing,
GNSS
and GIS
Remote Sensing
Remote Sensing: An Overview
Definition: Remote sensing is the art and science of acquiring information
about the Earth's surface from a distance. It uses satellites to capture images,
providing valuable data for mapping and monitoring changes over time.
Components of Remote Sensing Systems:
Energy Source: Provides constant energy over all wavelengths at a known
rate. The ideal source is uniform and has high output.
Radiation and Atmosphere: The atmosphere can scatter and absorb
energy, affecting the signals reaching the sensors.
Interaction with Object: Objects interact differently with electromagnetic
radiation depending on their properties, which is critical for remote
sensing.
Transmission, Processing, and Ground Receiving Station: Sensors capture
reflected or emitted signals, convert them into digital form, and transmit
them to the ground station for processing.
Sensors and Data:
Sensors capture incoming, reflected, and emitted radiation, which is then
processed and stored at the ground station.
Different types of sensors provide data at various resolutions, used for
applications like land use mapping, urban planning, and environmental
studies.
Applications:
Remote sensing data is used in land use and land cover mapping, forestry,
agriculture, telecommunication, environmental monitoring, urban
planning, infrastructure development, emergency management, and more.
It enables mapping of changes over time, such as river shifts, urban
expansion, or damage assessment (e.g., post-tsunami).
Advantages:
Cost-effective and efficient for large-scale analysis.
Provides temporal data, allowing change detection over time.
Easily integrated with GIS for enhanced analysis and visualization.
Disadvantages:
Requires ground verification (10-15%) to ensure accuracy.
Digital methods can misclassify objects, requiring a good understanding of
algorithms.
History and Evolution:
Began in the 1960s with aerial platforms and evolved to satellite platforms.
Recently, UAV drones have brought back aerial remote sensing.
Data Analysis and Interpretation:
Satellite data can be used for creating various maps, analyzing urban
landscapes, and detecting critical changes over time.
Medium and high-resolution images can help in detailed infrastructure
mapping and urban planning.
Use Cases:
Example images showed changes in the Ganges River and damage from the
2004 tsunami, highlighting the importance of remote sensing in change
detection and disaster management.
Challenges in Interpretation:
Interpretation of remote sensing data is complex and requires
understanding of algorithms to prevent misclassification.

Electromagnetic Spectrum and Remote Sensing


1. Electromagnetic Radiation (EMR):
Definition: All objects above absolute zero emit EMR, consisting of electric
and magnetic fields that are perpendicular to each other, traveling in a sine
wave form.
Key Characteristics:
Wavelength: Distance between successive wave peaks.
Frequency: Number of peaks passing a point per unit time.
Inverse Relationship: Wavelength and frequency are inversely related.
2. Electromagnetic Spectrum:
Ranges from gamma rays (high frequency) to radio waves (low frequency).
Important Regions:
Visible Light: 0.4 to 0.7 micrometers; most common in remote sensing.
Infrared (IR): Divided into reflected IR and thermal IR.
Microwave: 1 mm to 1 meter; used for penetrating clouds and fog.
Radio Waves: 30 cm to thousands of meters; used in communication.
Unusable for Remote Sensing: Gamma rays and X-rays are absorbed by the
atmosphere.
3. Visible Spectrum and Sensors:
Visible spectrum includes colors created by primary colors: red, green, and
blue.
Most remote sensing sensors operate in the visible spectrum, but some
also work in ultraviolet and microwave regions.
4. Interaction of EMR with Objects:
Absorption: Some energy is absorbed by the object.
Transmission: Energy passes through the object.
Reflection: Energy bounces off the object and is captured by sensors.
Scattering: Energy is deflected in different directions.
5. Bands in Remote Sensing:
Different parts of the spectrum are divided into "bands" (e.g., Band 1, Band
2) to simplify communication.
6. Applications and Uses:
Infrared: Useful for thermal studies and capturing temperature data.
Microwave: Useful for imaging through clouds and fog, both in active and
passive modes.
Ultraviolet: Absorbed by the ozone layer; limited use in remote sensing.
7. Energy Interaction with Earth Objects:
When EMR strikes an object, energy is divided into reflected, transmitted,
absorbed, and scattered components.
Incident Energy Equation: I =Reflected+Transmitted+Absorbed+Scattered

Electromagnetic Energy Interaction with Atmosphere and Earth Surface


1. Energy Interaction with Matter:
Types of Energy Captured by Sensors:
Reflected Energy: Light reflected from surfaces.
Emitted Energy: Radiation emitted by objects.
2. Atmospheric Processes:
Absorption: Energy absorbed and converted into other forms.
Key Absorbing Agents: Ozone, CO₂, water vapor, methane.
Absorption Spectrum: Minimum absorption in the visible spectrum.
Reflection: Energy bounced off surfaces; follows Snell's Law.
Scattering: Dispersal of light in multiple directions.
Factors Affecting Scattering: Wavelength, particle size, distance.
3. Colors in Nature and Man-Made Objects:
Natural Colors: Result from scattering (e.g., blue sky, red sunset).
Man-Made Objects: Visible due to reflection of light.
Types of Scattering
1. Rayleigh Scattering:
Occurs when particles are smaller than the wavelength.
4
Scattering intensity ∝ 1/λ
Causes blue color of the sky due to higher scattering of shorter
wavelengths.
2. Mie Scattering:
Caused by larger particles (0.1 to 10 times wavelength size).
Influences by dust, smoke, pollen, and water vapor.
Occurs mostly in the lower atmosphere.
3. Non-Selective Scattering:
Occurs when particle size is much larger than the wavelength.
All wavelengths scattered equally, resulting in white light.
Absorption and Scattering: Determine optimal wavelengths for remote
sensing.
Visible Spectrum: Best region for remote sensing due to minimal absorption
and high sensitivity of human eyes.
Important Keywords
Absorption, Reflection, Scattering, Rayleigh Scattering, Mie Scattering, Non-
Selective Scattering, Atmospheric Absorption Spectrum, Snell's Law, Visible
Spectrum, Ultraviolet Radiation, Greenhouse Gases, Energy Interaction with
Matter, Reflected and Emitted Energy
Atmospheric Window and Black Body

1. Introduction to Wavelength Regions in Remote Sensing:


Remote sensing relies on specific wavelength regions to effectively study and
analyze objects.
The choice of wavelength is crucial to minimize energy loss due to absorption,
scattering, or transmittance.
2. Black Body Radiation:
Black Body: An ideal body that absorbs all incident electromagnetic radiation
without any transmission or reflection.
Emits thermal radiation depending on temperature and wavelength.
The sun and the Earth approximately behave like black bodies.
As the temperature of a black body increases, the intensity of emitted energy
shifts from the visible part to the ultraviolet part of the spectrum.
3. Key Laws Governing Black Body Radiation:
Wien’s Displacement Law: The peak wavelength of radiation emitted by a black
body is inversely proportional to its temperature.
Higher temperatures result in peak emission at shorter wavelengths.
Planck’s Law: Describes the spectral density of electromagnetic radiation
emitted by a black body in thermal equilibrium.
Kirchhoff's Law: At the same temperature, all black bodies have the same ratio
of emitted to absorbed energy.
Defines the emissivity of an object, a measure of its efficiency in emitting
radiation.
Emissivity of a black body = 1 (perfect emitter).
Stefan-Boltzmann Law: Total energy emitted by a black body is directly
proportional to the fourth power of its absolute temperature.
4. Atmospheric Windows:
Definition: Wavelength regions where electromagnetic radiation can pass
through the atmosphere with minimal absorption and scattering.
These regions are crucial for effective remote sensing.
Key Atmospheric Windows:
Visible Part of the Spectrum (0.4 - 0.7 µm): Best for sensors capturing
reflected light.
Infrared Part (0.7 - 15 µm): Useful for thermal infrared imaging, detecting
thermal changes.
Microwave Part (0.1 - 100 cm): Suitable for studying surface properties,
even in cloudy conditions or during nighttime.
5. Application of Laws in Remote Sensing:
By understanding black body radiation and laws like Wien’s Displacement and
Planck’s Law, we can:
Determine optimal wavelength regions for sensors.
Maximize information content by selecting appropriate spectral bands.
6. Selecting Wavelengths for Remote Sensing:
Visible Region: Used for visual interpretation and mapping, where absorption
and scattering are minimum.
Infrared Region: Identifies vegetation, soil moisture, and temperature changes.
Microwave Region: Penetrates through clouds and vegetation, used for radar
sensing.
7. Importance of Emissivity:
Determines how different objects interact with radiation, crucial for selecting
sensor bands and analyzing remote sensing data.
8. Key Takeaways for GATE Preparation:
Understand the concepts of black body radiation, laws related to radiation
(Wien’s Displacement, Planck’s, Kirchhoff's, and Stefan-Boltzmann).
Focus on atmospheric windows and their importance in remote sensing.
Remember the specific wavelength ranges that are useful in remote sensing
studies (visible, infrared, and microwave regions).
Key Points/Keywords:
Black Body Radiation, Emissivity, Wien’s Displacement Law, Planck’s Law,
Kirchhoff's Law, Stefan-Boltzmann Law, Atmospheric Windows, Visible
Spectrum, Infrared Spectrum, Microwave Spectrum, Remote Sensing
Applications, Thermal Imaging, Sensor Wavelength Selection.

Spectral Signature in Remote Sensing

1. Definition of Spectral Signature:


Spectral Signature: Unique pattern of reflectance or emittance of an object
across different wavelengths.
Used to identify objects in remote sensing based on their spectral reflectance
characteristics.
2. Importance of Spectral Reflectance:
Spectral Reflectance: Ratio of reflected energy to incident energy, expressed
as a percentage.
Different objects have different spectral reflectance characteristics
depending on their physical and chemical properties.
3. Factors Affecting Spectral Reflectance:
Wavelength Variation: Reflectance varies with different wavelengths, affecting
how objects appear.
Object Properties: Physical and chemical composition, roughness, moisture
content, temperature.
Temporal Factors: Time of day (diurnal cycles), seasons, and environmental
conditions impact reflectance.
4. Multispectral Imaging:
Utilizes multiple wavelength regions to detect variations in reflectance
patterns.
Helps differentiate objects that may appear similar in one wavelength but
distinct in others.
Application: Identification of vegetation, water bodies, soil types, etc.
5. Measuring Spectral Reflectance:
Spectroradiometers: Instruments used to measure spectral reflectance; can
be field-based or mounted on UAVs or aircraft.
Calibration with a white reflectance plate ensures accurate readings.
6. Spectral Reflectance Curve:
Curve Characteristics: Shows reflectance values of objects over various
wavelengths.
Used to differentiate objects like snow, vegetation, soil, and water.
7. Key Regions for Remote Sensing:
Visible Spectrum (0.4 - 0.7 µm): Best for visual inspection and identifying
general features.
Near Infrared (0.7 - 1.4 µm): Effective for identifying healthy vs. diseased
vegetation, high vs. low turbidity water.
Infrared (1.4 - 3.0 µm): Good for soil and vegetation studies, identifying plant
growth, and analyzing soil moisture.
8. Practical Applications:
Vegetation Analysis: Differentiate healthy and stressed vegetation using the
near-infrared region.
Water Studies: Identify clear vs. turbid water in the near-infrared region.
Soil Mapping: Use spectral curves to identify soil types and characteristics.
Environmental Monitoring: Detect pollutants, contaminants, and changes in
land cover.
9. Instrumentation Techniques:
Use of light, portable spectroradiometers for field data collection.
Calibration of instruments to minimize errors due to sun elevation and other
factors.
Timing: Morning or late afternoon measurements to avoid shadow effects.
10. Designing Remote Sensing Sensors:
Sensors designed to operate in specific spectral bands for optimal data
collection.
Spectral reflectance values guide sensor calibration and hyperspectral data
analysis.

Key Points/Keywords:
Spectral Signature, Spectral Reflectance, Multispectral Imaging, Reflectance
Curve, Near Infrared, Visible Spectrum, Infrared Spectrum, Vegetation
Analysis, Water Quality Detection, Soil Mapping, Spectroradiometers, Remote
Sensing Sensors, Calibration, Hyperspectral Data, Land Cover Monitoring.
Resolutions in Remote Sensing

1. Overview of Resolutions in Remote Sensing:


Resolution: Defines the ability to distinguish objects or details in an image.
Four types of resolution: Spectral, Spatial, Temporal, Radiometric.
2. Spectral Resolution:
Definition: Ability of a sensor to distinguish different wavelengths.
Characteristics: Number and width of wavelength bands used (e.g., blue, green,
red, near-infrared).
Higher Spectral Resolution: Narrower bands, more detailed information (useful
for distinguishing vegetation types, water absorption).
Application: Identifying objects with unique spectral signatures, thematic
mapping.
3. Spatial Resolution:
Definition: Size of the smallest object that can be resolved; defined by pixel
size.
High Spatial Resolution: Smaller pixels, better detail (e.g., distinguishing
buildings, road junctions).
Low Spatial Resolution: Larger pixels, less detail (e.g., general land use
patterns).
Examples:
Landsat (30 m, 80 m),
Aster (15 m),
QuickBird (sub-meter).
Application: Differentiating small features such as buildings, vegetation
patches.
4. Temporal Resolution:
Definition: Frequency at which a sensor revisits the same location.
High Temporal Resolution: Frequent observations (useful for monitoring
dynamic changes like urban growth, deforestation).
Low Temporal Resolution: Infrequent observations (useful for long-term
monitoring).
Examples:
Landsat (16 days),
Geostationary weather satellites (continuous coverage),
Sun-synchronous satellites (daily to monthly intervals).
Application: Monitoring changes over time (e.g., seasonal vegetation growth,
urban expansion).
5. Radiometric Resolution:
Definition: Sensitivity of the sensor to detect small differences in energy
levels.
Higher Radiometric Resolution: More gray levels (e.g., 8-bit = 256 levels, 16-bit
= 65,536 levels).
Lower Radiometric Resolution: Fewer gray levels, less detailed variations (e.g.,
7-bit = 128 levels).
Application: Capturing subtle differences in reflectance for applications like
vegetation health analysis.
6. Key Characteristics and Their Applications:
Color Composite Images: Combining different spectral bands to create more
informative images (e.g., using red, green, blue bands).
Identifying Objects: Urban areas appear in bluish tints, vegetation in red or
orange in false-color composites.
Mapping and Classification: Combining data from various resolutions to
identify and classify land cover, urban features, and vegetation types.
7. Guidelines for Selecting Remote Sensing Data:
For Thematic Mapping: Use high spectral and spatial resolution.
For Monitoring Changes: High temporal resolution is preferred.
For Detail Analysis: High radiometric resolution to detect minor variations.
8. Practical Examples:
Landsat Thematic Mapper: 28.5 m spatial resolution; useful for land cover
classification.
Aster Data: 15 m spatial resolution; useful for detailed analysis.
SPOT Satellite: Flexible view angles; can observe the same area from different
perspectives.
Geostationary Satellites: Continuous monitoring, ideal for weather
observation.
9. Summary of Resolution Selection:
Choose the right combination of spectral, spatial, temporal, and radiometric
resolutions based on the specific application:
Vegetation Studies: High spectral and spatial resolution.
Urban Monitoring: High spatial and temporal resolution.
Climate Change Studies: High temporal resolution.
10. Key Points/Keywords:
Resolution, Spectral Resolution, Spatial Resolution, Temporal Resolution,
Radiometric Resolution, Pixel Size, Wavelength Bands, Color Composite,
Thematic Mapping, Satellite Imagery, Geostationary Satellite, Sun-
Synchronous Satellite, Reflectance Levels.

Multi-Concepts in Remote Sensing

1. Introduction to Multi-Concept in Remote Sensing:


Definition: Multi-concept involves using multiple approaches, datasets, or
methods in remote sensing for various applications.
Applications: Multi-hazard analysis, multi-phase analysis, and engaging
multiple stakeholders in decision-making.
2. Multi-Stage Remote Sensing:
Definition: Collection of data from multiple platforms (ground, drones,
low/high altitude aircraft, satellites).
Purpose: Provides data at different resolutions, scales, and coverage areas.
Challenge: Integrating diverse datasets on a common platform for analysis.
3. Multi-Resolution Images:
Definition: Use of images at different resolutions to obtain information of
varying detail levels.
High Resolution (0.5 m): Provides detailed information (Level 4); Low
Resolution gives broader categories (Level 1).
Application: Tailored selection of resolution based on specific study
requirements.
4. Multi-Band Remote Sensing:
Definition: Use of multiband images to capture different spectral reflectance
characteristics.
Use Cases: Creating indices (e.g., vegetation indices), color composites.
Advantage: Enhanced clarity and information content compared to single-
band images.
5. Multi-Sensor Remote Sensing:
Definition: Combining data from multiple sensors (e.g., multispectral,
hyperspectral, RGB sensors).
Data Fusion: Fusing multi-sensor data with panchromatic data for detailed
analysis.
Application: Identifies ground features more clearly by leveraging the
strengths of different sensors.
6. Data Fusion Techniques:
Definition: Combining data from different satellites/sensors to maximize
information.
Process: Geocoding and fusing data sets for comprehensive analysis on a
single platform.
7. Multi-Temporal Remote Sensing:
Definition: Use of images from different times to monitor changes over
periods.
Application: Assessing damage (floods, fires, earthquakes), predicting future
changes, monitoring critical areas (e.g., flood protection).
8. Multi-Directional Remote Sensing:
Definition: Observing an area from different angles to create a 3D model.
Application: Enables 3D visualization and analysis for applications such as
urban planning.
9. Multi-Disciplinary Approach:
Definition: Collaboration of experts from different domains for data
interpretation (e.g., smart city planning, transportation alignment).
Purpose: Enhances analysis by integrating knowledge from multiple fields.
10. Multi-Thematic Maps:
Definition: Creation of various thematic maps (e.g., water, vegetation, land use,
agriculture, urban) from the same satellite data.
Application: Useful for comprehensive resource management and planning.
11. Utility of Multi-Concepts in Remote Sensing:
Purpose: Helps in selecting the appropriate satellite images based on
application needs, balancing factors like resolution, cost, and data availability.
Use Cases: Resource exploration, incident mapping, site selection, engineering
design.
12. Advantages and Challenges:
Advantages: Provides a holistic view, enhances data accuracy, supports
diverse applications.
Challenges: Data handling complexity, need for data fusion and integration,
higher costs for high-resolution data.
13. Key Points/Keywords:
Multi-Stage, Multi-Resolution, Multi-Band, Multi-Sensor, Data Fusion, Multi-
Temporal, Multi-Directional, Multi-Disciplinary, Multi-Thematic, Remote
Sensing, Satellite Imagery, 3D Modeling, Data Integration, Hazard Analysis,
Resource Management.

Remote Sensing - Satellite Orbits

1. Types of Orbits in Remote Sensing:


Earth Observing Satellites: Utilize polar and geosynchronous orbits.
Attributes: Differ by altitude, inclination, and orbital plane angle relative to the
North-South Line and the equator.
2. Polar Orbits (Sun-Synchronous Orbits):
Characteristics:
Low altitude, sun-synchronized, move from pole to pole.
Provide global coverage at regular intervals.
Designed to pass over the same location at the same time each day.
Function:
Capture sunlit parts of the Earth using passive sensors.
Active sensors are used when covering the dark parts of the Earth.
Application: Ideal for continuous global monitoring, environmental mapping,
and data collection.
Swath Width:
Width covered by satellite on Earth's surface; varies with orbit type and
sensor.
Determines the size of the scene captured; narrower swath widths mean
higher resolution.
3. Geosynchronous Orbits:
Characteristics:
High Earth orbits that match Earth’s rotation speed, remaining fixed over
one point.
Located in the equatorial plane.
Function:
Constant monitoring of the same area.
Medium to high-resolution images provided at regular intervals (every few
days).
Application: Used for weather forecasting, climate monitoring, disaster
management, communication, etc.
4. Polar Orbiting Satellites:
Details:
Provide full global coverage by rotating from North to South and vice
versa.
Utilize the sun as an illumination source for passive sensors during their
sunlit pass.
Can achieve frequent imaging with multiple satellites (reduces temporal
data gaps).
Image Overlap: Near poles, high overlap due to reduced ground area, allowing
frequent coverage.
5. Swath Width and Satellite Path:
Definition: Swath width is the width of Earth's surface covered by a satellite in
one pass.
Path and Row Numbers:
Unique identifiers used to grid the world for satellite coverage.
Important for selecting the correct satellite data for a specific region.
6. Satellite Life and Successive Launches:
Typical Life Span: 5 years, but many exceed this; replacement satellites must
be ready before the previous one's expiry.
7. Geosynchronous Satellites:
Details:
Provide continuous data at fixed intervals (e.g., every 15-30 minutes).
Positioned far from Earth in the equatorial plane.
Application Areas: Weather monitoring, disaster management, climate studies,
communication.
8. Utility of Satellite Data:
Polar Orbiting Satellite Uses:
Earth resource mapping, environmental monitoring, application
development.
Geosynchronous Satellite Uses:
Real-time monitoring, weather forecasting, climate change studies, disaster
management.
9. Key Points/Keywords:
Polar Orbit, Geosynchronous Orbit, Sun-Synchronous, Swath Width, Path and
Row Numbers, Passive Sensors, Active Sensors, Remote Sensing Applications,
Earth Observation Satellites, Real-Time Monitoring, Resource Mapping,
Climate Monitoring, Disaster Management, Satellite Life Span.

Various Sensors
1. Introduction to Remote Sensing Sensors:
Function: Sensors on satellites capture refracted, reflected, emitted, and
backscattered radiation from the Earth's surface.
Data Types: Provide multi-resolution and multispectral data; convert analog
signals to digital signals.
2. Digital Conversion and Reflectance:
Digital Signals: Analog signals are converted into digital numbers (DN)
represented as bits and bytes.
Reflectance Factors: Vary based on time of day, season, atmospheric
conditions, and object characteristics; used to identify changes in objects.
3. Data Transmission and Processing:
Ground Station: Digital signals are transmitted to a ground receiving station,
processed, segmented, and made available for user demand.
4. Instantaneous Field of View (IFOV):
Definition: Area observed by a sensor from a specific height.
Relationship with Altitude: IFOV increases with sensor height; affects
resolution (ground area covered by one pixel).
5. Digital Image Characteristics:
Pixels: Comprise digital images, defined by path number, row number, and
digital number.
Intensity Levels: Pixel intensity stored in binary format; 256 levels of contrast
(black to white); more bits mean clearer images.
6. Sensor Types Based on Height:
Platform Variability: Different satellites operate at varying altitudes, offering
different resolutions.
Sensor Types:
Monochrome, Multispectral, Hyperspectral Sensors: Capture data across
various wavelengths.
7. Sensor Classification Based on Function:
Active Sensors: Provide their illumination (e.g., Synthetic Aperture Radar,
microwave radar, laser scanners).
Passive Sensors: Depend on external illumination, like sunlight (e.g., TV
cameras, radiometers).
8. Active vs. Passive Sensors:
Active Sensors:
Operate independently of sunlight.
Examples: Microwave radar, laser scanners.
Passive Sensors:
Depend on sunlight.
Types: Scanning type, imaging type.
9. Sensor Classification Based on Scanning Mechanism:
Scanning Types:
Whisk Broom (Across Track): Rotating mirror covers entire swath width.
Push Broom (Along Track): Less distortion, captures more data; commonly
used.
Digital Frame Camera: Uses CCDs for different wavelengths.
10. Imaging vs. Non-Imaging Sensors:
Imaging Sensors: Used for topographical mapping, ocean surface mapping.
Non-Imaging Sensors: Used for sonar and acoustics.
11. Sensor Selection Criteria:
Considerations: Based on source of illumination, scanning mechanism, and
wavelength region.
Applications: Suitable sensors are selected for atmospheric and radiometric
corrections.
12. Key Points/Keywords:
Remote Sensing Sensors, Digital Conversion, Instantaneous Field of View
(IFOV), Active Sensors, Passive Sensors, Whisk Broom Scanner, Push Broom
Scanner, Hyperspectral, Multispectral, Monochrome, Imaging and Non-
Imaging Sensors, Radiometric Correction, Atmospheric Correction.

Sensors and Platforms

1. Introduction to Remote Sensing Satellites and Sensors:


Satellite Sensors: Different types of sensors used on satellites like
hyperspectral sensors, microwave radiometers, and synthetic aperture radars
for various applications.
Sensor Categorization: Based on platforms (ground-based, airborne, satellite,
space-based) and their purpose (weather monitoring, urban mapping, water
monitoring, etc.).
2. Categories of Sensors:
Hyperspectral Sensors, Microwave Radiometers, and Synthetic Aperture
Radars: Provide a large number of images within narrow spectral ranges.
Low to High-Resolution Sensors: Sensors are categorized into low, high, very
high, and very low resolutions based on their application.
3. First Remote Sensing Satellites:
ERTS-1 (Earth Resources Technology Satellite): First remote sensing satellite
launched in 1972 by NASA, later managed by NOAA for multiple applications.
4. Landsat Satellites Overview:
Landsat Series: Launched in 1972; several subsequent versions:
Landsat 1: Launched in 1972; completed 14.5 orbits per day, covered a
swath of 185 km.
Landsat 2, 3, 4, 5: Launched with improved sensors; Landsat 3 in 1982,
Landsat 4 in 1989, and Landsat 5 in 1984.
Landsat 7, 8, 9: Newer satellites launched in 1999, 2013, and 2021 with
further sensor advancements.
5. Sensor Systems on Landsat Satellites:
Initial Sensors:
Return Beam Vidicon (RBV) Cameras: Used in Landsat 1, 2, 3.
Multispectral Scanner (MSS): Provided multiple spectral bands.
Thematic Mapper (TM): Operated in 7 spectral bands; improvements seen
in ETM+ sensors on Landsat 7.
Landsat 7 ETM+: Included 8 bands (one panchromatic) providing high-
resolution data at 15 meters.
Landsat 8 OLI (Operational Land Imager): Operates in 9 bands, better suited
for vegetation and soil analysis.
6. Characteristics of Landsat Data:
Data Resolution and Usage:
Scene Size: 185 km x 185 km; covers large areas for comprehensive
monitoring.
Radiometric Resolution: Landsat TM data has 256 DN values, helping
distinguish vegetation types and growth stages.
Thermal Infrared Bands: Useful for thermal mapping and identifying
surface temperatures.
7. Applications of Landsat Data:
Vegetation and Urban Mapping: Identify different vegetation covers, urban
areas, and water bodies.
Temporal Change Analysis: Detect changes such as urbanization or
deforestation over time using historical data.
8. Data Availability and Access:
Free Data Access: Data older than two years is freely available for research
and education, distributed by the USGS (United States Geological Survey).
9. Key Points/Keywords:
Remote Sensing, Hyperspectral Sensors, Microwave Radiometers, Synthetic
Aperture Radar, Ground-Based Sensors, Airborne Sensors, Satellite Sensors,
Space-Based Sensors, Landsat Series, Return Beam Vidicon (RBV),
Multispectral Scanner (MSS), Thematic Mapper (TM), Enhanced Thematic
Mapper Plus (ETM+), Operational Land Imager (OLI), Radiometric Resolution,
Panchromatic Data, Temporal Resolution, Thermal Infrared Bands, Urban
Mapping, Free Data Access.
Remote Sensing Sensors and Platforms
1. SPOT Satellites Overview:
SPOT Satellites: Sun-synchronous polar satellites with an 8-degree inclination.
HRV Sensors: High-Resolution Visible (HRV) sensors on SPOT 1, 2, 3; additional
vegetation sensors on SPOT 4 and 5.
Dual-Sensor Scanning: Provides wider swath width; allows simultaneous multi-
spectral imaging in three bands.
2. Image Characteristics and Improvements:
Push-Broom Sensor System: Linear array sensors provide continuous imaging
along the satellite's track.
Resolution: Panchromatic images improved to 10m x 10m; useful for thematic
mapping and topographical maps.
SPOT 4 and 5: Provide multispectral data and panchromatic data; launched at
lower altitude (694 km) for better revisit time.
3. SPOT Satellite Features:
High Spatial Resolution: Panchromatic data at 10m resolution; supports
1:50,000 scale mapping.
Steering Capability: Allows the satellite to change view angles, increasing
temporal resolution to 3 days.
Stereo Imaging: Provides elevation models, contours, and updated
topographical maps.
4. Applications of SPOT Satellite Data:
False Color Composite Images: Useful for detecting vegetation, water bodies,
urban areas, and disasters (e.g., forest fires).
Sequential Imaging: Used for monitoring changes, disasters, land use,
agriculture, forestry, and water management.
5. Indian Remote Sensing (IRS) Satellites:
IRS 1A (1988): First satellite in the IRS series; polar orbit at 904 km, 22-day
repeat cycle.
IRS 1B (1991): Two cameras for visible and infrared light; multispectral data.
IRS 1C (1995) and IRS 1D (1997): Three cameras, panchromatic sensor;
improved resolution and Wide Field Sensor (WiFS).
IRS P6 (2003): LISS-IV camera with better resolution, covering 25 km x 25 km
swath.
6. Cartosat Series:
Cartosat: Panchromatic camera with 2.5 m resolution; used for topographical,
cadastral, and land boundary mapping.
Data Usage: Survey of India uses Cartosat data for creating and updating
topographical maps.
7. Applications and Advancements in IRS Satellites:
Ocean Sat: Focus on ocean characteristics; provides chlorophyll content, snow
cover, geology, minerals, vegetation indices.
IRS P4, P6, P5 Series: Specific uses like oceanography, agriculture,
cartography, and digital elevation models.
Improved Sensors: Better resolution and reduced repeat periods for enhanced
monitoring of resources.
8. Other Polar Satellites:
Other Satellites: Include ADEOS, Terra, Aqua, NOAA, AVHRR, MODIS, and
POLDER for resource mapping.
Key Points/Keywords:
SPOT Satellites, Sun-Synchronous Orbit, HRV Sensors, Push-Broom Sensor,
Panchromatic Imaging, Multispectral Data, Steering Capability, Stereo Imaging,
IRS Satellites, LISS-IV Camera, Cartosat, Topographical Maps, Ocean Sat,
Vegetation Index, Digital Elevation Model, Terra, Aqua, MODIS, POLDER.

Very High Resolution Remote Sensing Data


1. Overview of High Resolution Remote Sensing:
Definition: Very high resolution (VHR) remote sensing data refers to satellite
images with a spatial resolution of less than 5 meters.
Satellites: Examples include Indian satellites and commercial ones like Spot 5
(5m) and QuickBird (0.5m).
2. Commercial High-Resolution Satellites:
Iconos (1999): Polar, sun-synchronous, collects data with 1m panchromatic
and 4m multispectral resolution.
Stereo mode for 3D mapping.
Used for utility mapping and large-scale map production (e.g., 1:5000,
1:4000).
QuickBird 2 (2001): 61 cm panchromatic, 2.4 m multispectral resolution.
Covers 100,000 sq km/day.
Useful for detailed vegetation and topographic mapping.
3. Other High-Resolution Satellites:
OrbView (1997-2003): Provided optical transit data and GPS, used for
atmospheric measurements.
WorldView 1, 2, 3, 4:
WorldView 3 (2014): Highest resolution at 31 cm, 8 multispectral bands,
and short-wave infrared band.
WorldView 2 (2009): First satellite to offer 8 multispectral bands.
Used for civilian and military applications, 3D data with stereo capabilities.
4. Applications of High-Resolution Data:
Mapping and Planning:
Large-scale map creation, e.g., 1:5000 scale.
Mapping individual buildings, road networks, and vegetation.
Change Detection & Modeling:
Change analysis, stormwater runoff modeling, and urban planning.
Identification of pervious and impervious surfaces.
Topographical Mapping:
Contour mapping and elevation data used for topographical maps.
5. Advantages of High-Resolution Satellite Data:
Detailed Feature Identification:
Buildings, roads, cars, vegetation, and even object height.
Provides valuable and timely information.
Applications in Emergency Response:
Natural disaster modeling and response planning.
6. Panchromatic and Multispectral Data Fusion:
Pan-Sharpening: Merging panchromatic and multispectral data enhances
resolution and improves mapping accuracy.
7. Key Features of High-Resolution Satellites:
Spatial Resolution: Ranges from 5m to 31cm (e.g., WorldView 3).
Radiometric Resolution: Improved precision (e.g., QuickBird, WorldView).
Stereoscopic Capability: Allows for 3D mapping and height determination.
Key Points/Keywords:
High Resolution Data, VHR, Iconos, QuickBird, WorldView, OrbView,
Panchromatic, Multispectral, Pan-sharpening, Spatial Resolution, Radiometric
Resolution, Stereoscopic Imagery, Mapping, Change Detection, Emergency
Response, Topographical Maps, Stormwater Runoff Modeling.
Thermal, Microwave, and Hyperspectral Remote Sensing
1. Thermal Imaging Overview:
Thermal Imaging: Detects temperature differences between features; brighter
areas are hotter.
Thermal Infrared Sensors: Operate in the 8–14 µm range; minimal atmospheric
losses.
Applications: Law enforcement, fire rescue, security, sea surface temperature
(SST), and volcanic activity.
2. Thermal Image Features:
Temperature Sensitivity: Thermal sensors capture emitted heat from natural
and man-made objects, distinguishing materials based on temperature.
Sea Surface Temperature (SST): Used for weather forecasting, climate studies,
and identifying thermal pollution (e.g., volcanic areas).
Sensor Examples: Landsat, Terra ASTER, NOAA AVHRR, Terra MODIS, Sentinel 3
SLSTR.
3. Microwave Imaging Overview:
Microwave Region: Located between infrared and radio wavelengths; can
penetrate clouds, vegetation, dust, and rain.
Active vs. Passive Sensors: Active sensors emit their own radiation (e.g., radar),
while passive sensors rely on emitted radiation.
Applications: Useful in all weather conditions, day and night, for remote
sensing of vegetation, glaciers, sea dynamics, and rainfall.
4. Active Microwave Sensors:
Distance Measurement: Measure reflected energy to calculate distance using
the time of travel multiplied by the speed of light.
SAR (Synthetic Aperture Radar): High-resolution, independent of cloud
coverage, useful for detecting objects, and monitoring displacement after
earthquakes.
Applications: Crop monitoring, snow and soil moisture studies, urban mapping,
hydrology, and oceanography.
5. Advantages of Microwave Data:
No Cloud Interference: Unlike optical data, microwave data can penetrate
cloud cover, allowing faster and clearer analysis.
Climatic and Planetary Studies: Used for climate change research and
planetary exploration.
6. Hyperspectral Imaging Overview:
Hyperspectral Sensors: Capture images in very narrow bands (~0.1 µm); used
to analyze detailed spectral information from different materials.
Applications: Identifying land cover, minerals, vegetation, and crop types
based on spectral signatures.
7. Spectral Reflectance Curves:
Spectral Signature: Reflectance curves (wavelength vs. reflectivity) are used to
identify different objects or materials.
8. Hyperspectral Sensors and Applications:
Examples of Sensors: Hyperion (EO-1), ultra-spectral sensors, providing over
100 bands in narrow wavelength ranges.
Use in Classification: Hyperspectral sensors enable detailed classification and
are widely used in geology, agriculture, and environmental analysis.
Mineral Mapping: Hyperspectral imaging is critical for mineralogical mapping
and plant physiology studies.
9. Multi-spectral vs. Hyperspectral Imaging:
Multi-spectral Systems: Useful for general feature discrimination but limited
compared to hyperspectral sensors in precision.
Spectral Data Analysis: Complex data analysis is required to extract useful
information from hyperspectral data.
10. Global Hyperspectral Coverage:
Global Use: Hyperspectral sensors are being developed and launched by
several countries for applications like ocean analysis, agriculture, and mineral
mapping.
Key Points/Keywords:
Thermal Infrared Sensors, Temperature Difference, Urban Heat Island, SST
(Sea Surface Temperature), Landsat, Sentinel 3, Microwave Imaging,
Active/Passive Sensors, SAR (Synthetic Aperture Radar), Hyperspectral
Sensors, Spectral Reflectance Curve, Hyperion, Mineral Mapping, Climatic
Studies, Multi-spectral Imaging, Plant Physiology.

Visual Interpretation Method


1. Visual Interpretation Overview:
Definition: Visual interpretation involves identifying objects in satellite images
or aerial photographs using the human eye.
Economical Method: Particularly useful in developing countries for small area
coverage.
2. Photo-Interpretation Process:
Tools Required: Black and white/color photographs, stereoscopes, parallax
bar, light tables, magnifying lenses, tracing paper.
Hard Copy Analysis: Images are enlarged, analyzed based on color, pattern,
shape, size, etc., for qualitative analysis and resource mapping.
3. Mosaic Creation:
Mosaics: Created by merging multiple images for larger areas, making
interpretation easier. Example: Landsat Mosaic of 105 images.
4. Elements of Visual Interpretation:
Tone: Refers to the brightness or darkness in black and white images; used to
identify the relative reflectance of features.
Texture: Frequency of tonal changes related to terrain roughness; varies with
scale (e.g., forest cover appears rough on large-scale images but smooth on
small-scale).
Pattern: Spatial arrangement of features (e.g., roads, buildings, drainage).
Examples include radial, concentric, and checkerboard patterns.
Shape: Distinguishing objects by their shape (e.g., buildings, roads, water
bodies); natural features tend to have irregular shapes, while man-made
objects have geometric shapes.
Size: Depends on image scale; helps in distinguishing features like roads, cars,
buildings (e.g., width of roads, size of houses).
Shadow: Shadows help identify object height and shape but may obscure
ground information. Shadow length can be used to calculate height.
Site and Association: Contextual information that relates one feature to
another (e.g., large building next to railway lines); location and association help
in accurate feature identification.
5. Practical Interpretation Example:
QuickBird Image: Identifying features like buildings, roads, and streets using
basic photo-interpretation elements.
6. Interpretation Elements and Their Use:
Tone, Texture, Pattern, Shape, Size, Shadow, Site, and Association: Used
together for thematic mapping and feature identification in satellite images
and aerial photos.
7. Application of Visual Interpretation:
Resource Mapping: Qualitative analysis of land features, infrastructure, natural
resources using cost-effective manual interpretation methods.
Key Points/Keywords:
Visual Interpretation, Photo-Interpretation, Mosaic, Tone, Texture, Pattern,
Shape, Size, Shadow, Site, Association, Resource Mapping, QuickBird Image,
Aerial Photographs, Thematic Maps.

You might also like