0% found this document useful (0 votes)
32 views

Topic 3 - Image Processing

Uploaded by

Shameer Imran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views

Topic 3 - Image Processing

Uploaded by

Shameer Imran
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

TOPIC 3

CONTENT
3.1 Understand the types of image correction
3.1.1 Explain the meaning of Noise on satellite image
3.1.2 Identify techniques of noise correction
3.1.3 Conduct radiometric correction
3.1.4 Conduct Import Image, interpretation and geometric correction
a. Registration of image to map grid
a. Image to image registration
• Most of the common image processing functions available in image
analysis systems can be categorized into the following four categories:
 Preprocessing
 Image Enhancement
 Image Transformation
 Image Classification and Analysis

• Preprocessing functions involve those operations that are normally


required prior to the main data analysis and extraction of information,
and are generally grouped as radiometric or geometric corrections.
• Radiometric corrections include correcting the data for sensor
irregularities and unwanted sensor or atmospheric noise, and converting
the data so they accurately represent the reflected or emitted radiation
measured by the sensor.
• Geometric corrections include correcting for geometric distortions due to
sensor-Earth geometry variations, and conversion of the data to real
world coordinates (e.g. latitude and longitude) on the Earth's surface.
• Image enhancement, is solely to improve the appearance of the imagery
to assist in visual interpretation and analysis.
• Examples of enhancement functions include contrast stretching to
increase the tonal distinction between various features in a scene, and
spatial filtering to enhance (or suppress) specific spatial patterns in an
image.
• Image transformations are operations similar in concept to those for
image enhancement.
• However, unlike image enhancement operations which are normally
applied only to a single channel of data at a time, image transformations
usually involve combined processing of data from multiple spectral
bands.
• Arithmetic operations (i.e. subtraction, addition, multiplication, division)
are performed to combine and transform the original bands into "new"
images which better display or highlight certain features in the scene..
• Image classification and analysis operations are used to digitally identify
and classify pixels in the data.
• Classification is usually performed on multi-channel data sets (A) and this
process assigns each pixel in an image to a particular class or theme (B)
based on statistical characteristics of the pixel brightness values.
• There are a variety of approaches taken to perform digital classification..
RADIOMECTRIC CORRECTION
• Radiometric corrections may be necessary due to variations in
scene illumination and viewing geometry, atmospheric conditions,
and sensor noise and response.
• Each of these will vary depending on the specific sensor and
platform used to acquire the data and the conditions during data
acquisition.
• Also, it may be desirable to convert and/or calibrate the data to
known (absolute) radiation or reflectance units to facilitate
comparison between data.
• Scattering of radiation occurs as it passes through and interacts
with the atmosphere.
• This scattering may reduce, or attenuate, some of the energy
illuminating the surface.
• In addition, the atmosphere will further attenuate the signal
propagating from the target to the sensor.
• Various methods of atmospheric correction can be applied ranging
from detailed modeling of the atmospheric conditions during data
acquisition, to simple calculations based solely on the image data.
CAUSES OF RADIOMETRIC CORRECTION
1. Error in Stripping
2. Atmosphere effect
3. Instrument Error
4. Noise Removal
NOISE
• Noise in an image may be due to irregularities or errors that occur in the
sensor response and/or data recording and transmission.
• Common forms of noise include systematic striping or banding and
dropped lines.
• Both of these effects should be corrected before further enhancement or
classification is performed.
• Striping was common in early Landsat MSS data due to variations and
drift in the response over time of the six MSS detectors.
• The "drift" was different for each of the six detectors, causing the same
brightness to be represented differently by each detector. The overall
appearance was thus a 'striped' effect.
• Dropped lines occur when there are systems errors which result in
missing or defective data along a scan line. Dropped lines are normally
'corrected' by replacing the line with the pixel values in the line above or
below, or with the average of the two.
STRIPING AND MISSING LINE
Striping – Landsat TM

Striping De-striped
NOISE REMOVAL
Haze – Example Indonesia

Hazy Corrected
NOISE REMOVAL
1. DESTRIPING

• This method uses a mathematical model to correct digital


numbers are not corrected to a value close properly. noisy
is displayed as a band (strip) in the image
• This noise will cause scan lines will look too brighter or darker
when compared with the overall image displayed
2. DROP LINE
• The noise that causes a line scan is not has a digital number.
Correction is done by giving the average lines above and below the
line with this disorder.
Example of Destriping
Partially missing lines Calculate new DN for missing line pixel
GEOMETRIC CORRECTION
• Geometric corrections include correcting for
geometric distortions due to sensor-Earth
geometry Variations, and conversion of the
data to real world coordinates (e.g. latitude
and longitude) on the Earth's surface.
• Uncorrected digital images normally contain
geometric distortion so significant they cannot
be used as maps.
Source of distortions are:
• Variation in the altitude
• Altitude & Velocity of the sensor platform
• Earth curvature
• Atmospheric refraction
• Relief displacement and
• Nonlinearities in the sweep of a sensor’s IFOV
TYPES OF ERROR
1. Systematic Error
2. Non systematic error
SYSTEMATIC ERROR
• Scan Skew: Caused by the forward motion if the platform during the time
required for each mirror sweep. The ground swath is not normal to the ground
track but is slightly skewed, producing cross-scan geometric distortion
Mirror-Scan Velocity Variance: The mirror scanning rate is usually not
constant across a given scan, producing along-scan geometric distortion.
• Panoramic Distortion: The ground area imaged is proportional to the
tangent of the scan angle rather than to the angle itself. Because data are
sampled at regular intervals, this produces along-scan distortion
• Platform Velocity: If the speed of the platform changes, the ground track
covered by successive mirror scans changes, producing along-track scale
distortion
• Earth Rotation: Earth rotates as the sensor scans the terrain. This results in a
shift of the ground swath being scanned, causing along-scan distortion.
• Perspective: For some applications it is desirable to have images represent
the projection of points on Earth on a plane tangent to Earth with all
projection lines normal to the plan. This introduces along-scan distortion.
NON SYSTEMATIC ERROR
• Altitude Variance: If the sensor platform departs from
its normal altitude or the terrain increases in elevation,
this produces changes in scale
• Platform Attitude: One sensor system axis is usually
maintained normal to Earth's surface and the other
parallel to the spacecraft's direction of travel. If the
sensor departs form this attitude, geometric distortion
results.
• Conversion of the data to real world coordinates are carried by analyzing
well distributed Ground Control Points (GCPs).
• This is done in two steps
• Georeferencing : This involves the calculation of the appropriate
transformation from image to terrain coordinates.
• Geocoding :This step involves resembling the image to obtain a new
image in which all pixels are correctly positioned within the terrain
coordinate system.
• Resampling is used to determine the digital values to place in the
new pixel locations of the corrected output image.
RESAMPLING
The resampling process calculates the new pixel values from the original
digital pixel values in the uncorrected image. There are three common
methods for resampling.
• Nearest Neighbourhood
• Bilinear Interpolation
• Cubic Convolution.
Nearest Neighbourhood
• The nearest neighbor approach uses the value of the closest input pixel
for the ouput pixel value.
• To determine the nearest neighbor, the algorithm uses the inverse of the
transformation matrix to calculate the image file coordinates of
thedesired geographic coordinate.
• The pixel value occupying the closest image file coordinate to the
estimated coordinate will be used for the output pixel value in the
georeferenced image.
Nearest Neighbor - Uses the input cell value closest to the output cell as the assigned

value to the output cell


BILINEAR INTERPOLATION
• Bilinear interpolation resampling takes a weighted average of four pixels
in the original image nearest to the new pixel location.
• The averaging process alters the original pixel values and creates entirely
new digital values in the output image.
Cubic Convolution
• Resampling goes even further to calculate a distance weighted average of
a block of sixteen pixels from the original image which surround the new
output pixel location.

You might also like