0% found this document useful (0 votes)
51 views99 pages

Final Report (1) 3 Book +color

This document is a project report submitted in partial fulfillment of the requirements for a Bachelor of Technology degree in Computer Science and Engineering. The project involved generating a digital elevation model using spectral filtering on interferometric synthetic aperture radar data. The report begins with an introduction to synthetic aperture radar interferometry and digital elevation models. It then describes related work on spectral filtering techniques. The methodology section outlines the process used, including co-registration, interferogram computation, coherence estimation, phase unwrapping, and conversion to a digital elevation model. Results are presented on coherence images, interferograms, filtered interferograms, unwrapped phases, and the final digital elevation model. The report concludes with future work possibilities.

Uploaded by

Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views99 pages

Final Report (1) 3 Book +color

This document is a project report submitted in partial fulfillment of the requirements for a Bachelor of Technology degree in Computer Science and Engineering. The project involved generating a digital elevation model using spectral filtering on interferometric synthetic aperture radar data. The report begins with an introduction to synthetic aperture radar interferometry and digital elevation models. It then describes related work on spectral filtering techniques. The methodology section outlines the process used, including co-registration, interferogram computation, coherence estimation, phase unwrapping, and conversion to a digital elevation model. Results are presented on coherence images, interferograms, filtered interferograms, unwrapped phases, and the final digital elevation model. The report concludes with future work possibilities.

Uploaded by

Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 99

A Project Report

on
Spectral Filter Testing For DEM Generation Using inSAR Data

Submitted in partial fulfillment of the requirements


for the award of the degree of

Bachelor of Technology
in
Computer Science and Engineering

by
Ram Modwel Saurabh Agrawal Shubham Srivastava
(1109710082) (1109710091) (1109710102)

Semester-VIII
Under the Supervision of
Ms. Tanu Shree

Galgotias College of Engineering & Technology


Greater Noida 201306
Affiliated to

Uttar Pradesh Technical University


Lucknow
May 2015
GALGOTIAS COLLEGE OF ENGINEERING & TECHNOLOGY
GREATER NOIDA - 2 0 13 0 6 , UTTAR PRADESH, INDIA.

CERTIFICATE

This is to certify that the project report entitled “Spectral Filter Testing For

DEM Generation Using inSAR Data” submitted by Ram Modwel , Saurabh

Agrawal and Shubham Srivastava to the Uttar Pradesh Technical University,

Lucknow in partial fulfillment for the award of Degree of Bachelor of

Technology in Computer science & Engineering is a bonafide record of the project

work carried out by them under my supervision during the year 2014-2015.

Prof.(Dr.) Bhawna Mallick Ms. Tanu Shree

Dean Academics, GCET Assistant Professor

Head, Deptt. of CSE Deptt. of CSE


ACKNOWLEDGEMENT

We owe a debt of deepest gratitude to our project supervisor Ms.Tanu Shree, Assistant

Professor, Department of Computer Science and Engineering, for her guidance,

support, motivation and encouragement throughout the period this work was carried

out. Her readiness for consultation at all times, her educative comments, her concern

and assistance even with practical things have been invaluable.

We would also like to thank our project coordinator Mr. Manish Kumar Sharma,

Assistant Professor, Department of Computer Science and Engineering for his

guidance, support and Motivation.

We are grateful to Prof.(Dr.) Bhawna Mallick, Head of the Department, Computer

Science and Engineering for providing us the necessary opportunities for the

completion of our project. We also thank our friends for their invaluable help, guidance

and support.

RAM MODWEL(1109710082)

SAURABH AGRAWAL(1109710091)

SHUBHAM SRIVASTAVA(1109710102)
ABSTRACT

Synthetic aperture radar (SAR) is a coherent active microwave imaging method. In

remote sensing it is used for mapping the scattering properties of the Earth’s surface in

the respective wavelength domain. Many physical and geometric parameters of the

imaged scene contribute to the grey value of a SAR image pixel. Scene inversion suffers

from this high ambiguity and requires SAR data taken at different wavelength,

polarization, time, incidence angle, etc. Interferometric SAR (InSAR) exploits the

phase differences of at least two complex-valued SAR images acquired from different

orbit positions and/or at different times. The information derived from these

interferometric data sets can be used to measure several geophysical quantities, such as

topography, deformations (volcanoes, earthquakes, and ice fields), glacier flows, ocean

currents, vegetation properties, etc.

Digital Elevation Models (DEMs) are used in many applications in the context of earth

sciences such as in topographic mapping, environmental modelling, rainfall-runoff

studies, landslide hazard zonation, seismic source modelling, etc. During the last years

multitude of scientific applications of Synthetic Aperture Radar Interferometry

(InSAR) techniques have evolved. It has been shown that InSAR is an established

technique of generating high quality DEMs from space borne and airborne data, and

that it has advantages over other methods for the generation of large area DEM.

However, the processing of InSAR data is still a challenging task.

KEYWORDS: InSAR, DEM, Phase, Interferogram, Coherence, Phase Unwrapping.


CONTENTS

Title Page

LIST OF TABLES v
LIST OF FIGURES vi
ABBREVIATIONS ix
NOMENCLATURE x

CHAPTER 1 INTRODUCTION

1.1 Background 1

1.2 SAR Interferometry (InSAR) 3

1.3 Differential Interferometry(DInSAR) 5

1.3.1 DInSAR Principles 6

1.4 Digital Elevation Model(DEM) 8

1.5 Objective of project 9

1.5.1 Sub-objective 9

CHAPTER 2 LITERATURE SURVEY

2.1 Introduction 10

2.2 InSAR 10

2.3 How to get the Phase SAR Image 12

2.4 How to measure the terrain altitude using Interferometric phase 13

2.5 Spectral Shift Filters 15

2.5.1 Range Filter 15

2.5.2 Azimuth Common Band Filtering 17


CHAPTER 3 PROBLEM DOMAIN

3.1 Problem of noise in DEM 19

3.2 Research area 19

3.3 Scientific importance of research area 20

CHAPTER 4 MATERIALS AND METHODOLOGY

4.1 Materials 21

4.2 Methodology 22

4.2.1 Co-registration 23

4.2.2 Interferogram Computation 24

4.2.3 Coherence estimation 24

4.2.4 Interferogram Flattening 25

4.2.5 Phase unwrapping 26

4.2.6 Phase to Height coding and Geo-coding 27

CHAPTER 5 RESULTS AND RESULT ANALYSIS

5.1 Coherence Images 28

5.2 Interferogram 39

5.3 Phase Filtered Interferogram 47

5.4 Unwrapped Phase Image 56

5.5 Digital Elevation Model 65

CHAPTER 6 CONCLUSION AND FUTURE WORK

6.1 Conclusion 74

6.2 Future Scope 74

APPENDIX A LIST OF VARIABLES 76

APPENDIX B GLOSSARY 80
REFERENCES 84

LIST OF PUBLICATIONS 86
LIST OF TABLES

Table Title Page

1.1 Radar bands and their designation 1

3.1 Scene Boundaries 20

4.1 Dataset Specifications 21

v
LIST OF FIGURES

Figure Title Page

1.1 SAR imaging system 3

1.2 Geometry of a satellite Interferometric SAR system 4

1.3 InSAR data collection mode 5

1.4 Two-Pass DInSAR 6

1.5 Three-Pass DInSAR 7

1.6 Four-Pass DInSAR 7

2.1 Concept of Phase 12

2.2 Representation of Interferometric phase computation 13

2.3 Wavefront projected on terrain slanting at angle α 15

2.4 Frequency shift in master and slave image 16

2.5 Azimuth common band filtering 17

3.1 Location of research area (Google Earth Image) 19

4.1 Flow diagram of Methodology 22

4.2 Phase unwrapping 26

5.1 Coherence Image generated using non-filtered master & slave image 28

5.2 Statistics of coherence image generated using non-filtered master &


slave image 29

5.3 Coherence Image generated using range filtered master & slave image 30

5.4 Statistics of coherence image generated using range filtered master &
slave image 31

5.5 Coherence Image generated using azimuth filtered master & slave image 32

5.6 Statistics of coherence image generated using azimuth filtered master &
slave image 33

vi
5.7 Coherence Image generated using range and azimuth filtered master &
slave image 34

5.8 Statistics of coherence image generated using range and azimuth filtered
master & slave image 35

5.9.1 Histogram of coherence image generated using non-filtered master &


slave image 36

5.9.2 Histogram of coherence image generated using range filtered master &
slave image 37

5.9.3 Histogram of coherence image generated using azimuth filtered master &
slave image 37

5.9.4 Histogram of coherence image generated using range and azimuth filtered
master & slave image 38

5.10 An Interferogram generated using non-filtered inputs 39

5.11 Statistics of an interferogram generated using non-filtered inputs 40

5.12 An Interferogram generated using range filtered inputs 41

5.13 Statistics of an interferogram generated using range filtered inputs 42

5.14 An Interferogram generated using azimuth filtered inputs 43

5.15 Statistics of an interferogram using azimuth filtered inputs 44

5.16 An Interferogram generated using range & azimuth filtered inputs 45

5.17 Statistics of an interferogram using range & azimuth filtered inputs 46

5.18 Phase filtered interferogram of non-filtered inputs 48

5.19 Statistics of phase filtered interferogram of non-filtered inputs 49

5.20 Phase filtered interferogram of range filtered inputs 50

5.21 Statistcis of phase filtered interferogram of range filtered inputs 51

5.22 Phase filtered interferogram of azimuth filtered inputs 52

5.23 Statistics of phase filtered interferogram of azimuth filtered inputs 53

5.24 Phase filtered interferogram of range & azimuth filtered inputs 54

vii
5.25 Statistics of phase filtered interferogram of range & azimuth filtered
Inputs 55

5.26 Unwrapped phase image of non-filtered inputs 57

5.27 Statistics of unwrapped phase image of non-filtered inputs 58

5.28 Unwrapped phase image of range filtered inputs 59

5.29 Statistics of unwrapped phase image of range filtered inputs 60

5.30 Unwrapped phase image of azimuth filtered inputs 61

5.31 Statistics of unwrapped phase image of azimuth filtered inputs 62

5.32 Unwrapped phase image of range & azimuth filtered inputs 63

5.33 Statistics of unwrapped phase image of range & azimuth filtered inputs 64

5.34 DEM generated from non-filtered inputs 66

5.35 Statistics of DEM generated from non-fitlered inputs 67

5.36 DEM generated from range filtered inputs 68

5.37 Statistics of DEM generated from range filtered inputs 69

5.38 DEM generated from azimuth filtered inputs 70

5.39 Statistics of DEM generated from azimuth filtered inputs 71

5.40 DEM generated from range & azimuth filtered inputs 72

5.41 Statistics of DEM generated from range & azimuth filtered inputs 73

viii
ABBREVIATIONS

SAR Synthetic Aperture Radar


ASAR Advanced SAR
ASI Agenzia Spaziale Italiana
DEM Digital Elevation Model
DESCW Display Earth remote sensing Swath Coverage for Windows
DInSAR Differential Interferometric SAR
DLR Deutsche Luft und Raumfahrt
DORIS Doppler Orbitography and Radiopositioning Integrated by Satellite
ENL Equivalent Number of Looks
EOLI Earthnet On-Line Interactive
ESA European Space Agency
GCP Ground Control Point
InSAR Interferometric SAR
JAXA Japan Aerospace Exploration Agency
JPL Jet Propulsion Laboratory
NASA National Aeronautics and Space Administration
PAF Processing and Archiving Facility
PDF Probability Density Function
PolSAR Polarimetric SAR
PolInSAR Polarimetric Interferometric SAR
PRF Pulse Repetition Frequency
RADAR Radio Detection And Ranging
RAR Real Aperture Radar
SIR Shuttle Imaging Radar
SLC Single Look Complex
SRTM Shuttle Radar Terrain Mission
UTM Universal Transfer Mercator
WGS World Geodetic System

ix
NOMENCLATURE

English Symbols
A Amplitude
Phase Difference
c Speed of Light
Wavelength
f Frequency

x
CHAPTER 1
INTRODUCTION

1.1 BACKGROUND

Remote sensing has different applications and it has been adjudged that the technique
with great potential to help the nation’s economic growth and resolve some of its
problems. It includes better management of natural resources through wasteland
mapping, identifying water in catchments areas, flood-prone areas, assessment of
situation of reservoirs, estimating forest area, and prediction of crop yield and scarcity
of resources etc. The remote sensing’s application depend on the choice of frequency.
The use of microwave in the remote sensing offers certain specific advantages in
applications like mineral mapping, crop and vegetation monitoring, water resource
management, oceanography, soil moisture detection and DEM generation

The utilization of electromagnetic spectrum enclosing microwave has led to


remarkable invention like radio and television, mobile communication, microwave
ovens and Radio Detection and Ranging. The microwave signals are a part of
electromagnetic spectrum with wavelengths between 1 cm and 1m. The outgrowth in
field of microwave remote sensing has helped to figure out the topography of world.
The most common microwave bands which are used in the existing RADAR systems
are listed in the Table 1.1.

Table 1.1 RADAR bands and their designation[25] : name of RADAR bands and
their corresponding wavelengths.
NAME OF BANDS WAVELENGTHS
Ka 0.75 cm - 1.10 cm
K 1.10 cm - 1.67 cm
Ku 1.67 cm - 2.40 cm
X 2.40 cm - 3.75 cm
C 3.75 cm– 7.50 cm
S 7.50 cm - 15.0 cm
L 15.0 cm - 30.0 cm
P 30.0 cm - 130 cm

1
The Microwaves are used for Conventional Radio Detection and Ranging (RADAR)
Technology. The RADAR is not only employed in detection and ranging but also
portraying surface of the Earth, as such it is marked as a remote sensing technique. The
surface of Earth is illuminated by microwave signals and from the reflected signals, as
an image can be obtained in offline processing as compared to visible part of
electromagnetic spectrum, microwaves are relatively long and they have the capability
to penetrate through cloud and independent of same atmospheric conditions such as
haze.

Imaging of Earth can be done through Real Aperture RADAR (RAR) systems but the
resolution of the data acquired from space borne system is about 5-10 km. The
resolution is limited by power and size of the footprint of the Radar antenna which itself
is lied on the aperture size, thus its use is limited for some remote sensing applications.
This restriction in providing the useful Spatial Resolution is vanquished by Synthetic
Aperture RADAR (SAR) systems. In SAR techniques large antenna is constructed by
means of offline processing techniques. SAR fraternize the techniques of signal
processing and the satellite orbit information and thus obtain a much higher resolution
Radar image.

SAR[7] is the microwave imaging system. It has capabilities to penetrate through cloud
because it use the microwaves. It is an active system because it has day and night
operational capabilities. A SAR imaging system from a satellite is portrayed in Figure
1.1. A satellite carries Radar with antenna pointed to the surface of Earth in the plane
perpendicular to the orbit. The inclination of antenna with respect to nadir is known as
off-nadir angle and existing systems is usually in the range between 20o and 50o. Due
to curvature of surface of Earth incidence angle of radiation on flat horizontal terrain is
greater than the off-nadir angle. For the sake simplicity we assume Earth is flat and
hence the incidence angle is equal to off-nadir angle.

A digital SAR image can be perceived as mosaic of small picture element called pixels.
Each pixel is fraternized with a small area of the surface of Earth. Each pixels holds a
complex number that carries amplitude and phase information about microwave field
backscattered by the all scatters with in the respective resolution cell projected on the
ground.

2
Figure 1.1 SAR imaging system[7] : Description of SAR imaging system from a
satellite.

1.2 SAR INTERFEROMETRY (InSAR)[7]

Synthetic Aperture Radar (SAR) interferometry is processing technique which can be


used to calculate the topography of surface of Earth. It is a technique in which two
different SAR images are combined to obtain an interferogram. An interferogram is a
phase interference image. The raw SAR images are in complex-valued format which
containing both amplitude and phase information per pixel. An interferogram is created
by taking pixel-to-pixel phase difference between two SAR images. The phase
difference is calculated by the multiplication of one image with the conjugate of other
image. This results in phase difference image which is called Interferogram. From the
interferometric phase image, relative terrain elevation can be obtained using orbit data
of two images.

A SAR satellite can sense the same area from slightly different look angles. This can
be achieved either simultaneously (with two radar antenna’s mounted on same
platform) or at different times by repeated orbits of the same satellite. The distance
between two satellites in the plane perpendicular to the orbit is called the interferometer

3
baseline and its projection perpendicular to the slant range is the perpendicular
baseline.

Figure 1.2 Geometry of a satellite Interferometric SAR system[7] : Geometric


representation of Interferometric SAR system from a Satellite.

InSAR can be classified into two categories which are based on the number of platforms
involved:-

1. Single-Pass Interferometry

2. Repeat-Pass Interferometry

In single-pass interferometry two radar antenna are used to capture the radar return of
the same scene from two different angles while in repeat-pass interferometry single
antenna used so revisit to the same scene is required hence termed as repeat-pass

4
interferometry. But the two flight paths must be carefully oriented to establish the
desired baseline. The advantages of single pass interferometry are relatively ease of
motion compensation and maintenance of baseline, the two aperture are coupled and
the absence of any temporal decorrelation of the scene between the two images. The
major disadvantage is the cost and complexity of the multi-receiver sensor. Conversely,
the major advantage of repeat pass interferometry is the capability to use conventional
single receiver SAR sensor, while the major disadvantage is the problem of controlling
the two passes and compensating data from two receivers to carefully aligned collection
paths as well as probability of temporal decorrelation of scene between passes.

Figure 1.3 InSAR data collection mode (a) Repeat Pass Interferometry (b) Single
Pass Interferometry[25] : Different data collection modes by sensors.

1.3 DIFFERENTIAL INTERFEROMETRY (DInSAR)[25]

SAR is supposed to be the integral part for the amplification of the surface of earth and
its changes with certain period of time. In SAR Interferometry the deformation signal
received from the surface of the Earth is associated with topographic signal. To surpass
this problem, differential interferogram is used. DInSAR is used in remote sensing for

5
quantifying the deformation of surface of the Earth. This technique is assumed as more
accurate than InSAR as it is able to providing relative measures up to few centimetres.

1.3.1 DInSAR Principles

This technique demanded at least three single-look complex (SLC) images in order to
obtain minimum two interferometric phase measurements. The phase information
respected to each pixel in the SLC images are measured and differenced to generate an
interferogram.

There are three categories of DInSAR and they are two-pass, three-pass, four-pass
DInSAR. Differential Interferogram (DInSAR) can be obtained using interferometric
image pairs and Digital Elevation Model (DEM). Figure 1.4, Figure 1.5, Figure 1.6
show two-pass, three-pass and four-pass DInSAR respectively.

SLC

Data 1
Interferogram 1

SLC Differential
Interferogram
Data 2

Interferogram 2

DEM

Figure 1.4 Two–Pass DInSAR[25] : Flow diagram of Two-Pass Differential InSAR.

6
SLC

Data 1
Interferogram 1

SLC Differential
Interferogram
Data 2

Interferogram 2

SLC

Data 3

Figure 1.5 Three-Pass DInSAR[25] : Flow diagram of Three-Pass Differential InSAR.

SLC

Data 1
Interferogram 1

SLC Differential
Interferogram
Data 2

Interferogram 2

SLC

Data 3

SLC

Data 4

Figure 1.6 Four-Pass DInSAR[25] : Flow diagram of Four-Pass Differential InSAR.

7
In three pass DInSAR, three SLC images are used to generate two interferogram and
then interferometric phase information of these two interferogram are differenced to
produce another interferogram which is termed as Double-Differential Interferogram.
This step separate the phase changes due to topography and a new image is produced.
The final phase of DInSAR is composed of surface change phase contributions,
atmospheric delay contributions and phase noise.

1.4 DIGITAL ELEVATION MODEL (DEM)[8]

A Digital Elevation Model (DEM) is defined as the digital model or the three
dimensional representation of terrain of surface- commonly for a planet including
Earth. Digital elevation models are broadly classified into two categories one is Digital
Terrain Model (DTM) and another is Digital Surface Model (DSM). Digital terrain
model depicts the bare surface of ground without showing any features while Digital
surface model represent surface of the Earth including all objects present on the surface
of the Earth. DEM data files comprises the elevation values of the terrain over a
particular area at fixed grid interval. The intervals between each of grid points are
referenced to a geographical coordinate system. More the grid points are closely
located, more the detailed information about the terrain.

Digital elevation models have variety of applications, among them hydrological


modelling, terrain induced distortion correction as well as radiometric correction and
aid in thematic interpretation (classification, snow cover mapping, biomass
estimation). The generation of digital elevation models (DEMs) has been long
associated in the remote sensing business and provides a core business. Using the
conventional methods, a pair of images are processed, using optical stereo methods to
create the elevation model. But another technique known as InSAR (Interferometric
Aperture RADAR) has gained attention. Firstly this technique used in 1960’s to resolve
the ambiguity in the measurement of Venus. But this technique has undergone rapid
development in 1991 this due to the widespread availability of ERS (European Remote
Sensing Satellite) data and this technique emerged as potential competitor to
conventional optical stereo methods. Optical stereo depends on the high contrast on
the other hand SAR Interferometry has no such limitation. For high resolution DEM
generation, optical stereo and InSAR depend on very highly costly airborne campaigns.

8
SAR interferometry can potentially provide large swaths, with a tandem or a one-pass
dual-antenna configuration providing systematic large-area coverage.

1.5 OBJECTIVE OF PROJECT

The main objective of this project is to test spectral shift filters for DEM generating
using InSAR data and to identify the potential of these filters to enhance the DEM
generating using InSAR data in terms of visibility and accuracy of elevation
information.

1.5.1 Sub-objective
 Generation of interferogram and coherence image.
 Generation of phase filtered interferogram.
 Generate geocoded Digital Elevation Model.

9
CHAPTER 2
LITERATURE REVIEW

2.1 INTRODUCTION

The extension of Radar Interferometry can be in an epigrammatic penetrate by simple


headway as Radio Detection and Ranging (RADAR) - Synthetic Aperture Radar
(SAR) –Interferometric SAR (InSAR)-Differential Interferometric SAR (DInSAR).
The background of Radar can be discovered to 19th century in which the notable
discovery of radio waves and electromagnetism by Maxwell and Hertz gave new loop
to world. Significant updates of radar technology in an imminent years guide to the
development of SAR. The coherent radar in which both phase and amplitude
information are received, peruse as the key factor for the upgradation in radar. A mock
long antenna was created synthetically using a moving antenna, which merge the
statistics of received pulse returns with in the synthetic antenna length. A SAR system
sanctioned satellite was launched June 1978 for oceanography. This lead to launch of
many SAR systems sanctioned satellite which became the certain necessity data source
for many applications. A SAR’s impotence to discriminate two objects at same range
but from different angles then become subject public interest. The answer to the
question was to impact couple radars. This idea, along with the use of phase
information surfaced the way for interferometry. It became possible with a couple SAR
images to obtain distance as well as angular measurements.

2.2 InSAR

The usage of a SAR system in an interferometric manner to produce topographic maps


and such topographic maps using interferometric information was first generated by
Graham in the year 1974 (Graham, 1974)[24] . The efficiency of the data collected by
aerial photography was found to be hindered by clouds and poor sunlight conditions.
Radar technology which is used with all-weather electronic system was used to
overcome the limitations of aerial photography. The Radar was supposed to perform
two functions. First, the image should be exists with sufficient resolution in order to
identify various objects and features to be mapped. Second, a three-dimensional
measurement of position of sufficient number of points to define terrain surface has to
be made. These could be attained by Synthetic Aperture Radar technology (to generate

10
fine resolution image of terrain) and Radar interferometry (to achieve three-
dimensional measurement).

In the year 1986, Zebker and Goldstein was successfully derived the high resolution
topographic map of the San Francisco Bay area, using interferometric technique
(Zebker and Goldstein, 1986)[23] . Two images were achieved using two antennas
placed in the flight direction of aircraft. A single image (interferogram) was achieved
by combining two images which were acquired at slightly different angle. Interference
fringes were achieved when two images were combined with pixel by pixel which
leads to a single image whose phase at each position was the difference of the phase of
two original images and whose magnitude was the product of the magnitude of the two
original images. Using some mathematical relation, this image was converted into a
topographically accurate map.

The repeat pass InSAR method was first manifested by Li and Goldstein in the year
1987 (Li and Goldstein, 1987)[22] . This provided exceptionally useful topographic
information and to explain the specific data set obtained by SEASAT SAR was
employed. The interferograms generated from the data pair were similar to the
conventional topographic contours in which fringes were stable with conventional
topographic map of the study area. This paper mainly concentrated on the topography
mapping tried with conventional SAR with one antenna in repeat pass mode.

The application of repeat pass interferometry was also demonstrated by Gabriel and
Goldstein in the year 1988 (Gabriel and Goldstein, 1988)[21] . This study SIR (Shuttle
Imaging Radar)-B, two different orbits, not exactly parallel but inclined at a small
angle to implement the Interferometric process. Crossed orbit interferometry has been
promoted as useful extension of SAR Interferometric process.

In the year 1993, Madsen et al. [19] was exercise the C-band RADAR data to obtain
the rectified topographic maps. They come up with new processing scheme which take
motion compensation, absolute phase retrieval and three-dimensional location into
consideration. The performance new procedure was evaluated under typical condition
by testing the process using the data which acquired with extreme aircraft motion. The
digital elevation model determined using conventional optical stereo techniques
compared with the topographic maps obtained by the RADAR and thus accuracy was
studied. The digital elevation models (DEMs) are employed in many applications like

11
topographic mapping, seismic source modelling, rainfall-runoff studies etc. Okeke in
his paper explained that InSAR is secured technique for producing high quality DEMs
from airborne and space borne data (Okeke, 2006)[9] . This paper explained the
processing steps needed for DEM generation using SLC SAR data.

2.3 HOW TO GET THE PHASE SAR IMAGE[7]

The transmitted waves from the Radar has to reach to the scatterers which are on the
ground and then came back to the Radar in order to form the SAR image this is two-
way travel. Scatterers are at the different distances from the Radar (slant range) and
produce different delays between transmission and the reception of the transmitted
waves from the Radar.

Because of almost sinusoidal characteristic of transmitted and received signal this delay
‘d’ is equal to a phase change ɸ between the received and transmitted signals. Since the
phase change is proportional to the two-way distance 2R of the radiation divided by the
wavelength of the signal.

Figure 2.1 Concept of Phase[7] : A sinusoidal function sin φ is periodic with a 2π


radian period .

The phase of a wave can be expressed using following formula:

2𝜋
ɸ=
𝜆

12
The phase of transmitted signal is zero, the received signal that covers the distance 2R
from Radar to the target and back target to the Radar is expressed using following

2𝜋
ɸ= *2R
𝜆

4𝜋
ɸ= 𝑅
𝜆

2.4 HOW TO MEASURE THE TERRAIN ALTITUDE USING


INTERFEROMETRIC PHASE[7]

SAR Interferometry is depend on the measurement of the phase difference from the
complex-valued resolution elements of two co-registered images, received by two
separated antennas (baseline B) as shown in Figure 2.2. The measured interferometric
phase is wrapped in 2π, represented by ɸm in Figure 2.2. The grid of the wrapped phase
values is transformed into a grid of unwrapped phase values by using a phase-
unwrapping algorithm that appends to each measured phase value a constant integer
multiple of 2π, represented by ɸunw in Figure 2.2. The phase-offset value, represented
by ɸoff in Figure 2.2, is a constant phase component for the whole scene that must be
estimated and added to the unwrapped phase in order to obtain the absolute
interferometric phase ɸabs from which a digital elevation model (DEM) can be
generated.

Figure 2.2 Representation of Interferometric phase computation[7] :


Geometric representation of Interferometric phase.

13
The path difference between two distances is given by

𝜆𝑝
𝛥𝑟 = ɸabs 1
4𝜋

𝜆𝑝
𝑟1 = 𝑟2 − ∆𝑟= 𝑟2 - ɸabs 2
4𝜋

with p = 1 for mono static acquisitions and p = 2 for bi-static acquisitions schemes.
For sake of simplicity, assuming a flat-earth geometry, than the terrain height can be
represented by:

ℎ = 𝐻 − 𝑟1 𝑐𝑜𝑠𝜃 3

From equation (1) , (2) and (3) it can be written as :

𝜆𝑝
ℎ = 𝐻 − (𝑟2 − ɸ𝑎𝑏𝑠 )𝑐𝑜𝑠𝜃
4𝜋
where h is the terrain height and H is the satellite altitude and θ is look-angle for the
antenna A1.

2.5 SPECTRAL SHIFT FILTERS[5]

There are two spectral shift filters in the InSAR processing and these are Range
Spectral Shift Filter and Azimuth Common Band Filter. The aim of these two distinct
filtering the InSAR processing steps is to provide a type of phase co-registration, such
that the contributions which are mostly correlated in the two SLC images are retained,
but the contribution which are uncorrelated (that behaves like noises) are eliminated
prior to the generation of the interferogram cross-product.

2.5.1 Range Filter[5]

In SAR interferometry two images acquired from slightly different positions which are
used to generate the interferometric phase pattern. Due to different look angles
different parts of the ground reflectivity spectra are present in the received signal.
During the interferogram generation, uncorrelated parts of the spectra are merged,
which result in a decreased coherence. The elimination of these uncorrelated parts is a
standard technique in the interferometric processing of SAR data called ’range spectral
filtering’. The terrain is assumed locally flat, slanting at angle ’α’ which varies with

14
the range. When the wavelength λ transmitted from the Radar changes when projected
on the ground based on the incidence angle ‘θ’ and the geometry shown in Figure 2.3.
If the same wavefront is transmitted by a source with slightly different incidence angle,
a different wavelength is projected on the same terrain

Figure 2.3 Wavefront projected on terrain slanting at angle α[5] : Projection and
scattering of Wavefront on the ground.
The changed wavefront is expressed by:

λ
λg =
sin(θ−α)

From wavelength to frequency it is easy to see that the measured reflectivity spectrum
changes with incidence angle θ:

𝑓𝑔 = 𝑐
=𝑓𝑠𝑖𝑛(𝜃−𝛼)
𝜆𝑔

The radar transmitted frequency should change in order to compensate for the
frequency change on the ground due to incidence angle:

𝑓𝑔
𝑓=
𝑠𝑖𝑛(𝜃−𝛼)

differentiate with respect to the θ

𝜕𝑓 𝑓𝑔𝑐𝑜𝑠(𝜃−𝛼) 𝑓
=− =−
𝜕𝜃 𝑠𝑖𝑛2 (𝜃−𝛼) 𝑡𝑎𝑛(𝜃−𝛼)

15
𝑓
∆𝑓 = − ∆𝜃
𝑡𝑎𝑛(𝜃−𝛼)

The above expression gives the amount of shift in the reflectivity spectrum introduced
when the wavefront projected on the terrain. The change in the incidence angle is
function of the perpendicular baseline (Bn) and sensor to target distance (R) and given
by:

𝐵𝑛
∆𝜃 =
𝑅

Hence spectral shift is expressed as:

𝑓 𝐵𝑛
∆𝑓 = − ∗
tan(𝜃−𝛼) 𝑅

Figure 2.4 Frequency shift in master and slave image[5] : Representation of


frequencies of master and slave image obtained from SAR imaging system.

This spectral shift can be removed by filtering out the uncorrelated bands of the master
and slave images. This can be achieved by filtering master and slave images with band-
∆𝑓 ∆𝑓
pass filter with bandwidth Wc and the central frequencies and - respectively.
2 2
Where Wc is signal band-width common to both master and slave images is:

𝑊𝑐 = 𝑊 − |∆𝑓|
where W is bandwidth of the respective SAR System.

16
2.5.2 Azimuth Common Band filtering[5]

The azimuth common band filtering is somewhat is complementary to the range


filtering, the goal of this filtering again same that is to keep mostly correlated
contributions. The azimuth spectral shift due to terrain slope is quite small and can be
completely negligible. This shift is due to possible variation in the antenna pointing
between the two images acquisitions. The impact of a different Doppler Centroid on
the Azimuth spectra of the two acquisitions is displayed in Figure 2.5. The concept is
quite similar to the range spectral shift: there the same portions of two shifted
reflectivity spectra were observed but here: two different portions of the same
reflectivity spectra ae observed.

Figure 2.5 Azimuth common band filtering[5] : Azimuth filtering based on the
common frequencies of master and slave image.

17
The different Doppler Centroid in the master and slave image introduce an azimuth
spectral shift and a coherence loss. This shift and coherence loss can be avoided by the
azimuth common band filter.

18
CHAPTER 3
PROBLEM DOMAIN

3.1 PROBLEM OF NOISE IN DEM


The interferogram is generally found to be always noisy which affect the accuracy of
DEM and makes it noisy due to the de-correlation in SAR interferometry. It was
observed that the terrain coherence effect can dramatically affect the phase unwrapping
and in turn DEM quality especially for highly vegetated or mountainous regions. In this
project filtering algorithm will be used to minimize the de-correlation effect and also
phase unwrapping intricacy by enhancing the signal in the interferogram spectrum. The
results obtained from this project will demonstrate the effectiveness of this filtering to
produce a quality DEM.

3.2 RESEARCH AREA

The site chosen for the research is a part of Arizona State that is located in the south-
western region of the United States of America. It covers an area of 400 km2. It lies
between 33o11’25.90” N to 33o29’52.15” latitude and 112o16’23.28”W to
111o55’37.54”W longitude. The major cities fall in this area are Phoenix, Tolleson,
Tempe and Gaudalupe, covers a portion of the study site.

Figure 3.1 Location of research area (Google Earth Image) : Location chosen for
research objective.

19
3.3 SCIENTIFIC IMPORTANCE OF THE RESEARCH AREA

This area is chosen for research particularly because of its varied land cover. Different
scatterers can be found which help in this work. Permanent scatterers like building and
mountains acts as corner reflection for C-band Radar. The water bodies having plain
surface that causes specular reflection, agriculture lands at the time of crop growth acts
as volume scatterers and when plain surface acts as specular reflector. All these
features together make research area a complete package to undergo in research work.

The exact location of the research area is shown in Figure 3.1 and the scene boundary
is mentioned in the Table 3.1.

Table 3.1 Scene Boundaries : Latitude and Longitude of chosen research area.

Position Latitude Longitude


Bottom Left Corner 33o 11’25”N 112o 16’23”W
Bottom Right Corner 33o 11’36”N 111o 55’37”W
Upper Right Corner 33o 29’52”N 111o 55’49”W
Upper Left Corner 33o 29’41”N 112o 16’24”W

20
CHAPTER 4
MATERIALS AND METHODOLOGY

This chapter divides into two sections one is materials and another is methodology
adopted in this work. The first section gives detailed information about the data used
in this work and the second section gives the detailed explanation of the methodology
used in this research.

4.1 MATERIALS

To generate the interferograms there are requirement of at least two SLC images which
contain both amplitude and phase information. In this study, two RADARSAT-2 SLC
images are used to generate the interferogram. RADARSAT-2 is an Earth observation
satellite that was successfully launched in December 14, 2007. The satellite has
Synthetic Aperture Radar (SAR) with multiple polarization modes, including a fully
polarimetric in which HH, HV, VV, VH polarized data are recorded. Its finest
resolution is 1m in spotlight mode (3m in Ultra-fine mode). Details of data set has
shown in the Table 4.1.

Table 4.1 Dataset specification : Image metadata obtained from SAR imaging system.

Serial No. 1(Master Image) 2(Slave Image)


Source Radarsat-2 Radarsat-2
Sensor SAR SAR
Central Frequency 5.405 GHz 5.405 GHz
Band C-Band C-Band
Pattern Fine Mode/HH Fine Mode/HH
Off-nadir Angle 29.8o 29.8o
Incident Angle 39.3o 39.3o
Orbit No. 2022 2022
Revisit Time 24 days 24 days
Date of Data Acquisition 4 may 2008 28 may 2008

21
4.2 METHODOLOGY[7]&[25]

The approach followed in this research work is shown in Figure 4.1

Raw SAR Dataset 1 Raw SAR Dataset 2

Import SAR Master Import SAR Slave


SLC SLC

Sss Spectral Shift Filters


SlavsSe

Co-Registration

Interferogram Computation

Wrapped Phase Coherence


Estimation
Interferogram Flattening

Flattened Phase

Phase Unwrapping
Unwrapped Phase

Phase to Height Coding


DEM

Geocoding

Figure 4.1 Flow diagram of Methodology[25] : Methodology used for Interferometric


process.

22
4.2.1 Co-registration

The co-registration step is a basic step in the interferogram generation, as it assures that
each ground target contribute to the same pixel in the both master and slave SLC image.
In perfect case of ideally parallel orbits and aligned acquisition, co-register would only
require to compensate for the differing geometry due to different view angle (parallax
effect). This would be compensate by a proper cross-track stretching of one image. The
space alignment between the two images should be performed properly on a pixel by
pixel basis, with accuracy of the order of one tenth of the resolution. In theory, co-
registration should depend on the local topography. However the effect of elevation is
almost negligible in most cases. In space borne Synthetic Aperture Radars, the sensor
velocity and attitudes are so stable that the master and slave deformation of frame can
be approximated by following polynomial:

rs = a.rM2 + b.rM + c.aM + d

as = e.rM2 + f.rM + g.aM + h

Where (rM , aM ) are the range and azimuth coordinates of the mater image. (rM , aM )
are the range and azimuth coordinates where the slave image should be evaluated. The
convention presumed that the slave image is the image that is actually resampled, so
that the final interferogram will be in the same reference of the master image.

 A definite azimuth shift, coefficient (d) due to different timing along


orbit and definite range shift, (h) due to perpendicular baseline
component.
 Stretch in range b due to normal baseline variation with range and stretch
in azimuth g due to variation in PRF (pulse repetition frequency).
 A range and azimuth skew (c, f) that is approximation of an image
rotation for small rotation angle.
 (a, e) are the two second order terms that are need for processing large
range swath.

23
4.2.2 Interferogram Computation

The fundamental inputs of interferogram generation are Single Look Complex (SLC)
images that preserve the phase. These SLC images two-dimensional matrix holding the
amplitude and phase information associated with each pixel of the images. The
amplitude is a factor of target reflectivity and surface parameters. The phase is measure
the changes at the surface and it is in relation with two way distance between platform
and ground. A small surface of ground is represented by pixel which have hundreds of
scatters elements which carry complex reflection that contribute to the phase, but this
phase does not form a meaningful parameter. When phase of two different images are
compared then it becomes meaningful. From the phase difference of two image the
height of the pixels in relation to altitude of Radar can be calculated. These images are
referred as master and slave image. The computation of interferogram requires the pixel
to pixel computation of Hermitian product of two co-registered images:

G = UM * US

where UM and US indicate to the master and slave.

Here the convention that the interferogram is registered in the same reference as master
image, and the phase of the interferogram is the difference between the phase of the
master and slave image.

The phase of the interferogram is the difference between the phase of the master and
the slave image and represented as:

𝐈𝐦𝐚𝐠(𝐆)
ɸ = tan-1 ( )
𝐑𝐞𝐚𝐥(𝐆)

where Real(G) and Imag(G) are the real and imaginary part of the interferogram
respectively.

4.2.3 Coherence Estimation

The degree of likeliness between two images is referred as the Coherence. The pixel
value of coherence image is lies between 0 and 1. The areas of high coherence in the
two images shows pixel value near to 1 and areas of low coherence shows pixel value
near to 0. The way to measure the interferometric complex coherence is expressed as:

24
∑𝑵 𝑷𝟏 .𝑷∗𝟐
ϒ=
√∑𝑵|𝑷𝟏 |𝟐 ∑𝑵|𝑷𝟐 |𝟐

Where

N = Number of pixels in the N-sample estimation window

P1 = Complex SAR image (Master)

P2 = Complex SAR image (Slave)

P2* = Complex conjugate of slave image

The coherence serves two prime purpose:

1). to determine the quality of measurement (i.e. interferometric phase). Usually InSAR
pairs having coherence lower than 0.2 should not consider of further processing.

2). to extract thematic information about object on the ground.

There are many factors affecting loss of coherence some of them are listed below:-

1. Different atmospheric condition during acquisition of images

2. Processing errors in phase

3. Changes in position, properties in the objects between data acquisition

4. Different viewing positions

4.2.4 Interferogram Flattening

Only fringes contributed by the topographic terrain present in the flattened


interferogram. It involves the elimination of low frequency phase difference from the
interferogram. Before the phase unwrapping step, the calculated phase is removed to
produce the flatted interferogram that is easier to unwrap. The variation in
interferometric ɸ phase can be depend upon the two contribution:

1. A variation in phase is proportional to the altitude difference ’q’ between


point targets.
2. A variation in phase is proportional to the slant range displacement ‘s’ of
the point target.
25
4𝜋𝐵𝑛 𝑞 4𝜋𝐵𝑛 𝑠
Δɸ = − -−
𝜆 𝑅 𝑠𝑖𝑛Θ 𝜆 𝑅 𝑡𝑎𝑛Θ

where θ is the radiation incidence angle with respect to the


reference

Bn is perpendicular baseline

R is distance between Radar and target

λ is transmitted wavelength

The perpendicular baseline is known from the orbital data and the second part of the
equation can be computed and subtracted from the interferometric phase. This operation
is known as interferogram flattening and results in a phase map which is proportional
to the relative terrain altitude.

4.2.5 Phase unwrapping

The flattened interferogram delivers an ambiguous measurement of the relative terrain


altitude because of 2π cyclic nature of phase of interferogram. The variation in phase
between two points in flattened interferogram delivers actual altitude variation after
removing any integer number of 2π phase cycle. This process of appending the correct
integer multiple of 2π to interferometric fringes is called phase unwrapping.

Figure 4.2 Phase unwrapping[17] : Representation of Absolute and wrapped phase.

26
The phase unwrapping is the process of resolving 2π ambiguity of the interferometric
phase. Several algorithms are has been developed like branch cuts, region growing,
minimum cost flow, minimum least square etc. for unwrapping the phase of flattened
interferogram. In order to remove inherent ambiguity the phase unwrapping must be
performed. Phase unwrapping refers to conversion from measured phase to the absolute
phase.

n = an integer representing number of cycles to unwrap the phase of single pixel

4.2.6 Phase to Height Coding and Geocoding

In this stage of the methodology unwrapped interferometric phase to the height.


Sometimes it is also known as slant to height conversion. In this stage unwrapped phase
is combined with the synthesized phase and it is converted into height and geocode into
the specific map projection. This is achieved by applying a theoretical relationship
between interferometric phase and height. This relationship strongly relies on the
imaging geometry. Therefore precise baseline estimation is required if an absolute
height of terrain is wanted to obtain. If this is not going to happen then more number of
ground control points (GCPs) must be required to improve the height information

But after height information, the DEM remains slant range coordinate system. Since
this geometry is quite different for each SAR images, and not related with any
georeference system. The steps followed in the geocoding are as follows.

 Target Location: it consists of translation of the image position of a pixel to the


respective position on the reference system on the Earth the help of the satellite
position and velocity.
 Transformation of Earth location to the geographic coordinates.
 Conversion of geographic coordinates to map grid coordinates.

27
CHAPTER 5
RESULTS AND RESULT ANALYSIS

This chapter describes results obtained from the methodology discussed in chapter 4.

5.1 COHERENCE IMAGES

The bright patches of coherence image indicate areas of high coherence between the
two images while the dark patches represents areas where the coherence between the
two images is relatively low. The pixel value of a coherence is lies between the 0 and
1. The 1 pixel value of coherence image represent the total correlation and if pixel value
lies near to 0 then it represent total decorrelation. In this work four different coherence
images are generated for the same master and slave image.

Figure 5.1 Coherence Image generated using non-filtered master & slave image[7] :
Result obtained after co-registration phase.

28
Figure 5.2 Statistics of coherence Image generated using non-filtered master &
slave image[7] : Result obtained after co-registration phase.

29
Figure 5.3 Coherence Image generated using range filtered master & slave image[7] :
Result obtained after co-registration phase.

30
Figure 5.4 Statistics of coherence Image generated using range filtered master & slave
image[7] : Result obtained after co-registration phase.

31
Figure 5.5 Coherence Image generated using azimuth filtered master & slave
image[7] : Result obtained after co-registration phase.

32
Figure 5.6 Statistics of coherence Image generated using azimuth filtered master &
slave image[7] : Result obtained after co-registration phase.

33
Figure 5.7 Coherence Image generated using range and azimuth filtered master &
slave image[7] : Result obtained after co-registration phase.

34
Figure 5.8 Statistics of coherence Image generated using range and azimuth filtered
master & slave image[7] : Result obtained after co-registration phase.

35
In figure 5.1, 5.3, 5.5 and 5.7 are four coherence images generated using different inputs
of same image. Figure 5.1 is a coherence image which is generated using non-filtered
master and slave images, figure 5.3 is a coherence image which is generated using range
filtered master and slave images, figure 5.5 is a coherence image which is generated
using azimuth filtered master and slave images and figure 5.7 coherence image which
is generated master and slave image which are filtered with a combination of both
azimuth and range filter. Coherence image which is generated using non-filtered master
and slave image has highest loss of coherence while coherence image generated using
both filters has less loss of coherence.

Figure 5.9.1 Histogram of coherence image generated using non-filtered master &
slave image[7] : Intensity distribution of coherence image.

36
Figure 5.9.2 Histogram of coherence image generated using range filtered master &
slave image[7] : Intensity distribution of coherence image.

Figure 5.9.3 Histogram of coherence image generated using azimuth filtered master &
slave image[7] : Intensity distribution of coherence image.

37
Figure 5.9.4 Histogram of coherence image generated using range and azimuth
filtered master & slave image[7] : Intensity distribution of coherence image.

The histogram in Figure 5.9.1 ,5.9.2 and 5.9.3 are almost same that means there is less
effect of the filtering reflected on the coherence image. But while a combination of both
filter is used on the both master and slave images , the combination of these two filters
are capable of removing some amount of decorrelation error.

Therefore good coherence image is generated. So it is quite good to filter the input
images before the coherence estimation and by these filtered image, a coherence image
is formed which has less amount of coherence loss among the all four coherence
images. Therefore, the quality of the interferogram is better than the other
interferogram.

38
5.2 INTERFEROGRAM

An interferogram is a phase interference image. The raw SAR images are in complex-
valued format which containing both amplitude and phase information per pixel. An
interferogram is created by taking pixel-to-pixel phase difference between two SAR
images. The phase difference is calculated by the multiplication of one image with the
conjugate of other image.This results in phase difference image which is called
Interferogram. In this work four different interferogram images are generated for the
same master and slave image.

Figure 5.10 An interferogram generated using non-filtered inputs[7] : Result obtained


after Interferogram generation phase.

39
Figure 5.11 Statistics of an interferogram generated using non-filtered inputs[7] :
Result obtained after Interferogram generation phase.
.

40
Figure 5.12 An interferogram generated using range filtered inputs[7] : Result
obtained after Interferogram generation phase.
.

41
Figure 5.13 Statistics of an interferogram generated using range filtered inputs[7] :
Result obtained after Interferogram generation phase.
.

42
Figure 5.14 An interferogram generated using azimuth filtered inputs[7] : Result
obtained after Interferogram generation phase.

43
.

Figure 5.15 Statistics of an interferogram generated using azimuth filtered inputs[7]:


Result obtained after Interferogram generation phase.
.

44
Figure 5.16 An interferogram generated using range & azimuth filtered inputs[7] :
Result obtained after Interferogram generation phase.
.

45
Figure 5.17 Statistics of an interferogram generated using range & azimuth filtered
inputs[7] : Result obtained after Interferogram generation phase.

46
Here four interferograms are generated using same master and slave images.
Interferogram in Figure 5.10 is generated using non-filtered input images and
interferogram in Figure 5.12 is generated using range filtered inputs. In these two
nothing much different, almost same interferogram we get whether use non-filtered
inputs or use range filtered inputs. Hence, there is no positive effect of range filter on
the generated interferogram.

Interferogram in Figure 5.14 is generated using azimuth filtered master and slave
images while interferogram in Figure 5.16 is generated using both range and azimuth
filtered master and slave images. The interferogram which generated using both range
and azimuth filtered master and slave images has most fine interferogram fringes
among all four interferograms which means it leads to the quite accurate contours of
the region because of fine fringes. The filtering of master and slave image increases
signal-to-noise ratio in the interferogram. This reduction of noise results from filtering
out the parts of the spectrum which are not overlapped.

5.3 PHASE FILTERED INTERFEROGRAM

Interferogram filtering is the critical technology for the phase unwrapping, should filter
the noise before phase unwrapping to get the best results. There are different
interferogram filtering techniques like median filter, baran filter etc. Here Goldstein
method is used to filter the interferogram, this filter is applied in the frequency domain
of the complex data. Baran Filter is the enhanced Goldstein filter which is also used for
filtering of interferogram. This filter addressed the under filtering over incoherent area
where the filter parameter alpha is underestimated by the biased coherence estimation.

Though correcting the overestimate of the sample coherence, the correct filter
parameter alpha is derived and the performance of the filter is optimized. It minimizes
the loss of phase still reducing the noise level in an interferogram.

In the phase filtered interferogram, interferometric fringes become sharper because of


filtering the peak in the spectrum is given higher weight. So phase filtering phase
reduces the noise level in an interferogram . Phase filtered interferogram gives the best
result in upcoming phases in Interferometry process because of less loss of phase and
low noise level in interferogram.

47
Figure 5.18 Phase filtered interferogram of non-filtered inputs[7] : Result obtained
after Phase filtering phase.

48
Figure 5.19 Statistics of phase filtered interferogram of non-filtered inputs[7] :
Result obtained after Phase filtering phase.

49
Figure 5.20 Phase filtered interferogram of range filtered inputs[7] : Result obtained
after Phase filtering phase.
.

50
Figure 5.21 Statistics of phase filtered interferogram of range filtered inputs[7] :
Result obtained after Phase filtering phase.
.

51
Figure 5.22 Phase filtered interferogram of azimuth filtered inputs[7] : Result
obtained after Phase filtering phase.
.

52
Figure 5.23 Statistics of phase filtered interferogram of azimuth filtered inputs[7] :
Result obtained after Phase filtering phase.
.

53
Figure 5.24 Phase filtered interferogram of range & azimuth filtered inputs[7] :
Result obtained after Phase filtering phase.

54
Figure 5.25 Statistics of Phase filtered interferogram of range & azimuth filtered
inputs[7] : Result obtained after Phase filtering phase.
.

55
Figure 5.18, 5.20, 5.22 and 5.24 are the phase filtered interferograms which are
generated by the images of same region. Figure 5.18 is phase filtered interferogram for
inputs which are non- filtered. Figure 5.20 is filtered interferogram for inputs which are
filtered with range filter. Figure 5.22 is filtered interferogram for inputs which are
filtered with azimuth filter. Figure 5.24 is filtered interferogram for inputs which are
filtered with both range and azimuth filter. Among all the phase filtered interferogram,
phase filtered interferogram which is generated using range and azimuth filtered inputs
has the most fine interferometric fringes and fringes are well separated and provide
least noise free data to the phase unwrapping stage.

5.4 UNWRAPPED PHASE IMAGE

The noises increase the difficulty in phase unwrapping, even results in failure of phase
unwrapping. So, before unwrapping, it is necessary to carry out the interferogram
filtering to reduce the noises, which ensure the accuracy and reliability of unwrapped
phase data. The phase noise level varies across the phase image. Areas with high noise
level should be filtered more to offer enough smoothing for phase unwrapping. Areas
with low noise level should be filtered less to preserve detailed elevation information.

The phase unwrapping is the process of resolving 2π ambiguity of the interferometric


phase. Several algorithms are has been developed like branch cuts, region growing,
minimum cost flow, minimum least square etc. for unwrapping the phase of flattened
interferogram. In order to remove inherent ambiguity the phase unwrapping must be
performed. Phase unwrapping refers to conversion from measured phase to the absolute
phase.

n = an integer representing number of cycles to unwrap the phase of single pixel

Results obtained from this phase produces best result in Geocoding phase which gives
us a good quality Digital elevation model(DEM) which can be used for extraction of
features of terrain.

56
Figure 5.26 Unwrapped phase image of non-filtered inputs[7] : Result obtained after
Phase Unwrapping phase.

57
Figure 5.27 Statistics of unwrapped phase image of non-filtered inputs[7] : Result
obtained after Phase Unwrapping phase.
.

58
Figure 5.28 Unwrapped phase image of range filtered inputs[7] : Result obtained after
Phase Unwrapping phase.
.

59
Figure 5.29 Statistics of unwrapped phase image of range filtered inputs[7] : Result
obtained after Phase Unwrapping phase.
.

60
Figure 5.30 Unwrapped phase image of azimuth filtered inputs[7] : Result obtained
after Phase Unwrapping phase.
.

61
Figure 5.31 Statistics of unwrapped phase image of azimuth filtered inputs[7] : Result
obtained after Phase Unwrapping phase.
.

62
Figure 5.32 Unwrapped phase image of range & azimuth filtered inputs[7] : Result
obtained after Phase Unwrapping phase.
.

63
Figure 5.33 Statistics of unwrapped phase image of range & azimuth filtered
inputs[7] : Result obtained after Phase Unwrapping phase.

64
Figure 5.26, 5.28, 5.30 and 5.32 are the Unwrapped phase images which are generated
by the images of same region. Figure 5.26 is Unwrapped phase image for inputs which
are non- filtered. Figure 5.28 is Unwrapped phase image for inputs which are filtered
with range filter. Figure 5.30 is Unwrapped phase image for inputs which are filtered
with azimuth filter. Figure 5.32 is Unwrapped phase image for inputs which are filtered
with both range and azimuth filter. Among all the Unwrapped phase images, the
Unwrapped phase image which is generated using range and azimuth filtered inputs
provides the best DEM.

5.5 DIGITAL ELEVATION MODEL

One of the best technique to generate the digital elevation model (DEM) is InSAR . A
Digital Elevation Model (DEM) is defined as the digital model or the three dimensional
representation of terrain of surface- commonly for a planet including Earth. Digital
elevation models are broadly classified into two categories one is Digital Terrain
Model (DTM) and another is Digital Surface Model (DSM). Digital terrain model
depicts the bare surface of ground without showing any features while Digital surface
model represent surface of the Earth including all objects present on the surface of the
Earth. DEM data files comprises the elevation values of the terrain over a particular
area at fixed grid interval. The intervals between each of grid points are referenced to
a geographical coordinate system. More the grid points are closely located, more the
detailed information about the terrain.

In this project work four different DEM is generated of the same region. Out of which
DEM generated using range and azimuth filtered master and slave image is found to
be more close to terrain characteristics where as DEM generated using non-filtered
master and slave is found to be more noisy than the others. DEM generated using only
range and azimuth filters sperately have no significant effect of filtering.

DEM obtained from range and azimuth filtered inputs gives us best result to extract
terrain characteristics. .Hence, we provide less noisy input to the phase unwrapping
stage where most accurate and reliable unwrapped phases are generated in order to get
accurate Digital elevation model.

65
Figure 5.34 DEM generated from non-filtered inputs[7] : Result obtained after
Geocoding phase.

66
Figure 5.35 Statistics of DEM generated from non-filtered inputs[7] : Result obtained
after Geocoding phase.
.

67
Figure 5.36 DEM generated from range filtered inputs[7] : Result obtained after
Geocoding phase.
.

68
Figure 5.37 Statistics of DEM generated from range filtered inputs[7] : Result
obtained after Geocoding phase.
.

69
Figure 5.38 DEM generated from azimuth filtered inputs[7] : Result obtained after
Geocoding phase.
.

70
Figure 5.39 Statistics of DEM generated from azimuth filtered inputs[7] : Result
obtained after Geocoding phase.

71
Figure 5.40 DEM generated from range & azimuth filtered inputs[7] : Result
obtained after Geocoding phase.

72
Figure 5.41 Statistics of DEM generated from range & azimuth filtered inputs[7] :
Result obtained after Geocoding phase.

73
CHAPTER 6
CONCLUSION AND FUTURE SCOPE

6.1 CONCLUSION
The interferogram is generally found to be always noisy which affect the accuracy of
DEM and makes it noisy due to the de-correlation in SAR interferometry. It was
observed that the terrain coherence effect can dramatically affect the phase unwrapping
and in turn DEM quality especially for highly vegetated or mountainous regions. In this
project filtering algorithm will be used to minimize the de-correlation effect and also
phase unwrapping intricacy by enhancing the signal in the interferogram spectrum. The
results obtained from this project will demonstrate the effectiveness of this filtering to
produce a quality DEM.

The filtering significantly improves the fringes visibility and reduce the noise
introduced by temporal or baseline decorrelation. Radar interferogram have the
property that fringes spectrum are very narrow-band except in the regions of layover
where the phase is no longer single value because of multiple scattering, these
properties account for the success of the filtering in suppressing the noises. Here before
phase filtering of interferogram, master and slave images are already filtered by the
range and azimuth spectral filter to get the best results. In this work four phase filtered
interferogram are generated using different inputs of the same region. But the best result
is found with input images which are range and azimuth filtered, in this interferometric
fringes are most fine among all four interferogram. Hence, we provide less noisy input
to the phase unwrapping stage where most accurate and reliable unwrapped phases are
generated in order to get accurate Digital elevation model.

6.2 FUTURE SCOPE

In future So many filters can also be tested for generating Digital Elevation Model
(DEM) using InSAR data.These filters can also improve the quality of three
dimensional representation of any terrain . A combination of our approach and filtering
on phase unwrapping phase can also significantly improve the quality of DEM.
The interferogram is generally found to be always noisy which affect the accuracy of
DEM and makes it noisy due to the de-correlation in SAR interferometry. It was
observed that the terrain coherence effect can dramatically affect the phase unwrapping

74
and in turn DEM quality especially for highly vegetated or mountainous regions.The
filtering significantly improves the fringes visibility and reduce the noise introduced by
temporal or baseline decorrelation. Radar interferogram have the property that fringes
spectrum are very narrow-band except in the regions of layover where the phase is no
longer single value because of multiple scattering, these properties account for the
success of the filtering in suppressing the noises.

75
APPENDIX A
LIST OF VARIABLES

Parameter Symbol Units

Azimuth resolution   m

Azimuth pixel spacing (sampling)   m

Baseline vector between two acquisition
locations
 m

 m
Length of component of baseline vector
parallel to line of sight 

Length of component of baseline vector  m
perpendicular to line of sight


Critical baseline  m
Cross-track component of baseline  m

 m
Instantaneous cross-track component of
baseline at azimuth position i 

Normal component of baseline   m
Azimuth processed bandwidth   Hz

Range chirp bandwidth   MHz

Speed of light  m/s

76
Parameter Symbol Units

cˆ -
Basis vector for “cross-track” axis of
TCN coordinate system ˜

Coarse azimuth offset ca pixels

Coarse range offset cr pixels

f
Radar carrier frequency 0 GHz

Azimuth Sampling Rate (also PRF) f


As Hz
Range Sampling Rate f
Rs MHz
fa Hz
Shift in azimuth frequency

Shift in range frequency f


r MHz

Complex flattened interferogram F


˜
ˆ
Complex normalized flattened F
interferogram ˜
Complex filtered flattened interferogram F′
˜
Complex raw interferogram G
˜
ˆ
Complex normalized raw interferogram G
˜
h m
Measured height value

Ambiguity height: change in elevation h


2π m
required to cause a full fringe of phase
rotation
Basis vector for “normal” axis of TCN nˆ -
coordinate system ˜

77
Parameter Symbol Units

Reference height of satellite in


simplified 2-D geometry
H m

Position vector of point on Earth’s


surface P m
˜
Position vector of point on DEM P m
˜d
Position vector of point on ellipsoid P m
˜e
Pulse Repetition Frequency PRF Hz

Local radius of Earth rE m

Range distance from spacecraft to point


on Earth’s surface
R m

Range distance from antenna 1 to point


on Earth’s surface
R1 m

Range distance from antenna 2 to point


on Earth’s surface
R
2 m

Slant range resolution Rr m

Slant range pixel spacing (sampling) Rs m

Slant range distance per nominal “flat


R
Earth” fringe 2π m
Spacecraft position vector, antenna 1 S m
˜1
Spacecraft position vector, antenna 2 S m
˜2
Basis vector for “ tangential” axis of
TCN coordinate system
ˆt -
˜

78
Parameter Symbol Units

Instantaneous spacecraft velocity vector v m/s


˜s

Instantaneous velocity vector of point on v m/s


Earth’s surface ˜p
Ground slope α radians

αc m/s
Change in cross-track baseline per unit
time caused by orbital azimuth
convergence

δr m
Difference in range distances between
two acquisitions
Interferometric phase φ radians

Phase difference modeled using DEM φd radians

Phase difference modeled using


φ
ellipsoid e radians

φm radians
Phase difference modeled synthetically
using either DEM or ellipsoid
Interferometric coherence γ -

Coherence caused by system response γH -

Radar wavelength λ cm

θ radians
Incidence angle between line-of-sight
vector and Earth surface normal

Δθ radians
Nominal change in incidence angle
between acquisitions
Backscatter coefficient σ0 -

79
APPENDIX B
GLOSSARY

This glossary contains short explanations of remote sensing terms and acronyms that
are used in this report.

COHERENCE
is a measure used in SAR interferometry to quantify the amount of phase noise present.
The quantity ranges between zero and one, with zero indicating a completely
uncorrelated random phase, and a maximum of one resulting when all phase
contributions are identical.

DEM
stands for Digital Elevation Model. Elevations are ground level heights above a given
reference, especially that of the ocean.

D-PAF
is an acronym for the German (Deutsches) Processing and Archiving Facility, one of
the four major facilities constructed in Europe for the processing of ERS data.

DSM
is an acronym for Digital Surface Model. Surface height values are stored rather than
the ground-level elevation measurements found in a DEM. For example, tree heights
are considered in a DSM, but ignored in a DEM.

DTM
is an acronym for Digital Terrain Model. DTM and DEM are often used synony-
mously. DTM is the more general expression since it can, in addition to elevation
information, also include description of terrain break lines and other topographical
features.

80
ECR
is an acronym for Earth Centred Rotating. The reference frame is not fixed to the stars,
but rather to the rotating Earth.

ERS
is an acronym for European Remote Sensing Satellite. At C-band, VV polarization, the
two ESA satellites ERS-1 (launched July 1991) and ERS-2 (launched April 1995) have
pioneered operational active microwave remote sensing from space.

ESA
is an acronym for European Space Agency, an organization with a mandate from a
group of European states to further the exploration and exploitation of space.

FIR
is an acronym for Finite Impulse Response. Within the field of signal processing, an
FIR filter has a kernel of finite length.

GEOS
is the name of the operational SAR geocoding software developed by a consortium of
the D-PAF, the University of Zürich’s Remote Sensing Laboratories, and the Joanneum
Research Centre in Graz, Austria. The software system has been used to operationally
produce terrain-geocoded ERS products since 1992.

GEC
is an acronym for Geocoded Ellipsoid Corrected. An ellipsoid model is used to
transform from range Doppler geometry into the chosen map reference system.

GIS
is an acronym for Geographic Information System. Spatial information is stored and
organized in a computer.

GTC
is an acronym for Geocoded Terrain Corrected. A digital elevation model is used to

81
transform from range Doppler geometry into the chosen map reference system. Terrain
correction enables overlay of multi temporal SAR images acquired with heterogeneous
geometries, and is also a prerequisite for combination of SAR data with geo-
graphically-tagged information from other sources.

HEIGHT FIELD
A height field is a matrix of numerical values representing the vertical height above
some reference surface.

HYPSOGRAPHY
is that branch of geography that deals with the measurement and mapping of the
topography of the Earth above sea level.

LAYOVER
In areas with steep slopes, the peak of a mountain may be closer to the radar satellite
than a valley floor closer to the radar sensor’s ground track. In such cases, distance from
the ground track does not increase with slant range, as in flat or less steep terrain.
Distance from the ground track instead decreases with increasing slant range, before
reversing again, reassuming its positive correlation with slant range. The effect causes
con-centration of a large map geometry area within a small slant range region, and is
strongest at steep incidence angles.

RSAT-1(RADARSAT-1)
is the first Earth-observation satellite launched by Canada, in November 1995. At C-
band HH polarization, it uses electronic beam steering to acquire swaths from a variety
of incidence angles, and optionally interleaving (ScanSAR) to form a wider swath.

RGB
is an abbreviation for Red, Green and Blue, the additive primary colours.

SAR
is an acronym for Synthetic Aperture Radar. SAR is an active microwave remote
sensing system that makes use of signal processing techniques to “synthesize” a pseudo
antenna along the length of the radar platform’s flight track. The generated images show

82
the Earth’s reflective properties at microwave wave lengths.
SHADOW
Radar shadow occurs when terrain obstructs a portion of the Earth’s surface from a line
of sight connection to the radar sensor’s flight track. Such areas are not illuminated by
the microwave pulses emitted by the radar, and therefore do not produce an echo. After
range and azimuth compression, their range-Doppler coordinates are empty. Note that
radar shadow differs from optical shadows in that the illumination originates from the
sensor itself, and not from the sun as in electro optical imagery.

SNR
is an abbreviation for signal to noise ratio, the ratio between the pure signal being
estimated and the noise in the channel being measured.

SRTM
is an acronym for Shuttle Radar Topography Mission. Working together with the
German Space Agency, NASA plans to launch a week-long shuttle mission near the
end of the century, using pairs of C-band and X-band antennae (one of each on a boom)
to map non-polar areas of the globe.

TANDEM MISSION
One often refers to the use of both the ERS-1 and ERS-2 satellites together in tandem
as a “tandem mission”. During 1995 and 1996 they acquired much of the globe with a
repeat-pass interval of just one day. In comparison to the previously sparsely scattered
set of available InSAR pairs (from earlier ERS-1 3-day repeat orbits), the tandem
mission significantly increased the available cover-age.

83
REFERENCES

[1] Rui Song, Huadong Guo, Guang Liu, Member, IEEE, Zbigniew Perski, and
Jinghui Fan. “Improved Goldstein SAR Interferogram filter on Empricial
Mode Decomposition”. IEEE Transactions on Geoscience and Remote Sensing,
vol.11 , no.2, 2014.

[2] S.Chelbi, A.Khireddine, J.P Charles. “Interferometry process for satellite


images SAR”. ELECO 2011 7th International Conference on Electrical and
Electronics Engineering, 1-4 December, Bursa, TURKEY,2011.

[3] A. Bin , L. Xia , X. Zheng. “Reconstructing high-accuracy DEM with precise


orbit data and external DEM”. Progress in Electromagnetics
Research,vol.14,15-32, 2010.

[4] Juan J. Martinez-Espla, Tomás Martinez-Marin, and Juan M. Lopez-


Sanchez, Senior Member, IEEE. “A Particle Filter Approach for InSAR Phase
Filtering and Unwrapping”. IEEE Transactions on Geoscience and Remote
Sensing , vol.47 , no.4 , 2009.

[5] Fabio Rocca , Politecnico di Milano. “Spectral Shift and differential


interferometry” ,2008.

[6] Zhengxiao Li , James Bethel . “Image Coregistration in SAR Interferometry”.


The International Archives of the Photogrammetry, Remote Sensing and Spatial
Information Sciences. Vol. XXXVII. Part B1 , 2008.

[7] Alessandro Ferretti, Andrea Monti-Guarnieri, Claudio Prati, Fabio Rocca.


“InSAR Principles: Guidelines for SAR Interferometry Processing and
Interpretation” , 2007

[8] MARK A. RICHARDS, Senior Member, IEEE,Georgia Institute of


Technology. “A Beginner’s Guide to Interferometric SAR Concepts and Signal
Processing” . IEEE A&E Systems Magazine ,vol. 22, no. 9 , 2007.

[9] Okeke, F. I. . “InSAR Operational and Processing steps for DEM Generation,
in promoting Land Administration and Good Governance”, pp. 1-13,2006.

[10] Stephane Guillaso , Andreas Reigber , Laurent Ferro-Famil , Eric Pottier.


“Range Resolution Improvement of Airborne SAR images”. IEEE GeoScience
and Remote Sensing Letters, vol. 3, no.1,2006.

[11] C. Damerval , S. Meignen , V. Perrier. “A fast algorithm for bidimensional


EMD”. IEEE signal Processing Letter, vol. 12 , no.10 , 2005.

[12] C. M. Han , H. D. Guo , C. L. Wang , D. Fan , H.Y. Sang. “Edge Preserving


filter for SAR images”. Chin. High Technol. Letter. vol.7,pp.11-15,2003.

84
[13] S. Beucher .“Geodesic reconstruction, saddle zones and hierarchical
segmentation”. Image Analysis Stereol. vol. 20 no. 3, 2001.

[14] David Small. “Generation of Digital Elevation Models through Spaceborne


SAR Interferometry”, 1998.

[15] Richard Bamleryx and Philipp Hartlz . “Synthetic aperture radar


interferometry” , 1998.

[16] R. M. Goldstein, C. L. Werner . “RADAR interferogram filtering for


geophysical applications” . GeoPhysics Res. Letter . vol. 25, no.21,1998.

[17] Fabio Galelli , Andrea Monti Guarnieri , Francesco Parizzl , Paolo


Pasquali , Claudio Prati, and Fabio Rocca . “The Wave Number shift in SAR
Interferometry”. IEEE Transactions on Geoscience and Remote Sensing ,
vol.32 , no.4 , 1994.

[18] L. Vincent . “Morphological grayscale reconstruction in image analysis”.


Harvard Robotics Lab ,USA, 1993.

[19] Madsen, S. N., Zebker, H. A. and Martin, J. . “Topographic mapping using


Radar Interferometry: processing techniques”. IEEE Trans. Geoscience and
Remote Sensing, vol. 31 (1), pp. 246-256 ,1993

[20] C. Prati , F. Rocca. “Focusing SAR data with time varying Doppler
centroid”.IEEE Transaction of GeoScience and Remote Sensing . vol.30, pp-
550-559,1992.

[21] Gabriel, A. K., and Goldstein, R. M. . “Crossed Orbit interferometry: Theory


and Experimental Results from SIR-B”. Internat. Journal of Remote Sensing,
vol. 9 (5), pp. 857-872 , 1988.

[22] Li, F. and Goldstein, R. M. . “Studies of multi-baseline spaceborne


interferometric SAR”. IEEE Trans. Geoscience and Remote Sensing, vol. 28
(1), pp. 88-97, 1987.

[23] Zebker, H. A. and Goldstein, R. . “Topographic mapping from interferometric


Synthetic Aperture Radar observation”. Journal of Geophysical Research, vol.
91 (5) pp. 4993-4999 , 1986.

[24] Graham, L. . “Synthetic Interferometer Radar for Topographic Mapping”.


Proceedings of IEEE, vol. 62 (6), pp. 763-768 ,1974.

[25] Shashi Kumar. “Indian Institute of Remote Sensing(Department of Space,


Government of India)”.

85
LIST OF PUBLICATIONS

[1] Ram Shankar Modwel , Saurabh Agrawal , Shubham Srivastava , Tanu


Shree . “Spectral filter testing for DEM generation using InSAR data”. Journal
of Advance research in Remote Sensing & GeoScience ,vol .2 , no.1 , 2015.

86

You might also like