Image Restoration
Image Restoration
Image Restoration
CONTENT
• Introduction and Overview
• n(r,c) – additive noise function
– Noise Models
• Gaussian
• Salt-and-pepper
• Uniform
• Rayleigh
– Noise Removal using Spatial Filters
• Order filters
• Mean filters
• h(r,c) - degradation Function
– Geometric Transforms
Introduction
• IR – process of finding an approximation to
the degradation process and finding the
appropriate inverse process to estimate the
original image
• IR differ IE – it used a mathematical model for
image degradation
Overview
• Types of degradation :-
– Blurring caused by motion or atmospheric
disturbance
h(r,c) =
– Geometric distortion caused by imperfect lenses degrad
ation
– Superimposed interference patterns caused by function
mechanical systems
– Spatial quantization
n(r,c) =
– Noise from electronic sources additive
noise
function
IR process (fig 9.1-1)
• We see that sample degraded images and knowledge of
the image acquisition process are inputs to development
of a degradation model
• After the model has been developed, the next step is the
formulation of the inverse process
d̂
g gray level
m mean (average)
standard deviation
var iance
2
• A bell-shapped
• 70% of all values fall within the range from
one standard deviation (σ) below the mean
(m) to one above
• About 95% fall within two standard deviations
Uniform distribution
• The gray level values of the noise are evenly
distributed across a specific range
1
, for a g b
H Uniform b a
0 elsewhere
a b
mean
2
(b - a) 2
variance
12
Salt-and-pepper
H A
for g a ("pepper")
salt& paper B for g b ("salt")
where
I , I , I ,......., I are the Intensity (gray level) values
1 2 3 N
2
110 110 114
100 104 104
95 88 85
a) b)
c) after median filtering with a 5x5 window, all the noise is
removed, but the image is blurry acquiring the “painted”
effect
c)
• Two order filters are midpoint and alpha-trimmed
mean filters – both order and mean filters since they
rely on ordering the pixels values, but are then
calculated by an averaging process
• Midpoint filter – the average of max & min within the
window;
2
Ordered set I1 , I 2 I 3 ...... I N
• Most useful for I1 I N& uniform noise
Gaussian 2
Midpoint
2
• Alpha-trimmed mean is the average of pixel values within
the window, but with some of the endpoint ranked
excluded
• Useful for images containing multiple types of noise,
Gaussian and salt-and-pepper noise
2
Ordered set I1 , I 2 I 3 ...... I N
N 2 T
1
Alpha - trimmed mean 2
where T is the number of pixel values Ii
N 2T i T 1 each end of the
excluded at
ordered set, and can range from 0 to (N2 – 1)/2
• Alpha-trimmed mean filter ranges from a
mean to median filter, depending on the value
selected for the T parameter
Exercise
Apply the following filters to the 3 x 3 subimages
below, and find the output for each: (a)
median, (b) maximum, (c) minimum, (d)
midpoint, (e) alpha-trimmed mean with T = 2.
10 11 10
12 12 11
9 10 9
Figure 9.3-5 Alpha-Trimmed Mean. This filter can vary between a mean filter and a median
filter. a) Image with added noise: zero-mean Gaussian noise with a variance of 200, and salt-
and-pepper noise with probability of each = 0.03, b) result of alpha-trimmed mean filter, mask
size = 3x3, T = 1, c) result of alpha-trimmed mean filter, mask size = 3x3, T = 2, d) result of alpha-
trimmed mean filter, mask size = 3x3, T = 4. As the T parameter increases the filter becomes
more like a median filter, so becomes more effective at removing the salt-and pepper noise.
a) b)
c) d)
Mean Filters
• Function by finding some form of an average within the
NxN window, using sliding window concept to process
entire image
• The most basic – arithmetic mean filter which finds the
arithmetic average of pixel values ;
1
Arithmetic mean 2 d (r , c)
N NxN
where N2 = the number of pixels in the ( r ,cwindow,
)W W
• Smooths out local variations & work best with Gaussian,
gamma and uniform noise
• Contra-harmonic mean filter works well for
images containing salt OR pepper type noise,
depending on the filter order, R:
d(r, c)
( r , c )W
R 1
10 11 10
12 12 11
9 10 9
The Degradation Function
• Either spatially-invariant or spatially-variant
• Spatially-invariant degradation affects all
pixels in the image same
– Eg, poor lens focus and camera motion
• Spatially-variant degradation depend on
spatial location & more difficult to model
– Eg, imperfects in a lens or object motion
Point Spread Function
• d(r,c) = I(r,c) * h(r,c) where * denotes the
convolution process
• h(r,c) is called point spread function (PSF), or
blur function
• PSF of a linear, spatially-invariant(shift
invariant) system can be empirically
determined by imaging a single point of light
Estimation of the Degradation
Function
• Estimated primarily by combinations of
1. Image analysis
2. Experimentation
3. Mathematical modeling
• Image analysis : examine a known point or
line in an image, and estimate the PSF by
measuring the width and distribution of
known feature in blurred image
• Experimentation :
1. PSF can be found by imaging a point of light
2. A more reliable method is to use sinusoidal
inputs at many different spatial frequencies to
find the H(u,v)
• Mathematical modeling examples:
1. The motion blur model
2. Atmospheric turbulence degradation model
used in astronomy and remote sensing
Geometric Transforms
• Spatially-variant
• Images that have been spatially, or geometrically,
distorted
• Used to modify the location of pixel values within
an image, typically to correct images that have
been spatially warped
• Often referred as rubber-sheet transforms -
image is modeled as a sheet of rubber and
stretched and shrunk
• Because of defective optics in image acquisition system,
distortion in image display devices, or 2D imaging of 3D
surfaces
• This methods are used in map making, image
registration, image morphing, and other applications
requiring spatial modification
• Simplest – translate, rotate, zoom & shrink
• More sophisticated – 1) spatial transform & 2) gray level
interpolation
Spatial Gray Level
Input Image Spatial Gray Level Output Image
Transform Interpolation
Transform Interpolation
Spatial Transforms
• Used to map the input image location to a
location in the output image; it defines how the
pixel values in output image are to be arranged
G
e
o
m
e
t
r
i
c
D
i
s
rˆ Rˆ ( r , c )
t
o
I(r,c) d (rˆ, cˆ)
r cˆ Cˆ ( r , c )
t
i
o
n
• The original, undistorted image, I(r,c), and distorted (or degraded)
image is d (rˆ, cˆ)
5 6 7 8