0% found this document useful (0 votes)
51 views46 pages

3.3 Histogram Processing - Histogram (直⽅圖) : - Histogram of a digital image is a distribution function h (r) = n

Uploaded by

PYC
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views46 pages

3.3 Histogram Processing - Histogram (直⽅圖) : - Histogram of a digital image is a distribution function h (r) = n

Uploaded by

PYC
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

3.

3 Histogram Processing
- Histogram (直⽅圖)
•  Histogram of a digital image is a distribution function
h(rk) = nk
–  where rk is the kth gray level
and nk is the number of pixels having gray level rk

nk

Normalized histogram
rk
credit of this slide: Y. P. Hung

3.3 Histogram Processing


- Histogram (直⽅圖)

credit of this slide: C. Nikou


3.3 Histogram Processing


- Histogram (直⽅圖)
• Histogram is widely used in computer vision,
because it can provide a scale, rotation, view-angle
invariant descriptor to mention an object
Single camera mean-shift
Multi-camera object tracking
object tracking

3.3 Histogram Processing


- Histogram (直⽅圖)
• Four image types and their corresponding histograms

Note that the high-contrast image usually has a more flat histogram

3.3 Histogram Processing


- Histogram (直⽅圖)
• Four image types and their corresponding histograms

=> Histogram Equalization

Note that the high-contrast image usually has a more flat histogram

3.3 Histogram Processing


- Histogram Equalization

Image
Enhancement

Histogram
Equalization

To make histogram distributed uniformly


credit of this slide: Y. P. Hung

3.3 Histogram Processing


- Histogram Equalization
At rst, the continuous case will be studied:
- r is the intensity of the image in [0, L-1].
- The transformations s =T(r):
- T(r) is strictly monotonically increasing.
- T(r) must satisfy:

0 £ T (r ) £ L - 1, for 0 £ r £ L - 1

monotonically increasing: guarantees that ordering of


the output intensity values will be the same as that of
the input (avoids reversal of intensities)
fi

3.3 Histogram Processing


- Histogram Equalization
At rst, the continuous case will be studied:
- r is the intensity of the image in [0, L-1].
- The transformations s =T(r):
- T(r) is strictly monotonically increasing.
- T(r) must satisfy:

0 £ T (r ) £ L - 1, for 0 £ r £ L - 1

strictly monotonically increasing: guarantees the


mapping from s back to r will be ono-to-one
fi

3.3 Histogram Processing


- Histogram Equalization
strictly monotonically increasing: guarantees the
mapping from s back to r will be ono-to-one

monotonically increasing strictly monotonically increasing

3.3 Histogram Processing


- Histogram Equalization
• We then can view intensities r and s as random
variables and their histograms as probability density
functions (PDFs) pr(r) and ps(s).

• A fundamental result from probability theory:


- If pr(r) and T(r) are known, and T(r) is continuous
and differentiable, then

1 dr
1 dr
ps ( s ) = pr (r )ps ( s ) = pr (r ) = pr ( r )
ds ds ds
dr dr
https://www.pbr-book.org/3ed-2018/Monte_Carlo_Integration/Transforming_between_Distributions

3.3 Histogram Processing


- Histogram Equalization

1 dr
1 dr
ps ( s ) = pr (r )ps ( s ) = pr (r ) = pr ( r )
ds ds ds
dr dr
• The PDF of the output s is determined by the PDF of
the input r and the transformation T(r), which means
we can determine the histogram of the output image

3.3 Histogram Processing


- Histogram Equalization
• A transformation of particular importance in image
processing is cumulative distribution function (CDF)
of a random variable:
r
s = T (r ) = ( L - 1) ò pr ( w) dw
0

• It satis es the rst condition as the area under the


curve increases as r increases.
• It satis es the second condition as for r=L-1, the
integral evaluates to 1, thus the maximum s=L-1.
fi
fi
fi

3.3 Histogram Processing


- Histogram Equalization
• To nd ps(s) we can compute 1
ps ( s ) = pr (r )ps ( s ) = pr (r )
dr
1
= pr ( r
ds ds
dr dr
r
s = T (r ) = ( L - 1) ò pr ( w) dw
0
r
d
= ( L - 1) ò pr ( w) dw
dr 0

= ( L - 1) pr (r )
fi

3.3 Histogram Processing


- Histogram Equalization
ds
Substituting this result: = ( L - 1) pr (r )
dr
dr
to ps ( s ) = pr (r )
ds
we have

Uniform PDF

3.3 Histogram Processing


- Histogram Equalization

Uniform PDF
r
s = T (r ) = ( L - 1) ò pr ( w) dw
0

3.3 Histogram Processing


- Histogram Equalization
r
s = T (r ) = ( L - 1) ò pr ( w) dw
0

• For discrete case, the formula of histogram


equalization is

3.3 Histogram Processing


- Algorithm of Histogram Equalization
1. Compute the histogram of the input image:
h(k) = #{(x,y)|f(x,y)=k}, where k = 0 to 255.

2. Compute the transformation function: Cumulative normalized histogram


k
h( j )
T (k ) = 255 * ∑
j =0 n

3. Transform the value of each pixel by


g(x,y)=T(f(x,y))

credit of this slide: Y. P. Hung


3.3 Histogram Processing


- Example of Histogram Equalization

k
sk = T (rk ) = ( L - 1)å pr (rj )
j =0

credit of this slide: C. Nikou


3.3 Histogram Processing


- Example of Histogram Equalization
Rounding to the nearest integer (四捨五入):
s0 = 1.33 ® 1 s1 = 3.08 ® 3 s2 = 4.55 ® 5 s3 = 5.67 ® 6
s4 = 6.23 ® 6 s5 = 6.65 ® 7 s6 = 6.86 ® 7 s7 = 7.00 ® 7

credit of this slide: C. Nikou


3.3 Histogram Processing


- Example of Histogram Equalization

For discrete case, the resulting histogram will rarely be perfectly


uniform. But the net result is contrast enhancement.

3.3 Histogram Processing


- Examples of Histogram Equalization

3.3 Histogram Processing


- Examples of Histogram Equalization

(1)

(2)

(3)

(4)

3.3 Histogram Processing


- Examples of Histogram Equalization
For discrete case, histogram bins are never reduced in
amplitude, although they may increase if multiple gray
levels map to the same value (thus destroying information)

Histogram equalization
does not always provide
the desirable results

credit of this slide: Y. P. Hung


3.3 Histogram Processing


- Histogram Speci cation
Histogram equalization
does not always provide
the desirable results

Original image Histogram equalization

Want to transform an image into one


that has a speci c histogram
Histogram Speci cation
fi
fi
fi

3.3 Histogram Processing


- Histogram Speci cation
Histogram Speci cation (Histogram Matching)
• Problem statement:
– Given pr(r) from the image and the target
histogram pz(z), estimate the transformation z=T(r).

The solution exploits histogram equalization

credit of this slide: C. Nikou


fi

fi

3.3 Histogram Processing


- Histogram Speci cation
• Equalize the initial histogram of the image:
r
s = T (r ) = ( L - 1) ò pr ( w) dw
0

• Equalize the target histogram :


r
s = G ( z ) = ( L - 1) ò pz ( w) dw
0

• Obtain the inverse transform:

credit of this slide: C. Nikou


fi

3.3 Histogram Processing


- Histogram Speci cation
• Obtain the inverse transform:
• In practice, for every value of r in the image:
- get its equalized transformation s=T(r).
- perform the inverse mapping z=G-1(s), where
s=G(z) is the equalized target histogram

Ex. from (1) to (2)


sk
rk sk zk

rk zk
credit of this slide: C. Nikou
fi

3.3 Histogram Processing


- Histogram Speci cation
The discrete case:

• Equalize the initial histogram of the image:

• Equalize the target histogram:


q
sk = G ( zq ) = ( L - 1)å pz (ri )
i =0

• Obtain the inverse transform:

credit of this slide: C. Nikou


fi

3.3 Histogram Processing


- Example of Histogram Speci cation
Consider again the 3-bit 64x64 image:

Histogram Equalization:
s0 = 1, s1 = 3, s2 = 5, s3 = 6, s4 = 6, s5 = 7, s6 = 7, s7 = 7

credit of this slide: C. Nikou


fi
3.3 Histogram Processing
- Example of Histogram Speci cation
It is desired to transform this histogram to:

pz ( z0 ) = 0.00 pz ( z1 ) = 0.00 pz ( z2 ) = 0.00 pz ( z3 ) = 0.15


pz ( z4 ) = 0.20 pz ( z5 ) = 0.30 pz ( z6 ) = 0.20 pz ( z7 ) = 0.15

with z0 = 0, z1 = 1, z2 = 2, z3 = 3, z4 = 4, z5 = 5, z6 = 6, z7 = 7.
credit of this slide: C. Nikou

fi
3.3 Histogram Processing
- Example of Histogram Speci cation
It is desired to transform this histogram to:

Histogram Equalization:
G ( z0 ) = 0 G ( z1 ) = 0 G ( z2 ) = 0 G ( z3 ) = 1
G ( z 4 ) = 2 G ( z5 ) = 5 G ( z 6 ) = 6 G ( z 7 ) = 7
credit of this slide: C. Nikou

fi
3.3 Histogram Processing
- Example of Histogram Speci cation
Notice that G(z) may not be strictly monotonic. We must resolve this
ambiguity by choosing, e.g. the smallest value for the inverse mapping.

credit of this slide: C. Nikou


fi
3.3 Histogram Processing
- Example of Histogram Speci cation
Notice that due to discretization, the resulting histogram
will rarely be exactly the same as the desired histogram.

credit of this slide: C. Nikou


fi
3.3 Histogram Processing
- Example of Histogram Speci cation

Original image Histogram equalization

fi
3.3 Histogram Processing
- Example of Histogram Speci cation

Specified histogram

Transformation function (1)


and its inverse (2) Result Image

Resulting histogram

fi
3.3 Histogram Processing
- Histogram Speci cation
• When multiple images of the same scene, but taken
under slightly different lighting conditions, are to be
compared
-- e.g., visual surveillance, image stitching, stereo, etc.

• Get high contrast images by using a speci ed V-


shaped histogram.

• Usually, histogram speci cation is a trial-and-error


process.

credit of this slide: Y. P. Hung



fi
fi

fi
3.3 Histogram Processing
- Local Histogram Processing

Original Image
Global Histogram Local Histogram Equalization
Equalization with a neighborhood of size 3x3

• Overlapping windows – smooth, time consuming


• Non-overlapping windows – “blocky” effect, faster

3.3 Histogram Processing


- Local Histogram Processing

Local Enhancement based on Local Histogram Equalization


Original Image
local histogram statistics with a neighborhood of size 3x3

global mean global std.

k0 = 0 k2 = 0
k1 = 0.1 k3 = 0.1
C = 22.8

3.4 Fundamentals of Spatial Filtering

• Filter, Mask, Kernel,


Template, Window

• Coef cients

• Linear Filtering vs
Nonlinear Filtering
(e.g., median ltering)
fi

fi

3.4 Fundamentals of Spatial Filtering


• Filter, Mask, Kernel,
Template, Window

• Coef cients

• Linear Filtering vs
Nonlinear Filtering
(e.g., median ltering)
fi

fi

3.4 Fundamentals of Spatial Filtering


- Linear Spatial Filtering
a b
g ( x, y ) = å å w(s, t ) f ( x + s, y + t )
s = - at = - b

• The center coef cient of the kernel, w(0, 0), aligns


with the pixel at location (x, y).

• For a kernel of size m x n, we assume that m = 2a +


1 and n = 2b + 1, where a and b are nonnegative
integers.
fi

3.4 Fundamentals of Spatial Filtering


- Linear Spatial Filtering
One of the simplest spatial filtering for smoothing
operation - Averaging Filter

Box Filter Weighted Average


3.4 Fundamentals of Spatial Filtering
- Linear Spatial Filtering
One of the simplest spatial filtering for smoothing
operation - Averaging Filter

• original image of size


500*500 pixels
• ltering with an averaging
lter of increasing sizes
– 3, 5, 9, 15 and 35

Zero padding for edges


fi
fi



3.4 Fundamentals of Spatial Filtering
- Spatial Filtering at the Edges
At the edges of an image we are missing pixels to
form a neighbourhood

credit of this slide: C. Nikou


3.4 Fundamentals of Spatial Filtering
- Spatial Filtering at the Edges
Zero Padding

Replicate Padding

Mirror Padding

credit of this slide: C. Nikou


3.4 Fundamentals of Spatial Filtering
- Spatial Filtering at the Edges

Zero Padding Mirror Padding Replicate Padding

more applicable when


useful when the areas
the areas near the
near the border of the
border contain image
image are constant
details

You might also like