0% found this document useful (0 votes)
9 views13 pages

Probability and Image Enhancement

probability-and-image-enhancement (1)

Uploaded by

Desta Chuche
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views13 pages

Probability and Image Enhancement

probability-and-image-enhancement (1)

Uploaded by

Desta Chuche
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

2006-33: PROBABILITY AND IMAGE ENHANCEMENT

Maurice Aburdene, Bucknell University


MAURICE F. ABURDENE is the T. Jefferson Miers Professor of Electrical Engineering and
Professor of Computer Science at Bucknell University. He has taught at Swarthmore College, the
State University of New York at Oswego, and the University of Connecticut. His research areas
include, parallel algorithms, simulation of dynamic systems, distributed algorithms, computer
communication networks, control systems, computer-assisted laboratories, and signal processing.

Thomas Goodman, Bucknell University


THOMAS J. GOODMAN earned his B.S. degree in electrical engineering from Bucknell
University and is currently pursuing a Master's degree at Bucknell, also in electrical engineering.
His research interests include discrete transforms and efficient hardware implementation of
transform algorithms and other operations used in digital signal processing. He will be graduating
from Bucknell in May 2006 and plans to begin work as a hardware design engineer shortly
thereafter. He grew up in Rochester, NY.

Page 11.1023.1

© American Society for Engineering Education, 2006


Probability and Image Enhancement
Abstract

We present one of five projects used in our course, Probability with Applications in Electrical
Engineering. The course is required for all electrical engineering students and is open to third
and fourth year students. The project focuses on the applications of probability to image
enhancement using histogram equalization and histogram specification methods. These
techniques demonstrate applications of functions of random variables, transformations of random
variables, and the generation of random variables from specified distributions. We begin by
introducing the continuous random variable transformation and demonstrating the process of
transforming any random variable distribution to a uniform distribution through the use of the
cumulative density function. We then explore the concept of histogram equalization: how it
works, its effects on image contrast, and its applications in image processing and image
enhancement. Finally, we generalize the histogram equalization problem by showing how the
cumulative density function can be used to specify an arbitrary probability distribution and to
transform the image accordingly.

Introduction

ABET evaluation criteria for electrical engineering programs state “The program must
demonstrate that graduates have: knowledge of probability and statistics, including applications
appropriate to the program name and objectives; and knowledge of mathematics through
differential and integral calculus, basic sciences, computer science, and engineering sciences
necessary to analyze and design complex electrical and electronic devices, software, and systems
containing hardware and software components, as appropriate to program objectives.”(See
http://www.abet.org/criteria.html).

We present one of five projects used in our course, Probability with Applications in Electrical
Engineering. The course is required for all electrical engineering students and is open to third
and fourth year students. We introduce a way to make this topic more appealing to students. In
the latest offering, the four other projects included linear averaging,1 computer networks and
simulation,2 frequency response and least-squares estimation,1 and conditional probability and
receivers in communication systems1.

The project focuses on the applications of probability to image enhancement using both
histogram equalization and histogram specification methods. The histogram equalization
technique directly uses the original image pixel values to compute the enhanced image’s pixel
values. Histogram equalization is widely used in medical image processing, facial recognition,
radar, and photo processing software. Image enhancement techniques demonstrate applications
of functions of random variables (transformations of random variables, derived random
variables) and the generation of random variables from specified distributions. Image processing
examples are good in the sense of yielding immediate visual feedback; in addition, students may
Page 11.1023.2

already have had experience with image editing software.


The project requires students to apply their knowledge of probability concepts, including
probability density functions (pdf), probability mass functions (pmf), and transformations of
random variables to image processing.3,4,5,6,7 In addition to a deeper understanding of
probability, students discover two important points: first, that there is no general theory of image
enhancement, and second, the quality and method of the image enhancement depends on both the
original image and the viewer8.

In class and homework, we begin by introducing continuous random variable transformation and
demonstrating the process of transforming any random variable distribution to a uniform
distribution through the use of the cumulative density function (cdf).5,6,7,8,9 We then explore the
concept of image histogram equalization: how it works, its effects on image contrast, and its
applications in image processing and image enhancement. Finally, we generalize the histogram
equalization problem by showing how the cumulative density function can be used to specify an
arbitrary probability distribution and to transform the image accordingly.

Functions of Continuous Random Variables

Given a random variable X, its pdf, f X ( x ) , and a derived random variable Y = g ( X ) , students
are asked to find the pdf and cdf of Y and compare them with the pdf and cdf of X. In addition,
we look at the mean values and standard deviations of both X and Y. Students are asked to
explore both analytically and using MATLAB, the following:

1. Let Y = X + 3 .

a. Determine the pdf of Y if X is a uniformly distributed random variable with

1 0 ≤ x ≤ 1
f X (x ) =  .
0 otherwise

Sketch f X ( x ) and fY ( y ) .

b. What is E ( X ) , the expected value of X? What is the variance of X?


c. What is E (Y ) ? What is the variance of Y?
2. Let Y = 5 X + 1 .

a. Determine the pdf of Y if X is a uniformly distributed random variable with

2 x 0 ≤ x ≤ 1
f X (x ) =  .
 0 otherwise

Sketch f X ( x ) and fY ( y ) .
Page 11.1023.3

b. What is E ( X ) ? What is the variance of X?


c. What is E (Y ) ? What is the variance of Y?
3. Let Y = 3 X + 5 .

a. Determine the pdf of Y if X is a normally (Gaussian) distributed random variable


with E ( X ) = 0 and Var( X ) = 2 .
b. Sketch f X ( x ) and fY ( y ) .
c. What is E (Y ) ?
d. What is the probability that Y ≤ 5 ?

4. Let Y = X 10 .

a. Determine the pdf of Y if X is a uniformly distributed random variable with


1 10 0 ≤ x ≤ 10
f X (x ) =  .
 0 otherwise

b. Sketch f X ( x ) and fY ( y ) .
c. What is E ( X ) ? What is the variance of X?
d. What is E (Y ) ? What is the variance of Y?

5. Let Y = 1 − e − X .

a. Determine the pdf of Y if X is an exponentially distributed random variable with


e − x x≥0
f X (x ) =  .
 0 otherwise

b. Sketch f X ( x ) and fY ( y ) .

Examples 4 and 5 show that if g ( X ) is the cdf of X, then Y is a uniformly distributed random
variable in the range [0,1].

Our students use Matlab in an earlier course, Linear Systems. Students are asked to check their
analytical results by using a package such as MATLAB11 or Mathematica.12 We then ask the
students to use the transformation y = FX ( x ) and observe the distributions of Y and as
mentioned earlier they note that Y has a uniform distribution over the range 0 ≤ y ≤ 1 . We call
this process histogram equalization. Here we ask students to critically look at the characteristics
of FX ( x ) . We then ask what is the range of values for X and Y?
Page 11.1023.4
Random number generation

Often we are asked to generate random numbers with a specific distribution. Suppose we would
like to generate the exponential probability density function exponential

λe − λx x≥0
f X (x ) = 
 0 otherwise

as shown in Figure 1 with λ = 1 .

The cumulative density function for an exponential distribution is

1 − e − λx x≥0
FX ( x ) =  .
 0 otherwise

Since 0 ≤ FX ( x ) ≤ 1 , if we let Z be a uniformly distributed random variable with 0 ≤ z ≤ 1 , then


P ( Z ≤ z ) = P ( X ≤ x) and

x = F −1 ( z ) = −(1 / λ ) log(1 − z ) .

f X (x ) 1

0.8

0.6

0.4

0.2

1 2 3 4 x

Figure 1. Desired probability density function

Application to images

Given a two-dimensional image {x} jk shown in Figure 2a, we define the random variable X as
the pixel values of the image with the sample space of X = {x0 = 0, x1 = 1, x2 = 2,..., x255 = 255}.
The image has a probability density function f X ( x ) , and a histogram of the pixel values is
shown in Figure 2c. We would like to transform the image to have the probability density
function fY ( y ) = (1 64)e − y 64 as shown in Figure 3.
Page 11.1023.5
As we have seen, the transform is performed by replacing each {x} jk with a new value { y} jk ,
which will depend both on f X ( x ) and f Y ( y ) . The sample space of
Y = {y0 = 0, y1 = 1, y2 = 2,..., y255 = 255} .

Assuming a continuous random variable Y, the desired cumulative density function is given by

FY ( y ) = ∫ f Y (ω )dω = 1 − e − y 64 .
y

It follows that

FY
−1
(z ) = −64 log(1 − 0.989 z ) ,
where z is treated as a uniformly distributed random number with 0 ≤ z ≤ 1 . Thus the image
transformation we require is given by

{y}jk = −64 log(1 − 0.989 FX ({x}jk )) .


The image that results from applying this operation to the sample image in Figure 2a is shown in
Figure 2b, along with the histogram of pixel values in Figure 2d.

(a) (b) Page 11.1023.6


600
500
500
400
Frequency

400

Frequency
300 300

200 200

100 100

0 0
50 100 150 200 250 50 100 150 200 250
Grayscale Value Grayscale Value
(c) (d)

Figure 2. (a) Sample image; (b) transformed image; (c) histogram of pixel values from sample
image; (d) histogram of pixel values from transformed image.

fY ( y ) 0.015
0.0125
0.01
0.0075
0.005
0.0025

50 100 150 200 250


y
Figure 3. Desired pmf of the transformed image

Histogram Equalization

As mentioned earlier, histogram equalization is a transformation g of a random variable X such


that Y = g ( X ) has a uniform distribution, fY ( y ) [6]. Given a random variable X representing the
pixel values of a grayscale image {x} jk for 1 ≤ j ≤ N and 1 ≤ k ≤ M , we can examine the
distribution of X and perform a transformation on the pixel values to produce an enhanced
image Y = g ( X ) = { y} jk . If we let f X ( x ) and fY ( y ) be the pmfs of X and Y respectively, then

255 255

∑ f X ( xi ) = ∑ fY ( yi ) = 1 .
i =0 i =0

If we assume that Y is uniformly distributed, then its cdf is

y y
1 y +1
FY ( y ) = ∑ fY ( yi ) = ∑ =
i =0 i = 0 256 256
Page 11.1023.7

and
k
y k = 255∑ f X ( xi ) = 255 FX ( xk ) .
i =0

In the context of image enhancement, histogram equalization is a process by which each


grayscale pixel value xi is replaced by FX ( xi ) , where FX ( x ) is the cdf function of the original
image pixel values x. An algorithm for histogram equalization is:

1. Find the pmf of the image by counting the number of times each grayscale value
occurs in the image and dividing by the number of pixels.
2. Form the cdf of the image. If the pixel values of the original image range from 0-255
in one-unit increments, the cdf is given explicitly by
x
FX ( x ) = ∑ f X (k ) .
k =0

3. Replace all pixels {x}jk in the image by 255 FX ({x} jk ) .

Usually, histogram equalization transforms the distribution of the pixels in an image to increase
overall contrast. In this project, we ask students to look at the pmf of both X and Y and note that
the values that the two functions take on are identical, but are distributed differently. Note that
the {y}jk range is spread to enhance image contrast. However, if we have a large image and we
look at the histogram of the {y}jk values, we note that it looks like a “uniform distribution,”
using a small number of bins in the histogram.

When using this project, the teacher should be careful in extending the transformation of a
random variable from the continuous case to the discrete case. We recognize that applying
continuous histogram equalization methods to image processing might be challenging as an
application of probability density transformation at the undergraduate level. In fact, students
discover that image histogram equalization does not lead to uniform distributions due to the
discrete nature of the problem.

Image Enhancement

Here we note that the uniformly distributed pixel values greatly enhance image quality. Let us
demonstrate by performing histogram equalization on the 323x250 image shown in Figure 4a.
The image has low contrast and appears “washed out,” and a histogram of its grayscale pixel
values, shown in Figure 4c, confirms this observation. The distribution has a mean of 109 and a
variance of 622; clearly it is not a uniform distribution.

We can perform an accumulative sum of the pmf of X to obtain the cdf of the image. The
accumulation can be done recursively, i.e.

FX (0) = 0
FX (n ) = FX (n − 1) + f X (n ) 1 ≤ n ≤ 255
Page 11.1023.8
where f X (n ) is the pmf evaluated at n. Figure 5 shows the cdf for this image.

Next we will perform histogram equalization. We will replace each pixel {x}jk in the image by
255 ⋅ FX ({x}jk ). The resulting image is depicted in Figure 4b, and a histogram of its pixel values
is shown in Figure 4d. This distribution more closely resembles a uniform one (and based on the
theoretical framework of the technique, the distribution is as close as one can get to a uniform
distribution).

The mean of the new distribution is 129 and the variance is 5340. The mean has moved towards
the theoretical mean of a uniform distribution of values between 0 and 255. The variance has
also increased, which we expect due to the spread of pixel values.

How do the original and equalized images compare? Figure 4 shows a side-by-side comparison.
The equalized image, shown in 4b, appears brighter and has better contrast. From a visual
standpoint, image quality has improved significantly.

(a) (b)
7000
2500
6000
5000 2000
Frequency

Frequency

4000 1500
3000
1000
2000
500
1000
0 0
50 100 150 200 250 50 100 150 200 250
Grayscale Value Grayscale Value

(c) (d)

Figure 4. (a) Original image; (b) equalized image; (c) histogram of pixel values from original
image; (d) histogram of pixel values from equalized image.
Page 11.1023.9
1

Cumulative Probability
0.8

0.6

0.4

0.2

0
0 50 100 150 200 250
Grayscale Value
Figure 5. Cumulative distribution function (cdf) of the image in Fig. 4a

Once we notice that applying histogram equalization to this image has greatly improved overall
quality, we may be tempted to apply it a second time. However, image quality will not be
improved again because the distribution is already uniform. In other words, the cdf of the
equalized image should be the function FY ( y ) = ( y + 1) 256 . We can verify this by computing
the cdf of the equalized image, shown in Figure 6. We can see that the cdf closely approximates
the function FY ( y ) = ( y + 1) 256 , as predicted. As a result, we predict that performing histogram
equalization on this new image will have no effect, but we will perform the experiment to
confirm. Figure 7 shows the resulting histogram, which is identical to the one we obtained for
the first equalized image, Y.

1
Cumulative Probability

0.8

0.6

0.4

0.2

0
0 50 100 150 200 250
Grayscale Value
Figure 6. Cumulative probability mass function of the equalized image. Page 11.1023.10
2500

2000

Frequency
1500

1000

500

0
50 100 150 200 250
Grayscale Value
Figure 7. Histogram of the image Z, the result of applying histogram equalization to X twice.

Project Assessment

This project has been used for three years. Table I shows the project reports grades for the three
years. Student interest is demonstrated by the range of grades.

Table I: Project report grades

Year Average Standard High Low


Deviation
2003 8.8 0.8 10 7
2004 8.86 0.81 10 7
2005 9.92 0.28 10 9

After the projects reports are submitted and returned, the students are examined on the
fundamental concepts as well as the ability to perform the histogram equalization on a small size
image. Table II shows the project exam grades.

Table II: Project exam grades

Year Possible Average Standard High Low


Points Deviation
2003 40 27.6 7.06 40 14
2004 50 39.5 9.22 50 13
2005 50 43.17 9.17 50 20

The course evaluation form has the question, what aspects of the course did you like most?
Some student comments that relate to this project include:

• Projects and HW were from real life examples.


• Relating it to electrical concepts.
Page 11.1023.11

• The project format and the sometimes unguided approach to problem solving.
• Image enhancement.
• Projects and final project.
• The projects were good, varied and interesting with applicable (to EE) topics.
• I liked the projects because they were very diverse and interesting.
• The material is linked to real life application. Learning about image equalization,
network, queuing was interesting.
• I liked the Matlab projects. They really helped gain an understanding of the material.
• I think how the course related everyday technology to the probability we learned about.
• I liked the image processing project.
• It is related to real world problems. Matlab made it more hands on and easy to
understand.
• The projects because they gave me a workable physical example of problems that I could
understand and be interested in.
• I enjoyed the material on image processing and decisions rules.
• I liked the projects, especially the term project because we can pick a topic that is
interesting to us.
• Simulating everything in Matlab. Doing hands on work in class.
• The projects helped me apply the concepts.
• Application of probability to real life.
• Projects were generally interesting to do.
• Applying Matlab really helped to enforce principles taught in class.

Summary

We presented one project that we use to demonstrate the applications of probability in electrical
engineering. The application of both histogram specification and histogram equalization to
images appeal to undergraduates. Students realize that histogram equalization is a powerful tool
for enhancing image quality.

Acknowledgments

The authors would like to acknowledge the help of Professor Richard Kozick.

References

[1] Maurice F. Aburdene and Richard J. Kozick, “A project-oriented course in probability and
statistics for undergraduate electrical engineering students”, Proceedings of Frontiers in
Education Conference, Vol. 2, 1997, pp. 598-603.

[2] Maurice F. Aburdene and Thomas J. Goodman, “Probability, Computer Networks, and
Simulation”, with Thomas J. Goodman (undergraduate student), Proceedings of the 2005
American Society for Engineering Education Annual Conference & Exposition, Portland,
Oregon, June 12-15, 2005, Session 1432.
Page 11.1023.12
[3] Roy D. Yates and David J. Goodman Probability and Stochastic Processes: A Friendly
Introduction for Electrical & Computer Engineers, John Wiley Publishers, 2005. Chapter 3,
section 3.7.

[4] Rodger E. Ziemer, Elements of Engineering Probability and Statistics, Prentice-Hall, 1997.
Chapter 3, section 3-4.

[5] Roy D. Yates and David J. Goodman Probability and Stochastic Processes: A Friendly
Introduction for Electrical & Computer Engineers, John Wiley Publishers, 2005. Chapter 3,
section 3.7.

[6] Rodger E. Ziemer, Elements of Engineering Probability and Statistics, Prentice-Hall, 1997.
Chapter 3, section 3-4.

[7] T.T. Soong, Fundamentals of Probability and Statistics For Engineers, John Wiley& Sons,
2004. Chapter 5, pp. 119-134.

[8] Charles W. Therrien and Murali Tummala, Probability for Electrical and Computer
Engineers, CRC Press, 2004.

[9] A. H. Haddad, Probabilistic Systems and Random Signals, Pearson Education, Inc., 2006.

[10] Rafael C. Gonzalez and Richard E. Woods, Digital Image Processing, Addison- Wesley,
1992.

[11] Duane C. Hanselman, Bruce L. Littlefield, Mastering MATLAB 7, Prentice Hall, 2004.

[12] Stephen Wolfram, The Mathematica Book, Fifth Edition, Wolfram Media, 2003.

Page 11.1023.13

You might also like