0% found this document useful (0 votes)
26 views20 pages

What Is Digital Image Processing (DIP) ?

image processing

Uploaded by

Suman Chatterjee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views20 pages

What Is Digital Image Processing (DIP) ?

image processing

Uploaded by

Suman Chatterjee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 20

What is Digital Image Processing (DIP)?

Digital Image Processing (DIP) is a technique by which we are used to manipulate the digital
images by the use of computer system. It is also used to enhance the images, to get some
important information from it.

It is also used in the conversion of signals from an image sensor into the digital images.

o Digital Image Processing provides a platform to perform various operations like


image enhancing, processing of analog and digital signals, image signals, voice
signals etc.

Application of Digital Image Processing:


o Image sharpening and restoration: The common applications of Image
sharpening and restoration are zooming, blurring, sharpening, grayscale conversion,
edges detecting, Image recognition, and Image retrieval, etc.
o Medical field: The common applications of medical field are Gamma-ray imaging,
PET scan, X-Ray Imaging, Medical CT, UV imaging, etc.
o Remote sensing: It is the process of scanning the earth by the use of satellite and
acknowledges all activities of space.
o Machine/Robot vision: It works on the vision of robots so that they can see things,
identify them, etc.

Describe Components of Image Processing System


Digital image processing is the processing of an image by means of a digital computer.

It consists of following components:-


 Image Sensors:
Image sensors senses the intensity, amplitude, co-ordinates and other features of the
images and passes the result to the image processing hardware. It includes the problem
domain.
 Image Processing Hardware:
Image processing hardware is the dedicated hardware that is used to process the
instructions obtained from the image sensors. It passes the result to general purpose
computer.
 Computer:
Computer used in the image processing system is the general purpose computer that is
used by us in our daily life.
 Image Processing Software:
Image processing software is the software that includes all the mechanisms and
algorithms that are used in image processing system.
 Mass Storage:
Mass storage stores the pixels of the images during the processing.
 Hard Copy Device:
Once the image is processed then it is stored in the hard copy device. It can be a pen
drive or any external ROM device.
 Image Display:
It includes the monitor or display screen that displays the processed images.
 Network:
Network is the connection of all the above elements of the image processing system.

Describe fundamental steps in digital image


processing:

 1. Image Acquisition:
 This is the first step or process of the fundamental steps of digital
image processing. Image acquisition could be as simple as being
given an image that is already in digital form. Generally, the
image acquisition stage involves pre-processing, such as scaling
etc.

 2. Image Enhancement:
 Image enhancement is among the simplest and most appealing
areas of digital image processing. Basically, the idea behind
enhancement techniques is to bring out detail that is obscured,
or simply to highlight certain features of interest in an image.
Such as, changing brightness & contrast etc.

 3. Image Restoration:
 Image restoration is an area that also deals with improving the
appearance of an image. However, unlike enhancement, which
is subjective, image restoration is objective, in the sense that
restoration techniques tend to be based on mathematical or
probabilistic models of image degradation.

 4. Color Image Processing:


 Color image processing is an area that has been gaining its
importance because of the significant increase in the use of
digital images over the Internet. This may include color modeling
and processing in a digital domain etc.

 5. Wavelets and Multi-Resolution Processing:


 Wavelets are the foundation for representing images in various
degrees of resolution. Images subdivision successively into
smaller regions for data compression and for pyramidal
representation.

 6. Compression:
 Compression deals with techniques for reducing the storage
required to save an image or the bandwidth to transmit it.
Particularly in the uses of internet it is very much necessary to
compress data.
7. Morphological Processing:
Morphological processing deals with tools for extracting image
components that are useful in the representation and description of
shape.

8. Segmentation:
Segmentation procedures partition an image into its constituent parts
or objects. In general, autonomous segmentation is one of the most
difficult tasks in digital image processing. A rugged segmentation
procedure brings the process a long way toward successful solution of
imaging problems that require objects to be identified individually.

9. Representation and Description:


Representation and description almost always follow the output of a
segmentation stage, which usually is raw pixel data, constituting either
the boundary of a region or all the points in the region itself. Choosing
a representation is only part of the solution for transforming raw data
into a form suitable for subsequent computer processing. Description
deals with extracting attributes that result in some quantitative
information of interest or are basic for differentiating one class of
objects from another.

10. Object recognition:


Recognition is the process that assigns a label, such as, “vehicle” to
an object based on its descriptors.

11. Knowledge Base:


Knowledge may be as simple as detailing regions of an image where
the information of interest is known to be located, thus limiting the
search that has to be conducted in seeking that information. The
knowledge base also can be quite complex, such as an interrelated
list of all major possible defects in a materials inspection problem or
an image database containing high-resolution satellite images of a
region in connection with change-detection applications.

How to store digital images in computer?or how to


convert analog image into digital image?
An image function f(x,y) must be digitized both spatially and in amplitude in order to
become suitable for digital processing. Typically, a frame grabber or digitizer is used to
sample and quantize the analogue video signal. Therefore, in order to create an image
which is digital, we need to convert continuous data into digital form. This conversion
from analog to digital involves two processes:

 Sampling (digitization of coordinate values).


 Quantization (digitization of amplitude values).

To convert an analog signal into a digital signal, both its axis(x,y) are converted into digital
format.

Sampling
In digital image processing, sampling is the reduction of a continuous-time signal to a
discrete-time signal. Sampling can be done for functions varying in space, time or any
other dimension and similar results are obtained in two or more dimensions. Sampling
takes two forms: Spatial and temporal. Spatial sampling is essentially the choice of 2D
resolution of an image whereas temporal sampling is the adjustment of the exposure
time of the CCD. Sampling is done on x-axis whereby infinite values are converted to
digital values.
What You Need To Know About Sampling
1. Sampling is the reduction of a continuous-time signal to a discrete-time signal.
2. In sampling, the values on the y-axis, usually amplitude, are continuous but
the time or x-axis is discretized.
3. Sampling is done prior to the quantization process.
4. The sampling rate determines the spatial resolution of the digitized image.

Quantization
Quantization is the process of mapping input values from a large set to output values in
a smaller set, often with a finite number of elements. Quantization is the opposite of
sampling. It is done on the y-axis. When you are quantizing an image, you are actually
dividing a signal into quanta (partitions). On the x axis of the signal, are the coordinate
values and on the y-axis, we have amplitudes. Therefore, digitizing the amplitudes is
what is referred to as quantization.
What You Need To Know About Quantization
1. The transition between continuous values of the image function and its digital
equivalent is referred to as quantization.
2. Quantization makes a sampled signal truly digital and ready for processing by
a computer.
3. In quantization, time or x-axis is continuous and the y-axis or amplitude is
discretized.
4. Quantization is done after sampling process.
5. The quantization level determines the number of grey levels in the digitized
image.

Difference Between Sampling And Quantization


BASIS OF
SAMPLING QUANTIZATION
COMPARISON

In sampling, the values on the y-


In quantization, time or x-axis is
axis, usually amplitude, are
X And Y axis continuous and the y-axis or amplitude is
continuous but the time or x-axis is
discretized.
discretized.

Sampling is done prior to the Quantization is done after sampling


When It Is Done
quantization process. process.

The sampling rate determines the The quantization level determines the
Resolution spatial resolution of the digitized number of grey levels in the digitized
image. image.

Sampling reduces a continuous Quantization reduces a continuous curve


Effect On A
curve (Time-Amplitude Graph) to a to a continuous series of ‘’stair steps’’ that
Continuous Curve
series of “tent poles”over time. exist at regular time interval.

In the sampling process, a single In quantization process, the values


Values
amplitude value is selected from representing the time intervals are
Representing The
different values of the time interval rounded off, to create a defined set of
Time Intervals
to represent it. possible amplitude values.
What is Histogram of an image?
Histogram of an image, like other histograms also shows frequency. But an image
histogram, shows frequency of pixels intensity values. In an image histogram, the x
axis shows the gray level intensities and the y axis shows the frequency of these
intensities.
For example

The histogram of the above picture of the Einstein would be something like this

The x axis of the histogram shows the range of pixel values. Since its an 8 bpp image,
that means it has 256 levels of gray or shades of gray in it. Thats why the range of x
axis starts from 0 and end at 255 with a gap of 50. Whereas on the y axis, is the count
of these intensities.
As you can see from the graph, that most of the bars that have high frequency lies in
the first half portion which is the darker portion. That means that the image we have got
is darker.

Applications of Histograms
1. In digital image processing, histograms are used for simple calculations in software.
2. It is used to analyze an image. Properties of an image can be predicted by the
detailed study of the histogram.
3. The brightness of the image can be adjusted by having the details of its histogram.
4. The contrast of the image can be adjusted according to the need by having details of
the x-axis of a histogram.
5. It is used for image equalization. Gray level intensities are expanded along the x-axis
to produce a high contrast image.
6. Histograms are used in thresholding as it improves the appearance of the image.
7. If we have input and output histogram of an image, we can determine which type of
transformation is applied in the algorithm.

Contrast
Contrast can be defined as the difference between maximum and minimum pixel
intensity in an image.
There are two methods of enhancing contrast.
The first one is called Histogram stretching that increase contrast.
The second one is called Histogram equalization that enhance contrast and it has
been discussed in our tutorial of histogram equalization.

Histogram Stretching
In histogram stretching, contrast of an image is increased. The contrast of an image is
defined between the maximum and minimum value of pixel intensity.

If we want to increase the contrast of an image, histogram of that image will be fully
stretched and covered the dynamic range of the histogram.

From histogram of an image, we can check that the image has low or high contrast.
Example of histogram stretching:

Consider this image.

The histogram of this image is shown below.

Now we calculate contrast from this image.


Contrast = 225.
Now we will increase the contrast of the image.

Increasing the contrast of the image


The formula for stretching the histogram of the image to increase the contrast is
The formula requires finding the minimum and maximum pixel intensity multiply by
levels of gray. In our case the image is 8bpp, so levels of gray are 256.
The minimum value is 0 and the maximum value is 225. So the formula in our case is

where f(x,y) denotes the value of each pixel intensity. For each f(x,y) in an image , we
will calculate this formula.
After doing this, we will be able to enhance our contrast.
The following image appear after applying histogram stretching.

The stretched histogram of this image has been shown below.


Note the shape and symmetry of histogram. The histogram is now stretched or in other
means expand. Have a look at it.
In this case the contrast of the image can be calculated as
Contrast = 240
Hence we can say that the contrast of the image is increased.

What is Histogram Equalization?


Histogram equalization is used for equalizing all the pixel values of an image.
Transformation is done in such a way that uniform flattened histogram is produced.

Histogram equalization increases the dynamic range of pixel values and makes an equal
count of pixels at each level which produces a flat histogram with high contrast image.

While stretching histogram, the shape of histogram remains the same whereas in Histogram
equalization, the shape of histogram changes and it generates only one image.

Histogram equalization is used to enhance contrast. It is not necessary that contrast


will always be increase in this. There may be some cases were histogram equalization
can be worse. In that cases the contrast is decreased.
Example:
Lets start histogram equalization by taking this image below as a simple image.
Image

Histogram of this image


The histogram of this image has been shown below.
Now we will perform histogram equalization to it.
PMF
First we have to calculate the PMF (probability mass function) of all the pixels in this
image. If you donot know how to calculate PMF, please visit our tutorial of PMF
calculation.
CDF
Our next step involves calculation of CDF (cumulative distributive function). Again if
you donot know how to calculate CDF , please visit our tutorial of CDF calculation.
Calculate CDF according to gray levels
Lets for instance consider this , that the CDF calculated in the second step looks like
this.

Gray Level Value CDF

0 0.11

1 0.22
2 0.55

3 0.66

4 0.77

5 0.88

6 0.99

7 1

Then in this step you will multiply the CDF value with (Gray levels (minus) 1) .
Considering we have an 3 bpp image. Then number of levels we have are 8. And 1
subtracts 8 is 7. So we multiply CDF by 7. Here what we got after multiplying.

Gray Level Value CDF CDF * (Levels-1)

0 0.11 0

1 0.22 1

2 0.55 3

3 0.66 4

4 0.77 5

5 0.88 6
6 0.99 6

7 1 7

Now we have is the last step, in which we have to map the new gray level values into
number of pixels.
Lets assume our old gray levels values has these number of pixels.

Gray Level Value Frequency

0 2

1 4

2 6

3 8

4 10

5 12

6 14

7 16

Now if we map our new values to , then this is what we got.

Gray Level Value New Gray Level Value Frequency


0 0 2

1 1 4

2 3 6

3 4 8

4 5 10

5 6 12

6 6 14

7 7 16

Now map these new values you are onto histogram, and you are done.
Lets apply this technique to our original image. After applying we got the following
image and its following histogram.
Histogram Equalization Image
Histogram Equalization histogram
Comparing both the histograms and images

Conclusion
As you can clearly see from the images that the new image contrast has been
enhanced and its histogram has also been equalized. There is also one important thing
to be note here that during histogram equalization the overall shape of the histogram
changes, where as in histogram stretching the overall shape of histogram remains
same.

You might also like