0% found this document useful (0 votes)
12 views37 pages

Unit I

This document provides a comprehensive overview of digital image processing, covering fundamental concepts, image acquisition, and various applications across fields such as medicine, industry, and entertainment. It details the steps involved in processing digital images, including enhancement, restoration, segmentation, and the importance of image transforms. Additionally, it discusses the types of image sensors and the significance of pixel relationships in image processing tasks.

Uploaded by

maniji1771
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views37 pages

Unit I

This document provides a comprehensive overview of digital image processing, covering fundamental concepts, image acquisition, and various applications across fields such as medicine, industry, and entertainment. It details the steps involved in processing digital images, including enhancement, restoration, segmentation, and the importance of image transforms. Additionally, it discusses the types of image sensors and the significance of pixel relationships in image processing tasks.

Uploaded by

maniji1771
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

UNIT-I-IMAGE PROCESSING

INTRODUCTION:
1. Introduction to Image Processing,
2. Fundamental steps in digital image processing,
3. Components of an image processing system,
4. Image sensing and acquisition,
5. Image sampling and quantization,
6. Some basic relationships between pixels
7. Introduction to the mathematical tools used in digital image processing.
IMAGE TRANSFORMS:
1. Need for image transforms
2. Discrete Fourier transform (DFT) of one variable,
3. Extension to functions of two variables,
4. some properties of the 2-D Discrete Fourier transform.

INTRODUCTION TO DIGITAL IMAGE PROCESSING:

An image is defined as a two-dimensional function f (x, y) where x and y are spatial coordinates and
amplitude of “f” at any pair of coordinates (x, y) is called the intensity or gray level of the image at that point.
When x, y and the intensity values of ‘f’ are all finite, discrete quantities, we call the image a digital image.
A digital image is composed of finite number of elements, each of which has a particular location and
value. These elements are called picture elements, image elements, pixels or pels.
It represents processing of images which are digital in nature by using digital computer.
 Processing represents a series of action that lead to some result
 Images means pictures or frames of video
 Digital describes electronic technology that stores and process data in either positive or non-positive
form. Requires digital computer.
Necessity of IMAGE PROCESSING
It improves the pictorial information for human perception - enhances the quality of an image
Applications –
 Noise filtering
o Content Enhancement
 Contrast enhancement
 De-blurring
o Remote sensing – aerial images taken from satellite
 It is used for autonomous machine applications in industries especially in quality control, assembly,
automation etc.,
 It is used for efficient storage and transmission.

Applications of Digital Image Processing


 Acoustics:
 Geological Explorations - Mineral And Oil
 Industry
 Medicine – Ct Scn, X-Ray Imaging, Ultrasound Imaging,
 Mri Etc
 Office Automation:
 Optical Character Recognition –In Banks
 Document Processing
 Cursive Script Recognition
 Logo
 Icon Recognition,
 Identification Of Address Area On Envelop To Sort Mail In Post Offices
 Industrial Automation:
 Automatic Inspection System - Non- Destructive Testing
 Automatic Assembling - Process Related To Vlsi Manufacturing
 Robotics, Oil And Natural Gas Exploration -Process Related To Pcb Checking
 Seismography - Process Control Applications
• Bio-Medical Applications:
• Electro Cardiography (Ecg) - Heart
• Electro Encephalography (Eeg) – Electrical Activity Of Brain
• Electro Mayography (Emg) – Skeletal Muscles
• Computer Aided Tomography (Cat) –Detaials Of Organs, Bones, Tissues (X-Rays)
• Magnetic Resonance Imaging (Mri) – Nervous System And Soft Tissues
• Positron Emission Tomography (Pet) - Cancer
• Cytology – Study Of Microscopic Cells
• Histology - Anatomical Study Of The Microscopic Structure Of Animal And Plant Tissues
• Stereology - Determination Of The Three-Dimensional Structure Of Objects Based On Two-
Dimensional Views Of Them.
• Automated Raiology
• Pathology
• X-Ray Image Analysis
• Mammograms
• Cancer Smears
• Screening Of Plant Samples
• 3d Reconstrution And Analysis
• Single Photon Emitted Computer Tomography(Spect) –Working Of Organs
• Mass Screening Of Medical Images For Detection Of Various Diseases
• Meterology:
• Short Term Weather Forecasting
• Long Term Climatic Change Detection From Satellite
• Remote Sensing Data Cloud Pattern Analysis
• Information Technology:
• Facsimile Image Transmission
• Videotex
• Video Conferencing
• Video Phones Etc
• Entertainment And Consumer Electronics:
• Hdtv
• Multimedia
• Video-Editing
• Printing And Graphics Arts:
• Colour Fidelity In Desktop Publishing
• Art Conservation And Distribution
• Military Applications:
• Missile Guidance And Detection
• Target Identification
• Navigation Of Pilotless Vehicle
• Reconnaissance (Exploration Or Investigation)
• Range Finding Etc.

Types of energy sources that can be used to capture images


1. Gamma-Ray Imaging –
 Nuclear medicine – bone scan and PET scan
 Astronomical observations – cygnus loop
2. X-Ray Imaging
 Medical diagnostics Industry, astronomy
 Chest X-ray
 Digital radiography to obtain images. Uses X-ray sensors. Highly efficient.
 Aniography – study of blood vessels to obtain their images
 Computerized Axial Tomography (CAT scan)
 Industry
 Electronic circuit boards
 Astronomical observations - Cygnus loop

3. Imaging in the Ultraviolet Band


 Lithography
 Industrial inspection
 Lasers
 Biological imaging
 Astronomical observations
 Microscopy
 Fluorescence microscopy

4. Imaging in Infrared Band


 Light microscopy
 Astronomy
 Remote sensing
 Industry
 Law enforcement

5. Imaging in Visible band


 Thumb print – enhances features, searches data base for potential matches
 Paper Currency – automated counting, law enforcement, tracking and identifying
bills based on serial no.
 Traffic monitoring – Number plate identification

6. Imaging in the Microwave Band - Radar imaging


7. Imaging in the Radio Band
 Medicine
o MAGNETIC RESONANCE IMAGING(MRI)
o ELECTRO CARDIOGRAPHY (ECG)
o ELECTRO ENCEPHALOGRAPHY(EEG)
o ELECTRO MYOGRAPHY(EMG)
o COMPUTER AIDED IMAGING
 ASTRONOMY

8. Examples in which other imaging modalities are used


a) Acoustic
b) Ultrasonic
c) Electronic beams

Fundamental steps in digital image processing.


Figure represents the steps involved in representing a digital Image starting with Image acquisition to
making sense of an image.
Problem Domain: This represents the image and its type that has to be acquired. Ex: Black and white, colour,
medical, satellite images.
Image Acquisition: The acquisition of image is a simple procedure. This may be capturing of images using
a digital camera or a scanner. It involves some pre-processing techniques such as scaling, conversion from
gray scale to RGB or RGB to gray scale etc.
Image Enhancement: This process is to enhance the quality of an image for human perception only and
hence is called subjective process. It filters noise and increases the content as well as contrast in an image
using simple point processing and masking techniques. The techniques implied for enhancement are not the
same for all images and varies based on the type of images. Image enhancement can be performed either in
spatial or frequency domain. Both these domains involve techniques for either smoothing or sharpening of
the image.
Image Restoration: This step appears to be similar to enhancement but differs in a few ways. Enhancement
of image is purely subjective process whereas restoration is an objective process where it implements
mathematical and probabilistic models for estimating the amount of degradation in an image and how to
remove it.
Colour Image Processing: Colour is used as a potent descriptor. It helps in identification of objects,
modelling and processing of colour images. Though there are thousands of colours in nature, only a few are
chosen for analysis and description. This process helps in converting black and white images to colour or
colour images to black and white.
Wavelets and Multiresolution Processing: Wavelets prove to be the basis to signify an image in multi
resolutions. It signifies and analyzes signals or images in single or multiple resolutions.

Compression: This step is mainly used for saving storage space required for text, images, and videos and
enhances bandwidth required to transmit images. Compression uses and encoder to compress images or videos
and a decoder to recreate the original images or videos. Compression techniques are of two types – Lossy and
Lossless.
Morphological Processing: In this step, we learn about various components and tools that are used to
represent and describe an image. The inputs to this step are images and the outputs obtained are features of
the image.
Segmentation: In this step, an image is split into small regions or object. The region of interest can be
extracted from an image using this step. The inputs and outputs at this step are features only. Two types of
segmentation techniques are used namely Discontinuity based and similarity based segmentation
techniques. These techniques help in detecting points, lines, edges, linking of edges, thresholding, region
splitting, merging and growing.
Representation and Description: A segmented portion of an image can be best described by using either the
boundary or the region information. Boundary description centres on representation of shapes whereas
regional description focuses on colour or texture. In cases, rotation, size or translation becomes insensate.
Object Recognition: This step makes sense of the boundary and regional descriptors and defines the image.
Knowledge Base: This provides the required software to help the other blocks perform smoothly
Image Processing to Computer Vision

Low Level Processing: At this level both the inputs and outputs are images. This level involves pre-
processing steps such as noise removal, content and contrast enhancement using enhancement techniques in
either spatial or frequency domain or both.
Mid-level Processing: At this level, the inputs are images and outputs are features of images. This level
involves operations such as segmentation of objects in image, classification of those followed by
representation and description.
High Level Processing: This level shows a transition from Image Processing to Computer vision. Here the
inputs are features and output is making sense of an image. The role of computer vision is to make sense of
the image from the features. This level presents functions associated with computer vision.

Components used in Digital Image Processing:

Image sensors: This block uses digital cameras, scanners, optical sensors, x-ray scanners, MRI, radar,
ultrasonic sensors, Photodiodes, charge coupled devices to capture the image
Digitizer: Sampling and quantizing is performed by using specialized image processing hardware to digitize
the image and remove noise. These operations must be performed at high speed
Networking: high speed internet facility is required, transmission bandwidth is limited and hence to transmit
large data or images compression techniques is used
Computer: A General PC to super computer can be used to serve the purpose. Parallel processors are used for
high speed
Image processing software: All steps in image processing require software to process the image
Mass storage: It is of different types
– Short term storage - for processing of images which are repetitively used
– Online storage - fast retrieval
– mass storage - secondary storage for slow access
Image displays: TV, Monitors, CRTs
Hard copy: line printers, dot matrix printers, laser printers, transparencies

Structure of Human Eye:


Image Formation in the Eye

IMAGE SENSING AND ACQUISITION

Images are generated by combination of “illumination” from source and reflection or absorption of energy
from that source by the elements of the image. Illumination source – radar, infrared, ultra sound or X-ray
energy.
Reflected image – light reflected from plane surface – any picture
Transmitted or absorbed image – X-rays passing through a patient
• Single imaging sensors, Line sensors & Array sensors
Image Sensors:
 Image sensors convert optical images into digital signals.
 Types: CCD (high-quality, low noise) and CMOS (cost-effective, faster processing).
 CCDs offer better image quality, while CMOS sensors are widely used in consumer devices.
 The sensor type affects image clarity, noise, and speed.
 Relation: Sensor type directly impacts the quality and cost of image acquisition.

Image Acquisition Process:


 Involves capturing light through a lens and converting it into an electrical signal.
 Analog signals are converted into digital data for processing.
 A camera's resolution and exposure settings affect the final image quality.
 Factors like light conditions and sensor type influence the outcome.
 Relation: Image acquisition quality is determined by the sensor and camera settings.

Lens and Sensor Role:


 The lens focuses light on the sensor to form an image.
 Sensor captures light and converts it into electrical signals.
 The quality of the lens and sensor together determines the image sharpness.
 Larger sensors capture more light, enhancing quality in low-light conditions.
 Relation: Lens and sensor quality directly impact image resolution and clarity.

Resolution and Image Quality:


 Resolution refers to the pixel count in an image, affecting its detail level.
 Higher resolution means more detail but requires more storage and processing.
 Sensor size also affects how much light is captured, influencing quality.

Image sampling and quantization:

 In digital image processing, two fundamental concepts are image sampling and quantization. These
processes are crucial for converting an analog image into a digital form that can be stored,
manipulated, and displayed by computers. Despite being closely related, sampling and quantization
serve distinct purposes and involve different techniques.

What is Image Sampling?


 Image sampling is the process of converting a continuous image (analog) into a discrete image
(digital) by selecting specific points from the continuous image. This involves measuring the image
at regular intervals and recording the intensity (brightness) values at those points.

How Image Sampling Works?

 Grid Overlay: A grid is placed over the continuous image, dividing it into small, regular sections.
 Pixel Selection: At each intersection of the grid lines, a sample point (pixel) is chosen.

Examples of Sampling

 High Sampling Rate: A digital camera with a high megapixel count captures more details because it
samples the image at more points.
 Low Sampling Rate: An old VGA camera with a lower resolution captures less detail because it
samples the image at fewer points.

What is Image Quantization?

 Image quantization is the process of converting the continuous range of pixel values (intensities)
into a limited set of discrete values. This step follows sampling and reduces the precision of the
sampled values to a manageable level for digital representation.
How Image Quantization Works?
 Value Range Definition: The continuous range of pixel values is divided into a finite number of
intervals or levels.
 Mapping Intensities: Each sampled pixel intensity is mapped to the nearest interval value.
 Assigning Discrete Values: The original continuous intensity values are replaced by the discrete
values corresponding to the intervals.
Examples of Quantization
 High Quantization Levels: An image with 256 levels (8 bits per pixel) can represent shades of gray
more accurately.
 Low Quantization Levels: An image with only 4 levels (2 bits per pixel) has much less detail and
appears more posterized.

Some basic relationships between pixels:

In image processing, understanding the relationships between pixels is crucial for various operations such as
filtering, segmentation, and object recognition.
Adjacency of Pixels is a fundamental concept where pixels are considered adjacent if they are close to each
other in the image grid. In a 4-adjacency scheme, a pixel is adjacent to its vertical and horizontal neighbors,
while in 8-adjacency, it is adjacent to its vertical, horizontal, and diagonal neighbors. These relationships
are important for operations like image dilation, edge detection, and flood fill, where the connectivity
between neighboring pixels defines how an operation spreads across an image.

Pixel Intensity refers to the brightness or color value of a pixel. In grayscale images, pixel intensity is
typically represented by values between 0 (black) and 255 (white), while in color images, each pixel has
multiple intensity values corresponding to different color channels (such as RGB). Pixel intensity plays a
key role in image processing tasks such as enhancement, thresholding, and segmentation, as it directly
influences the visual appearance of the image. Changes in intensity values can highlight important features
such as edges, textures, or contrasts within the image.

The Distance Between Pixels is another important relationship, which can be measured in terms of spatial
or color difference. The Euclidean distance is commonly used to measure the spatial proximity or the
similarity between the colors of two pixels, while the Manhattan distance sums the absolute differences in
pixel positions. These measurements are important in algorithms that involve clustering, segmentation, or
pattern recognition, where pixel relationships are used to group similar regions or detect patterns in the
image.

Pixel Connectivity defines how pixels are connected to each other in terms of adjacency. In 4-connectivity,
a pixel is connected to its immediate vertical and horizontal neighbors, while in 8-connectivity, it is
connected to its vertical, horizontal, and diagonal neighbors. Connectivity is crucial for image segmentation,
region growing, and morphological operations, as it helps in determining the boundaries of objects and
regions of interest in an image.
Finally, a Pixel’s Neighborhood refers to the surrounding pixels that are used for various image processing
techniques, particularly those that involve local operations. In convolution or filtering operations, a pixel's
neighborhood is often defined by a small region around the pixel that influences the operation’s outcome.
The neighborhood relationships play a significant role in defining local image characteristics, such as
texture, gradients, and features that contribute to higher-level tasks like object recognition and scene
interpretation.

Introduction to the mathematical tools used in digital image processing:


2D Discrete Fourier Transform (DFT) is an extension of the 1D DFT to two dimensions, commonly used for
analyzing spatial frequencies in images and other 2D signals. Here are the main properties of the 2D DFT:

1. Linearity

The 2D DFT is a linear transformation. If f(x,y)f(x, y)f(x,y) and g(x,y)g(x, y)g(x,y) are two functions, and
aaa and bbb are scalars, then:

DFT[a⋅f(x,y)+b⋅g(x,y)]=a⋅DFT[f(x,y)]+b⋅DFT[g(x,y)].
2. Shift Theorem

Shifting the spatial domain (image) causes a phase shift in the frequency domain. If f(x,y)f(x, y)f(x,y) is
shifted by (x0,y0)(x_0, y_0)(x0,y0), then:

DFT[f(x−x0,y−y0)]=F(u,v)⋅e−j2π(ux0/M+vy0/N)

where MMM and NNN are the dimensions of the 2D signal.

3. Periodicity

The 2D DFT is periodic in both dimensions. If F(u,v)F(u, v)F(u,v) is the DFT of f(x,y)f(x, y)f(x,y), then:

F(u,v)=F(u+M,v)=F(u,v+N),

where MMM and NNN are the dimensions of the original function.

4. Symmetry

For a real-valued function f(x,y)f(x, y)f(x,y), the 2D DFT exhibits conjugate symmetry:

F(−u,−v)=F(u,v).

This means the frequency spectrum is symmetric about the origin.

5. Energy Conservation (Parseval’s Theorem)

The total energy in the spatial domain is equal to the total energy in the frequency domain:
∑M−1y=0∑N−1∣f(x,y)∣2=MN1u=0∑M−1v=0∑N−1∣F(u,v)∣2.

6. Convolution Theorem

The 2D DFT of a convolution in the spatial domain is equivalent to the pointwise multiplication in the
frequency domain:

DFT[f(x,y)∗g(x,y)]=F(u,v)⋅G(u,v).

7. Correlation

The 2D DFT of a correlation in the spatial domain corresponds to a pointwise multiplication in the
frequency domain, with one of the functions conjugated:

DFT[f(x,y)∘g(x,y)]=F(u,v)⋅G(u,v).

8. Scaling and Resolution

 Scaling the image in the spatial domain (e.g., shrinking or enlarging) inversely affects the resolution in
the frequency domain.
 A larger spatial extent corresponds to finer resolution in the frequency domain and vice versa.

9. Separable Computation

The 2D DFT can be computed as two sequential 1D DFTs:

 First, perform the 1D DFT along the rows.


 Then, perform the 1D DFT along the columns.

10. Duality

The 2D DFT and its inverse (IDFT) exhibit duality, meaning that the roles of the spatial and frequency
domains can be interchanged.

You might also like