Unit I DIP PPT Final Subathra
Unit I DIP PPT Final Subathra
UNIT- I
DIGITAL IMAGE FUNDAMENTALS
1
Topics to be covered in Unit-I
processing.
Learning Outcome:
• Summarize about fundamentals of DIP
• List the various Application of DIP
• Summarize the concept of Image Representation
• Illustrate and describe the fundamental steps and
components in DIP
What is Image ?
An image is a spatial representation of a two-
dimensional or three-dimensional scene.
An image is an array, or a matrix pixels (picture
elements) arranged in columns and rows.
What is Image Processing
What is Image Processing
What is Image Processing
WHY digital image processing…???
Interest in digital image processing methods stems from
two principal application areas:
Digital Image:
When x, y and the intensity values of f are all finite, discrete
quantities, we call the image a digital image.
Color Image:
g(x , y)
Discretization
g(i , j )
Quantization
f(i , j) Digital Image
Image
Processi Image Analysis Vision
ng
Low-Level High-Level
Process Mid-Level Process
Process
• Reduce Noise Making Sense of an
• Segmentation Ensemble of
• Contrast Enhancement
• Classification Recognized Objects
• Image Sharpening
Origins of Digital Image Processing
One of the first applications of digital images was in the newspaper
industry, when pictures were first sent by submarine cable between
London and New York.
Gamma-Ray Imaging
X-Ray Imaging
Imaging in the Ultraviolet Band
Imaging in the Visible and Infrared Bands
Imaging in the Microwave Band
Imaging in the Radio Band
Gamma-Ray Imaging
Bone scan PET
Major uses of imaging
based on gamma rays
include nuclear medicine.
Angiogram
Learning Outcome:
• List the fundamental steps of DIP
• Define different processes in DIP
• Illustrate the block diagram of fundamental steps of DIP
Fundamental Steps in Digital Image Processing
Fundamental Steps in Digital Image Processing
Essential steps when processing digital images:
Acquisition
Enhancement
Restoration Outputs are
digital
Color image restoration images
Wavelets
Morphological processing
Segmentation Outputs are
attributes of the
Representation image
Recognition
Fundamental Steps in Digital Image Processing
https://directpoll.com/c?XDVhEtaJSgYCRlFBwNAW3VgGY7t8bwQ6
Session 3
Elements of Digital Processing System
Objective:
• To understand the fundamentals elements of digital image
processing
Learning Outcome:
• Identify the fundamental elements/components of DIP
• Illustrate the block diagram of fundamental elements of DIP
System
General Purpose Image Processing System
Components of an Image Processing
System
Image Sensors
• Two elements are required to acquire digital images.
• In other words, this unit performs functions that require fast data
throughputs that the typical main computer cannot handle.
Computer
• The computer in an image processing system is a general-purpose
computer and can range from a PC to a supercomputer.
• A well-designed package also includes the capability for the user to write
code that, as a minimum, utilizes the specialized modules.
Mass Storage Capability
• Mass storage capability is a must in a image processing
applications.
• And image of sized 1024 * 1024 pixels requires one megabyte of
storage space if the image is not compressed.
• Digital storage for image processing applications falls into three
principal categories:
Hardcopy devices
• Used for recording images, include laser printers, film cameras,
heat-sensitive devices, inkjet units and digital units, such as optical
and CD-Rom disks.
Networking
• Is almost a default function in any computer system, in use today.
• Because of the large amount of data inherent in image processing
applications the key consideration in image transmission is
bandwidth.
• In dedicated networks, this typically is not a problem, but
communications with remote sites via the internet are not always as
efficient.
Learning Assessment!!! (Direct Poll)
http://etc.ch/kZQF
Session 4 & 5
Elements of Visual Perception and Image Formation
Objective:
• To understand the mechanism of Visual Perception & Image
Formation in Eye
Learning Outcome:
• Draw and label the anatomy of Eye
• Summarize the mechanism behind Image Formation in eye
• State Brightness Adaptation & Discrimination
• State Mach Band Effect
Preview
• Structure of human eye
• Brightness adaptation and Discrimination
• Image formation in human eye and Image formation model
• Basics of exposure
• Resolution
– Sampling and quantization
• Research issues
67
Objective:
• To understand the concept of visual perception and various
elements of eye which contributes for perception.
Learning Outcome:
• Draw the structure of Human eye
• Relate each parts of eye to image formation, visual
perception
Structure of the human eye
69
Human eye
70
Cones vs. Rods
71
Image Formation in the Eye
• Example:
• Calculation of retinal image of an object
15 x
100 17
x 2.55mm
Image Formation in the Eye
Image Formation in the Eye
• Dynamic range of
human visual system
– 10-6 ~ 104
• Cannot accomplish this
range simultaneously
• The current sensitivity
level of the visual
system is called the
brightness adaptation
level
77
Brightness discrimination
• Weber ratio (the experiment) DIc/I
– I: the background illumination
– DIc : the increment of illumination
– Small Weber ratio indicates good discrimination
– Larger Weber ratio indicates poor discrimination
78
Brightness Adaptation
• Subjective brightness: Intensity as perceived
by the human visual system which is a
logarithmic function of the light intensity
incident on the eye
• Brightness adaptation: For any given set of
conditions, the current sensitivity level of the
visual system is called the brightness
adaptation.
Two phenomena clearly demonstrate that
perceived brightness is not a simple function of
intensity.
• Mach Band Effect
• Simultaneous contrast
MACH BAND EFFECT
• The first is based on the fact that the visual
system tends to undershoot or overshoot around
the boundary of regions of different intensities.
• Figure 2.7(a) shows a striking example of this
phenomenon.
• Although the intensity of the stripes is constant,
we actually perceive a brightness pattern that is
strongly scalloped near the boundaries [Fig.
2.7(c)].
• These seemingly scalloped bands are called Mach
bands after Ernst Mach, who first described the
phenomenon in 1865.
Brightness Adaptation of Human Eye
(cont.)
A
B
Intensity
Position
• For digital images the minimum gray level is usually 0, but the maximum
depends on number of quantization levels used to digitize an image. The
most common is 256 levels, so that the maximum level is 255.
Luminance
• Intensity per unit area
• Measured in lumens(lm), gives a measure of the
amount of energy an observer perceives from a
light source
• Emitting or reflecting light from the object
91
The research
• Artificial retina
• Artificial vision
• 3-D interpretation of
line drawing
• Compress sensing
92
Summary
• Structure of human eye
– Photo-receptors on retina (cones vs. rods)
• Brightness adaptation
• Brightness discrimination (Weber ratio)
• Be aware of psychovisual effects
• Image formation models
93
Learning Assessment!!! (Direct Poll)
http://etc.ch/kZQF
Session 6
Image Sensing & Acquisition
Image Sensing and Acquisition
• Images are generated by the combination of an
“illumination” source and the reflection or absorption of
energy from that source by the elements of the “scene”
being imaged.
• Illumination may originate from a source of electromagnetic
energy such as radar, infrared, or X-ray energy.
• Depending on the nature of the source, illumination energy
is reflected from, or transmitted through, objects.
• Examples: Planar surface-Light is reflected, X-rays pass
through a patient’s body for the purpose of generating a
diagnostic X-ray film,
• In some applications, the reflected or transmitted energy is
focused onto a photo converter (e.g., a phosphor screen),
which converts the energy into visible light.
Image Sensing and Acquisition
• The imaging strip gives one line of an image at a time, and the motion
of the strip completes the other dimension of a two-dimensional
image.
Image Acquisition Using Sensor Strips
• Sensor strips mounted in a ring configuration are
used in medical and industrial imaging to obtain
cross-sectional (“slice”) images of 3-D objects, as
Figure shows.
Image Acquisition Using Sensor Strips:
• A rotating X-ray source provides illumination and the portion of
the sensors opposite the source collect the X-ray energy that
pass through the object (the sensors obviously have to be
sensitive to X-ray energy).
• This is the basis for medical and industrial computerized axial
tomography (CAT).
• Output of the sensors must be processed by reconstruction
algorithms whose objective is to transform the sensed data into
meaningful cross-sectional images.
• In other words, images are not obtained directly from the
sensors by motion alone; they require extensive processing.
• A 3-D digital volume consisting of stacked images is generated as
the object is moved in a direction perpendicular to the sensor
ring.
• Other modalities of imaging based on the CAT principle include
magnetic resonance imaging (MRI) and positron emission
Image Acquisition Using Sensor Arrays:
• Figure shows individual sensors arranged in the form of a 2-D array.
• Nature of f(x,y):
– Illumination: i(x,y)
– Reflectance: r(x,y)
Learning Outcome:
• Differentiate Sampling and Quantization
• Elucidate the concept of Spatial Resolution and Intensity
Resolution
• Define Image Interpolation
Sampling & Quantization
• There are numerous ways to acquire images, but our
objective in all is the same: to generate digital images from
sensed data.
– Varying N, M numbers
– Varying k (number of bits)
– Varying both
Sampling & Quantization
• Conclusions:
– Quality of images increases as N
& k increase
– Sometimes, for fixed N, the
quality improved by decreasing k
(increased contrast)
– For images with large amounts of
detail, few gray levels are needed
Dithering
Definitions
Definitions (Con’t)
(Con’t)
Some
Some Basic
Basic Matrix
Matrix Operations
Operations
138
Review: Matrices and Vectors
Some
Some Basic
Basic Matrix
Matrix Operations
Operations (Con’t)
(Con’t)
is defined as
140
ORTHOGONAL AND UNITARY MATRICES
141
BLOCK MATRIX
• Any matrix whose elements are matrices
themselves is called a block matrix.
142
Learning Assessment!!! (Direct Poll)
http://etc.ch/kZQF
Session 8
Color Models
Objective:
• To know about the concept of color models
• To know the application of Color Model Transformations
Learning Outcome:
• Differentiate and list the applications of Color Models
• Reproduce the basic concept of mathematical tools used in
DIP
• Explain Image Operations on Pixel Basis in the context of color
models
Color Fundamentals
• Color Image Processing is divided into two major areas:
• 1) Full-color processing
• Images are acquired with a full-color sensor, such as a
color TV camera or color scanner
• Used in publishing, visualization, and the Internet
• 2) Pseudo color processing
• Assigning a color to a particular monochrome intensity
or range of intensities
Color Fundamentals
• In 1666, Sir Isaac Newton discovered that when a beam of
sunlight passes through a glass prism, the emerging beam
of light is split into a spectrum of colors ranging from violet
at one end to red at the other
Color Fundamentals
• Visible light as a narrow band of frequencies in EM
• A body that reflects light that is balanced in all visible
wavelengths appears white
• However, a body that favors reflectance in a limited range
of the visible spectrum exhibits some shades of color
• Green objects reflect wavelength in the 500 nm to 570 nm
range while absorbing most of the energy at other
wavelengths
Color Fundamentals
• If the light is achromatic (void of color), its only attribute is
its intensity, or amount
• Chromatic light spans EM from 380 to 780 nm
• Three basic quantities to describe the quality:
• 1) Radiance is the total amount of energy that flows from
the light source, and it is usually measured in watts (W)
• 2) Luminance, measured in lumens (lm), gives a
measure of the amount of energy an observer
perceives from a light source
• For example, light emitted from a source operating in the
far infrared region of the spectrum could have significant
energy (radiance), but an observer would hardly perceive
it; its luminance would be almost zero
Color Fundamentals
• 3) Brightness is a subjective descriptor that is practically
impossible to measure. It embodies the achromatic notion
of intensity and is one of the key factors in describing color
sensation
Color Fundamentals
(R = 0) (G = 0) (8 = 0)
RGB Color Models
• Many systems in use today are limited to colors
• Many 256 applications require few colors
• Given only
the variety of systems in current use, it is of
considerable interest to have subset of colors that are likely
to be reproduced faithfully, this subset of colors are called
the set of safe RGB colors, or the set of all-systems-safe
colors
• In internet applications, they are called safe Web colors or
safe browser colors
• On the assumption that 256 colors is the minimum number
of colors that can be reproduced faithfully
• Forty of these 256 colors are known to be processed
RGB Color Models
Color Equivalents TABLE 6.1
Number Valid values of
System
00 33 66 99 cc FF each RGB
Hex O 51 10 15 204 255 component in
Decimal 2 3 a
safe color.
a
b
FIGURE 6.10
(a) The 216 safe
RGB colors,
(b) A ll th e
gray s in the
256-color RGB
system (grays
that ar e
part or the safe
color group arc
shown
underlined).
CMY and CMYK Color Models
• CMY are the secondary colors of light, or, alternatively, the
primary colors of pigments
• For example, when a surface coated with cyan pigment is
illuminated with white light, no red light is reflected from
the surface because cyan subtracts red light from reflected
white light
• Color printers and copiers require CMY data input or
perform RGB to CMY conversion internally
C 1 R
M 1 G
Y 1 B
CMY and CMYK Color Models
• Equal amounts of the pigment primaries, cyan, magenta,
and yellow should produce black
• In practice, combining these colors for printing produces a
muddy-looking black
• So, in order to produce true black, a fourth color, black, is
added, giving rise to the CMYK color model
HSI Color Model
https://www.lightingschool.eu/portfolio/understanding-the-light/#41-hue-saturation-brightness
s
Ii
Cyan � -'----- ----
bR ed
1
I (R G
B)
3
Converting from HSI to RGB
• RG sector: 0 H 120
B I (1 S)
S cosH
R I 1
cos(60 H )
G 3I (R B)
Converting from HSI to RGB
• GB sector: 120 H
240
H H 120
R I (1 S)
S cosH
G I 1
cos(60 H )
B 3I (R G)
Converting from HSI to RGB
H H 240
G I (1 S)
S cosH
B I 1
cos(60 H )
R 3I (G B)
HSI Color Model
• HSI (Hue, saturation, intensity) color model, decouples the
intensity component from the color-carrying information
(hue and saturation) in a color image
Intensity to Color Transformations
• Achieving a wider range of pseudocolor enhancement
results than simple slicing technique
• Idea underlying this approach is to perform three
independent transformations on the intensity of any input
pixel
• Three results are then fed separately into the red, green,
and blue channels of a color television monitor
• This produces a composite image whose color content is
modulated by the nature of the transformation functions
• Not the functions of position
• Nonlinear function
Intensity to Color Transformations
Intensity to Color Transformations
Intensity to Color Transformations
Intensity to Color Transformations
Introduction to Mathematical Operations in DIP
Introduction to Mathematical Operations in DIP
H f ( x, y ) g ( x, y )
H ai fi ( x, y ) a j f j ( x, y )
Additivity
H ai fi ( x, y ) H a j f j ( x, y )
ai H f i ( x, y ) a j H f j ( x, y ) Homogeneity
ai gi ( x, y ) a j g j ( x, y )
H is said to be a linear operator;
H is said to be a nonlinear operator if it does not meet the above
qualification.
K
1
g ( x, y )
K
g ( x, y )
i 1
i
1 K
E g ( x, y ) E g i ( x, y )
2
2 K
K i 1 g ( x,y ) 1
gi ( x , y )
K i 1
1 K
E f ( x, y ) ni ( x, y )
K i 1 1 2
2
n( x, y )
1 K
1 K
ni ( x , y ) K
f ( x, y ) E
K
i 1
ni ( x, y )
K i 1
f ( x, y )
Live images f(x,y): X-ray images captured at TV rates after injection of the
contrast medium
The procedure gives a movie showing how the contrast medium propagates
through the various arteries in the area being observed.
A {( x, y, z ) | z f ( x, y )}
• The complement of A is denoted Ac
Ac {( x, y, K z ) | ( x, y, z ) A}
K 2k 1; k is the number of intensity bits used to represent z
A B {max(a, b) | a A, b B}
z
• Single-pixel operations
Alter the values of an image’s pixels based on the intensity.
s T ( z )
e.g.,
( x, y ) T {(v, w)}
— intensity interpolation that assigns intensity values to the spatially transformed
pixels.
• Affine transform
t11 t12 0
x y 1 v w 1 t21 t22 0
t31 t32 1
Weeks 1 & 2 203
Weeks 1 & 2 204
Image Registration
x c1v c2 w c3vw c4
y c5v c6 w c7 vw c8
• Given T(u, v), the original image f(x, y) can be recoverd using
the inverse tranformation of T(u, v).
M 1 N1
f ( x, y ) T (u , v) s ( x, y, u , v )
u 0 v 0
M 1 N1
T (u , v) f ( x, y )e j 2 ( ux / M vy / N )
x 0 y 0
M 1 N1
1
f ( x, y )
MN
T (u, v)e
u 0 v 0
j 2 ( ux / M vy / N )
p( z ) 1
k 0
k
PERIODIC
FAST ALGORITHM
CONJUGATE
SYMMETRIC ABOUT
N/2
• CIRCULAR CONVOLUTION
THEOREM