Digital Image Processing - Lecture Weeks 1&2 PDF
Digital Image Processing - Lecture Weeks 1&2 PDF
Contents
• Introduction to images and image processing
• Electromagnetic radiation and image formation
• Image acquisition and sensing
• Use of Matlab
• Images and Matlab
• Image display
• Point operations and histograms
• Spatial filtering
• Noises and noise cleaning
• Differences and edges
• Fourier transforms and frequency filtering
• Color models and processing
Introduction to Image
and Image Processing
Image sharpening
• Original image • Enhanced original
Processed images
Image sharpening
Thumb print
Image
• Suppose we take an image, e.g. a photo.
• Suppose the photo is black and white (that is, lots of
shades of gray), so no color.
• We may consider this image as being a two
dimensional function f(x,y), where the function values
give the brightness of the image at any given point
(x,y).
• In other words, there is a function f that depends on
the spatial (room) coordinates x and y.
• We may assume that in such an image brightness
values can be any real numbers in the range 0 (black)
to 1 (white).
Digital image
• A digital image differs from a photo in that the x, y and f(x,y)
values are all discrete.
• Usually they take on only integer values, so e.g. x and y
ranging from 1 to 256 each, and the) brightness values also
ranging from 0 (black) to 255 (white).
• A digital image can be considered as a large array of discrete
dots, each of which has a brightness associated with it.
• These dots are picture elements called pixels.
• Surroundings of a given pixel form a neighborhood.
• A neighborhood can be characterized by its shape in the same
way as a matrix.
– E.g. we can speak of a 5*5 neighborhood, or of a 7*9 neighborhood.
Note: Except in very special circumstances, neighborhoods have odd numbers of rows and
Jorma Kekalainen
columns, because this ensures thatDigital Image Processing
the current 22
pixel is in the centre of the neighborhood.
pixel =
picture element
Zoom
Jorma Kekalainen
Original image: 500x340
Digital Image Processing 23
pixels
Pixel: [ p, I(p)]
c
Value: I(p)=I(r,c)
r Location: p=(r,c)
Pixel Location: p = (r , c)
Pixel Value: I(p) = I(r , c)
Pixel: [ p, I(p)]
c
Value: I(p)=I(r,c)
r Location: p=(r,c)
red
p=(r,c)
=(row#, col#) I (p ) = green
blue
Jorma Kekalainen Digital Image Processing 26
Example
Pixels and a neighborhood
Current pixel
3*5 neighborhood
Note: If a neighborhood has an even number of rows or columns (or both), it may
Jorma Kekalainen
be necessary to specify which pixel inDigital
theImage Processing
neighborhood is the current pixel. 27
Some applications
• Image processing has an enormous range of applications in almost
every area of science and technology e.g.,
• Industry
– Automatic inspection of items on a production line
• Agriculture
– Satellite/aerial views of land, for example to determine how much
land is being used for different purposes, or to investigate the
suitability of different regions for different crops, or inspection of fruit
and vegetables - distinguishing good and fresh produce from old.
• Medicine
– Inspection and interpretation of images obtained from X-rays, MRI
(magnetic resonance imaging) or CT scans.
• Law enforcement:
– Fingerprint analysis,
– Sharpening or de-blurring of speed-camera images,
– All kinds of surveillance camera applications
Digital image
a grid of squares,
each of which
contains a single
color
each square is
called a pixel (for
picture element)
Binary image
• Each pixel is just black or white.
• Since there are only two possible values for
each pixel, we only need one bit per pixel.
• Such images can therefore be very efficient in
terms of storage.
• Images for which a binary representation may
be suitable include text (printed or
handwriting), fingerprints, blueprints,
architectural plans etc.
Jorma Kekalainen Digital Image Processing 36
Grayscale
• Each pixel is a shade of gray, normally from 0 (black) to
255 (white).
• This range means that each pixel can be represented by
eight bits i.e. exactly one byte.
• This is a very natural range for image file handling.
• Other grayscale ranges are used, but generally they are
a power of 2.
• Such images arise in medicine (X-rays), images of
printed works, and indeed 256 different gray levels is
sufficient for the recognition of most natural objects.
C(150:170,210:235)
Color images
• Are constructed from
three intensity maps.
• Each intensity map is
projected through a color
filter (e.g., red, green, or
blue, or cyan, magenta, or
yellow) to create a
monochrome image.
• The intensity maps are
overlaid to create a color
image.
• Each pixel in a color image
is a three element vector.
Jorma Kekalainen Digital Image Processing 40
CRT
LCD
plasma
Color images
• Formation of a vector from corresponding pixel
values in three RGB component images
Indexed image
• Most color images only have a small subset of the
more than sixteen million possible colors.
• For convenience of storage and file handling, the image
has an associated color map, or color palette, which is
simply a list of all the colors used in that image.
• Each pixel has a value which does not give its color (as
for an RGB image), but an index to the color in the
map.
• It is convenient if an image has 256 colors or less, for
then the index values will only require one byte each to
store.
• In this image the indices, rather then being the gray values
of the pixels, are simply indices into the color map.
• Without the color map, the image would be very dark and
colorless.
Jorma Kekalainen Digital Image Processing 47
Example
A small index sample from the previous image
>> ind(117:123,347:352) Color map
ans =
0.1451 0.1176 0.1412
2 1 2 1 1 4 0.7451 0.2627 0.2314
1 4 4 1 1 1 0.7412 0.5451 0.6392
2 1 4 4 1 4 0.2392 0.2588 0.4000
5 5 2 2 1 1 0.8706 0.6941 0.1490
0.8706 0.8431 0.8196
5 5 5 5 5 1
5 5 5 5 5 5
5 5 5 5 5 5
Note: Many images are of course much larger than this. E.g. satellite images may be of
Jorma Kekalainen Digital Image Processing 51
the order of ten thousand pixels in each direction.
Image perception
• Much of image processing is concerned with
making an image appear better to human eyes.
• We should therefore be aware of the limitations
of the human visual system.
• Image perception consists of two basic steps:
– capturing the image with the eye,
– recognizing and interpreting the image with the visual
cortex in the brain.
• The combination and immense variability of
these steps influences the ways in we perceive
the world around us.
Jorma Kekalainen Digital Image Processing 53
upside down
Focal length
Color vision
• The human eye has cones which are sensitive to
different wavelength bands
Absorption of light by
the cones in the human eye
Gray scale
image:
Actual
intensity:
Perceived
intensity:
Jorma Kekalainen Digital Image Processing 59
Intensity resolution
• We can only resolve 26=64 or at most 27=128 intensity levels
on an ordinary computer screen.
• Based on hardware considerations, grayscale images are
usually stored with 8 bits per pixels, i.e. 28=256 intensity
levels.
• Some images are stored with more than 8 bits per pixels.
• E.g., CT images are stored with 12 bits per pixels, i.e. 212=4096
intensity levels.
• During image processing, pixels can preferably be stored with
more than 8 bits or floating point numbers.
• Color images are usually stored with 3x8 bits per pixels, 28 red
intensity levels, 28 green intensity levels and 28 blue intensity
levels giving 224 > 16 million different colors.
Jorma Kekalainen Digital Image Processing 60
Spatial resolution
• Spatial resolution is a measure
of the smallest discernible • Newspaper: 75 dpi
detail in the image. • Magazine: 133 dpi
• Two common measures: • Glossy brochures: 175 dpi
– lp/mm (line pair/mm), • Gonzalez & Woods: 2400
– dpi (dot/inch) dpi
5 discernible 6 dots
line pairs (lp) per inch, 6 dpi
per mm, 5 lp/mm
A newspaper
Jorma Kekalainen Digital Image Processing image 61
After
sampling After
quantization
Typically camera
Jorma Kekalainen Digital Image Processing 63
Sensor element
Linear transformation
A D/A-converter
converts a digital
value to an To D/A-converter and
further to the screen
analog value, an
electrical voltage
Jorma Kekalainen Digital Image Processing 65
Electromagnetic Radiation
and Image Formation
EM radiation
• Electromagnetic radiation is energy which
propagate through space as electromagnetic
waves
• The waves consist of transversal electrical and
magnetic fields that alternate with a temporal
frequency ν (Hertz) and spatial wavelength λ
(meter)
Spectrum
• In practice, light consists of
– photons with a range of energies, or
– waves with a range of frequencies
• This mix of frequencies/wavelengths/energies is
called the spectrum of the light.
• The spectrum gives the total amount of energy
for each frequency/wavelength/energy.
• Monochromatic light consists of only one
frequency/wavelength
– Can be produced by special light sources, e.g., lasers
Spectrum
Color spectrum
• In 1666, Newton discovered that sunlight (white light) passing
through a glass prism split up into a color spectrum of wave
lengths in the interval 400-700nm.
Color wavelength
Classification of EM spectrum
Polarization
• The electromagnetic field has a direction
– Perpendicular to the direction of motion
• The polarization of the light is defined as the
direction of the electric field.
• Natural light is a mix waves with polarization
in all possible directions: it is unpolarized
light.
• Special light sources or filters can produce
polarized light of well-defined polarization.
Jorma Kekalainen Digital Image Processing 80
Polarization
• Plane polarization
– The electric field varies only in a single plane
Polarization
• Circular/elliptical polarization
– The electric field vector rotates
– Can be constructed as the sum of two plane polarized
waves with 90o phase shift
Coherence
• The phase of the light waves can either be
– random: incoherent light (natural light)
– in a systematic relation: coherent light
• Coherent light is related to monochromatic
light sources
• Compare a red LED and a red laser
– Both produce light within a narrow range
– The LED light is incoherent
– The laser light is coherent
Radiation energy
• Light radiation has energy
– Each photon has a particular energy related to its
frequency (E = h ν)
– The number of photons of a particular frequency
gives the amount of energy for this frequency
– Described by the spectrum
– Unit: Joule (J=Ws =Watt second)
– Is usually not measured directly.
Radiation power
• The power of the radiation, i.e., the energy
per unit time, is the radiant flux
– Since the energy depends on the frequency, so
does the radiant flux
– Unit: Watt (W=J/s= Joule per second)
– Is usually not measured directly.
Radiant intensity
• For point sources, or distant sources of small
extent, the flux density can also be measured per
unit solid angle
Basic principle
• Preservation of energy ⇒ A constant light
source must produce the same amount of
energy through a solid angle regardless of
distance to the source
– The radiant intensity is constant
– The radiant flux density decreases with the square
of the distance to the source
Radiometric chain
Basic principle
• Based on preservation of energy:
E0 = E1 + E2 + E3
Refraction
• The light that is transmitted into the new
medium is refracted due to the change in light
speed
• Snell’s law of refraction:
Absorption
• Absorption implies attenuation of transmitted
or reflected light
• Materials get their colors as a result of
different amount of absorption for different
wavelengths
– E.g., A red object attenuates wavelengths in the
red band less than in other bands.
Absorption
• The absorption of light in matter depends on the
length that the light travels through the material
Absorption spectrum
• The spectrum of the reflected/transmitted
light is given by
• s1 = incident spectrum
• s2 = reflected/transmitted spectrum
• a = absorption spectrum ( 0 ≤ a(ν) ≤ 1)
Reflection
• Highly dependent on the surface type
Emission
• Almost independent of its interaction with
incident light:
– Any object, even one that is not considered a light
source, emits electromagnetic radiation
• Primarily in the IR-band, based on its
temperature.
Scattering
• All mediums (except vacuum) scatter light
– E.g., air, water, glass etc.
• We can think of the medium as consisting of
small particles and with some probability they
reflect the light
– In any possible direction
– Different probability for different directions
– Weak effect and roughly proportional to λ-4
– In general, the probability depends also on the
distribution of particle sizes
Scattering
• Scattering means that the
light ray does not travel
along a straight line
through the medium
– There is a probability that a
certain photon exits the
medium in another
direction than it entered.
• Examples:
– The sky is blue because of
scattering of the sun light
– A strong laser beam
becomes visible in air