0% found this document useful (0 votes)
11 views8 pages

CH1 - Introduction To Digital Image Processing

Uploaded by

bekiamx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views8 pages

CH1 - Introduction To Digital Image Processing

Uploaded by

bekiamx
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Digital Image Processing

• Digital image processing


– Image defined by a 2D function f (x, y)
∗ x and y are spatial coordinates
∗ f (x, y) gives the amplitude of intensity at the spatial coordinates
∗ All quantities are discrete
– Processing of digital images on a computer
∗ Algorithms operate on input images to produce output images
∗ Improvement for human interpretation
∗ Processing for storage, transmission, and representation
– Not just limited to the visual band of the electromagnetic spectrum
• Image analysis
– Field of study in which algorithms operate on images to extract high-level information

• Image enhancement
– Transforming an input image into another image to improve its visual appearance
• Image restoration
– Restore an image that may have been corrupted by some type of noise

• Image compression
– Manipulating an image so that the storage requires fewer bits than the original signal, while preserving the visual
quality of the image
– May be applied to still images or video

• Image segmentation
– Analyzing an image to determine the pixels in an image that belong together, or that are part of the same object in
a scene
– Bottom-up process by looking at neighborhood of pixels

• Pixel classification
– Analyzing an image to determine the pixels that belong to a predefined model
– Top-down process relying on some system to facilitate a criterion to facilitate the creation of a model
• Shape from X

– Recover the 3D structure of a scene using stereo, video, shading, or texture


– Depends on linear algebra, projective geometry, and function optimization
• Machine vision
– Systems in an industrial setting in which placement of the sensor and light source can be controlled

• Computer vision
– Characterized by unstructured setting where placement of sensor and light source may not be controlled
Digital Image Processing 2

History and related fields

• Newspaper industry
– Bartlane cable picture transmission system across Atlantic (1920s)
– Superseded by photographic reproduction from tapes using telegraph terminals
– Earlier images could code in five levels of gray, improved to 15 levels in 1929
• Figure 1.1–1.3
• Image analysis and computer vision
– Areas based on image processing
– Image processing outputs an image while image analysis and computer vision use image processing techniques to
reason on images
– Low-level processing
∗ Both input and output are images
∗ Image preprocessing operations such as noise reduction, contrast enhancement, and image sharpening
– Mid-level processing
∗ Inputs are images but outputs are characteristics extracted from those images, such as edges, contours, and
identity of individual objects
∗ Processing images to render them useful for further computer processing
∗ Segmentation for object recognition and classification
– High-level processing
∗ Performing cognitive functions typically associated with human vision
∗ Tracking or identifying objects in an image

Sample applications

• Space
– Correction of distortion inherent in the onboard television camera on spacecraft
– Remote earth observation and astronomy
• Medicine
– Computerized axial tomography (CAT scan)
– A ring of detectors circle the patient and an X-ray source, concentric with the detector ring, rotates about the patient
– The sensed data is used to build a slice through the object
∗ Numerous slices of patient’s body are generated as the patient is moved in a longitudinal direction
∗ The slices are then combined to create a 3D rendering of the inside of patient’s body
• Robotics, including industrial inspection
• Document image analysis
• Transportation
• Homeland security, security, and surveillance
• Remote sensing
• Scientific imaging, plants and insects
Digital Image Processing 3

• Entertainment

Examples of fields that use image processing

• Classification of images based on the source of energy, ranging from gamma rays at one end to radio waves at the other
(Figure 1.5)
• Viewing images in non-visible bands of the electromagnetic spectrum, as well as in other energy sources such as acoustic,
ultrasonic, and electronic
• Gamma-ray imaging
– Nuclear medicine
∗ Inject a patient with a radioactive isotope that emits gamma rays as it decays
∗ Used to locate sites of bone pathology such as infection or tumors
∗ Figure 1.6a
– Positron emission tomography (PET scan) to detect tumors (Figure 1.6b)
∗ Similar to CAT
∗ Patient is given a radioactive isotope that emits positrons as it decays
∗ When a positron meets an electron, both are annihilated giving off two gamma rays
– Astrophysics
∗ Studying images of stars that glow in gamma rays as natural radiation (Figure 1.6c)
– Nuclear reactors
∗ Looking for gamma radiation from valves (Figure 1.6d)
• X-ray imaging
– Medical and industrial applications
– Generated using an X-ray tube – a vacuum tube with a cathode and an anode
∗ Cathode is heated causing free electrons to be released
∗ Electrons flow at high speed to positively charged anode
∗ Upon electron’s impact with a nucleus, energy released in the form of X-ray radiation
∗ Energy captured by a sensor sensitive to X-rays
∗ Figure 1.7a
– Angiography or contrast-enhanced radiography
∗ Used to obtain images or angiograms of blood vessels
∗ A catheter is inserted into an artery or vein in the groin
∗ Catheter threaded into the blood vessel and guided to the area to be studied
∗ An X-ray contrast medium is injected into the catheter tube
∗ Enhances the contrast of blood vessels and enables radiologists to see any irregularities or blockages
∗ Figure 1.7b
• Imaging in ultraviolet band
– Lithography, industrial inspection, microscopy, lasers, biological imaging
– Fluorescence microscopy
∗ A mineral fluorspar fluoresces when UV light is directed upon it
∗ UV light by itself is not visible but when a photon of UV radiation collides with an electron in an atom of a
fluorescent material, it elevates the electron to a higher energy level
Digital Image Processing 4

∗ The excited electron relaxes and emits light in the form of a lower energy photon in the visible light region
∗ Fluorescence microscope uses excitation light to irradiate a prepared specimen and then, to separate the much
weaker radiating fluorescent light from the brighter excitation light
∗ Only the emission light reaches the sensor
∗ Resulting fluorescing areas shine against a dark background with sufficient contrast to permit detection
– Astronomy
• Visible and IR band
– Remote sensing, law enforcement
– Thematic bands in satellite imagery, NASA’s LANDSAT satellites (Table 1.1)
– Multispectral and hyperspectral imagery (Fig. 1.10; Washington, DC)
– Weather observation and monitoring (Figure 1.11; Hurricane Katrina)
– Figures 1.12, 1.13 – Lights of the World dataset
– Target detection
– Law enforcement
– Military applications
• Imaging in microwave band

– Radar
– Collect data regardless of weather or ambient lighting conditions
– Figure 1.16 – Spaceborne radara imagery of mountains near Lhasa, Tibet
• Imaging in radio band

– Medicine (MRI) and astronomy


• Other imaging modalities
– Acoustic imaging (ultrasound), electron microscopy
– Geological exploration with sound in the low end of the sound spectrum

Image basics

• Image
– A discrete 2D array of values, like a matrix
∗ Width of image is the number of columns in the image
∗ Height of image is the number of rows in the image
∗ Aspect ratio is width divided by height
– A 2D function f (x, y)
– x and y are spatial coordinates
– Amplitude of f at a point is intensity or gray level of image at that point
– Digital image
∗ x, y, and f (x, y) are all discrete and finite
∗ Finite number of elements with a given value at a location
· Elements are called picture elements or pixels
– Pixel coordinates may be represented using a vector notation
Digital Image Processing 5

∗ By convention, each vector is vertically oriented while its transpose is horizontally oriented
 
x T
x= = [x y] = (x, y)
y

– Image storage into memory


∗ Column major order
∗ Row major order
– Accessing image data – origin at the top left corner
∗ Scanline
∗ Raster scan order
∗ Image accessed as 1D array of pixels, with indices in the range i = 0, 1, . . . , n where n is width·height
∗ Relationship between 1D and 2D arrays

i = y · width + x
x = i%width
y = i/width

• Image types
– Grayscale image
∗ Pixel values quantized into finite number of discrete gray levels
∗ Number of bits used to store each gray level known as bit depth
· b bits imply 2b gray levels
· 8 bits per pixel gives 256 gray levels
· Hexadecimal notation
· Specialized applications may use more quantization levels to increase the dynamic range
– RGB color image
∗ Each pixel is a vector of three integers, representing three color channels
∗ 24 bpp
∗ Pixel vector stored as RGB or BGR
∗ Values of different colors stored as interleaved channels as B0 G0 R0 B1 G1 R1 B2 G2 R2 · · · Bn−1 Gn−1 Rn−1
∗ Other method for storage is planar layout, with each color channel stored separately

B0 B1 B2 · · · Bn−1 G0 G1 G2 · · · Gn−1 R0 R1 R2 · · · Rn−1

– Alpha value or opacity


∗ 00 indicates transparent while F F indicates opaque
– Binary image
∗ Each pixel is either black or white
∗ 1 bpp, but displayed with 8bpp
∗ Useful for building masks to separate areas of image
– Real-valued image, or floating point image
∗ 32-bit floating point number; 64-bit double precision values, 16-bit half-precision values
– Complex-valued images
∗ Output from computing the Fourier transform of an image
• Conceptualizing images
– Brightness of each pixel proportional to its value
Digital Image Processing 6

– Raw pixels as a height map or 3D surface plot


– I(x, y) as the value of the function at position (x, y)
– Grayscale image as a matrix of pixel values
– Color image as a matrix of 3-tuples
– Binary image as the set of pixels with value 1
 
1 0 1
 1 1 1 
1 0 1

can be represented as
{(0, 0), (2, 0), (0, 1), (1, 1), (2, 1), (0, 2), (2, 2)}

Steps in digital image processing

• Two main types of image processing processes


1. Both input and output of processing are images
2. Inputs are images but outputs are some attributes of those images
• Image acquisition
– Acquiring an image in a digital form
– Could be acquired from a sensor or from a storage medium
– May involve preprocessing such as scaling
• Image enhancement
– Bring out obscured detail
– Subjective method, depending on application
– Contrast enhancement
• Image restoration
– Objective method
– Based on mathematical or probabilistic models of image degradation
– Filling in the details, making the picture sharper
• Color image processing
– Different color models for representation and processing
• Wavelets
– Provide a foundation to represent images in multiple resolution levels
– Useful for pyramidal representation and compression
• Compression
– Techniques to reduce the storage required to save an image, or to conserve bandwidth required for transmission
– Most common method of compression based on JPEG specification
• Morphological processing
– Extracting image components useful in the representation and description of shape
Digital Image Processing 7

• Segmentation
– Partitioning an image into components, such as objects in the image
– One of the most difficult tasks in image processing
– Required for object recognition
• Feature extraction
– Typically applied after segmentation
– Boundary or region-based
– Boundary representation good for external shape characteristics such as corners and inflections
– Region representation appropriate for texture or skeletal shapes
– Description, or feature selection, deals with extracting attributes to get some quantitative information of interest,
and to differentiate between object classes
– Feature descriptors should be insensitive to variations in parameters such as scale, translation, rotation, illumination,
and viewpoint
• Image pattern classification or Recognition
– Assigning a label to an object based on its description
– Knowledge about the problem domain
– Building models of objects to be identified/recognized
– Recent advances in classification are based on deep convolutional neural networks
• Image display
– Not a real concern for computer vision
– You may want to display intermediate images in some cases, primarily for debugging

Components of an image processing system

• Sensor/digitizer
– Sensor senses the energy radiated by the object to be captured
– Sensor produces an electrical output proportional to EM waveform intensity
– Digitizer converts the energy to digital form
• Specialized image processing hardware
– Also called digital signal processor (DSP)
– Characterized by small form factor and low power consumption
– Used to achieve real-time frame processing (30 frames per second)
– Older examples include Texas Instruments C80
– Newer systems replace a specialized DSP with general purpose CPU such as PowerPC being used for its vector
processing capabilities
– Use of GPUs
• Computer
– Acts as mother ship for the specialized hardware such as DSP
• Software
Digital Image Processing 8

– Specialized modules to perform specific tasks


– Normally available as a set of library functions to take advantage of specialized hardware, or the vector processing
capabilities of the computer’s CPU
– OpenCV
• Mass storage

– Images take up a lot of space


– Consider storage requirements for 512 × 512 pixel color image
∗ Assume 8-bits per color per pixel (normal)
∗ Total memory needed: 512 × 512 × 3 = 786432 bytes
∗ On my machine, it gives me a 5.7” × 5.7” image
– Short-term storage used during processing
∗ Computer memory
∗ Frame buffers
· Allow access at video rates (30 fps)
· Processed images are visible right away
– On-line storage for relatively fast recall
∗ Magnetic disk
– Archival storage characterized by infrequent access
∗ Magnetic tapes, CD - ROMs, jukeboxes
• Image displays
– Monitors (CRT, plasma)
– Stereo displays (require goggles)
• Hardcopy devices
– Laser printers, film, inkjet printers
• Networking

– Image transmission bandwidth


– Good with broadband but consider data coming from Mars

You might also like