VT07
VT07
7H A P T E R
Machine vision is the application of vision does not necessarily mean the use
computer vision to industry and of a computer. Specialized image
manufacturing. It is a specialization processing hardware is even capable of
within system engineering, which achieving a higher processing speed and
encompasses computer science, optics, can replace the computer.1 The modern
mechanical engineering, and industrial approaches may use a camera with the
automation. One definition of machine capability to interface directly with a
vision is “the use of devices for optical, personal computer, a system designed on
noncontact sensing to automatically an image processing board, or a vision
receive and interpret an image of a real engine that plugs into a personal
scene in order to obtain information computer.2
and/or control machines or processes.”1 A smart camera is a self-contained and
For nondestructive inspection, visual standalone unit with communication
inspection is usually performed by an interfaces. A typical smart camera may
experienced inspector. However, a tedious consist of the following components:
and difficult task may cause the inspector image sensor, image digitization circuit,
to tire prematurely and degrade the memory, digital signal processor,
quality of inspection. Repetitive and communication interface, input/output
dangerous inspection demands machine ports and a built-in illumination device.
vision to replace human inspection so An embedded vision computer, which is a
that precise information can be extracted standalone box with frame storage and
and interpreted consistently. With intelligence, is intermediate to the
technological advances on computer, personal computer based vision system
camera, illumination, and and a smart camera.2 The system differs
communication, widespread application from smart cameras, in that the camera is
of machine vision systems to tethered to the unit rather than
nondestructive testing is foreseen. self-contained. Different system
The basic architecture of a personal configurations have their own advantages
computer based machine vision system is for different applications. A personal
given in Fig. 1. The main components computer based machine vision system
include light source, detector, optics, has the greatest flexibility and capability
frame grabber and computer. Machine of handling a wider range of applications.
FIGURE 1. Typical architecture of machine FIGURE 2. Four basic parameters for optics.
vision system.
Light source
Frame
grabber
Cameras
Specimen
Working
distance
Depth
of view Resolution
Computer
Field of
view
Illumination
Illumination can be provided by one or
more of the following techniques: front (b)
lighting, backlighting, coaxial lighting,
structured illumination, strobed
illumination or polarized light.
As illustrated in Fig. 3a, the bright field
mode for front lighting uses any light
source in the line of sight of the camera
upon direct reflection from the test
surface. Matte surfaces will appear darker
than specular surfaces because the
scattering of the matte surface returns less
light to the camera. In contrast, sharp
reflection returns more light. Dark field is
any light source that is outside the line of Bright field
sight of the camera upon direct reflection.
In a dark field, light scattering from a
matte surface will reach the camera and Dark field
create a bright region. Similarly, a bright
field for backlighting is any light source in
the line of sight of the camera upon direct
(b)
FIGURE 5. Illumination: (a) backlighting;
(b) coaxial illumination.
(a)
Light box
(c)
(b)
Fluorescent lighting Illumination using electricity to excite mercury vapor to produce short wave ultraviolet radiation, which causes
a phosphor to fluoresce, producing visible light.
Quartz halogen lamp Incandescent light bulb with envelope made of quartz and with filament surrounded by halogen gas.
Light emitting diode (LED) Semiconductor diode that emits light when electrical current is applied.
Metal halide lamp Lamp that produces light by passing an electric arc through high pressure mixture of argon, mercury and
various metal halides.
Xenon Element used in arc and flash lamps. Xenon arc lamps use ionized xenon gas to produce bright white light;
xenon flash lamps are electric glow discharge lamps that produce flashes of very intense, incoherent, full
spectrum white light.
Sodium Element used in some vapor lamps. Sodium gas discharge lamps use sodium in excited state to produce light.
There are two types: low pressure and high pressure lamps.
G B
FIGURE 7. Pinhole camera: (a) model; (b) transformation
Color wheel between camera frame coordinates and pixel coordinates.14
Image scene
(a) Y
X
(b)
G RG B p
GRGB
GRGB P
G RG B
G RGG
O o
G R GG Z
CCD Optical
GGGB
f axis
G GG B
G RG R
G RG R Image plane
Image scene G BG B
G BG B
(b) Yw
(c) Yc
CCD 1
R
Beam
splitter G Xw
CCD 2 Zc
Xc
Pw Pc
B
Zw
CCD 3
Camera Link® Automated Imaging Association, serial communication protocol that extends base
Ann Arbor, MI technology of Channel Link® for vision application19
USB USB Implementers Forum universal serial bus
GigE Vision® Automated Imaging Association, based on gigabit ethernet standard with fast data
Ann Arbor, MI transfer, allowing standard, long, low cost cables21
IEEE 1394 IEEE, New York, NY Interface standard for high speed communication and
isochronous (real time) data transfer for high
performance and time sensitive applications22
Topology master and slave master and slave networked, peer to peer
(“on the fly”) peer to peer
Maximum bit ratec 2380 Mbps 480 Mbps 1000 Mbps ~400 ~800 Mbps
Isochronous mode yes yes no yes
Maximum sustained bit rate 2380 Mbps 432 Mbps 930 Mbps ~320 to ~640 Mbps
Cable distance (copper) 10 m 5m 25 m ~4.5 to ~100 m
Bus power none up to 0.5 A none up to 1.5 A
a. USB = universal serial bus
b. IEEE = IEEE [formerly Institute of Electrical and Electronics Engineers], New York, NY
c. Mbps = 106 bits per second
0 0 0
(b)
0 0 0
0 1 0
0 –1 0
Image I(x,y) Operator w(i,j)
1 –4 1
Noise Reduction
In image data, there are various noises of
0 1 0
which the noise called salt and pepper is
typical. Such noises are expressed by
(b)
1 1 1
FIGURE 10. Operators for edge detection:
(a) roberts operators; (b) prewitt operators;
(c) sobel operators. 1 –8 1
(a)
1 1 1
0 0 0 0 0 0
0 1 0 0 0 1 (c)
–1 2 –1
0 0 –1 0 –1 0
2 –4 2
(b)
–1 2 –1
–1 0 1 –1 –1 –1
–1 0 1 0 0 0
(c)
1/9 1/9 1/9
–1 0 1 –1 –2 –1
Frequency
4 4 3
2 10 3 Sorting 4
P percent
5 2 4
2 2 3 3 4 4 4 5 10
(11) X ⊕ Y = {z z = x + y for x ∈ X , y ∈ Y }
σ 2bc σ 2bc
(10) =
σ 2wc σ 2 − σ 2bc
FIGURE 16. Examples of dilation with different structuring
elements: (a) 2 × 2 set; (b) four connected sets.
Because σ2 is fixed, when σbc2 reaches
its maximum value, the separation metric (a)
is maximized. Therefore, in the Image X
discriminant analysis technique, σbc2 is
calculated by changing the value of t and
searching for the adequate threshold in
case of maximum σbc2 .
By using discriminant analysis, the
threshold is uniquely estimated for any
monochrome images. Although only the
case of binarization (two classes, black
and white) is demonstrated, this
technique can also be applied to estimate
multiple thresholds. Origin
Structuring element Y
FIGURE 15. Discriminant analysis technique.
(b)
Image X
Frequency
Class 2
Class 1
Origin
t Intensity
ω1, m1, σ1 ω2, m2, σ2 Structuring element Y
(a)
Image X
(c) x
Origin
Structuring element Y
(b) Image X
(d) x
Origin
Structuring element Y y
⎡ x′⎤ ⎡1 0 t ⎤ ⎡ x ⎤
⎢ x⎥
⎢ ⎥ ⎢ ⎥
(17) ⎢ y ′⎥ = ⎢0 1 t y ⎥ ⎢y ⎥
⎢⎣ 1 ⎥⎦ ⎢ ⎥⎢ ⎥
⎣0 0 1 ⎦ ⎣1⎦
θp
Equation 18 gives rotation as an angle
from the X axis around the origin
(Fig. 18d):
y
(b)
⎡ x′⎤ ⎡cos θ − sin θ 0 ⎤ ⎡ x⎤ x
⎢ ⎥ ⎢ ⎥⎢ ⎥
(18) ⎢ y ′⎥ = ⎢ sin θ cos θ 0 ⎥ ⎢ y ⎥ θq
⎢ 1 ⎥⎦
⎣
⎢ 0
⎣ 0 1 ⎥⎦ ⎢⎣1⎥⎦
⎡ x′⎤ ⎡1 p 0 ⎤ ⎡ x ⎤
⎢ ⎥ ⎢ ⎥⎢ ⎥
(19) ⎢ y ′⎥ = ⎢ q 1 0 ⎥ ⎢ y ⎥
⎢⎣ 1 ⎥⎦ ⎢⎣0 0 1 ⎥⎦ ⎢⎣1⎥⎦
t w −1 t h −1
(25) RNCC ( x, y ) = ∑ ∑ ⎡⎣I (x + i, y + j)
i =0 j =0
× T (i , j )⎤⎦
⎡ t w −1 t h −1
⎢ 2
÷ ⎢
⎢
∑ ∑ I ( x + i , y + j)
⎣ i =0 j =0
y
t w −1 th −1 ⎤
2⎥
(b)
x × ∑ ∑ T (i , j ) ⎥
⎥
i=0 j= 0 ⎦
(c) x
Tw
Th
y=x
Image I(x,y) Template
y T(i,j)
(27) b = ( −x ) a + y
^ ^
(a, b)
This formulation shows another straight
line in a-b space. Here, the a-b space is
called the parameter space. b = (–x2) a + y2
As shown in Fig. 22a, when a line
a
crosses two points (x1,y1) and (x2,y2) in x-y 0
(32) − w 2 + h2 ≤ ρ ≤ w 2 + h2
^
ρ
^
θ
(33) 0 ≤ θ ≤ π x
0
where: ^
ρ,^
θ
and where: θ
0
x1
(36) α = tan −1
y1
(a)
High frequency
High pass emphasized image
filter
Live image
Low pass
Fraction
filter
(a)
Input Output
(b)
LH HH Image
Feature vectors
HH = high high
HL = high low Image preprocessing:
LH = low high · enhancement;
LL = low low · denoising;
· segmentation;
(b) · others
Image
Wavelet transform
Feature extraction:
· spatial domain;
· transform domain
Feature extraction
Classification
Classification
Postprocessing Postprocessing
Result Result
1. Batchelor, B.G. and P. F. Whelan. 13. Charge Injection Device Research at RIT.
Intelligent Vision Systems for Industry. Web site. Rochester, NY: Rochester
Bruce G. Batchelor, Cardiff, United Institute of Technology, Center for
Kingdom; Paul F. Whelan, Dublin, Imaging Science (2009).
Republic of Ireland (2002). 14. Trucco, E. and A. Verri. Introductory
2. Zuech, N. “Smart Cameras vs. Techniques for 3-D Computer Vision.
PC-Based Machine Vision Systems.” Upper Saddle River, NJ :Prentice Hall
Machine Vision Online. Ann Arbor, MI: (1998).
Automated Imaging Association 15. Bouguet, J.-Y. Camera Calibration
(October 2002). Toolbox for Matlab. Web pages.
3. Zuech, N. “Optics in Machine Vision Pasadena, CA: California Institute of
Applications.” Machine Vision Online. Technology (2009).
Ann Arbor, MI: Automated Imaging 16. Wang, J., F. Shi, J. Zhang and Y. Liu.
Association (August 2005). “A New Calibration Model of Camera
4. Fales, G. “Ten Lens Specifications You Lens Distortion.” Pattern Recognition.
Must Know for Machine-Vision Vol. 41, No. 2. Amsterdam,
Optics.” Test and Measurement World. Netherlands: Elsevier, for Pattern
Web page. Waltham, MA: Reed Elsevier Recognition Society (February 2008):
(27 October 2003). p 607-615.
5. “What is Structured Light?” Web page. 17. IEEE 1394, High-Performance Serial Bus.
Salem, NH: StockerYale (2009). New York, NY: IEEE (2008).
6. Casasent, D.[P.], Y.F. Cheu and 18. Wilson, A. “Camera Connections.”
D. Clark. Chapter 4: Part 4, “Machine Vision Systems Design. Tulsa, OK:
Vision Technology.” Nondestructive PennWell Corporation (April 2008).
Testing Handbook, second edition: 19. Specifications of the Camera Link
Vol. 8, Visual and Optical Testing. Interface Standard for Digital Cameras
Columbus, Ohio: American Society of and Frame Grabbers. Ann Arbor, MI:
Nondestructive Testing (1993): Automated Imaging Association
p 92-107. (Annex D, 2007).
7. Forsyth, D.A. and J. Ponce. Computer 20. “Universal Serial Bus,” Web site.
Vision: A Modern Approach. Upper Beaverton, OR: USB Implementers
Saddle River, NJ: Prentice Hall (2002). Forum (2008).
8. Martin, D. Practical Guide to Machine 21. “GigE Vision®.” Web page. Machine
Vision Lighting. Web pages. Austin, TX: Vision Online. Ann Arbor, MI:
National Instruments Corporation Automated Imaging Association
(November 2008). (2009).
9. Hainaut, O.R. “Basic Image 22. 1394 Technology. Web page. Southlake,
Processing.” Web pages. Santiago, TX: 1394 Trade Association (2008).
Chile: European Organisation for 23. Sgro, J. “USB Advantages Offset Other
Astronomical Research in the Southern Interfaces.” Vision Systems Design.
Hemisphere, European Southern Tulsa, OK: PennWell Corporation
Observatory (December 1996). (September 2003).
10. Users Manual MTD/PS-0218, Kodak 24. Jain, R., R. Kasturi and B.G. Schunck.
Image Sensors. Revision 2.0. Rochester, Machine Vision. New York, NY:
NY: Eastman Kodak (July 2008). McGraw-Hill (1995).
11. Peterson, C. “How It Works: The 25. Gonzalez, R.C. and R.E. Woods. Digital
Charged-Coupled Device, or CCD.” Image Processing: Upper Saddle River,
Journal of Young Investigators. Vol. 3, NJ: Prentice Hall (2002).
No. 1. Durham, NC: Journal of Young 26. Otsu, N. “A Threshold Selection
Investigators, Incorporated (March Method from Gray-Level Histograms.”
2001). SMC-9, IEEE Transactions on Systems,
12. Litwiller, D. “CCD vs. CMOS: Facts Man, and Cybernetics. Vol. 9, No. 1.
and Fiction.” Photonics Spectra. Vol. 35, New York, NY: IEEE
No. 1. Pittsfield, MA: Laurin (January 1979): p 62-66.
Publishing (January 2001): p 154-158.