Chapter 1 - Introduction and Basic Concepts
Chapter 1 - Introduction and Basic Concepts
Customer requirement/
specification
Product Design
Information from
measurement
Manufacture
Quality Inspection
Digital micrometer
Profile projector
1.3 Calibration
Calibration is the comparison of the measuring instruments reading with the reading of a
standard instrument when the same input is given to both. Calibration is necessary after the
instrument is used for a long time because the reading can change over time due to several factors,
such as deterioration of the internal components of the instrument. Calibration can be carried out
in-house or the instruments can be sent to specialized metrology laboratories that carry out such
services.
1.4 Precision, accuracy and error
Two very important terms in the science of measurement are precision and accuracy. Precision of
a measuring instrument is the degree of repeatability of the readings when measurement of the same
standard is repeated. Meanwhile, accuracy is the degree of closeness of the measured values with the
actual dimension.
The difference between precision and accuracy becomes clear when we refer to the example
shown in Figure 1.3(a)-(c). Assume that we take six measurements on a standard block of dimension
20.00 mm using three different instruments. In Figure 1.3(a) the readings taken have small deviation
from one another, but the mean reading deviates greatly from the actual dimension of 20.00 mm.
This illustrates an instrument having high precision but low accuracy. Figure 1.3(b) shows the
characteristics of an instrument having low precision but high accuracy. The precision is low
because the readings deviate greatly from one another, but the accuracy is high because the mean
value is close to the actual value. Figure 1.3(c) shows a case where the instrument has both high
2
Accuracy
Associated with the
instrument
Known before the
measurement
Has both positive
and negative signs
Intrinsic part of an
instrument
Readings taken
(a)
Error
Mean value
Actual value
20.00 mm
22.02 mm
21.12 mm
22.32 mm
22.95 mm
22.41 mm
21.02 mm
High precision
Low accuracy
Reading number
Readings taken
(b)
Error
Mean value
Actual value
20.00 mm
Low precision
High accuracy
22.02 mm
19.32 mm
22.12 mm
23.95 mm
19.20 mm
18.90 mm
Reading number
Readings taken
Mean value
(c)
Error
Actual value
20.00 mm
High precision
High accuracy
21.15 mm
19.30 mm
20.22 mm
20.85 mm
21.05 mm
19.90 mm
Reading number
Cost
Accuracy
High instrument accuracy can be obtained if the sources of errors caused by the following
elements can be reduced or eliminated:
(a) Calibration standard
Examples of factors affecting calibration standards are environmental effects, stability with
respect to time and elastic properties of materials. The most serious environmental effect is
difference in temperature between the environment where the instrument is used and the standard
temperature at which the instrument was calibrated. The calibration standard should also be stable
with respect to time, i.e. no changes in dimensions should occur although the standard has been
used for a long time. Elastic properties especially in length standards that can deflect under its own
weight must be considered in accurate measurement.
(b) Workpiece being measured
Among the most important factors that influence the workpiece are environmental factors
such as temperature, condition of the workpiece surface and elastic properties of the workpiece
material. Workpiece surface that is not perfectly flat or dirty will influence the measurement taken.
If the workpiece is made from a material that has high elasticity there is possibility that the
workpiece will deflect due to the pressure applied by the measuring instrument. The deflection will
affect the readings taken. This is known as loading error.
(c) Measuring instrument
The measuring instrument can be influenced by hysteresis effects, backlash, friction, zero
drift error, inadequate amplification and calibration errors. Hysteresis effect can be observed if the
reading shown by the instrument for a given input shows a difference depending on whether the
reading was reached in increasing or decreasing order. The result is a hysteresis loop that represents
measurement error. Hysteresis error is the maximum separation of the increasing and decreasing
readings as illustrated in Figure 1.6.
Backlash error usually occurs in instruments that use a screw mechanism to move a
contacting point (stylus), such as a micrometer. The screw rotates slightly before moving the
internal parts and the stylus. This small rotation causes error in the measurement. Zero drift error
refers to the zero error that occurs when an instrument is used for a long time.
Value
Decreasing
Hysteresis
Error
Increasing
Readings
Resolution = 0.01 in
Gage B
Gage A
I.
II.
III.
IV.
Question 1.2
The precision and accuracies of three digital calipers A, B and C were compared by calculating the
mean and standard deviation of a set of readings taken using each caliper. Each set of reading was
recorded by repeating the measurement on a 20mm block gage five times. The readings are
tabulated as follows:
Dial caliper
A
B
C
Readings (mm)
20.005, 20.080, 19.995, 20.016, 20.055
20.151, 20.155, 20.149, 20.145, 20.160
19.233, 19.440, 20.002, 21.024, 19.028
(x
x)
n 1