0% found this document useful (0 votes)
108 views48 pages

3 Test and Calibration Standards

The document discusses standards of measurement and calibration. It defines key organizations that establish standards like the ISO and NBS. There are different types of standards including international, primary, secondary, and working standards. Primary standards are maintained by national labs and are used to calibrate secondary standards. Working standards are used to calibrate instruments in labs. Instruments must be calibrated periodically against standards, and there are procedures to ensure traceability of calibrations.

Uploaded by

umarsabo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
108 views48 pages

3 Test and Calibration Standards

The document discusses standards of measurement and calibration. It defines key organizations that establish standards like the ISO and NBS. There are different types of standards including international, primary, secondary, and working standards. Primary standards are maintained by national labs and are used to calibrate secondary standards. Working standards are used to calibrate instruments in labs. Instruments must be calibrated periodically against standards, and there are procedures to ensure traceability of calibrations.

Uploaded by

umarsabo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

ELE8313

Test and calibration standards


Terms and Definitions
 ISO (International Standards Organization): A Paris based
international standards agency that establishes metric standards
 IBWM (International Bureau of Weights and Measures): The
official headquarters of the ISO metric system of weights and
measures; it is based in Paris, France.
 SI (International System of Units): The standard metric measures
used worldwide, and officially abbreviated SI
 IEC (International Electrotechnical Commission): An
international commission that sets and disseminates working
electrical standards from its headquarters in Geneva, Switzerland.

2
Terms and Definitions
 IEEE (Institute of Electrical and Electronic Engineers): An
American technical society that works with NBS and Industry
to establish electrical/electronic standards.
 NBS (National Bureau of Standards): The official United
States agency that establishes and maintains standards,
including primary standards or replicas used in America.

3
Standards of Measurement
 A standard of Measurement is a physical representation of a
unit of measurement. A unit is realized by reference to an
arbitrary material standard or to a natural phenomena
including physical and atomic constants.

4
Standards of Measurement
 Standards of measurement are categorized as follows:
 International standards
 Primary standards
 Secondary standards
 Working standards

5
Standards of Measurement
 International standards are defined by international
agreement.
 They represent certain units of measurement to the closest
accuracies possible within the limits of production and
measurement technology.
 They are periodically evaluated and checked by absolute
measurements in terms of the fundamental units.
 These standards are maintained at the International Bureau of
Weights and Measures and are not available to the ordinary
user of measuring instruments for purposes of comparison or
calibration.

6
Standards of Measurement
 Primary standards are maintained by national standard laboratories
in different parts of the world.
 The primary standards representing fundamental units and some
of the derived mechanical and electrical units, are independently
calibrated by absolute measurements at each of the national
laboratories.
 The results of these measurements are compared against each
other, leading to a world average figure for the primary standard.
 Primary standards are not available for use outside the national
laboratories.
 Primary standards are used to verify and calibrate secondary
standards.

7
Standards of Measurement
 Secondary standards are the basic reference standards used in
industrial measurement laboratories.
 These standards are maintained by the particular involved
industry and are checked locally against other reference
standards in the area.
 Secondary standards are periodically sent to the national
laboratories for calibration and comparison against the
primary standards.

8
Standards of Measurement
 Working standards are the principal tools of a measurement
laboratory.
 They are used to check and calibrate general laboratory
instruments for accuracy and performance or to perform
comparison measurements in industrial applications.

9
Calibration of measuring instruments
 Calibration consists of comparing the output of the
instrument or sensor under test against the output of an
instrument of known accuracy when the same input (the
measured quantity) is applied to both instruments.
 Instruments used as a standard in calibration procedures are
usually chosen to be of greater inherent accuracy than the
process instruments that they are used to calibrate.
 Instrument calibration has to be repeated at prescribed
intervals because the characteristics of any instrument change
over a period.

10
Control of Calibration Environment
 Any instrument that is used as a standard in calibration
procedures must
 be kept solely for calibration duties and must never be used for
other purposes.
 not be regarded as a spare instrument that can be used for
process measurements if the instrument normally used for that
purpose breaks down.

11
Control of Calibration Environment
 A room should always be set aside for calibration.
 Calibration should be assigned to just one professional who
shall have total control over the calibration function.
 Calibration procedures that relate in any way to
measurements that are used for quality control functions are
controlled by the international standard ISO 9000.
 One of the clauses in ISO 9000 requires that all persons using
calibration equipment be adequately trained and certified.

12
Calibration chain and traceability

13
Calibration chain and traceability
 When the working standard instrument has been calibrated
by an authorized standards laboratory, a calibration certificate
containing at least the following information will be issued:
 the identification of the equipment calibrated
 the calibration results obtained
 the measurement uncertainty
 any use limitations on the equipment calibrated
 the date of calibration
 the authority under which the certificate is issued.

14
Calibration chain and traceability
 The establishment of a company Standards Laboratory is only
viable for large companies.
 All of the elements in the calibration chain must be known so
that the calibration of process instruments at the bottom of
the chain is traceable to the fundamental measurement
standards.
 This knowledge of the full chain of instruments involved in
the calibration procedure is known as traceability, and is
specified as a mandatory requirement in satisfying the ISO
9000 standard.

15
Calibration records

16
Calibration and re-ranging
 To calibrate an instrument means to check and adjust (if
necessary) its response so the output accurately corresponds to its
input throughout a specified range.
 In order to do this, one must expose the instrument to an actual
input stimulus of precisely known quantity.
 To range an instrument means to set the lower and upper range
values so it responds with the desired sensitivity to changes in
input.
 For example, a pressure transmitter set to a range of 0 to 200 PSI
(0 PSI = 4 mA output ; 200 PSI = 20 mA output) could be re-
ranged to respond on a scale of 0 to 150 PSI (0 PSI = 4 mA ; 150
PSI = 20 mA).

17
Calibration and re-ranging
 In analog instruments, re-ranging could (usually) only be
accomplished by re-calibration, since the same adjustments
were used to achieve both purposes.
 In digital instruments, calibration and ranging are typically
separate adjustments.

18
19
20
21
22
23
24
25
26
27
28
29
30
Zero and span adjustments (analog
transmitters)

This graph shows


how any given
percentage of
input should
correspond to the
same percentage
of output, all the
way from 0% to
100%.

31
Zero and span adjustments (analog
transmitters)
 y = mx + b y = 0.16x + 4
 On the actual instrument (the pressure transmitter), there
are two adjustments which let us match the instrument’s
behavior to the ideal equation.
 The Zero adjustment (b) shifts the instrument’s function
vertically on the graph, while the span adjustment (m)
changes the slope of the function on the graph.
 For most analog instruments, these two adjustments are
interactive.

32
Zero and span adjustments (analog
transmitters)

Although the
graph is still
linear, zero
pressure does not
equate to zero
current. This is
called
a live zero

33
Damping adjustments
 The vast majority of modern process transmitters (both
analog and digital) come equipped with a feature known as
damping.
 Damping is essentially a low-pass filter function placed in-
line with the signal, reducing the amount of process “noise”
reported by the transmitter.
The flow of water exiting a
pump tends to be extremely
turbulent, and any pressure-
sensing device connected to
the immediate discharge port
of a pump will interpret this
turbulence as violent
fluctuations in pressure.
34
Damping adjustments

If successfully applied
to a process
transmitter, such low-
pass filtering has the
effect of “quieting” an
otherwise noisy signal
so only the real process
pressure changes are
seen, while the effect of
turbulence (or
whatever else was
causing the noise)
becomes minimal.
35
LRV and URV Settings, Digital Trim
 The advent of “smart” field instruments containing
microprocessors has been a great advance for industrial
instrumentation.
 These devices have
i. built-in diagnostic ability
ii. greater accuracy (due to digital compensation of sensor
nonlinearities)
iii. ability to communicate digitally with host devices for
reporting of various parameters.

36
LRV and URV Settings, Digital Trim

37
LRV and URV Settings, Digital Trim

38
LRV and URV Settings, Digital Trim
The only way
anyone would ever
know this
transmitter was
inaccurate at 100
PSI is to actually
apply a known value
of 100 PSI fluid
pressure to the
sensor and note the
incorrect response.
39
Calibration Procedures: Linear
Instruments
 Zero-and-span method
1. Apply the lower-range value stimulus to the instrument,
wait for it to stabilize.
2. Move the “zero” adjustment until the instrument registers
accurately at this point.
3. Apply the upper-range value stimulus to the instrument,
wait for it to stabilize.
4. Move the “span” adjustment until the instrument registers
accurately at this point.
5. Repeat steps 1 through 4 as necessary to achieve good
accuracy at both ends of the range.

40
Calibration Procedures: Linear
Instruments
 An improvement over this crude procedure is to check the
instrument’s response at several points between the lower-
and upper-range values.
 Yet another improvement over the basic five-point test is to
check the instrument’s response at five calibration points
decreasing as well as increasing. (up-down calibration).
 Check for hysteresis.

41
Calibration Procedures: Nonlinear
instruments
 Every nonlinear instrument will have its own recommended
calibration procedure.
Refer to the manufacturer’s literature for
your specific instrument.

42
Typical calibration errors
 A zero shift calibration error shifts the function vertically on
the graph.
 This error affects all calibration points equally, creating the
same percentage of error across the entire range:

43
Typical calibration errors
 A span shift calibration error shifts the slope of the function.
 This error’s effect is unequal at different points throughout
the range:

44
Typical calibration errors
 A linearity calibration error causes the function to deviate
from a straight line.

45
Typical calibration errors
 A hysteresis calibration error occurs when the instrument
responds differently to an increasing input compared to a
decreasing input.

46
Typical calibration errors
 Hysteresis errors are almost always caused by mechanical
friction on some moving element (and/or a loose coupling
between mechanical elements) such as bourdon tubes,
bellows, diaphragms, pivots, levers, or gear sets.
 In practice, most calibration errors are some combination of
zero, span, linearity, and hysteresis problems.

47
Further Reading
 Modern Electronic Instrumentation and Measurement
Techniques by Helfrick and Cooper
 Measurement and Instrumentation Principles by AS Morris
 Lessons In Industrial Instrumentation by Tony R. Kuphaldt

48

You might also like