Uc11 Level IV
Uc11 Level IV
LEARNING OUTCOMES:
SHOCK. Electric shock occurs when the human body becomes part of a path through which electrons can flow.
The resulting effect on the body can be either direct or indirect.
Direct. Injury or death can occur whenever electric current flows through the human body.
Currents of less than 30 mA can result in death.
A thorough coverage of the effects of electricity on the human body is
contained in the section of this module entitled Effects of Electricity on the Human Body.
Indirect. Although the electric current through the human body may be well below the values required to cause
noticeable injury, human reaction can result in falls from ladders or scaffolds, or movement into operating
machinery. Such reaction can result in serious injury or death.
BURNS. Burns can result when a person touches electrical wiring or equipment that is improperly used or
maintained. Typically, such burn injuries occur on the hands.
Thermal Radiation. In most cases, the radiated thermal energy is only part of the total energy available from the
arc. Numerous factors, including skin color, area of skin exposed, type of clothing have an effect on the degree of
injury.
Proper clothing, work distances and over current protection can improve the chances of curable burns.
Definition of calibration
WHAT IS CALIBRATION?
There are as many definitions of calibration as there are methods.
According to ISA’s The Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as
“a test during which known values of measured are applied to the transducer and corresponding output
readings are recorded under specified conditions.
” The definition includes the capability to adjust the instrument to zero and to set the desired span.
An interpretation of the definition would say that a calibration is a comparison of measuring equipment against a
standard instrument of higher accuracy to detect, correlate, adjust, rectify and document the accuracy of the
instrument being compared.
Typically, calibration of an instrument is checked at several points throughout the calibration range of the
instrument.
The calibration range is defined as “the region between the limits within which a quantity is measured, received or
transmitted, expressed by stating the lower and upper range values.
For example, an electronic pressure transmitter may have a nameplate instrument range of 0–750 pounds per
square inch, gauge (psig) and output of 4-to-20 milliamps (mA).
However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA.
Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA.
In this example, the zero input value is 0 psig and zero output value is 4 mA.
The input span is 300 psig and the output span is 16 mA.
Different terms may be used at your facility.
Just be careful not to confuse the range the instrument is capable of with the range for which the instrument has
been calibrated.
Calibration
Calibration affords the opportunity to check the instrument against a known standard and subsequently to reduce
errors in accuracy.
Calibration, in its most basic form, is the measuring of an instrument against a standard.
As instruments become more complicated, successfully identifying and applying best practices can reduce business
expenses and improve organizational capabilities.
What is Calibration?
Calibration is the comparison of a measurement device (an unknown) against an equal or better standard.
A standard in a measurement is considered the reference; it is the one in the comparison taken to be the more
correct of the two.
Calibration finds out how far the unknown is from the standard.
A “typical” commercial calibration uses the manufacturer’s calibration procedure and is performed with a
reference standard at least four times more accurate than the instrument under test.
Why Calibrate?
Calibration can be an insurance policy because out-of-tolerance (OOT) instruments may give false information
leading to unreliable products, customer dissatisfaction and increased warranty costs.
In addition, OOT conditions may cause good products to fail tests, which ultimately results in unnecessary rework
costs and production delays.
Calibration Terms
As found data—The reading of the instrument before it is adjusted.
As left data—The reading of the instrument after adjustment or “same as found,” if no adjustment was made.
Optimization—Adjusting a measuring instrument to make it more accurate is NOT part of a typical calibration
and is frequently referred to as “optimizing” or “nominal zing” an instrument.
Limited calibration— It may be more cost effective to have a limited calibration when only certain functions of
an instrument are not utilized by the user.
Test uncertainty ratio (TUR)—This is the ratio of the accuracy of the instrument under test compared to the
accuracy of the reference standard.
Without data—Most calibration labs charge more to provide the certificate with data and will off er a “no-data”
option.
Here are some of the requirements: ISO 9001:2008 Calibration (International Organization for Standardization) -
This type of calibration is crucial for many industries and has the following requirements (in alphabetical order):
Accredited calibration lab—The calibration laboratory must be ISO 9001:2008 accredited or be the original
equipment manufacturer.
Comprehensive equipment list—To pass the ISO audit, the company must demonstrate that it has a
comprehensive equipment list with controls in place for additions, subtractions and custodianship of equipment.
Calibrated and no calibration required items properly identified— The equipment list must identify any units
that do not require calibration, and controls must be in place to ensure that these units are not used in an
application that will require calibration.
An OOT investigation log—For any instrument found OOT, an investigation must be performed and recorded.
Proper documentation—All critical aspects of the calibration must be properly documented for the certificate to
be recognized by an ISO auditor.
Proper recall system—A procedure should be established that includes timeframes for recall notification, an
escalation procedure and provisions for due-date extension.
Instrument error can occur due to a variety of factors: drift ,Environment, electrical supply, addition of components
to the output loop, process changes, etc.
Since a calibration is performed by comparing or applying a known signal to the instrument under test, errors are
detected by performing a calibration.
An error is the algebraic difference between the indication and the actual value of the measured variable.
Typical errors that occur include Zero and span errors are corrected by performing a calibration.
Most instruments are provided with a means of adjusting the zero and span of the instrument, along with
instructions for performing this adjustment.
The zero adjustment is used to produce a parallel shift of the input-output curve.
The span adjustment is used to change the slope of the input-output curve.
Linearization error may be corrected if the instrument has a linearization adjustment
TTLM Development Date: 2015
Page 4
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
. If the
magnitude of the nonlinear error is unacceptable and it cannot be adjusted, the instrument must be replaced.
Even if a periodic calibration reveals the instrument is perfect and no adjustment is required, we would not have
known that unless we performed the calibration.
And even if adjustments are not required for several consecutive calibrations, we will still perform the calibration
check at the next scheduled due date.
Periodic calibrations to specified tolerances using approved procedures are an important element of any quality
system.
Measuring instruments must be retraced to national standards at regular intervals by means of calibration, and if
necessary adjusted, and plainly labeled with their calibration status.
If it is determined during calibration that the measuring instrument does not fulfill the specified requirements, the
operating company must evaluate the validity of previously obtained measurement results and implement
appropriate measures with regard to the measuring instrument itself, as well as all affected products.
A CST performs calibration, documentation, loop checks, troubleshooting, and repair or replacement of
instrumentation.
These tasks relate to systems that measure and control level, temperature, pressure, flow, force, power, position,
motion, physical properties, chemical composition and other process variables
.
Calibration Intervals
Time between any two calibrations of measuring and test instruments is known as the calibration interval and
must be established and monitored by the user in accordance with his own requirements.
For a loop indicator, the input would be a 4-20 mA current signal and the output would be a human-readable
display.
For a variable-speed motor drive, the input would be an electronic signal and the output would be electric power to
the motor.
Calibration and ranging are two tasks associated with establishing an accurate correspondence between any
instrument’s input signal and its output signal.
In order to do this, one must expose the instrument to an actual input stimulus of precisely known quantity.
For a pressure gauge, indicator, or transmitter, this would mean subjecting the pressure instrument to known fluid
pressures and comparing the instrument response against those known pressure quantities.
One cannot perform a true calibration without comparing an instrument’s response to known, physical stimuli.
To range an instrument means to set the lower and upper range values so it responds with the desired sensitivity to
changes in input.
For example, a pressure transmitter set to a range of 0 to 200 PSI (0 PSI = 4 mA output ; 200 PSI = 20 mA output)
could be re-ranged to respond on a scale of 0 to 150 PSI (0 PSI = 4 mA ; 150 PSI = 20 mA).
In analog instruments, re-ranging could (usually) only be accomplished by re-calibration, since the same
adjustments were used to achieve both purposes.
In digital instruments, calibration and ranging are typically separate adjustments (i.e. it is possible to re-range a
digital transmitter without having to perform a complete recalibration), so it is important to understand the
deference.
Calibration Of Pressure
Pressure-sensing devices are calibrated at the factory.
In cases where a sensor is suspect and needs to be recalibrated, the sensor can be returned to the factor for
recalibration, or it can be compared to a known reference.
Calibration Of flow
TTLM Development Date: 2015
Page 6
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Flow meters need periodic calibration.
This can be done by using another calibrated meter as a reference or by using a known flow rate.
Accuracy can vary over the range of the instrument and with temperature and specific weight changes in the fluid,
which may all have to be taken into account.
Thus, the meter should be calibrated over temperature as well as range, so that the appropriate corrections can be
made to the readings.
A spot check of the readings should be made periodically to check for instrument drift that may be caused by the
instrument going out of calibration, particulate build up, or erosion.
Calibration of temperature
Temperature calibration can be performed on most temperature sensing devices by immersing them in known
temperature standards which are the equilibrium points of solid/liquid or liquid/gas mixtures, which is also known
as the triple point.
Most temperature sensing devices are rugged and reliable, but can go out of calibration due to leakage during use
or contamination during manufacture and should therefore be checked on a regular basis etc.
The measured value obtained from a measuring instrument is thus compared with the known value of the test
standard under specified reference conditions using reproducible measuring procedures.
Calibration does not involve any manipulation of the measuring instrument, which remains entirely unchanged.
Adjustment involves the correction or balancing of a measuring instrument in order to eliminate systematic
measurement deviation.
The measured value obtained from a measuring instrument is thus adjusted to match the known value of the test
standard under specified reference conditions.
Adjustment always involves manipulation, which permanently changes the measuring instrument.
Retrace ability of a calibration procedure means that the calibration sequence is reproducibly documented from
the individual device under test all the way up to the national standard for the respective measured quantity.
Retrace ability of measurement results is assured by a country’s metrological infrastructure.
Consequently, calibration at regular intervals assures the quality of the respective product or service on the basis of
internationally comparable measurement results.
This provides for legal security with respect to product liability, as well as for approval tests and audits.
Due to its assured retrace ability to national test standards
Purpose of a calibration
TTLM Development Date: 2015
Page 7
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
There are three main reasons for having instruments calibrated:
1. To ensure readings from an instrument are consistent with other measurements.
2. To determine the accuracy of the instrument readings.
3. To establish the reliability of the instrument i.e. that it can be trusted.
Such measurements allow manufacturing processes to be kept in control from one day to the next and from one
factory to another.
Manufacturers and exporters require such measurements to know that they will satisfy their clients’ specifications.
For example we may want to determine whether the diameter of a lawn mower shaft is too big, too small or just
right.
Our aim is to balance the cost of rejecting good shafts and of customer complaints if we were to accept faulty
shafts, against the cost of an accurate but over engineered measurement system.
When making these decisions the uncertainty in the measurement is as important as the measurement itself.
The uncertainty reported on your certificate is information necessary for you to calculate the uncertainty in your
measurements
With such an instrument, where corrections and uncertainties are negligible, the user simply wants to know that the
instrument is reliable.
Unfortunately a large number of instruments are not
For these reasons the international measurement community establishes documentary standards (procedures) that
define how such quantities are to be measured so as to provide the means for comparing the quality of goods or
ensuring that safety and health requirements are satisfied.
Traceability is ensured only if these three factors are present in the measurement process
Calibration Principles
TTLM Development Date: 2015
Page 9
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Calibration is the activity of checking, by comparison with a standard, the accuracy of a measuring instrument
of any type. It may also include adjustment of the instrument to bring it into alignment with the standard.
Even the most precise measurement instrument is of no use if you cannot be sure that it is reading accurately – or,
more realistically, that you know what the error of measurement is.
Let’s begin with a few definitions:
Calibration range – the region between the within which a quantity is measured, received or transmitted which is
expressed by stating the lower and upper range values.
For example, an electronic pressure transmitter may have an instrument range of 0–750 psig and output of 4-to-20
milliamps (mA).
However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA.
Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA. In this example, the zero input
value is 0 psig and zero output value is 4 mA.
The input span is 300 psig and the output span is 16 mA.
Be careful not to confuse the range the instrument is capable of with the range for which the instrument has been
calibrated.
Ideally a product would produce test results that exactly match the sample value, with no error at any point within
the calibrated range.
This line has been labeled “Ideal Results”.
However, without calibration, an actual product may produce test results different from the sample value, with a
potentially large error.
Calibrating the product can improve this situation significantly.
During calibration, the product is “taught” using the known values of Calibrators 1 and 2 what result it should
provide.
The process eliminates the errors at these two points, in effect moving the “Before Calibration” curve closer to
the Ideal Results line shown by the “After Calibration”
Accuracy - the ratio of the error to the full scale output or the ratio of the error to the output, expressed in percent
span or percent reading, respectively.
Tolerance - permissible deviation from a specified value; may be expressed in measurement units, percent of
span, or percent of reading.
It is recommended that the tolerance, specified in measurement units, is used for the calibration requirements
performed at your facility.
By specifying an actual value, mistakes caused by calculating percentages of span or reading are eliminated.
Also, tolerances should be specified in the units measured for the calibration.
Calibration tolerances should be determined from a combination of factors.
These factors include:
Requirements of the process
Capability of available test equipment
Consistency with similar instruments at your facility
Manufacturer’s specified tolerance
The term Accuracy Ratio was used in the past to describe the relationship between the accuracy of the test
standard and the accuracy of the instrument under test.
A good rule of thumb is to ensure an accuracy ratio of 4:1 when performing calibrations.
This means the instrument or standard used should be four times more accurate than the instrument being checked.
In other words, the test equipment (such as a field standard) used to calibrate the process instrument should be four
times more accurate than the process instrument.
Standard Addition
•Known amounts of analyte are added to aliquots of sample
•Signals are measured as a function of concentration added
•Accounts for sample matrix, but not for instrumental drift
Internal Standard
•A substance known as an “internal standard” is added to samples and standards (chemically similar to analyte)
•Used to correct for drift(changes in sensitivity over time) and matrix effects(sample-related changes in sensitivity)
Exercise 1 Determine average peak area, standard deviation of measurement (from 3 data points), and graph
concentration (ppm) vs. the average peak area - include error bars (std dev) Sulfate data (collected by Ion
Chromatography)
Model Calibration
1. What is a model?
2. What is calibration?
3. Why calibrate?
4. Decision-making and calibration
What is Model?
Anything used in anyway to represent something else”
That’s why “fashion models” are called “models”.
The models are being used to represent something else
– specifically you - the buyer of clothes.
Applications
All fields of endeavor: Engineering, Economics, Biology, Geology, Physics, Psychology have models that
must be calibrated.
Transportation
Travel Demand models
Emissions models
Capacity models
Safety models
…
What is “calibration”?
Reducing model error is a process
Validation is a process to determine that a model is an accurate representation of the real system.
Validation is usually achieved through the calibration of the model, an iterative process of comparing the model to
actual system behavior and using the discrepancies between the two, and the insights gained, to improve the
model. This process is repeated until model accuracy is judged to be acceptable.
Calibration and validation – process that modifies the Simplified Abstract model to best match the Complex
Reality such that Error is minimized.
What is calibration?
TTLM Development Date: 2015
Page 14
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Reducing model error is a process
What is calibration?
Calibration activities vary by the model used and the user’s tolerance for error
selection and confirmation of field data
application of a numerical constant
statistical comparison of model to field data
visual inspection
DATA COLLECTION
This is one of the team members getting ready to confirm the storage length of a lane.
No single model can be expected to be equally accurate for all possible conditions
No single model can include the whole universe of variables
Models are developed with a subset of limited, real data
Models have default values for variables, i.e. models assume that users have varied amounts of data
All models have error that needs to be minimized.
What tools are available to manage “error” and the use of “judgment”?
A decision to not use judgment is a judgment.
Judgment required
In other words –
What tools are available to manage “error” and the use of “judgment”?
What tools are available to manage “error” and the use of “judgment”?
A decision to not use judgment is a judgment.
What tools are available to manage “error” and the use of “judgment”?
A decision to not use judgment is a judgment.
Variables accounted for in the calibration process may react unusually under different conditions!