0% found this document useful (0 votes)
15 views18 pages

Uc11 Level IV

Uploaded by

Gizaw Tadesse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views18 pages

Uc11 Level IV

Uploaded by

Gizaw Tadesse
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 18

Notes

Information sheet UC11 Calibrating and testing measuring instruments Level IV


MODULE DESCRIPTION: This module covers calibration, adjustment and testing of measuring instruments.
It encompasses working safely and to standards, following calibration and adjustment procedures, applying
knowledge of parameters to be measured, testing and reporting

LEARNING OUTCOMES:

At the end of this module the trainer will be able to

 Prepare to calibrate and test measuring instruments


 Calibrate and test measuring instruments
 Completion and report calibration and test activities

LO1: Prepare to calibrate and test measuring instruments

Identify OH&S procedures

OH&S Policies and procedures


HAZARDS OF ELECTRICITY
The primary hazards associated with electricity and its use are:

SHOCK. Electric shock occurs when the human body becomes part of a path through which electrons can flow.
The resulting effect on the body can be either direct or indirect.

Direct. Injury or death can occur whenever electric current flows through the human body.
Currents of less than 30 mA can result in death.
A thorough coverage of the effects of electricity on the human body is
contained in the section of this module entitled Effects of Electricity on the Human Body.

Indirect. Although the electric current through the human body may be well below the values required to cause
noticeable injury, human reaction can result in falls from ladders or scaffolds, or movement into operating
machinery. Such reaction can result in serious injury or death.

BURNS. Burns can result when a person touches electrical wiring or equipment that is improperly used or
maintained. Typically, such burn injuries occur on the hands.

ARC-BLAST. Arc-blasts occur from high-amperage currents arcing through air.


This abnormal current flow (arc-blast) is initiated by contact between two energized points.
This contact can be caused by persons who have an accident while working on energized components, or by
equipment failure due to fatigue or abuse.
Temperatures as high as 35,000oF have been recorded in arc-blast research.

The three primary hazards associated with an arc-blast are:

Thermal Radiation. In most cases, the radiated thermal energy is only part of the total energy available from the
arc. Numerous factors, including skin color, area of skin exposed, type of clothing have an effect on the degree of
injury.
Proper clothing, work distances and over current protection can improve the chances of curable burns.

TTLM Development Date: 2015


Page 1
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV

Identify instrument parameter

Definition of calibration

WHAT IS CALIBRATION?
There are as many definitions of calibration as there are methods.
According to ISA’s The Automation, Systems, and Instrumentation Dictionary, the word calibration is defined as
“a test during which known values of measured are applied to the transducer and corresponding output
readings are recorded under specified conditions.

” The definition includes the capability to adjust the instrument to zero and to set the desired span.
An interpretation of the definition would say that a calibration is a comparison of measuring equipment against a
standard instrument of higher accuracy to detect, correlate, adjust, rectify and document the accuracy of the
instrument being compared.

Typically, calibration of an instrument is checked at several points throughout the calibration range of the
instrument.
The calibration range is defined as “the region between the limits within which a quantity is measured, received or
transmitted, expressed by stating the lower and upper range values.

” The limits are defined by the zero and span values.


The zero value is the lower end of the range.
Span is defined as the algebraic difference between the upper and lower range values.
The calibration range may differ from the instrument range, which refers to the capability of the instrument.

For example, an electronic pressure transmitter may have a nameplate instrument range of 0–750 pounds per
square inch, gauge (psig) and output of 4-to-20 milliamps (mA).
However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA.

Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA.
In this example, the zero input value is 0 psig and zero output value is 4 mA.
The input span is 300 psig and the output span is 16 mA.
Different terms may be used at your facility.
Just be careful not to confuse the range the instrument is capable of with the range for which the instrument has
been calibrated.

Calibration
Calibration affords the opportunity to check the instrument against a known standard and subsequently to reduce
errors in accuracy.

Example: Calibration of a flow-meter Comparison with a standard flow-measurement facility.


Comparison with a flow-meter of known accuracy, which is higher than the instrument to be calibrated.
Using indirect measurements e.g. weighing certain amount of water in a tank and recording the time elapsed for
this quantity to flow.

TTLM Development Date: 2015


Page 2
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Calibration Basics and Best Practices
Individuals who view calibration as a necessary evil are not taking into account the bigger picture.
Calibration is essential to improving a company’s bottom line, by minimizing risk to product defects and recalls,
and enhancing a reputation for consistent quality.

Calibration, in its most basic form, is the measuring of an instrument against a standard.
As instruments become more complicated, successfully identifying and applying best practices can reduce business
expenses and improve organizational capabilities.

What is Calibration?
Calibration is the comparison of a measurement device (an unknown) against an equal or better standard.
A standard in a measurement is considered the reference; it is the one in the comparison taken to be the more
correct of the two.

Calibration finds out how far the unknown is from the standard.
A “typical” commercial calibration uses the manufacturer’s calibration procedure and is performed with a
reference standard at least four times more accurate than the instrument under test.

Why Calibrate?
Calibration can be an insurance policy because out-of-tolerance (OOT) instruments may give false information
leading to unreliable products, customer dissatisfaction and increased warranty costs.
In addition, OOT conditions may cause good products to fail tests, which ultimately results in unnecessary rework
costs and production delays.

Calibration Terms
As found data—The reading of the instrument before it is adjusted.

As left data—The reading of the instrument after adjustment or “same as found,” if no adjustment was made.

Optimization—Adjusting a measuring instrument to make it more accurate is NOT part of a typical calibration
and is frequently referred to as “optimizing” or “nominal zing” an instrument.

Out-of-tolerance (OOT) condition—When an instrument’s performance is outside its specifications, it is


considered an out- of-tolerance (OOT) condition, resulting in the need to adjust the instrument back into
specification.

Limited calibration— It may be more cost effective to have a limited calibration when only certain functions of
an instrument are not utilized by the user.

Test uncertainty ratio (TUR)—This is the ratio of the accuracy of the instrument under test compared to the
accuracy of the reference standard.

Without data—Most calibration labs charge more to provide the certificate with data and will off er a “no-data”
option.

Calibration Quality Management Systems


TTLM Development Date: 2015
Page 3
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Calibration is the key to quality control.
In order to meet calibration standards, a good quality system needs to be in place.

Here are some of the requirements: ISO 9001:2008 Calibration (International Organization for Standardization) -
This type of calibration is crucial for many industries and has the following requirements (in alphabetical order):

Accredited calibration lab—The calibration laboratory must be ISO 9001:2008 accredited or be the original
equipment manufacturer.

Comprehensive equipment list—To pass the ISO audit, the company must demonstrate that it has a
comprehensive equipment list with controls in place for additions, subtractions and custodianship of equipment.

Calibrated and no calibration required items properly identified— The equipment list must identify any units
that do not require calibration, and controls must be in place to ensure that these units are not used in an
application that will require calibration.

Documented calibration procedures—The valid calibration procedure is based on the manufacturer’s


recommendations and covers all aspects of the instrument under test.

Equipment custodianship—There is an assignment of responsibility for ensuring equipment is returned to the


calibration lab.

An OOT investigation log—For any instrument found OOT, an investigation must be performed and recorded.

Proper documentation—All critical aspects of the calibration must be properly documented for the certificate to
be recognized by an ISO auditor.

Proper recall system—A procedure should be established that includes timeframes for recall notification, an
escalation procedure and provisions for due-date extension.

Why Is Calibration Required?


It makes sense that calibration is required for a new instrument.
We want to make sure the instrument is providing accurate indication or out put signal when it is installed.
But why can’t we just leave it alone as long as the instrument is operating properly and continues to provide the
indication we expect?

Instrument error can occur due to a variety of factors: drift ,Environment, electrical supply, addition of components
to the output loop, process changes, etc.
Since a calibration is performed by comparing or applying a known signal to the instrument under test, errors are
detected by performing a calibration.

An error is the algebraic difference between the indication and the actual value of the measured variable.
Typical errors that occur include Zero and span errors are corrected by performing a calibration.

Most instruments are provided with a means of adjusting the zero and span of the instrument, along with
instructions for performing this adjustment.

The zero adjustment is used to produce a parallel shift of the input-output curve.
The span adjustment is used to change the slope of the input-output curve.
Linearization error may be corrected if the instrument has a linearization adjustment
TTLM Development Date: 2015
Page 4
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
. If the
magnitude of the nonlinear error is unacceptable and it cannot be adjusted, the instrument must be replaced.

To detect and correct instrument error, periodic calibrations are performed.

Even if a periodic calibration reveals the instrument is perfect and no adjustment is required, we would not have
known that unless we performed the calibration.

And even if adjustments are not required for several consecutive calibrations, we will still perform the calibration
check at the next scheduled due date.

Periodic calibrations to specified tolerances using approved procedures are an important element of any quality
system.

Measuring instruments must be retraced to national standards at regular intervals by means of calibration, and if
necessary adjusted, and plainly labeled with their calibration status.

If it is determined during calibration that the measuring instrument does not fulfill the specified requirements, the
operating company must evaluate the validity of previously obtained measurement results and implement
appropriate measures with regard to the measuring instrument itself, as well as all affected products.

Who Performs Calibrations?


A control system technician (CST) is a skilled craftsperson who knows pneumatic, mechanical, and electrical
instrumentation.
He or she understands process control loops and process control systems, including those that are computer-based.
Typically, he or she has received training in such specialized subjects as theory of control, analog and/or digital
electronics microprocessors and/or computers, and the operation and maintenance of particular lines of field
instrumentation.

A CST performs calibration, documentation, loop checks, troubleshooting, and repair or replacement of
instrumentation.

These tasks relate to systems that measure and control level, temperature, pressure, flow, force, power, position,
motion, physical properties, chemical composition and other process variables
.
Calibration Intervals
Time between any two calibrations of measuring and test instruments is known as the calibration interval and
must be established and monitored by the user in accordance with his own requirements.

Essential criteria for determining the calibration interval include:


 Measured quantity and permissible tolerance
 The extent to which the measuring and test equipment is subject to stressing
 Frequency of use
 Ambient conditions
 Stability of previous calibrations
 Required measuring accuracy
 Company-specific requirements specified by the quality assurance system
Instrument calibration
TTLM Development Date: 2015
Page 5
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Every instrument has at least one input and one output.
For a pressure sensor, the input would be some flueid pressure and the output would (most likely) be an electronic
signal.

For a loop indicator, the input would be a 4-20 mA current signal and the output would be a human-readable
display.
For a variable-speed motor drive, the input would be an electronic signal and the output would be electric power to
the motor.

Calibration and ranging are two tasks associated with establishing an accurate correspondence between any
instrument’s input signal and its output signal.

Calibration versus re-ranging


To calibrate an instrument means to check and adjust (if necessary) its response so the output accurately
corresponds to its input throughout a specified range.

In order to do this, one must expose the instrument to an actual input stimulus of precisely known quantity.
For a pressure gauge, indicator, or transmitter, this would mean subjecting the pressure instrument to known fluid
pressures and comparing the instrument response against those known pressure quantities.

One cannot perform a true calibration without comparing an instrument’s response to known, physical stimuli.
To range an instrument means to set the lower and upper range values so it responds with the desired sensitivity to
changes in input.

For example, a pressure transmitter set to a range of 0 to 200 PSI (0 PSI = 4 mA output ; 200 PSI = 20 mA output)
could be re-ranged to respond on a scale of 0 to 150 PSI (0 PSI = 4 mA ; 150 PSI = 20 mA).

In analog instruments, re-ranging could (usually) only be accomplished by re-calibration, since the same
adjustments were used to achieve both purposes.

In digital instruments, calibration and ranging are typically separate adjustments (i.e. it is possible to re-range a
digital transmitter without having to perform a complete recalibration), so it is important to understand the
deference.

Calibration Of Pressure
Pressure-sensing devices are calibrated at the factory.
In cases where a sensor is suspect and needs to be recalibrated, the sensor can be returned to the factor for
recalibration, or it can be compared to a known reference.

Low-pressure devices can be calibrated against a liquid manometer.


High-pressure devices can be calibrated with a dead-weight tester.
In a dead-weight tester the pressure to the device under test is created by weights on a piston. High pressures can
be accurately reproduced

Calibration Of flow
TTLM Development Date: 2015
Page 6
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Flow meters need periodic calibration.
This can be done by using another calibrated meter as a reference or by using a known flow rate.

Accuracy can vary over the range of the instrument and with temperature and specific weight changes in the fluid,
which may all have to be taken into account.

Thus, the meter should be calibrated over temperature as well as range, so that the appropriate corrections can be
made to the readings.
A spot check of the readings should be made periodically to check for instrument drift that may be caused by the
instrument going out of calibration, particulate build up, or erosion.

Calibration of temperature
Temperature calibration can be performed on most temperature sensing devices by immersing them in known
temperature standards which are the equilibrium points of solid/liquid or liquid/gas mixtures, which is also known
as the triple point.
Most temperature sensing devices are rugged and reliable, but can go out of calibration due to leakage during use
or contamination during manufacture and should therefore be checked on a regular basis etc.

Calibration of Measuring Instruments


In daily usage, consistent differentiation between the terms calibration and adjustment is frequently neglected.
Calibration involves ascertaining and documenting deviation of the measured value from a retraceable, highly
accurate test standard.

The measured value obtained from a measuring instrument is thus compared with the known value of the test
standard under specified reference conditions using reproducible measuring procedures.

Calibration does not involve any manipulation of the measuring instrument, which remains entirely unchanged.
Adjustment involves the correction or balancing of a measuring instrument in order to eliminate systematic
measurement deviation.

The measured value obtained from a measuring instrument is thus adjusted to match the known value of the test
standard under specified reference conditions.
Adjustment always involves manipulation, which permanently changes the measuring instrument.

Retrace ability of a calibration procedure means that the calibration sequence is reproducibly documented from
the individual device under test all the way up to the national standard for the respective measured quantity.
Retrace ability of measurement results is assured by a country’s metrological infrastructure.

Why do measuring instruments have to be calibrated?


Measuring instruments must be retraced to national standards at regular intervals by means of calibration, and if
necessary adjusted, and plainly labeled with their calibration status.
If it is determined during calibration that the measuring instrument does not fulfill the specified requirements, the
operating company must evaluate the validity of previously obtained measurement results and implement
appropriate measures with regard to the measuring instrument itself, as well as all affected products.

Consequently, calibration at regular intervals assures the quality of the respective product or service on the basis of
internationally comparable measurement results.
This provides for legal security with respect to product liability, as well as for approval tests and audits.
Due to its assured retrace ability to national test standards
Purpose of a calibration
TTLM Development Date: 2015
Page 7
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
There are three main reasons for having instruments calibrated:
1. To ensure readings from an instrument are consistent with other measurements.
2. To determine the accuracy of the instrument readings.
3. To establish the reliability of the instrument i.e. that it can be trusted.

Traceability: relating your measurements to others


The results of measurements are most useful if they relate to similar measurements,
perhaps made at a different time,
a different place, by a different person with a different instrument.

Such measurements allow manufacturing processes to be kept in control from one day to the next and from one
factory to another.
Manufacturers and exporters require such measurements to know that they will satisfy their clients’ specifications.

Most countries have a system of accreditation for calibration laboratories.


Accreditation is the recognition by an official accreditation body of a laboratory’s competence to calibrate, test, or
measure an instrument or product.

The assessment is made against criteria laid down by international standards.


Accreditation ensures that the links back to the national standard are based on sound procedures.

For measurements to be compared they must share a common measurement system.

Uncertainty: how accurate are your measurements?


Ultimately all measurements are used to help make decisions, and poor quality measurements result in poor quality
decisions.
The uncertainty in a measurement is a numerical estimate of the spread of values that could reasonably be
attributed to the quantity.
It is a measure of the quality of a measurement and provides the means to assess and minimize the risk and
possible consequences of poor decisions.

For example we may want to determine whether the diameter of a lawn mower shaft is too big, too small or just
right.
Our aim is to balance the cost of rejecting good shafts and of customer complaints if we were to accept faulty
shafts, against the cost of an accurate but over engineered measurement system.
When making these decisions the uncertainty in the measurement is as important as the measurement itself.
The uncertainty reported on your certificate is information necessary for you to calculate the uncertainty in your
measurements

Reliability: can I trust the instrument?


Many measuring instruments read directly in terms of the SI units, and have a specified accuracy greater than
needed for most tasks.

With such an instrument, where corrections and uncertainties are negligible, the user simply wants to know that the
instrument is reliable.
Unfortunately a large number of instruments are not

TTLM Development Date: 2015


Page 8
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Achieving Traceability in your measurements
Many quantities of practical interest such as color, loudness and comfort are difficult to define because they relate
to human attributes.
Others such as
viscosity,
flammability, and
thermal conductivity are sensitive to the conditions under which the measurement is made, and it may not be
possible to trace these measurements to the SI units.

For these reasons the international measurement community establishes documentary standards (procedures) that
define how such quantities are to be measured so as to provide the means for comparing the quality of goods or
ensuring that safety and health requirements are satisfied.

To make a traceable measurement three elements are required:


1. An appropriate and recognized definition of how the quantity should be measured,
2. A calibrated measuring instrument, and
3. Competent staff able to interpret the standard or procedure, and use the instrument.

Traceability is ensured only if these three factors are present in the measurement process

Adjustment: what a calibration is not


Calibration does not usually involve the adjustment of an instrument so that it reads ‘true’.
Indeed adjustments made as a part of a calibration often detract from the reliability of an instrument because they
may destroy or weaken the instrument’s history of stability.
The adjustment may also prevent the calibration from being used retrospectively.
When MSL adjusts an instrument it normally issues a calibration report with both the ‘as received’ and ‘after
adjustment’ values.
Adjustments may completely invalidate an earlier calibration

What a calibration certificate contains


Your calibration certificate must contain certain information if it is to fulfill its purpose of supporting traceable
measurements.
This information can be divided into several categories:
it establishes the identity and credibility of the calibrating laboratory;
it uniquely identifies the instrument and its owner;
it identifies the measurements made; and
it is an unambiguous statement of the results, including an uncertainty statement.

Calibration Principles
TTLM Development Date: 2015
Page 9
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV

Calibration is the activity of checking, by comparison with a standard, the accuracy of a measuring instrument
of any type. It may also include adjustment of the instrument to bring it into alignment with the standard.

Even the most precise measurement instrument is of no use if you cannot be sure that it is reading accurately – or,
more realistically, that you know what the error of measurement is.
Let’s begin with a few definitions:
Calibration range – the region between the within which a quantity is measured, received or transmitted which is
expressed by stating the lower and upper range values.

Zero value – the lower end of the calibration range


Span – the difference between the upper and lower range
Instrument range – the capability of the instrument; may be different than the calibration range

For example, an electronic pressure transmitter may have an instrument range of 0–750 psig and output of 4-to-20
milliamps (mA).
However, the engineer has determined the instrument will be calibrated for 0-to-300 psig = 4-to-20 mA.
Therefore, the calibration range would be specified as 0-to-300 psig = 4-to-20 mA. In this example, the zero input
value is 0 psig and zero output value is 4 mA.
The input span is 300 psig and the output span is 16 mA.

Be careful not to confuse the range the instrument is capable of with the range for which the instrument has been
calibrated.

Ideally a product would produce test results that exactly match the sample value, with no error at any point within
the calibrated range.
This line has been labeled “Ideal Results”.
However, without calibration, an actual product may produce test results different from the sample value, with a
potentially large error.
Calibrating the product can improve this situation significantly.
During calibration, the product is “taught” using the known values of Calibrators 1 and 2 what result it should
provide.
The process eliminates the errors at these two points, in effect moving the “Before Calibration” curve closer to
the Ideal Results line shown by the “After Calibration”
Accuracy - the ratio of the error to the full scale output or the ratio of the error to the output, expressed in percent
span or percent reading, respectively.
Tolerance - permissible deviation from a specified value; may be expressed in measurement units, percent of
span, or percent of reading.
It is recommended that the tolerance, specified in measurement units, is used for the calibration requirements
performed at your facility.
By specifying an actual value, mistakes caused by calculating percentages of span or reading are eliminated.
Also, tolerances should be specified in the units measured for the calibration.
Calibration tolerances should be determined from a combination of factors.
These factors include:
 Requirements of the process
 Capability of available test equipment
 Consistency with similar instruments at your facility
 Manufacturer’s specified tolerance

TTLM Development Date: 2015


Page 10
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV

The term Accuracy Ratio was used in the past to describe the relationship between the accuracy of the test
standard and the accuracy of the instrument under test.
A good rule of thumb is to ensure an accuracy ratio of 4:1 when performing calibrations.
This means the instrument or standard used should be four times more accurate than the instrument being checked.
In other words, the test equipment (such as a field standard) used to calibrate the process instrument should be four
times more accurate than the process instrument.

Types of calibration methods


External Calibration
Signal is proportional to concentration - established using externally prepared standards
•Assumes that the sensitivity (signal/conc) is the same for samples and standards
•Assumes that the signal arises only from the analytein most cases
•Does not account for sample matrix or instrumental drift

Standard Addition
•Known amounts of analyte are added to aliquots of sample
•Signals are measured as a function of concentration added
•Accounts for sample matrix, but not for instrumental drift

Internal Standard
•A substance known as an “internal standard” is added to samples and standards (chemically similar to analyte)
•Used to correct for drift(changes in sensitivity over time) and matrix effects(sample-related changes in sensitivity)

TTLM Development Date: 2015


Page 11
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
External Calibration
–Standards of sulfate used at 10, 20, and 40 ppm

Exercise 1 Determine average peak area, standard deviation of measurement (from 3 data points), and graph
concentration (ppm) vs. the average peak area - include error bars (std dev) Sulfate data (collected by Ion
Chromatography)

Model Calibration
1. What is a model?
2. What is calibration?
3. Why calibrate?
4. Decision-making and calibration

What is Model?
Anything used in anyway to represent something else”
That’s why “fashion models” are called “models”.
The models are being used to represent something else
– specifically you - the buyer of clothes.

That’s why models tend to have very attractive features


– because people generally don’t like to consider themselves to be unpleasant to look at.
Which is actually a critical point
– the seller of clothes are intentionally providing an inaccurate
– not false
– but inaccurate model
– in this case using exceedingly attractive models pulled from a very significant minority of the population
– certainly not a true representation of physical attractiveness
– in order to lure an unsuspecting buyer into making a purchase.
I want to leave you with this question:

what would be an accurate model to represent “you”.


With “you” being the wide distribution of people that would wear clothes.

What is a scientific model?


TTLM Development Date: 2015
Page 12
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
“A
simplified abstract view of the complex reality”

Why do we develop models?


“To predict the future”

Difference between “Complex Reality” and the “Simplified Abstract”


Complex vs. simplified è linear approximation

Applications
 All fields of endeavor: Engineering, Economics, Biology, Geology, Physics, Psychology have models that
must be calibrated.
 Transportation
 Travel Demand models
 Emissions models
 Capacity models
 Safety models

What is “calibration”?
Reducing model error is a process

TTLM Development Date: 2015


Page 13
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV

Validation is a process to determine that a model is an accurate representation of the real system.
Validation is usually achieved through the calibration of the model, an iterative process of comparing the model to
actual system behavior and using the discrepancies between the two, and the insights gained, to improve the
model. This process is repeated until model accuracy is judged to be acceptable.

Calibration and validation – process that modifies the Simplified Abstract model to best match the Complex
Reality such that Error is minimized.
What is calibration?
TTLM Development Date: 2015
Page 14
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
Reducing model error is a process

What is calibration?
Calibration activities vary by the model used and the user’s tolerance for error
 selection and confirmation of field data
 application of a numerical constant
 statistical comparison of model to field data
 visual inspection

DATA COLLECTION

This is one of the team members getting ready to confirm the storage length of a lane.

TTLM Development Date: 2015


Page 15
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
3.
Why calibrate?

 No single model can be expected to be equally accurate for all possible conditions
 No single model can include the whole universe of variables
 Models are developed with a subset of limited, real data
 Models have default values for variables, i.e. models assume that users have varied amounts of data
All models have error that needs to be minimized.

Complex vs. simplified è linear approximation

If all models have error, how do I know:

 if the error has been sufficiently minimized?


 if the results reflect reality?
 if the prediction of the model is valid?
Judgment

What tools are available to manage “error” and the use of “judgment”?
A decision to not use judgment is a judgment.

Judgment required

In other words –

Has the base model I am using been sufficiently verified.


Has the base model I have modified been sufficiently calibrated? Management decision since it involves resources and work
effort.
Has the calibrated model been sufficiently validated?

TTLM Development Date: 2015


Page 16
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV
4.
Decision-making and calibration
Six different software programs found that calibration differences of 13 percent in the predicted freeway speeds
for existing conditions increased to differences of 69 percent in the forecasted freeway speeds for future
conditions

What tools are available to manage “error” and the use of “judgment”?

A decision to not use judgment is a judgment.

Known limitations of popular traffic analysis models


 Roads with driveways
 Over-saturated conditions
 Roads that have more than zero crashes
 Roads that have two-way left-turn lanes
 Tight diamond interchanges
 Roads with bicycles
 Roads with on-street parking
 Roads with commercial vehicle loading
 Roads that function in inclement weather
 Roads with a roadside environment that may impact drivers in any way

What tools are available to manage “error” and the use of “judgment”?
 A decision to not use judgment is a judgment.

Known limitations of popular traffic analysis models


Sensitivity to changes in parameters is often unknown and could result is very large changes in model output!

What tools are available to manage “error” and the use of “judgment”?
A decision to not use judgment is a judgment.
Variables accounted for in the calibration process may react unusually under different conditions!

Decision-making and Model Calibration


 When attempting to model the “real world”, an un-calibrated model is usually meaningless
 How a model is calibrated should be agreed upon before work is started.
 Calibration and validation should be extensively documented
 Properly calibrating a poor model doesn’t make the results acceptable
 Poorly calibrating a superior model doesn’t make the results acceptable
What tools are available to manage “error” and the use of “judgment”?
A decision to not use judgment is a judgment
 Properly calibrating a superior model doesn’t make the results acceptable
 Embrace, manage and disclose uncertainty
 Endeavor to convey uncertainty to decision makers
 Engineering principles and judgment
 Accuracy versus Precision
TTLM Development Date: 2015
Page 17
BY: Gizaw Tadesse
Notes
Information sheet UC11 Calibrating and testing measuring instruments Level IV

TTLM Development Date: 2015


Page 18
BY: Gizaw Tadesse

You might also like