Calibration Curves Are Used To Determine The Concentration of Unknown Substances Based On Previous Measurements of Solutions of Known Concentrations
Calibration Curves Are Used To Determine The Concentration of Unknown Substances Based On Previous Measurements of Solutions of Known Concentrations
Calibration Curves Are Used To Determine The Concentration of Unknown Substances Based On Previous Measurements of Solutions of Known Concentrations
Graph the absorbency versus the calculated known concentrations for all samples.
The known concentration will be on the X axis and the absorbency on the Y axis. It is
best to create the graph in a computer graphing program.
Use the graphing program to calculate the regression line for the graphed points. It is
possible to delete one of the two points for each dilution to get the best regression
line. This is the point of doing each dilution in duplicate. The closer the R^2 value is to
one, the better the regression line. Make note of the regression line equation.
Calculate the concentration of the unknown solution using the regression line
equation. The unknown absorbency is substituted as Y in the equation. You are
solving the equation for X, the concentration.
The calibration laboratory techniques established for Temprecord ISO17025 accreditation have been
developed to make 3 Point Calibration a standard process when manufacturing all General
Use Scientific and RH data loggers.
After initial assembly all General Use, Scientific & RH data loggers are calibrated at -15 °C, 0 °C and
+40 °C, or -38 °C, 0 °C and +40 °C for low temperature probes with the option of specified points
such as -80 °C. This process provides an accuracy of within 0.2 °C (- 0.36 °F) across a range of -20
°C to +50 °C (-4 °F to +122 °F) or for low temperature probes -40 °C to +50 °C, (-40 °F to +122 °F).
This second graph shows the accuracy curve after the computer has applied the correction values at
each of the three temperature points.
As can be seen, this process straightens out the accuracy curve dramatically giving the end-user an
accurate product across the stated range.
The graph to the left shows how a typical accuracy curve would look like from an un-calibrated data
logger. The straight blue line through the middle represents what you would expect true accuracy to
look like.
From a data logger showing results like this we would re-calibrate the data loggers programming at
the three points: -15 °C, 0 °C and +40 °C, or -38 °C, 0 °C and +40 °C for low temperature probes.
This level of accuracy in the data logger can only be achieved when the data logger’s software has
the ability to have these correction factors applied to it.
Currently Temprecord is one of the few manufactures in the world that applies the correction data to
the individual data logger economically.
Some manufacturers do provide calibration certificates but these certificates only provide the
correction factors at any nominated points. They cannot make allowances for the correction.
Without this, a calibration certificate is only telling you how inaccurate your data logger really is.
What Is Calibration?
Depending on whom you ask, there are many different answers to the
definition/meaning of calibration, but the basic principle remains the same
throughout.
Over time, the quality of measurements of just about every tool will deteriorate
to some extent. Companies need to ensure that these shifts in accuracy are
tracked, and measures are taken to prevent them from affecting final product
quality.
Equipment that operates on certain technologies, or measures shifting
parameters like humidity, temperature and pressure, is more likely to be
affected by a ‘drift’ in accuracy.
In situations where the quality of the measurements is imperative for
maintaining quality, you need to ensure that the instrument is operating within
an acceptable range of error, and calibration is essential for this.
In order to ensure that you can enjoy complete confidence in the
measurements and the output of any piece of equipment, calibration of
instruments needs to be performed on a periodic basis.
Though the exact method used for calibration of equipment depends on the
instrument in question, the procedure typically involves most or all of the
following steps:
One or more test samples or standards with known values, often called
‘calibrators’, are measured using the test instrument.
The results obtained are compared with the actual values, thus establishing a
relationship between known values and the measurement technique.
Using this process, the instrument is, in essence, ‘taught’ to produce more
accurate results than it would otherwise.
Post calibration, the instrument can measure unknown samples with higher
reliability of precision and accuracy.
Costs associated with labor and time can rack up quickly, so you need to limit
the number of calibration points accordingly.
The relation between number of calibration points and the resulting
performance might not be linear.
Practically speaking, you’ll need to make tradeoffs between the effort and cost
of calibrations, and the desired performance levels.
These include:
Using the Wrong Values
The instructions for calibration need to be followed very closely. The calibrator
mentioned in the instructions is the one that the instrument will ‘learn’ from.
Disregarding the documentation and choosing a different one or the wrong
values changes the way the test instrument behaves. This can produce
significant errors within some parts or the entire operating range of the
instrument.Some of the newer instruments have a built-in software diagnostic
system that can alert operators when the order in which calibrators are tested
is wrong (Calibrator-B used before Calibrator-A). However, they may not be
able to distinguish between calibrators that use the wrong values.
Calibrator Formulation Tolerance
Just like your equipment, the quality of the calibrator you use can affect the
results of the calibration. Using calibrators manufactured by reputable and
trustworthy manufacturers or calibration labs, which are built to precise
specifications and tolerances, is essential for obtaining repeatable
performance and dependable results. There is another tolerance that is
associated with the design and formulation of a control or calibrator. This is
due to normal inaccuracies and variations in quality control processes and the
instrument itself.For example, if you’re using calibrators whose nominal values
are 50 and 800 mOsm/kg (H2O), and if they’re both manufactured to perform
at the lower end of the required range, the net effect of calibration may be to
lower the accuracy or precision. This would result in additional errors in the
range of several mOsm/kg, over the entire range the calibration was
performed on.
Here’s why:
The calibration process will ‘teach’ the instrument to read 800 incorrectly as
796, so the actual results curve will be higher than if the instrument was
calibrated as per the correct value of 796, or if the calibrator was at the actual
required formulation of 800 mOsm/kg.
If it is assumed that one calibrator is at its nominal value (800 mOsm/kg), but
the true value is off by just a tiny bit, say at about 796 mOsm/kg, the resulting
curve is well off the assumed result.
Ideally, the resultant calibration curve should be linear, but even small errors
can have a drastic effect.
If the instrument is calibrated in this situation, any measurements made with it
will be inaccurate, until it has been recalibrated with the correct values.
Sample Preparation TechniqueAs with normal testing, you should always use
good techniques for sample preparation. This is essential for optimizing the
resultant performance through calibration. A similar situation to the one
discussed above can result if the sample itself has not been prepared properly
for the calibration.
All of these situations can cause more variations in the results obtained from
the equipment calibration process. The increase in number and scale of the
variations can cause the mean values obtained through calibration to vary
significantly. The result would be that the calibration curve would erroneously
shift and the errors in all the results would increase.
Ambient Temperature Effects
Even when you perform calibration of instruments using the correct values,
reliable calibrators with the correct tolerance and the correct sample
preparation technique, errors can still result from other factors. Environmental
factors, like the temperature of the surroundings, can have a huge impact on
the results of the calibration:
Instruments should be calibrated in an environment where factors that can
affect the performance, like temperature, pressure and humidity, are closest to
those of the surroundings it is operated in.
Variations and differences in operating temperature can affect the
performance of electrical and other components.
Instruments calibrated at one particular temperature, or in fluctuating
temperatures may be prone to temperature-induced errors if it is operated in a
significantly different environment. This can degrade the accuracy of the
calibration results.
You can create a history for different instruments by tracking the changes in
measurements of a known value and by comparing the “as-found” and “as-
left” results of each calibration.
Of all the things you should consider, the most weightage will probably be
given to just how much of an effect the instrument in question has on the
overall quality.
A close second would be the manufacturer’s recommendations and the
instrument’s tendency to ‘drift’ out of calibration.
Recalibration may be warranted after any events that could throw the
precision or accuracy. Such as an electrical fault, a fall, or other impacts.
Another time when you may need to perform an unscheduled calibration is
just before a particularly important measurement is made.
Calibrating an instrument each time you plan to use it, just to check its
performance isn’t always practical and it can get very expensive, very quickly.
Control solutions with known values can be tested every day or periodically,
which can provide an indication of the performance and establish a history.
If the results from the control data do not indicate any issues or inaccuracies
in the instrument’s performance, then you can continue using it till the next
scheduled calibration.
However, if it looks like the measurements are close to the limits of, or beyond
the acceptable performance criteria, it might be a good idea to calibrate it.
This is also true of significant short-term shifts (like while operating the
instrument in different environmental conditions).