Calibration Curves Are Used To Determine The Concentration of Unknown Substances Based On Previous Measurements of Solutions of Known Concentrations

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 9

Calibration curves are used to determine the concentration of unknown substances

based on previous measurements of solutions of known concentrations. The precision


and accuracy of the measurements are dependent on the calibration curve. The better
the curve the more accurate the answer, the worse the curve the worse the accuracy.
This is a type of comparison method, the unknown is compared to a known.
Calibration curves are used for all types of measurements using many different
machines. This example uses a spectrophotometer.

Dilute the standard solution to different concentrations. It is typical to do a 10 times


dilution, 20 times dilution, 30 times dilution or some other step wise solution. Do each
dilution twice so that all samples are in duplicate.

Calculate the concentrations of the diluted solutions. An example of the new


concentration for the 10 times dilution would be the the concentration of the first
solution multiplied by 0.10.

Read the absorbency of the diluted solutions on the spectrophotometer. Insert a


cuvette into the spectrophotometer so that the triangle marking is lined up with the
light path. Close the lid of the spectrophotometer and press the zero button. Zero the
machine with distilled water every five samples. Once the machine is zeroed read the
samples in the same way. The one difference is you press enter to get the
absorbency. Press enter after closing the lid. Record these values in a notebook.

Graph the absorbency versus the calculated known concentrations for all samples.
The known concentration will be on the X axis and the absorbency on the Y axis. It is
best to create the graph in a computer graphing program.

Use the graphing program to calculate the regression line for the graphed points. It is
possible to delete one of the two points for each dilution to get the best regression
line. This is the point of doing each dilution in duplicate. The closer the R^2 value is to
one, the better the regression line. Make note of the regression line equation.

Read the absorbency of the unknown concentration solution on the


spectrophotometer. Record this absorbency.

Calculate the concentration of the unknown solution using the regression line
equation. The unknown absorbency is substituted as Y in the equation. You are
solving the equation for X, the concentration.

Master curve One of a set of theoretical curves, calculated for


known models, against which a field curve can be matched. If the
two fit very closely the model is considered to apply reasonably well
to the field situation and the curve is known as the master curve.
Master curves were used extensively in electrical resistivity depth
sounding, but are being replaced by micro-computer curve-
matching which is much more accurate and more sensitive to real-
life situations.

-POINT CALIBRATION FOR


ACCURACY

The calibration laboratory techniques established for Temprecord ISO17025 accreditation have been
developed to make 3 Point Calibration a standard process when manufacturing all General
Use Scientific and RH data loggers.

After initial assembly all General Use, Scientific & RH data loggers are calibrated at -15 °C, 0 °C and
+40 °C,  or -38 °C, 0 °C and +40 °C for low temperature probes with the option of specified points
such as -80 °C.  This process provides an accuracy of within 0.2 °C (- 0.36 °F) across a range of -20
°C to +50 °C (-4 °F to +122 °F) or for low temperature probes -40 °C to +50 °C, (-40 °F to +122 °F).

This second graph shows the accuracy curve after the computer has applied the correction values at
each of the three temperature points.

As can be seen, this process straightens out the accuracy curve dramatically giving the end-user an
accurate product across the stated range.
The graph to the left shows how a typical accuracy curve would look like from an un-calibrated data
logger. The straight blue line through the middle represents what you would expect true accuracy to
look like.

From a data logger showing results like this we would re-calibrate the data loggers programming at
the three points: -15 °C, 0 °C and +40 °C, or -38 °C, 0 °C and +40 °C for low temperature probes.

This level of accuracy in the data logger can only be achieved when the data logger’s software has
the ability to have these correction factors applied to it.
Currently Temprecord is one of the few manufactures in the world that applies the correction data to
the individual data logger economically.
Some manufacturers do provide calibration certificates but these certificates only provide the
correction factors at any nominated points. They cannot make allowances for the correction.
Without this, a calibration certificate is only telling you how inaccurate your data logger really is.

What Is Calibration?
Depending on whom you ask, there are many different answers to the
definition/meaning of calibration, but the basic principle remains the same
throughout.

To ensure that measurements being made or output provided by equipment


are accurate, they need to be compared against a reference that is known to
be accurate. This is exactly what the process of calibration achieves – a
comparison between measurements of known and unknown accuracy or
precision.
When calibration is being done, the equipment with unknown performance is
called the ‘unit under test’ or ‘test instrument’; and the other is called the
‘calibration standard’ or simply ‘standard’.

Why Is Calibration So Important?


Calibration is used to define the quality of measurement parameters, like
accuracy, range or precision, which are recorded by a piece of equipment. It’s
a necessary part of the process like manufacturing, testing, and quality
assurance, which form the backbone of a wide range of industries and
sectors.

Here’s an overview of why calibration is important for practically every industry


(especially those that are heavily regulated by authorities like the FDA) at
some point or another:

 Over time, the quality of measurements of just about every tool will deteriorate
to some extent. Companies need to ensure that these shifts in accuracy are
tracked, and measures are taken to prevent them from affecting final product
quality.
 Equipment that operates on certain technologies, or measures shifting
parameters like humidity, temperature and pressure, is more likely to be
affected by a ‘drift’ in accuracy.
 In situations where the quality of the measurements is imperative for
maintaining quality, you need to ensure that the instrument is operating within
an acceptable range of error, and calibration is essential for this.
 In order to ensure that you can enjoy complete confidence in the
measurements and the output of any piece of equipment, calibration of
instruments needs to be performed on a periodic basis.

What is Equipment/Instrument Calibration & What


Does It Do?
Instrument calibration is amongst the primary (and often the most crucial)
methods of checking and maintaining the quality of measurements made by
instruments. The measurements are checked and compared, and the
instrument is configured to provide results which are within an acceptable
accuracy, precision, and repeatability range.
Minimizing, or altogether eliminating, factors that could cause inconsistencies
and errors are a fundamental part of instrument design philosophy. There are
various service providers you could turn to for your instrument calibration
requirements. These often offer specialized services like laboratory calibration
services or pipette calibration services, which are customized to the needs
and requirements of various industries.

Though the exact method used for calibration of equipment depends on the
instrument in question, the procedure typically involves most or all of the
following steps:

 One or more test samples or standards with known values, often called
‘calibrators’, are measured using the test instrument.
 The results obtained are compared with the actual values, thus establishing a
relationship between known values and the measurement technique.
 Using this process, the instrument is, in essence, ‘taught’ to produce more
accurate results than it would otherwise.
 Post calibration, the instrument can measure unknown samples with higher
reliability of precision and accuracy.

Using known standards of different values, multiple calibrations are performed


to establish a better correlation at different stages within the entire operating
range of the instrument. While you might want to perform calibrations at many
different points to plot an accuracy curve, this may not always be the best
choice, since:

 Costs associated with labor and time can rack up quickly, so you need to limit
the number of calibration points accordingly.
 The relation between number of calibration points and the resulting
performance might not be linear.
 Practically speaking, you’ll need to make tradeoffs between the effort and cost
of calibrations, and the desired performance levels.

Instruments tend to perform best when they’ve been calibrated according to


the recommendations of the manufacturer. The performance specifications
include intermediate points, which are used for calibrations. The process
specified is designed to ‘zero-out’, i.e. basically eliminate, the inherent errors
in the instrument at those points.
What Factors Affect Calibration?
Once you’ve understood the benefits of performing calibration and how critical
it can be for maintaining quality, it’s quite apparent that it should be dealt with
carefully. While designing the calibration procedures and during the actual
calibration, some steps must be taken to eliminate potential error sources that
can degrade the quality of the results.

For extremely sensitive instruments, you may need to take them to a


calibration laboratory or other calibration service provider, where they can be
calibrated under controlled conditions. There are a number of factors that can
affect calibration results otherwise, both during the calibration procedure and
afterwards.

These include:
 Using the Wrong Values

The instructions for calibration need to be followed very closely. The calibrator
mentioned in the instructions is the one that the instrument will ‘learn’ from.
Disregarding the documentation and choosing a different one or the wrong
values changes the way the test instrument behaves. This can produce
significant errors within some parts or the entire operating range of the
instrument.Some of the newer instruments have a built-in software diagnostic
system that can alert operators when the order in which calibrators are tested
is wrong (Calibrator-B used before Calibrator-A). However, they may not be
able to distinguish between calibrators that use the wrong values.
 Calibrator Formulation Tolerance

Just like your equipment, the quality of the calibrator you use can affect the
results of the calibration. Using calibrators manufactured by reputable and
trustworthy manufacturers or calibration labs, which are built to precise
specifications and tolerances, is essential for obtaining repeatable
performance and dependable results. There is another tolerance that is
associated with the design and formulation of a control or calibrator. This is
due to normal inaccuracies and variations in quality control processes and the
instrument itself.For example, if you’re using calibrators whose nominal values
are 50 and 800 mOsm/kg (H2O), and if they’re both manufactured to perform
at the lower end of the required range, the net effect of calibration may be to
lower the accuracy or precision. This would result in additional errors in the
range of several mOsm/kg, over the entire range the calibration was
performed on.

Here’s why:
 The calibration process will ‘teach’ the instrument to read 800 incorrectly as
796, so the actual results curve will be higher than if the instrument was
calibrated as per the correct value of 796, or if the calibrator was at the actual
required formulation of 800 mOsm/kg.
 If it is assumed that one calibrator is at its nominal value (800 mOsm/kg), but
the true value is off by just a tiny bit, say at about 796 mOsm/kg, the resulting
curve is well off the assumed result.
 Ideally, the resultant calibration curve should be linear, but even small errors
can have a drastic effect.
 If the instrument is calibrated in this situation, any measurements made with it
will be inaccurate, until it has been recalibrated with the correct values.
 Sample Preparation TechniqueAs with normal testing, you should always use
good techniques for sample preparation. This is essential for optimizing the
resultant performance through calibration. A similar situation to the one
discussed above can result if the sample itself has not been prepared properly
for the calibration.

Good sample preparation techniques can help eliminate a number of sources


of possible inaccuracies and contamination in the sample; some of which
include:

 Pipetting different volumes of the sample.


 The presence of air bubbles in the sample.
 Inconsistencies resulting from evaporation (which is caused by preparing
samples too early).

All of these situations can cause more variations in the results obtained from
the equipment calibration process. The increase in number and scale of the
variations can cause the mean values obtained through calibration to vary
significantly. The result would be that the calibration curve would erroneously
shift and the errors in all the results would increase.
 Ambient Temperature Effects
Even when you perform calibration of instruments using the correct values,
reliable calibrators with the correct tolerance and the correct sample
preparation technique, errors can still result from other factors. Environmental
factors, like the temperature of the surroundings, can have a huge impact on
the results of the calibration:
 Instruments should be calibrated in an environment where factors that can
affect the performance, like temperature, pressure and humidity, are closest to
those of the surroundings it is operated in.
 Variations and differences in operating temperature can affect the
performance of electrical and other components.
 Instruments calibrated at one particular temperature, or in fluctuating
temperatures may be prone to temperature-induced errors if it is operated in a
significantly different environment. This can degrade the accuracy of the
calibration results.

How Frequently Should An Instrument Be Calibrated?


There’s rarely a set ‘perfect’ calibration frequency for any instrument, since
there are a number of factors that need to be taken into account while
designing a calibration regimen. This means that the correct frequency can
often only be described as – “as and when it’s needed”.

You can create a history for different instruments by tracking the changes in
measurements of a known value and by comparing the “as-found” and “as-
left” results of each calibration.

 Of all the things you should consider, the most weightage will probably be
given to just how much of an effect the instrument in question has on the
overall quality.
 A close second would be the manufacturer’s recommendations and the
instrument’s tendency to ‘drift’ out of calibration.
 Recalibration may be warranted after any events that could throw the
precision or accuracy. Such as an electrical fault, a fall, or other impacts.
 Another time when you may need to perform an unscheduled calibration is
just before a particularly important measurement is made.
 Calibrating an instrument each time you plan to use it, just to check its
performance isn’t always practical and it can get very expensive, very quickly.
 Control solutions with known values can be tested every day or periodically,
which can provide an indication of the performance and establish a history.
 If the results from the control data do not indicate any issues or inaccuracies
in the instrument’s performance, then you can continue using it till the next
scheduled calibration.

Slight variations between measurements are to be expected, and will occur


despite your best precautions. As long as they fall within the limits of the
acceptable range for errors, there’s no reason to perform an unscheduled re-
calibration.

However, if it looks like the measurements are close to the limits of, or beyond
the acceptable performance criteria, it might be a good idea to calibrate it.
This is also true of significant short-term shifts (like while operating the
instrument in different environmental conditions).

When you’re designing a calibration routine, it’s extremely important to take


into account any regulations governing your field of operations. Check the
requirements of quality compliance organizations, as well as specific standard
operating procedures for laboratories and government regulatory authorities.

These may require instruments to be re-calibrated even if there’s no evidence


that it is needed. The requirements should be followed nonetheless and
should always be given precedence over all else. Any issued guidelines can
be used at times, especially if you’re unsure about whether an instrument
needs to be calibrated to improve accuracy.

You might also like