0% found this document useful (0 votes)
54 views5 pages

Dealing With Errors

This document discusses how to properly estimate uncertainties in measurements and improve measurement quality. It emphasizes identifying the largest source of uncertainty and focusing on instrument properties like resolution and sensitivity to reduce uncertainties. Graphing data is important to identify patterns and outliers. Uncertainties express measurement imprecision while systematic errors reflect mistakes that need correcting. The document advocates conservatively estimating uncertainties and acknowledging their own uncertainty.

Uploaded by

Pun-Tak Pang
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views5 pages

Dealing With Errors

This document discusses how to properly estimate uncertainties in measurements and improve measurement quality. It emphasizes identifying the largest source of uncertainty and focusing on instrument properties like resolution and sensitivity to reduce uncertainties. Graphing data is important to identify patterns and outliers. Uncertainties express measurement imprecision while systematic errors reflect mistakes that need correcting. The document advocates conservatively estimating uncertainties and acknowledging their own uncertainty.

Uploaded by

Pun-Tak Pang
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

How to deal with uncertainty in measurements

Measurements matter
For a physicist, making good measurements matters. In a hospital, the measurement may detect a serious health problem. In industry, it may make sure that a component fits properly. In research, it may show that an accepted idea needs to be reconsidered. A physicist is always asking 'How might I do better?', and taking action to improve a measurement or to decide how far to trust it. So you should be aiming to: develop a sense of pride in measuring as well as possible given the tools you have, and to be clear about how well the job has been done become better able to experiment well, and to recognise the limitations of instruments become better at handling data, particularly in looking at uncertainty in measurement learn to look for important sources of uncertainty and attempt to reduce them consider possible systematic errors and try to remove them.

Focus on the instruments


There are two main ways to estimate the uncertainty of a measurement: repeat it many times and make an estimate from the variation you get look at the process of measurement used, and inspect and test the instruments used. You should focus mainly on the second way, on the process of measuring and on the qualities of the instruments you have. This points the way to how to improve your measurement. The main reason for being interested in the quality of a measurement is to see how to do better. Properties of instruments The essential qualities and limitations of sensors and measuring instruments are: resolution: the smallest detectable change in input sensitivity: the ratio of output to input stability (repeatability, reproducibility): the extent to which repeated measurements give the same result, including gradual change with time (drift) response time: the time interval between a change in input and the corresponding change in output zero error: the output for zero input noise: variations, which may be random, superimposed on a signal calibration: determining the relation between output and true input value, including linearity of the relationship.

Advancing Physics

Estimating uncertainty
The most important source of uncertainty The best way to improve a measurement is to identify the largest source of uncertainty and take steps to reduce it. Thus the main focus in thinking about uncertainties is: identifying and estimating the most important source of uncertainty in a measurement. This can be estimated in several ways: from the resolution of the instrument concerned. For example, the readout of a digital instrument ought not to be trusted to better than 1 in the last digit from the stability of the instrument, or by making deliberate small changes in conditions (a tap on the bench, maybe) that might anyway occur, to see what difference they make by trying another instrument, even if supposedly identical, to see how the values they give compare from the range of some repeated measurements. When comparing uncertainties in different quantities, it is the percentage uncertainties that need to be compared, to identify the largest.

Why results vary


There are different kinds of variation, uncertainty or error: inherent variation in the measured quantity (for example, fluctuations in wind speed; variation in the value amongst a set of nominally identical commercial resistors) small (maybe random) uncontrollable variations in conditions, including noise, leading to uncertainty simple mistakes, for example misreading a scale, or 'one-off' accidental errors, which need to be detected and removed; 'outliers' often turn out to be due to such mistakes systematic error or bias; a problem with the design of the experiment which can only be removed either by improving the design or by calculating its likely magnitude and allowing for it; this may show up as an intercept on a suitable graph, prompting students to consider how it arose a genuine outlying value, whose departure from the overall variation has some physical cause, which may well be of interest in itself.

'Plot and look'


Given a batch of varying values, it is essential to look at them visually. A simple dot-plot will usually be best, though a histogram is valuable if there are many values to consider. This is the way to see how the values are distributed, to detect possible outlying values, and to estimate the range within which the 'true' value is likely to lie. For the purposes of the Advancing Physics course, an estimate of the uncertainty of a measurement can be given in terms of the range of typical values, excluding outliers. The spread is then:
spread = 1 range 2

A good rule of thumb is that a value is likely to be an outlier if it lies more than 2 spread from the 2

Advancing Physics

average. However, you should not identify outliers simply by such a rule. There also needs to be a reason why a value departs a long way from the usual run of values. The reason can be anything from a mistake in reading or recording, to a real physical difference with an identified physical cause. An outlier is not an anomaly to be got rid of, but is a problem to be investigated (see Case Study 'A natural nuclear fission reactor' in the Advancing Physics AS student's book).

Working with graphs


A good graph lets you see patterns in data that you can't see just by looking at the numbers. So graphs are an essential working tool. A good graph also helps you to communicate your results quickly, effectively and visually. So graphs are also an essential presentation tool. Always have at least two versions of every graph: one for working on and one for the final presentation. Your working graph needs finely spaced grid lines for accurate plotting and reading off values (for example, slope and intercept). Your presentation graph needs just enough grid lines to be easy to read. Expect to have to try several versions of your presentation graph before you find the best form for it. Working graphs The job of your working graph is to store information, to let you see patterns in data, and to help you draw conclusions, for example about slope and intercept. Whenever possible, 'plot as you go'. This lets you quickly spot mistakes and to decide at what intervals to take measurements. Plot points with vertical crosses ('plus sign' shape). This most easily lets you get the position on each axis correct. Always choose a scale for each axis so that the points spread over at least 50% of the axis. But keep the scale simple, to avoid plotting errors. Think about whether you need to include the zero values on the axes, or not. Label the axes clearly, with quantity and unit. Give each point 'uncertainty bars', indicating the range of values within which you believe the true value to lie. Obtain more points by making more measurements where a line curves sharply. Use the working graph to measure slopes or intercepts, taking account of uncertainties in the values. Presentation graphs The job of your presentation graph is to tell a story about the results as clearly and effectively as possible. When using a computer to generate graphs, always try several different formats and shapes. Choose the one that most vividly displays the story you want the graph to tell. Prefer graph areas that are wider than they are tall ('landscape' rather than 'portrait'). A ratio of width to height of 3:2 is often good. Label each axis with the name and symbol of the relevant quantity, and state the appropriate unit of measurement (e.g. pressure p / kPa). 3

Advancing Physics

Show 'uncertainty bars' for each point. Give every graph a caption that conveys the story it tells. For example: 'Spring obeys Hooke's law up to 20% strain', not 'Extension against load for a spring'.

Uncertainty versus systematic error


Physicists from the National Physical Laboratory insist that it is a bad idea to use the word 'error' for every kind of departure from a 'true' value, as is done in the common expressions 'error analysis' or 'error bar'. Uncertainties are not 'errors'. They express how unsure you are of a result, not a mistake in it. No result can be known with zero uncertainty, even though a few physical quantities are known to many decimal places. So an uncertainty always has the form 'plus or minus something'. On the other hand, mistakes are indeed mistakes, getting the result wrong by some definite amount. Systematic errors also get the result wrong (plus something, or minus something). Errors like these need to be corrected, not to be included in an uncertainty. The problem with systematic error is that you may not know what it is, or even that it exists. The history of physics is littered with examples of undetected systematic errors. The only way to deal with a systematic error is to identify its cause and either calculate it and remove it, or do a better measurement which eliminates or reduces it.

Pessimism is the best policy


Look at any record of the best efforts physicists have made to measure a fundamental quantity - the speed of light, the charge on an electron, the 'age' of the universe. Time and again, history shows that they have tended to underestimate both uncertainties and systematic errors. Sometimes the uncertainty bars on different measurements do not even overlap. It is therefore wise to overestimate uncertainties rather than underestimate them. It is essential to realise that estimates of uncertainty are themselves very uncertain. So it is never worth giving an uncertainty estimate to more than one significant figure. And it is nearly always best to round up rather than round down.

Case studies: quality of measurement


The Advancing Physics AS student's book ends with some case studies in quality of measurement. Each case study highlights a few key aspects of measurement. Look at them to get a clearer understanding of these key ideas. The ocean from space Describes how the height of the ocean surface can be measured to within a few centimetres by radar from a satellite, and how this requires corrections for many systematic errors. Key ideas: resolution, systematic errors, error correction. Calibrating ultrasound transducers Describes a new way to calibrate ultrasound transducers used in body scanning and in physiotherapy, which is both cheaper and has greater sensitivity than the orthodox method Key ideas: calibration; sensitivity; noise. The Hubble telescope: the most expensive zero error ever Tells the story of how an accidental zero error of 1.3 mm in a length measurement cost hundreds of millions of dollars to correct the faulty mirror of the Hubble telescope. 4

Advancing Physics

Key ideas: systematic error; zero error. A natural nuclear fission reactor Tells the story of the discovery that, some 2 billion years ago, there was a natural nuclear fission reactor in uranium deposits in Western Africa. Key ideas: distribution of values; range; outlying values. There is a further case study on the CD-ROM: Replacing mercury thermometers in hospitals Explains the advantages of new infrared ear thermometers, which respond in less than a second, compared to some minutes for mercury thermometers, but which are still not officially sanctioned for use by nurses because of a need for calibration. Key ideas: response time; calibration; drift.

Advancing Physics

You might also like