Does accuracy really matter?
What do you think is more important? An instrument that records accurate measurements or one that records precise measurements? Accuracy and precision are both important if you want to measure anything and to make measurements you need some sort of instrument. Whether it’s the balance in your laboratory or a data logger at work, data measurement precision is important. It is crucial to have some idea about whether they are accurate or precise or hopefully both, depending on what you need to use the data for. We all tend to use these two words interchangeably in everyday conversation, maybe during watching football ‘oh that was an accurate shot’ or watching a documentary about surgery ‘that was a very precise incision’, yet they are quite different.
So what is the difference between precise and accurate?
Well, ISO (International Organization for Standardization) uses “ two terms “trueness” and “precision” to describe the accuracy of a measurement method. “Trueness” refers to the closeness of agreement between the arithmetic mean of a large number of test results and the true or accepted reference value. “Precision” refers to the closeness of agreement between test results”. In other words, precision is the agreement among several determinations of the same quantity. The better the precision the lower the difference amongst the values showing that the results are highly reproducible. High precision is only achieved with high quality instruments and careful work.
This diagram below shows the differences between preciseness and trueness and accuracy.
Left target: good precision but poor trueness; centre target: good trueness but poor accuracy; right target: good trueness and good precision, therefore good accuracy.
And what about accuracy?
Accuracy is the agreement between an experimental value, or the average of several determinations of the value, with an accepted or theoretical (“true”) value for a quantity. Accuracy is usually expressed either as a percent difference or a unit of measurement and can be positive or negative; the sign shows whether the experimental value is higher or lower than the actual or theoretical value. For example, 2.9 to 5.2°C (±0.1°C). This distinction is not used in precision, since all values are experimental.
The diagram above illustrates the difference between accuracy and precision.
Interestingly, the ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises that accurate be used when a measurement is both accurate and precise.
If you have high precision, or reproducibility, and poor accuracy, there is usually a systematic error of some sort. This can involve improper calibration or mishandling of a measuring device – very consistent mishandling. Which is why proper calibration of instrumentation is essential. During calibration, a record is made of how far its measurements are from known or true values and a record kept of this calibration to ensure proper readings.
Good accuracy and poor precision can result from a combination of lazy experimental procedure and luck. However, some quantities are difficult to determine with high precision, such as those involving lab animals for example. In such cases, the experimenter must make many determinations of the value and average them. The goal of any experimenter should be to achieve both high precision and high accuracy in their work.
Find out more
To discover how our data acquisition solutions can help you accurately monitor your data, then visit Data Acquisition, or you can email salesdesk@grantinstruments.com or call +44 (0) 1763 260 811.