Page 16 - Mechanical Engineers' Handbook (Volume 2)
P. 16
1 Terminology 5
Accuracy is the maximum amount of difference between a measured variable and its
true value. It is usually expressed as a percentage of full-scale output. In the strictest
sense, accuracy is never known because the true value is never really known.
Precision is the difference between a measured variable and the best estimate (as ob-
tained from the measured variable) of the true value of the measured variable. It is a
measure of repeatability. Precise measurements have small dispersion but may have
poor accuracy if they are not close to the true value. Figure 1a shows the differences
between accuracy and precision.
Linearity describes the maximum deviation of the output of an instrument from a best-
fitting straight line through the calibration data. Most instruments are designed so that
the output is a linear function of the input. Linearity is based on the type of straight
line fitted to the calibration data. For example, least-squares linearity is referenced to
that straight line for which the sum of the squares of the residuals is minimized. The
Figure 1 Schematics illustrating concepts of (a) accuracy and precision, (b) hysteresis, (c) a static error
band, and (d) fitting a curve to the calibration data.