The permissible range of variation in a measured value or physical dimension, defining the acceptable limits within which a measurement or part is considered conforming.
Tolerance specifies the maximum allowable deviation from a nominal or target value. In manufacturing, tolerances define how much a part dimension can vary and still function correctly. In calibration, tolerances define the acceptable error range for an instrument's readings at each calibration point.
Tolerances are established based on the requirements of the end application. Tight tolerances demand more precise instruments, more frequent calibration, and often more controlled measurement environments. The relationship between instrument tolerance and the tolerance of the item being measured is governed by the test accuracy ratio (TAR) or test uncertainty ratio (TUR), which should typically be 4:1 or better to ensure reliable pass/fail decisions.
In calibration management, tolerances are documented on calibration procedures, certificates, and work instructions. When an instrument's as-found reading falls outside its specified tolerance, it is flagged as out-of-tolerance (OOT), triggering an investigation into the potential impact on products or processes measured since the last known good calibration. Proper tolerance management is essential for ISO 17025 accreditation and regulatory compliance.
Tolerance in calibration is the permissible range of error for an instrument's readings. If a measurement falls within the tolerance, the instrument passes calibration; if it falls outside, the instrument is out-of-tolerance.
Calibration tolerances are typically set by the instrument manufacturer, industry standards, or the end-use application requirements. They must account for the accuracy needed to make reliable measurements within the process tolerance.
Free calibration management software with audit-ready tracking, uncertainty budgets, and compliance tools.
Get Started Free