The permissible range of variation in a measured value or physical dimension, defining the acceptable limits within which a measurement or part is considered conforming.
Tolerance specifies the maximum allowable deviation from a nominal or target value. In manufacturing, tolerances define how much a part dimension can vary and still function correctly. In calibration, tolerances define the acceptable error range for an instrument's readings at each calibration point.
Tolerances are established based on the requirements of the end application. Tight tolerances demand more precise instruments, more frequent calibration, and often more controlled measurement environments. The relationship between instrument tolerance and the tolerance of the item being measured is governed by the test accuracy ratio (TAR) or test uncertainty ratio (TUR), which should typically be 4:1 or better to ensure reliable pass/fail decisions.
In calibration management, tolerances are documented on calibration procedures, certificates, and work instructions. When an instrument's as-found reading falls outside its specified tolerance, it is flagged as out-of-tolerance (OOT), triggering an investigation into the potential impact on products or processes measured since the last known good calibration. Proper tolerance management is essential for ISO 17025 accreditation and regulatory compliance.
In aerospace calibration labs, tolerance specifications are critical when calibrating torque wrenches used for aircraft fastener installation. A 100 ft-lb torque wrench might have a ±4% tolerance per AS9100 requirements, meaning readings between 96-104 ft-lb are acceptable. The lab must verify the wrench performs within these limits at multiple points across its range. In medical device manufacturing, blood pressure monitor calibrators require extremely tight tolerances—typically ±0.3% of reading or ±0.8 mmHg, whichever is greater, per FDA 21 CFR 820.72. A 200 mmHg reference standard must maintain accuracy within ±0.6 mmHg. Getting tolerance specifications wrong leads to serious consequences: an aerospace client received a major audit finding when their calibration lab incorrectly applied ±2% tolerance to precision pressure transducers that actually required ±0.1% per the manufacturer's specifications. This resulted in accepting out-of-tolerance instruments that could have compromised flight safety measurements. Similarly, a medical device company faced FDA observations when their calibration certificates showed 'Pass' results for instruments that exceeded the actual device tolerance requirements, even though they met the calibration standard's broader tolerances.
ISO/IEC 17025:2017 addresses tolerance in Section 7.8.6.1, requiring that calibration certificates include measurement uncertainty and specify whether results meet stated specifications. AS9100D Section 7.1.5.2 mandates that measurement equipment accuracy must be consistent with measurement requirements and specified tolerances. ISO 13485:2016 Section 7.6 requires medical device manufacturers to ensure measuring equipment is capable of providing valid results within specified tolerances. The GUM (ISO/IEC Guide 98-3) establishes that measurement uncertainty must be evaluated against tolerance limits to determine conformity. ANSI/NCSL Z540.3-2006 Section 4.2 specifies that calibration intervals must consider instrument tolerance and drift characteristics. Auditors specifically examine whether labs correctly identify and apply customer-specified tolerances versus calibration standard tolerances, verify that measurement uncertainty is appropriately small compared to tolerance bands (typically 4:1 or 10:1 ratios), and confirm that calibration certificates clearly state conformity to tolerance requirements. Non-compliance often occurs when labs apply generic tolerance values instead of customer-specific requirements or fail to distinguish between calibration accuracy and end-use tolerance specifications.
CalibrationOS handles tolerance management through its Specification Management module, which stores manufacturer tolerances, customer-specific tolerance requirements, and applicable standards for each instrument type. The system automatically compares measured values against defined tolerance bands during calibration data entry, flagging out-of-tolerance conditions with visual indicators and mandatory corrective action prompts. The Certificate Generation engine incorporates tolerance specifications directly into calibration certificates, clearly stating whether instruments conform to specified limits and displaying measurement uncertainty relative to tolerance bands. During audit preparation, the Compliance Dashboard provides tolerance trend analysis and identifies instruments approaching tolerance limits, enabling proactive maintenance scheduling. The system maintains a tolerance specification library linked to industry standards (AS9100, ISO 13485, IATF 16949), ensuring consistent application across different customer requirements. Reports can be generated showing tolerance compliance statistics and uncertainty-to-tolerance ratios, providing auditors with clear evidence of proper tolerance management and helping labs demonstrate conformity to regulatory requirements during assessments.
Tolerance in calibration is the permissible range of error for an instrument's readings. If a measurement falls within the tolerance, the instrument passes calibration; if it falls outside, the instrument is out-of-tolerance.
Calibration tolerances are typically set by the instrument manufacturer, industry standards, or the end-use application requirements. They must account for the accuracy needed to make reliable measurements within the process tolerance.
This article is licensed CC BY-SA 4.0. Share, adapt, and reuse with attribution to calibrationos.com/glossary/tolerance.
Industry benchmarks, best practices, and calibration tips — delivered to your inbox.
Free calibration management software with audit-ready tracking, uncertainty budgets, and compliance tools.
Start Free — No Credit Card