← Glossary

Accuracy

The closeness of agreement between a measured value and the true or accepted reference value. Accuracy reflects how correctly an instrument reports the actual quantity being measured.

Accuracy is one of the most fundamental concepts in metrology and calibration. It describes how close a measurement result is to the true value of the quantity being measured. An instrument with high accuracy produces readings that are very near the accepted reference value, while an instrument with poor accuracy consistently deviates from it.

Accuracy is often confused with precision, but the two are distinct. A measurement system can be precise (producing tightly clustered results) without being accurate (those clustered results may be far from the true value). In calibration, accuracy is evaluated by comparing the instrument's readings against a higher-accuracy reference standard with known traceability to national or international standards such as NIST.

In calibration management, maintaining accuracy is critical because measurement errors propagate through manufacturing, quality control, and compliance processes. Organizations must define acceptable accuracy limits for each instrument, calibrate at appropriate intervals, and document results. When an instrument falls outside its specified accuracy range, it is declared out-of-tolerance and may require adjustment or replacement to prevent defective products or invalid test results.

In Practice

In aerospace calibration labs, accuracy is critical when calibrating torque wrenches used on aircraft fasteners. A torque wrench reading 100 N·m when the true value is 105 N·m demonstrates poor accuracy, potentially leading to under-torqued bolts and safety failures. The calibration certificate must report the actual measured values against NIST-traceable standards. In medical device manufacturing, temperature controllers for sterilization autoclaves require exceptional accuracy. If a controller reads 121°C when actual temperature is 118°C, sterilization effectiveness is compromised, potentially causing patient infections. Accuracy errors commonly cause audit findings when calibration data shows systematic bias but labs fail to apply appropriate correction factors or declare instruments out-of-tolerance. For example, a pressure gauge consistently reading 2% high across its range indicates poor accuracy; continuing to use it without correction or replacement violates measurement requirements. Quality engineers must distinguish accuracy from precision - an instrument can be precise (repeatable) but inaccurate (biased from true value). Calibration laboratories demonstrate accuracy through comparison with higher-accuracy reference standards, typically maintaining 4:1 or 10:1 test uncertainty ratios per ANSI/NCSL Z540.3 requirements.

Regulatory Context

ISO/IEC 17025:2017 addresses accuracy in sections 6.4.6 and 6.5.2, requiring laboratories to determine measurement uncertainty and ensure equipment accuracy meets intended use. Section 7.8.6.1 mandates that calibration certificates include measurement results and associated uncertainties. AS9100D references accuracy through measurement system requirements in section 7.1.5, demanding documented evidence of measurement accuracy for aerospace applications. ISO 13485:2016 section 7.6 requires medical device manufacturers to demonstrate measurement equipment accuracy through calibration against traceable standards. The GUM (ISO/IEC Guide 98-3) provides the framework for evaluating accuracy through uncertainty budgets, distinguishing systematic errors (accuracy) from random errors (precision). ANSI/NCSL Z540.3 section 9.2.1 specifies accuracy requirements for calibration standards, typically requiring reference standards to be at least four times more accurate than items under calibration. During audits, assessors verify accuracy claims by reviewing calibration certificates, uncertainty calculations, and measurement traceability chains. Auditors specifically examine whether laboratories properly account for systematic biases and apply appropriate correction factors when accuracy specifications are not met.

How CalibrationOS Handles This

CalibrationOS captures accuracy data through its Measurement Module, which records actual measured values alongside nominal or target values for each calibration point. The software automatically calculates accuracy metrics including percent error, absolute error, and bias trending across calibration cycles. The Certificate Generation Module produces calibration certificates that clearly distinguish between accuracy (systematic deviation from true value) and precision (repeatability), ensuring compliance with ISO/IEC 17025 reporting requirements. During audits, the Audit Trail feature provides documented evidence of accuracy assessments, including how instruments performing outside accuracy specifications were handled - whether through correction factors, restricted use, or removal from service. The Uncertainty Calculator integrates accuracy contributions into overall measurement uncertainty budgets per GUM requirements. CalibrationOS trending algorithms identify systematic accuracy drift over time, enabling predictive maintenance and optimal calibration intervals. The software's Compliance Dashboard flags instruments approaching accuracy limits, ensuring proactive management before specifications are exceeded.

Frequently Asked Questions

What is accuracy in calibration?

Accuracy in calibration refers to how close a measured value is to the true or accepted reference value. It is determined by comparing an instrument's readings against a traceable reference standard during calibration.

How is accuracy different from precision?

Accuracy describes closeness to the true value, while precision describes the consistency of repeated measurements. An instrument can be precise but inaccurate if it consistently reads the same wrong value.

How do you improve measurement accuracy?

Measurement accuracy is improved through regular calibration against traceable standards, environmental controls, proper instrument handling, and applying correction factors when systematic errors are identified.

Related Standards

This article is licensed CC BY-SA 4.0. Share, adapt, and reuse with attribution to calibrationos.com/glossary/accuracy.

Get Calibration Insights

Industry benchmarks, best practices, and calibration tips — delivered to your inbox.

No spam. Unsubscribe anytime.

Track Accuracy in CalibrationOS

Free calibration management software with audit-ready tracking, uncertainty budgets, and compliance tools.

Start Free — No Credit Card