60 terms defined for calibration, metrology, and quality professionals.
The closeness of agreement between a measured value and the true or accepted reference value. Accuracy reflects how correctly an instrument reports the actual quantity being measured.
AdjustmentThe operation of bringing a measuring instrument into a state of performance suitable for its use, typically by correcting bias, zero offset, or span errors identified during calibration.
BiasThe systematic error in a measurement system, representing the consistent difference between the average of measured values and the accepted reference value.
Calibration IntervalThe time period or usage interval between scheduled calibrations of a measurement instrument. Intervals balance the risk of out-of-tolerance operation against calibration cost and instrument downtime.
DriftThe gradual, systematic change in an instrument's measurement characteristics over time, separate from any sudden shifts. Drift causes readings to slowly deviate from the calibrated state.
LinearityThe consistency of bias across the full operating range of a measurement instrument. An instrument with good linearity has the same amount of error at every point in its range.
Measurement UncertaintyA parameter associated with the result of a measurement that characterizes the dispersion of values that could reasonably be attributed to the measurand.
Metrological TraceabilityThe property of a measurement result whereby it can be related to a stated reference through an unbroken chain of calibrations, each contributing to the measurement uncertainty.
PrecisionThe closeness of agreement between independent measurements obtained under stipulated conditions. Precision quantifies the spread or dispersion of repeated measurement results.
RepeatabilityThe closeness of agreement between results of successive measurements of the same measurand carried out under identical conditions (same operator, same instrument, same location, short time period).
ReproducibilityThe closeness of agreement between results of measurements of the same measurand carried out under changed conditions, such as different operators, instruments, locations, or time periods.
ResolutionThe smallest change in a quantity being measured that causes a perceptible change in the instrument's indication. Resolution represents the finest increment an instrument can detect or display.
StabilityThe ability of a measuring instrument to maintain its metrological characteristics (accuracy, bias, precision) constant over time. Stable instruments retain their calibration longer.
ToleranceThe permissible range of variation in a measured value or physical dimension, defining the acceptable limits within which a measurement or part is considered conforming.
VerificationThe process of confirming, through objective evidence, that specified requirements have been fulfilled. In metrology, verification checks whether an instrument meets its stated accuracy specifications.
Formal recognition by an authoritative body that a calibration or testing laboratory is competent to perform specific types of measurements, typically assessed against ISO/IEC 17025.
Calibration and Measurement Capability (CMC)The smallest measurement uncertainty that a laboratory can achieve within its scope of accreditation when performing routine calibrations of nearly ideal measurement standards.
Decision RulesDocumented rules that describe how measurement uncertainty is accounted for when making conformity statements (pass/fail decisions) about calibration results.
Guard BandingThe practice of tightening acceptance limits relative to the specified tolerance to account for measurement uncertainty and reduce the risk of false accept decisions.
GUM (Guide to the Expression of Uncertainty in Measurement)The internationally recognized guide (JCGM 100) that establishes the framework and rules for evaluating and expressing measurement uncertainty.
ISO/IEC 17025The international standard specifying general requirements for the competence, impartiality, and consistent operation of testing and calibration laboratories.
Measurement Uncertainty BudgetA systematic accounting of all sources of uncertainty in a measurement process, combining individual contributions to determine the overall measurement uncertainty.
NIST TraceabilityThe documented chain of calibrations linking a measurement result to standards maintained by the National Institute of Standards and Technology (NIST), the United States' national measurement institute.
Test Accuracy Ratio (TAR)The ratio of the accuracy (or tolerance) of the unit under test to the accuracy of the reference standard used to calibrate it. TAR is the predecessor to TUR and does not account for measurement uncertainty.
Test Uncertainty Ratio (TUR)The ratio of the tolerance of the unit under test to the expanded uncertainty of the measurement process used to test it. A TUR of 4:1 or higher is generally recommended.
A highly sensitive laboratory weighing instrument capable of measuring mass to a resolution of 0.1 mg (0.0001 g) or better, used in chemistry, pharmaceutical, and precision weighing applications.
CaliperA versatile dimensional measuring instrument used to measure internal and external dimensions, depth, and step features with typical resolutions of 0.01 mm (0.0005 in) to 0.02 mm (0.001 in).
Coordinate Measuring Machine (CMM)A sophisticated metrology system that measures the geometry of physical objects by sensing discrete points on their surfaces using a contact probe or non-contact sensor, computing dimensions in three-dimensional space.
Force GaugeA handheld or mounted instrument that measures push and pull forces, typically using a strain gauge load cell or spring mechanism, with digital or analog display in units of pounds, newtons, or kilograms-force.
Gage BlockA precision-ground block of metal or ceramic with two parallel, flat measurement surfaces at a precisely known distance apart, used as a reference standard for dimensional calibration.
HygrometerAn instrument that measures the moisture content or humidity of air or other gases, typically reporting relative humidity (RH), dew point, or absolute humidity.
Load CellA force transducer that converts mechanical force or weight into an electrical signal, used in scales, testing machines, and process monitoring applications.
MicrometerA precision measuring instrument that uses a calibrated screw mechanism to measure dimensions with typical resolutions of 0.001 mm (0.00005 in) and accuracies of ±0.002 mm (±0.0001 in).
MultimeterAn electronic test instrument that measures multiple electrical quantities, typically including voltage (AC/DC), current (AC/DC), and resistance, with optional functions such as frequency, capacitance, and temperature.
OscilloscopeAn electronic test instrument that displays electrical signal waveforms as a function of time, enabling measurement of voltage amplitude, frequency, rise time, and other signal characteristics.
Pressure GaugeAn instrument that measures the pressure of a fluid (gas or liquid) and displays it on a dial, digital readout, or transmits it as an electrical signal. Common types include Bourdon tube, diaphragm, and digital pressure gauges.
PyrometerA non-contact temperature measurement instrument that determines the temperature of an object by measuring the thermal radiation it emits, used for high temperatures or moving/inaccessible objects.
Ring GageA cylindrical reference standard with a precisely machined bore, used to verify the outside diameter of cylindrical parts, shafts, and plug gages by go/no-go or setting master methods.
ThermocoupleA temperature sensor consisting of two dissimilar metal wires joined at one end, which generates a small voltage proportional to the temperature difference between the junction and the reference point.
Torque WrenchA tool designed to apply a specific amount of torque to a fastener, with a built-in measurement mechanism that indicates when the preset torque value has been reached.
The measurement readings recorded during calibration that reflect the instrument's condition as it was being used, before any adjustments or corrections are made.
As-Left DataThe measurement readings recorded after calibration adjustments have been completed, documenting the instrument's condition as it is returned to service.
Calibration AdjustmentThe process of modifying an instrument's settings, offset, gain, or mechanical configuration to correct errors identified during calibration and bring readings within specified tolerance.
Calibration CertificateA formal document issued upon completion of calibration that records the instrument identification, calibration results, reference standards used, measurement uncertainty, and the traceability chain.
Calibration Due DateThe date by which an instrument must be recalibrated to maintain its valid calibration status, calculated by adding the calibration interval to the date of the last calibration.
Calibration LabelA physical or electronic tag attached to a calibrated instrument that displays its calibration status, including the calibration date, due date, and identification of the calibrating organization.
Calibration RecallThe process of identifying instruments approaching their calibration due date and initiating actions to ensure they are recalibrated on time, including notifications, scheduling, and logistics.
Out of Tolerance (OOT)The condition where an instrument's as-found calibration readings exceed the specified tolerance limits, indicating it was not performing within acceptable accuracy while in service.
Pass/Fail CriteriaThe defined acceptance limits and rules used to determine whether an instrument meets its required specifications during calibration, resulting in a pass (in tolerance) or fail (out of tolerance) determination.
Reverse TraceabilityThe ability to identify all instruments and measurements affected when a reference standard is found to be out of tolerance, enabling assessment of the downstream impact on calibrated instruments and their measurements.
A graphical tool used in statistical process control that plots data over time with a center line (mean) and upper and lower control limits, used to detect non-random variation in a process.
Coverage FactorA numerical factor (k) by which the combined standard uncertainty is multiplied to obtain the expanded uncertainty, chosen to provide a specified level of confidence (typically k=2 for approximately 95% confidence).
Effective Degrees of FreedomA calculated value that represents the reliability of the combined standard uncertainty estimate, accounting for the degrees of freedom of each individual uncertainty component, used to determine the appropriate coverage factor.
Expanded UncertaintyThe measurement uncertainty expressed as an interval around the measured value within which the true value is expected to lie with a stated level of confidence, calculated by multiplying the combined standard uncertainty by the coverage factor.
Gage R&R (Gage Repeatability and Reproducibility)A statistical study that quantifies the amount of measurement variation attributable to the measurement system itself, separating it into repeatability (equipment variation) and reproducibility (operator variation) components.
Measurement System Analysis (MSA)A comprehensive evaluation of a measurement process that assesses all sources of variation, including bias, linearity, stability, repeatability, and reproducibility, to determine whether the system is adequate for its intended use.
Process CapabilityA statistical measure of how well a process can produce output within specified tolerance limits, expressed as indices such as Cp (potential capability) and Cpk (actual capability considering centering).
Statistical Process Control (SPC)A quality control methodology that uses statistical methods and control charts to monitor and control a process, distinguishing between normal variation and special-cause variation that requires investigation.
Type A UncertaintyA method of evaluating measurement uncertainty by statistical analysis of a series of observations, typically calculated as the standard deviation of the mean from repeated measurements.
Type B UncertaintyA method of evaluating measurement uncertainty using means other than statistical analysis of observations, including manufacturer specifications, calibration certificates, published data, and scientific judgment.