← Glossary

Multimeter

An electronic test instrument that measures multiple electrical quantities, typically including voltage (AC/DC), current (AC/DC), and resistance, with optional functions such as frequency, capacitance, and temperature.

Multimeters are the most widely used electronic test instruments, found in every electrical and electronics workshop, laboratory, and field service kit. Digital multimeters (DMMs) range from basic handheld units with 3.5-digit resolution for general troubleshooting to high-precision bench instruments with 8.5-digit resolution for metrology applications. Key specifications include measurement ranges, accuracy (often expressed as ±(% of reading + % of range)), resolution, input impedance, and frequency response.

Calibration of multimeters involves applying known electrical stimuli (voltage, current, resistance) and comparing the meter's readings to the applied values. Reference standards include precision voltage sources, current sources, and standard resistors. Calibration covers all functions and ranges, at multiple points within each range. For AC measurements, calibration at multiple frequencies is necessary because accuracy varies with frequency. High-performance DMMs may also require calibration of auxiliary functions like frequency measurement, capacitance, and temperature.

In calibration management, multimeters present a range of calibration complexity. Basic handheld meters may have 20-30 calibration points, while a high-accuracy bench DMM can have hundreds of points across all functions, ranges, and frequencies. Calibration intervals are typically 12 months for production and laboratory use. Many modern DMMs support automated calibration, where the calibration system communicates with the meter digitally to step through all test points and record results. This automation is essential for efficiently calibrating large fleets of DMMs.

In Practice

In aerospace calibration labs, multimeters are essential for calibrating power supplies used in avionics testing, where precise voltage measurements (±0.01%) are critical for flight-critical systems. A typical scenario involves calibrating a Fluke 8588A reference multimeter using a Fluke 5730A multifunction calibrator to verify DC voltage accuracy from 100mV to 1000V. Medical device manufacturers rely on multimeters for calibrating patient monitoring equipment, such as verifying the 5.000V reference in ECG machines where measurement errors could affect cardiac diagnosis. A common application uses Keysight 3458A multimeters to calibrate defibrillator charge voltage circuits, requiring traceability to NIST standards. Getting multimeter calibration wrong creates cascading measurement errors. In one audit finding, a defense contractor's uncalibrated Fluke 87V caused a 2% voltage error that propagated through their entire power supply calibration chain, affecting 47 instruments. The root cause was treating handheld multimeters as 'shop tools' rather than precision instruments requiring formal calibration. Another case involved a medical device lab where an out-of-tolerance bench multimeter caused systematic errors in implantable device testing, discovered only during FDA inspection. This resulted in product recalls and demonstrated why even secondary multimeters need documented calibration with appropriate measurement uncertainty analysis per GUM guidelines.

Regulatory Context

ISO/IEC 17025:2017 Section 6.4.6 requires calibration of measuring equipment, explicitly including multimeters as measurement standards when used for calibration activities. The standard demands documented measurement uncertainty and traceability for all calibration equipment. AS9100D Section 7.1.5.2 mandates calibration of test equipment used in aerospace manufacturing, with multimeters falling under measuring and monitoring equipment requirements. ISO 13485:2016 Section 7.6 requires medical device manufacturers to calibrate measuring equipment, including multimeters used for electrical safety testing and performance verification. ANSI/NCSL Z540.3-2006 Section 4.2 specifies calibration requirements for electrical measuring instruments, establishing Test Accuracy Ratio (TAR) requirements typically 4:1 or 10:1 for multimeter calibrations. ILAC P14:01/2013 policy addresses measurement uncertainty in calibration, requiring multimeter uncertainty budgets to include manufacturer specifications, environmental effects, and long-term stability. Auditors specifically verify multimeter calibration certificates show appropriate measurement ranges, environmental conditions during calibration, and compliance with specified TAR requirements. They examine whether handheld multimeters used for troubleshooting are properly excluded from calibration requirements while bench instruments used for measurement are appropriately controlled.

How CalibrationOS Handles This

CalibrationOS handles multimeter management through its Electronic Test Equipment module, automatically tracking calibration schedules for bench multimeters (Keysight 3458A, Fluke 8588A) and handheld units (Fluke 87V, Keysight U1272A). The system captures essential calibration data including voltage/current/resistance measurements across all ranges, environmental conditions, and measurement uncertainty calculations. For multimeter calibrations, CalibrationOS automatically generates certificates showing 'As Found' and 'As Left' conditions with statistical analysis of measurement drift. The software integrates with automated calibration systems using SCPI commands to control calibrators like the Fluke 5730A, reducing manual data entry errors. During audits, the system provides comprehensive reports showing multimeter usage history, calibration intervals, and measurement uncertainty propagation when multimeters are used as reference standards. CalibrationOS automatically flags when multimeters approach their calibration due dates and calculates appropriate Test Accuracy Ratios based on the application requirements, ensuring compliance with ANSI Z540.3 guidelines and supporting ISO 17025 documentation requirements.

Frequently Asked Questions

How is a multimeter calibrated?

A multimeter is calibrated by applying known electrical values (voltage, current, resistance) from traceable reference standards and comparing the meter's readings. All functions and ranges are tested at multiple points.

How often should a multimeter be calibrated?

Multimeters are typically calibrated annually (every 12 months). Critical or high-accuracy applications may warrant shorter intervals, while lightly used meters in stable environments may qualify for extended intervals after establishing a history.

Related Standards

This article is licensed CC BY-SA 4.0. Share, adapt, and reuse with attribution to calibrationos.com/glossary/multimeter.

Get Calibration Insights

Industry benchmarks, best practices, and calibration tips — delivered to your inbox.

No spam. Unsubscribe anytime.

Track Multimeter in CalibrationOS

Free calibration management software with audit-ready tracking, uncertainty budgets, and compliance tools.

Start Free — No Credit Card