An electronic test instrument that measures multiple electrical quantities, typically including voltage (AC/DC), current (AC/DC), and resistance, with optional functions such as frequency, capacitance, and temperature.
Multimeters are the most widely used electronic test instruments, found in every electrical and electronics workshop, laboratory, and field service kit. Digital multimeters (DMMs) range from basic handheld units with 3.5-digit resolution for general troubleshooting to high-precision bench instruments with 8.5-digit resolution for metrology applications. Key specifications include measurement ranges, accuracy (often expressed as ±(% of reading + % of range)), resolution, input impedance, and frequency response.
Calibration of multimeters involves applying known electrical stimuli (voltage, current, resistance) and comparing the meter's readings to the applied values. Reference standards include precision voltage sources, current sources, and standard resistors. Calibration covers all functions and ranges, at multiple points within each range. For AC measurements, calibration at multiple frequencies is necessary because accuracy varies with frequency. High-performance DMMs may also require calibration of auxiliary functions like frequency measurement, capacitance, and temperature.
In calibration management, multimeters present a range of calibration complexity. Basic handheld meters may have 20-30 calibration points, while a high-accuracy bench DMM can have hundreds of points across all functions, ranges, and frequencies. Calibration intervals are typically 12 months for production and laboratory use. Many modern DMMs support automated calibration, where the calibration system communicates with the meter digitally to step through all test points and record results. This automation is essential for efficiently calibrating large fleets of DMMs.
A multimeter is calibrated by applying known electrical values (voltage, current, resistance) from traceable reference standards and comparing the meter's readings. All functions and ranges are tested at multiple points.
Multimeters are typically calibrated annually (every 12 months). Critical or high-accuracy applications may warrant shorter intervals, while lightly used meters in stable environments may qualify for extended intervals after establishing a history.
Free calibration management software with audit-ready tracking, uncertainty budgets, and compliance tools.
Get Started Free