How to Calibrate a Micrometer

dimensional

Outside micrometers provide high-precision dimensional measurements, typically with 0.001 mm or 0.0001 in resolution. Calibration involves verifying accuracy using certified gage blocks at multiple points across the measuring range. Proper calibration ensures micrometers meet manufacturing tolerance requirements.

Required Reference Standards

  • ASME B89.1.13 - Micrometers
  • ISO 3611 - Micrometer specifications
  • Certified gage block set (Grade 0 or better)
  • Optical flat and monochromatic light source (for anvil flatness)

Calibration Procedure

  1. 1

    Visual and Functional Inspection

    Inspect the micrometer for damage, wear on anvil faces, and legibility of markings. Verify the thimble rotates smoothly, the lock nut functions, and the ratchet or friction stop engages properly.

  2. 2

    Zero Check

    Clean the measuring faces and close the micrometer using the ratchet stop. Record the zero reading. For micrometers above 25 mm, use the supplied setting standard to verify the zero point.

  3. 3

    Anvil Flatness and Parallelism

    Using an optical flat and monochromatic light, check the flatness of the anvil and spindle faces. Count the interference fringes — no more than one fringe (0.3 µm) indicates acceptable flatness. Check parallelism with gage blocks at the minimum and maximum gap.

  4. 4

    Accuracy Verification

    Measure certified gage blocks at a minimum of five points evenly distributed across the range. Use the ratchet stop for consistent measuring force. Record the micrometer reading and the gage block certified value at each point.

  5. 5

    Repeatability Assessment

    Take ten consecutive measurements of a single gage block near midrange. Calculate the standard deviation. This assesses the combined repeatability of the instrument and operator technique.

  6. 6

    Documentation

    Record all as-found readings, errors, and as-left readings on the calibration certificate. Include uncertainty of measurement. Apply calibration label with date, due date, and technician identification.

Acceptance Criteria

Error at any test point must not exceed ±0.002 mm (±0.0001 in) for standard micrometers, or per manufacturer specification. Anvil flatness must not exceed 1 interference fringe. Parallelism error must not exceed 0.002 mm.

Typical Calibration Interval

12 months

FAQ

What is the difference between calibrating a 0-25 mm and a 25-50 mm micrometer?

The procedure is the same, but micrometers above 25 mm use a setting standard (reference bar) to establish the zero point instead of closing the anvils directly. The setting standard itself must be calibrated and traceable.

Do I need an optical flat to calibrate a micrometer?

An optical flat check for anvil flatness and parallelism is part of a thorough calibration. If your quality system does not require it, you can perform a dimensional-only calibration using gage blocks, but accredited labs typically include the flatness check.

How do I know if my micrometer needs adjustment or replacement?

If errors are consistent (all readings high or low), the micrometer may need a zero adjustment. If errors are inconsistent or the repeatability is poor, the instrument may have worn threads or spindle issues and should be evaluated for repair or replacement.

Track Micrometer Calibrations Automatically

CalibrationOS tracks due dates, stores certificates, and generates audit-ready reports.

Get Started Free