How to Calibrate a Coordinate Measuring Machine

dimensional

Coordinate measuring machines (CMMs) are high-accuracy 3D measurement systems used for dimensional inspection of manufactured parts. Calibration involves verifying volumetric accuracy using calibrated artifacts across the measurement volume. CMM performance is critical for first-article inspection and SPC.

Required Reference Standards

  • ISO 10360-2 - CMM performance verification
  • ASME B89.4.10360.2 - Acceptance and verification tests
  • Calibrated step gauge, ball bar, or ball plate
  • Calibrated test sphere for probing error verification

Calibration Procedure

  1. 1

    Environmental Verification

    Record the ambient temperature and verify it is within the manufacturer's specification (typically 20 °C ±1 °C). Check that temperature gradients across the CMM volume do not exceed limits. Verify the air supply pressure for air bearings.

  2. 2

    Probing Error Test (PFTU)

    Qualify the probe and measure a calibrated test sphere 25 times per ISO 10360-5. Calculate the form error (range of radial deviations) and the size error (measured diameter minus calibrated diameter). Both must meet manufacturer specifications.

  3. 3

    Length Measurement Error Test (EL)

    Measure a calibrated step gauge or gage block set in at least seven positions and orientations throughout the measurement volume (along each axis, face diagonals, and body diagonals). Record the error at each length.

  4. 4

    Volumetric Performance Assessment

    If using a ball bar, measure the calibrated length in multiple orientations across the volume. Calculate the maximum permissible error (MPE) and compare to the manufacturer's specification, typically expressed as E = A + L/K micrometers.

  5. 5

    Interim Check with Artifacts

    Measure a master part or ring gauge as a functional performance check. Compare results to previous calibrations to identify trends or sudden changes in performance.

  6. 6

    Documentation and Reporting

    Record all data, environmental conditions, probe configuration, and artifact IDs. Generate the calibration report with pass/fail assessment against the CMM's MPE specification. Apply calibration label to the CMM.

Acceptance Criteria

Per ISO 10360-2, the maximum permissible error for length measurement (EL,MPE) must be within the manufacturer's specification, typically expressed as E = (A + L/K) µm where A and K are manufacturer-specified constants and L is the measured length. Probing form error (PFTU) must meet the specified value.

Typical Calibration Interval

12 months, with monthly interim checks

FAQ

What is the difference between CMM calibration and verification?

Calibration involves determining and correcting the CMM's geometric error map (typically performed by the manufacturer). Verification (per ISO 10360-2) is a performance test that confirms the CMM meets its specified accuracy. Most users perform periodic verification; full recalibration is done less frequently.

How does temperature affect CMM accuracy?

CMMs are specified at 20 °C. Thermal expansion of the scales, the workpiece, and the machine structure all contribute to measurement error at other temperatures. A 1 °C temperature error on a 500 mm steel part introduces approximately 6 µm of error. Temperature compensation can reduce but not eliminate this effect.

Can I use a CMM to calibrate other gages?

A CMM can be used to calibrate other dimensional artifacts (ring gages, plug gages, etc.) if its measurement uncertainty for that task is sufficiently low relative to the gage tolerance — typically a 4:1 test uncertainty ratio. The CMM must itself be verified and the measurement procedure validated.

Track Coordinate Measuring Machine Calibrations Automatically

CalibrationOS tracks due dates, stores certificates, and generates audit-ready reports.

Get Started Free