Test & Measurement


Why calibrate test equipment?

3 May 2006 Test & Measurement

You are serious about your electrical test instruments. You buy top brands, and you expect them to be accurate. You know some people send their digital instruments to a metrology lab for calibration, and you wonder why.

After all, these are all electronic - there is no meter movement to go out of balance. What do those calibration folks do, anyhow - just change the battery?

These are valid concerns, especially since you cannot use your instrument while it is out for calibration. But, let us consider some other valid concerns. For example, what if an event rendered your instrument less accurate, or maybe even unsafe? What if you are working with tight tolerances and accurate measurement is key to proper operation of expensive processes or safety systems? What if you are trending data for maintenance purposes, and two meters used for the same measurement significantly disagree?

What is calibration?

Many people do a field comparison check of two meters, and call them 'calibrated' if they give the same reading. This is not calibration. It is simply a field check. It can show you if there is a problem, but it cannot show you which meter is right. If both meters are out of calibration by the same amount and in the same direction, it will not show you anything. Nor will it show you any trending - you will not know your instrument is headed for an 'out of cal' condition.

For an effective calibration, the calibration standard must be more accurate than the instrument under test. Most of us have a microwave oven or other appliance that displays the time in hours and minutes. Most of us live in places where we change the clocks at least twice a year, plus again after a power outage. When you set the time on that appliance, what do you use as your reference timepiece? Do you use a clock that displays seconds? You probably set the time on the 'digits-challenged' appliance when the reference clock is at the 'top' of a minute (eg, zero seconds). A metrology lab follows the same philosophy. They see how closely your 'whole minutes' track the correct number of seconds. And they do this at multiple points on the measurement scales.

Calibration typically requires a standard that has at least 10 times the accuracy of the instrument under test. Otherwise, you are calibrating within overlapping tolerances and the tolerances of your standard render an 'in cal' instrument 'out of cal' or vice-versa. Let us look at how that works.

Two instruments, A and B, measure 100 V within 1%. At 480 V, both are within tolerance. At 100 V input, A reads 99,1 V and B reads 100,9 V. But if you use B as your standard, A will appear to be out of tolerance. However, if B is accurate to 0,1%, then the most B will read at 100 V is 100,1 V. Now if you compare A to B, A is in tolerance. You can also see that A is at the low end of the tolerance range. Modifying A to bring that reading up will presumably keep A from giving a false reading as it experiences normal drift between calibrations.

Calibration, in its purest sense, is the comparison of an instrument to a known standard. Proper calibration involves use of a NIST-traceable standard - one that has paperwork showing it compares correctly to a chain of standards going back to a master standard maintained by the National Institute of Standards and Technology.

In practice, calibration includes correction. Usually, when you send an instrument for calibration, you authorise repair to bring the instrument back into calibration if it was 'out of cal.' You will get a report showing how far out of calibration the instrument was before, and how far out it is after. In the minutes and seconds scenario, you would find the calibration error required a correction to keep the device 'dead on,' but the error was well within the tolerances required for the measurements you made since the last calibration.

If the report shows gross calibration errors, you may need to go back to the work you did with that instrument and take new measurements until no errors are evident. You would start with the latest measurements and work your way toward the earliest ones. In nuclear safety-related work, you would have to redo all the measurements made since the previous calibration.

Causes of calibration problems

What knocks a digital instrument 'out of cal?' First, the major components of test instruments (eg, voltage references, input dividers, current shunts) can simply shift over time. This shifting is minor and usually harmless if you keep a good calibration schedule, and this shifting is typically what calibration finds and corrects.

But, suppose you drop a current clamp - hard. How do you know that clamp will accurately measure, now? You do not. It may well have gross calibration errors. Similarly, exposing a DMM to an overload can throw it off. Some people think this has little effect, because the inputs are fused or breaker-protected. But, those protection devices may not trip on a transient. Also, a large enough voltage input can jump across the input protection device entirely. This is far less likely with higher quality DMMs, which is one reason they are more cost-effective than the less expensive imports.

Calibration frequency

The question is not whether to calibrate - we can see that is a given. The question is when to calibrate. There is no 'one size fits all' answer. Consider these calibration frequencies:

* Manufacturer-recommended calibration interval: Manufacturers' specifications will indicate how often to calibrate their tools, but critical measurements may require different intervals.

* Before a major critical measuring project: Suppose you are taking a plant down for testing that requires highly accurate measurements. Decide which instruments you will use for that testing. Send them out for calibration, then 'lock them down' in storage so they are unused before that test.

* After a major critical measuring project: If you reserved calibrated test instruments for a particular testing operation, send that same equipment for calibration after the testing. When the calibration results come back, you will know whether you can consider that testing complete and reliable.

* After an event: If your instrument took a hit - something knocked out the internal overload or the unit absorbed a particularly sharp impact - send it out for calibration and have the safety integrity checked, as well.

* Per requirements: Some measurement jobs require calibrated, certified test equipment - regardless of the project size. Note that this requirement may not be explicitly stated but simply expected - review the specs before the test.

* Monthly, quarterly, or semi-annually: If you do mostly critical measurements and do them often, a shorter time span between calibrations means less chance of questionable test results.

* Annually: If you do a mix of critical and non-critical measurements, annual calibration tends to strike the right balance between prudence and cost.

* Biannually: If you seldom do critical measurements and do not expose your meter to an event, calibration at long frequencies can be cost-effective.

* Never: If your work requires just gross voltage checks (eg, 'Yep, that is 480 V'), calibration seems like overkill. But what if your instrument is exposed to an event? Calibration allows you to use the instrument with confidence.

One final note

While this article focuses on calibrating DMMs, the same reasoning applies to your other handheld test tools, including process calibrators. Calibration is not a matter of 'fine-tuning' your test instruments. Rather, it ensures you can safely and reliably use instruments to get the accurate test results you need. It is a form of quality assurance. You know the value of testing electrical equipment, or you would not have test instrumentation to begin with. Just as electrical equipment needs testing, so do your test instruments.



Credit(s)



Share this article:
Share via emailShare via LinkedInPrint this page

Further reading:

60 MHz 16-bit AWG
Vepac Electronics Test & Measurement
Siglent’s SDG1000X Plus series function/arbitrary waveform generator offers a maximum output frequency of 60 MHz, 16-bit vertical resolution, 1 GSa/s sampling rate, and 8 Mpts arbitrary waveform length.

Read more...
AC programmable power
Accutronics Test & Measurement
TDK Corporation has announced the introduction of the TDK-Lambda brand GENESYS AC and GENESYS AC PRO series of 2 kVA and 3 kVA rated programmable AC power sources.

Read more...
QA introduces large chisel tip style probes
Techmet Test & Measurement
The two new probes have a larger diameter to provide better contact reliability.

Read more...
Digital PSU with four variable outputs
Vepac Electronics Test & Measurement
The PeakTech 6215 is a laboratory power supply with four separate voltage outputs, each one infinitely variable using the rotary controls on the front of the unit.

Read more...
High-voltage insulation resistance testers
Comtest Test & Measurement
Two new high-voltage insulation resistance testers, from Fluke, deliver accuracy and speed in industrial and solar PV applications.

Read more...
Digital accelerometer for high dynamics applications
RS South Africa Test & Measurement
TDK extends its Tronics portfolio with the AXO314, a high-performance digital MEMS accelerometer with ±14 g input range for industrial applications operating under shock and vibration.

Read more...
Acceleration sensors for wearables
Future Electronics Test & Measurement
Bosch Sensortec has introduced two new acceleration sensors, the BMA530 and BMA580, both offered in a compact size of only 1,2 x 0,8 x 0,55 mm.

Read more...
Accurate laser measurement
Avnet Silica Test & Measurement
Online Teaser: Panasonic Industry’s laser sensor has a resolution of up to 0,5 µm, a linearity of ±0,05% FS, and a high-speed sampling of 100 µs.

Read more...
Handheld analyser with wide frequency range
Vepac Electronics Test & Measurement
The PXN-400 from Harogic is a handheld spectrum analyser that covers a frequency range of 9 kHz to 40 GHz, with an analytical bandwidth of 100 MHz.

Read more...
Comtest introduces Ametek’s Mi-BEAM series
Comtest Power Electronics / Power Management
The Mi-BEAM series is a programmable DC power system that can be used in testing batteries, fuel cells, and solar inverters.

Read more...