Product Enquiry Cart

Product/s I am interested in

You currently have no products in your enquiry cart, please continue browsing and select more products.

Get a Quote

Continue Browsing

Knowledgebase >

Calibration Vs decay measurements in pressure decay leak testers

3 min read

Knowledgebase >

Calibration Vs decay measurements in pressure decay leak testers

The question has been raised as to the relationship between “calibration” (accuracy of the pressure decay leak tester) and “resolution” (ability of the tester to detect a pressure difference – “decay” – that occurs during the leak test.) Specifically, confusion exists regarding how the “resolution” of the leak test system can be 0.0001 psi, even though the accuracy of the instrument’s “calibration” (accuracy) may be significantly greater, such as +/- 0.075 psi. The answer to the question lies in the definition of these two parameters of pressure decay leak testers, and the understanding of what is happening during a pressure decay leak test.

Determination of leaks in a device using the “pressure decay” method involves pressurizing the
device to a predetermined pressure, locking out the supply source of the pressure and sensing a pressure change (“decay”) during a predetermined test time. Modern pressure decay measuring instruments such as the TM Electronics Solution Leak Tester are capable of detecting pressure changes as small as 0.0001 psi. This ability to detect pressure change during the test time is defined as the instrument’s “resolution”. This is not related to the starting pressure of the test (except for the special circumstance in which specified leakage rate is used as the test criterion). For more detailed information on how the test instrument detects this pressure change, why resolution is limited by external and environmental factors, and the basic
physics of the pressure decay leak test, see Appendix A below.

Calibration, on the other hand, refers to the accuracy of the leak tester’s reproduction of the desired test pressure for the leak test. For example, if you are performing a pressure decay leak test using 50 psi as your test pressure, accurate calibration ensures that your test instrument will consistently reproduce that test pressure, in a specific set of units of measure, in a standard set of conditions, every time you test, within a range of variation that is defined by instrument accuracy. TM Electronics leak test instruments are calibrated with reference to a known “standard” gauge, controlled by a standards body such as National Institute of Standards Technology (NIST).

To summarize, the resolution of the leak test system is defined by the smallest pressure difference (decay) the instrument can consistently detect during a predetermined test time, regardless of the starting pressure of the leak test. The calibration of the instrument is defined as the accuracy of the test pressure applied at the beginning of the leak test. For all practical purposes, these two items are not related, but totally separate characteristics of the leak test instrument.

APPENDIX A: Discussion
The determination of leaks in a device using the “pressure decay” method involves pressurizing the device to a predetermined pressure (usually in psi, gauge, psig), locking out the supply source of the pressure and waiting some period of time to sense a pressure change. The ability to detect the pressure change is determined by the test instrument’s “resolution”, that is its least significant measuring digit of the instrument. The resolution of a modern electronic pressure decay tester is a function of the electronic system’s amplification of the basic pressure transducer (the device that converts pressure to an electronic signal), the conversion of the electronic voltage (or current) to a digital signal, the input/ output software, as well as the basic
physics of the gas laws related to temperature and mechanical stability of the system.

Modern pressure decay measuring instruments routinely have the ability to create resolutions of pressure to 0.0001 psi [0.01mbar], however even though greater electronic resolutions are capable, the physics of the actual change of pressure related to external factors such as mechanical volume change when a load is applied to a device, or adiabatic temperature changes in short test cycles, or environmental temperature changes in longer test times, and the flow regime of laminar or turbulent limit the reality of repeatable and reliable measurements.
The general gas law PV= nRT indicates that both volume and temperature are a factor in the measurement.

When time is introduced, the volume leak rate [Q] can be calculated by relating the change ofpressure [dP] to a time change [dt]. The gas law equation can then be reduced to
Q= dP/Pa * V/ dt

The units of measure chosen will then determine the appropriate leakage rate output [sccm,
sccs, Pa-m3/sec].

From this equation one observes that the variable measurement to determine a leak rate is the ability to measure pressure difference in a fixed time. The volume of the system and the reconciling pressure to standard conditions are assumed to be constant during the test time. Since dP is a difference of starting and ending pressure the actual leakage rate determination is a function of the test instrument’s ability to differentiate small changes of electrical signal but not necessarily the absolute value of the signal. The absolute value of the signal is a function of the pressure transducer’s output per unit of pressure, for example 1 volt/ psi. The actual pressure being measured is part of the transducer specification and it’s “Calibration” to a know pressure standard.  Most transducers have a voltage output related to their maximum range, for example 10 Volts/ 10 psi (either gauge pressure or absolute pressure).

The “Calibration” of an instrument’s primary transducer is related to a known “standard” gauge. These gauges or comparative measurements are controlled by a standards body, such as National Institute of Standards Technology (NIST). The calibration of an instrument then relates to an instrument’s ability to reproduce a known pressure in a specific set of units of measure related to a known standard in a standard set of conditions.

The ability to determine a given leakage rate depends on many factors. In the pressure decay instrument, the resolution of “pressure difference” is the primary factor. Since the measurement of pressure difference is a combination of the transducer’s output voltage and its ability to reproduce this value and the electronics of the instrument’s ability to consistently reproduce the signal, then a series of systematic and random errors are at work. For this reason, a gauge R&R study is usually performed on the instrument and the test system to which it is attached.

The gauge R&R study is best performed related to the actual test system that the instrument is being used to test. The typical gauge study is performed with a known or predictable leakage device such as an orifice or microadjustable valve that can used to challenge the entire test system. The so called “leak standard” is defined by its operating pressure and its measured flow value from a standard source. When applied to a leak test system comprised of the instrument, attachments tubes and fittings and the known non leaking test part, the standard leak can be repeatedly measured to identify the variation of the system. The system “precision” can then be determined statistically so the performance standards can be established for the
test method applied. Good test method validation procedures should be applied to include inter-operator variation and environmental variation. If multiple labs will conduct the test, then an inter-lab reproducibility test protocol will provide that variation.

Thus, “calibration” and “precision” [repeatability] of a leakage rate or pressure change are seen as two separate and distinct measures in the measurement of pressure decay leak testing. The calibration refers to the ability of an instrument to read a known pressure related to a standard
measure. The reproduction of a known or estimated leakage rate is dependent on the physics of the media and system structure, including the test part. The ability to consistently reproduce a given leak is a function of the system’s precision based on the instrument and other parts of
the system.