Due to the high-spec nature of many vacuum applications, there are several aspects of such systems which are beyond compromise. The requirements for: highly-engineered pumping units and vessels; ultra-accurate means of measuring and controlling flows and pressures; and tightly-closed systems which do not leak.
What’s a leak between friends?
The detection of leaks in both pressurised and vacuum systems, as well as their elimination, management and/or accountability, is a serious business but unfortunately is often considered to be a trivial matter—which it most certainly is not.
But what exactly is a leak? A leak is a small hole in one or several parts of the system that allows the uncontrolled entry or exit of gas. As for the leak rate, this is dependent on several factors including: the size of the hole/holes; gas type; and the pressure differential (between the inside of the system and the outside).
The leak rate describes the magnitude of the leak in terms of the amount of gas that passes out of the system per unit time.
There are several reasons why a system may fail to maintain its vacuum-levels, including outgassing or contamination. Furthermore, different vacuum processes and applications call for different leak rate requirements i.e. what is acceptable at a lower vacuum would be considered utterly unacceptable (and possibly dangerous) at a higher vacuum level.
Reducing or eliminating leaks are important for a number of reasons, including:
- operator safety (i.e. egression of toxic gases/fluids)
- product safety (e.g. to stop air from entering a system where it may significantly contribute to the formation of an explosive mixture)
- to ensure and maintain the pressure/vacuum
- to ensure a long user-life for products
- for environmental and quality standards
- and for process efficiency
Despite such reasons, it must be accepted that no system can ever be absolutely vacuum-tight…and indeed it need not be—it just needs to be manageable or at least low-enough so that the operating pressure, gas balance and ability to reach and maintain a final/ultimate pressure, are not overly impacted.
With regard to leaks, one needs to distinguish between the two types: (i) where the direction of gas/fluid flow is into the vessel (known as the “outside-in leak”) and (ii) where the gas/fluid passes from inside the test specimen outwards (known as the “inside-out leak”). Furthermore, there are two aspects of leak technology worth examining: leak detection (i.e. locating the leak) and measuring the integral leak rate of the complete device.
Leaks in a vacuum system are one of the key causes of slow evacuation time. Discover how to Identifying the root cause of malfunctions and improve the operational capability of your vacuum system in our free eBook:
As with almost every facet of vacuum systems, there is no single method which fulfils every situation and every criterion. This is certainly the case with leak detection, with four main methods being employed: the bubble test; pressure decay test; pressure rise test; and helium sniffer mode/helium vacuum mode tests. These four tests roughly correspond to the “simplistic” bubble test (for low-vacuum pressures), through to the “high-tech” helium tests (for high-vacuum pressures).
The Bubble test is best illustrated by placing a punctured bicycle tube under water and marking where the bubbles come from or placing washing-up liquid around the joint of an active water/gas pipe and observing whether the liquid forms a froth. Both are reliable ways of detecting a low-pressure leak. The bubble test is employed up to vacuums of 10-4 mbar.
The pump-down test is conducted by evacuating a closed vacuum vessel until a certain pressure is obtained, then closing the pump’s inlet valve. After a pre-determined period of time, the inlet valve is again opened, and the time is recorded for the pump to return the vacuum to the original evacuated level. This process is repeated a number of times. If the time to return the vacuum to the original level remains constant, then a leak is present. If this time period decreases, this indicates reduced gas liberation (outgassing) on the inside of the system (i.e. a “virtual” leak), however, it does not exclude a leak from also being present.
Alternatively, the pressure rise test is made by plotting the vacuum level against the time after a vacuum level has been achieved, and after isolating the system, the curve will be a straight line if a leak is present. However, if the pressure rise is due to gas liberation from the system walls, the rise will gradually taper off to reach a final, stable value.
In most instances both phenomena occur simultaneously, which makes separating one from the other almost impossible. If the volume of the chamber or item under test is known, then the leak rate can be calculated (i.e. the volume x (measured rise in pressure)/time taken).
The pressure drop test is not dissimilar to the pressure rise test. It is only rarely used to check leaks in vacuum systems, and only when the (positive) gauge pressure does not exceed 1 bar, since the flange connections used in vacuum technology will not tolerate higher pressures.
However, the pressure drop test is frequently employed in tank engineering. Pressure drop tests allow leak rate measurements to 10-4 mbar*l/s but results can be distorted if condensation occurs. As one can see, the differential pressure decay test is fraught with caveats, but if employed under laboratory conditions, it is a good tool at determining both leaks and leak rates.
It must be noted that the only credible method to detect leaks smaller than 1x10-6 mbar*l/s is with a helium leak detector. A leak diameter for 1x10-12 mbar*l/s (which equates to 1Å) is also the diameter of a helium molecule, which is the smallest detectable leak rate. (N.B. A leak rate of 1 mbar*l/s means a rise of 1 mbar from a 1 litre vessel in a second. To put this into context: a leak rate of < 1x 10-2 mbar*l/s would be classified as “water tight”; < 1x 10-3 mbar*l/s “vapour tight”; < 1x 10-5 mbar*l/s “oil tight”; < 1x 10-6 mbar*l/s “virus tight”; < 1x 10-7 mbar*l/s “gas tight”; whilst < 1x 10-10 mbar*l/s would be classified as “absolute tight”.)
Fig 1: Leak rate of 1 mbar l/s
Other than diameter, there are other reasons why helium is employed in leak detection:
- it constitutes only about 5 ppm in air, so background levels are very low
- its relatively low mass means that it is very “mobile” (i.e. it mixes very quickly with other gasses)
- it is completely inert/non-reactive, non-flammable and harmless
- and is widely available at relatively low cost
There are several ways to leak-test vacuum vessels and components using helium, but all employ the same principle. The unit being checked is either helium-pressurised from within or helium-pressured from without. The gas from any potential leaks are collected and ‘pumped’ into a mass spectrometer for analysing, and any value above the background level is evidence of a leak.
The spectrometer itself works in the following way: any helium molecules flowing into the spectrometer will be ionized, and these helium ions will then “fly” into the ion detector, where the ion current is analysed and recorded. Before reaching the detector, the ions have to pass a magnetic field which deflects all ions other than helium ones. Based on the ionization current, the leak rate can then be calculated.
These helium tests, known as “vacuum” and “sniffer” tests, can detect leaks with both precision and certainty. Here, the term “certainty” means that there is no other method with which one can, with greater reliability and better stability, locate leaks (even small ones) and measure them quantitatively. For this reason, helium leak detectors, even though relatively costly, are often far more economical in the long run since considerably less time is required for the actual leak detection procedure to be concluded.
Helium leak detection falls into two basis methods: “integral” testing and “local” testing.
The choice of which method to use depends on the application, as well as what the final product will be used for. The “integral” method shows if there is a leak (but not how many different leaks), the “local” method shows where there is a leak (but exact determination of the leak rate or the leak size is difficult). Both of these detection methods can each be sub-divided into two further parts: “sample under pressure”, and “sample under vacuum”.
(i) Integral testing, occurs where the sample is either under pressure or under vacuum, and is contained in a vessel. These two “integral” methods are frequently referred to as the “helium vacuum tests” since the sample is either itself evacuated or placed in a vacuum, with helium gases leaking in or out of the sample, which is then detected as it flows through a mass spectrometer. The major disadvantage—though not the only one—is that the unit needs to be placed within a vessel of a suitable size. Furthermore, the helium “vacuum” test is usually only employed on units subjected to high or ultra-high vacuums.
Fig 2: Integral testing with helium (sample under pressure).
Fig 3: Integral testing with helium (sample under vacuum).
(ii) Local testing occurs, where (again) the sample itself is under pressure or under vacuum. These two “local” methods are frequently referred to as the “sniffer” test, since it uses a “sniffer” probe.
In the “local-spraying (sample under pressure)” method, the chamber is pressured up with helium and a sniffer device passed around the chamber’s likely leak points (i.e. welds, flanges, portals, instrument ducts etc.) to capture any escaping gas. This “sniffed” gas is passed to a mass spectrometer to record any elevated (i.e. above background) helium levels.
Fig 4: Local testing with helium (sample under pressure).
In the “local-spraying (sample under vacuum)” method, the chamber is vacuum pumped and helium gas is liberally sprayed/directed towards likely leak points, with the intention that some of this pure helium will be plumped into the chamber. The gas, from within the chamber, is then passed into a spectrometer to record any elevated helium levels.
Fig 5: Local testing with helium (sample under vacuum).
The sniffer test has the advantage that it shows where leaks actually occur. However, helium concentrations of 5 ppm in air, limits the minimum detectable leakage rate, and furthermore ambient background signals can also impact the possible detection of minor leaks.
However, before a helium reading is accepted as “fact”, reference (or background) readings for helium—which are an important part of the process—must be taken and accounted for. Such reference readings provide the “background noise” for helium, which can be thought of as the ambient level of helium.
The majority of background helium is contained in between 100 and 150 mono-layers of surface gas molecules and is permanently contained in the air that is found in the leak detector, pumps, valves, flanges, pipework etc. The removal of this surface helium is called “degassing” and starts when all of the gas has been pumped out, causing the molecules to be “desorbing” from the inside surface of the metal. This desorption starts at a pressure of about 10-1 mbar.
Such degassing by lowering the pressure or by heating the chamber surface, is not unusual, but even this does not totally eliminate all the gas at the surfaces. In addition to surface helium, “reservoir” helium is also contained in O-rings (which act like sponges to such gases). N.B. Vacuum levels after degassing also provide a good indication of how clean the unit’s elements are. Modern helium leak detectors are able to constantly measure and calculate this internal (background) level and automatically subtract this from the leak rate measurement.
To summarise and simplify the differences between these two types of helium leak detection procedures; the “integral” method requires the chamber to be placed inside a gas-proof unit (although this is not always a possibility). In the “local” testing method the chamber is either internally pressurised with helium, or internally evacuated with helium then sparingly sprayed onto the surface of the chamber at likely leak-prone points. In both tests, helium enters the leak detector via possible leaks, and passes to the mass spectrometer for analysing.
Before moving on from helium leak detection, it is worth covering the subject of residual gas analysers (RGAs), which are small and rugged field mass spectrometers which use quadrupole technology. RGAs use either an open ion source or a closed ion source. RGAs are frequently employed in high-vacuum applications in research chambers, accelerators, scanning microscopes, etc. where they monitor the quality of the vacuum by detecting minute traces of impurities in low-pressure gas environments.
RGAs are also used as sensitive in-situ leak detectors commonly using helium or other tracer molecules. With vacuum systems (especially in the XHV and UHV range), checking the integrity of the vacuum at low-levels can be important (and safer), before a more serious leak-detection process is initiated.
As gas is compressible, the pressure (or vacuum) influences the extent of the leak, so leak-rates are quoted in mbar*l/s, with the “leak rate” being the amount of gas that flows through a leak at a given pressure differential per time.
The basis of leak rate calculations are: the diameter of the leak is circular; and the leak channel is equivalent to the thickness of the material that the leak “passes” through.
There are several standards relating to leak detectors and leak detection. One of these, DIN EN 1330-8, designates the “helium standard leak-rate” for use where a leak-test is carried out with helium at a pressure differential of 1 bar external atmospheric pressure to < 1 mbar internal pressure (which in practice are common conditions).
Environmental and safety standards require manufacturers to guarantee leak tightness of their products by carrying out leak testing as part of the production/quality approval process. In order to indicate the rejection rate for a test using helium under standard helium conditions, it is necessary to convert the actual test conditions used to standard helium conditions; there are standard formulae available for such conversions.
When a vacuum system is connected to a leak detector, standard helium conditions must be present during helium leak detection. Using helium to carry out leak tests guarantees reliable and repeatable results, that can be quantified and constantly monitored.
Vacuum is an essential part of today’s modern life. From its humble beginnings several centuries ago, there are now few parts of our technologically-driven existence and well-being that are not impacted, bettered, perfected or made possible by vacuums.
From freeze-dried and vacuum packed foods, refrigerators and air-conditioning, placing micro coatings on surgical instruments to exploring the hidden secrets of physics and outer-space, these and hundreds of other applications are only possible through the much-unappreciated—but hugely important—vacuum. And as man pushes the boundaries of applicability, technology and scientific discovery, the shift to even lower-pressure vacuums i.e. into the realms of ultra and extreme vacuum ranges, has multiplied the current—and indeed future applications—still further.
It is one of life’s illogical truths that every vacuum system comes with its own variation of “tightness”, with none being “truly” leak-free. Different vacuum processes and applications call for different leak rate requirements. Indeed, what is acceptable at a lower vacuum would be considered utterly unacceptable (and potentially dangerous) at a higher vacuum level. The detection, location-finding, evaluation and measurement of leaks are all part of the eclectic and fascinating world of vacuums.
Want to find out more about leak detection? Click the link below to read our eBook: