Sunday, November 13, 2011

Saturday, November 5, 2011

AUTOMATED CALIBRATION AND BUMP TESTING

Unfortunately, there is no industry consensus on what comprises a successful bump test. Also, with docking stations there is often a trade-off between completing the bump test quickly, and assessing performance to a higher level of accuracy. The emphasis is frequently on getting in and out of the test as quickly as possible and using as little gas as possible per test.

During a bump test the calibration of the instrument is assessed by exposing the sensors to known concentration test gas, and verifying that the readings are accurate. The response of the sensors is not adjusted during a bump test. A bump test is simply a yes/no verification of response. Of course, during a full calibration the sensor outputs are adjusted to match the concentrations of the applied gas. Because the sensors have to stabilize completely before they can be adjusted, a full calibration generally takes much more time and gas than a bump test.

To save time and gas many docking stations provide only a qualitative assessment of performance during a “bump test.” In many cases the docking station only flows gas until the instrument alarms are activated. You know that the sensors respond to gas and the alarms function, but you don’t really know if the readings are accurate. To make it worse, customers often use less expensive “bump gas” rather than “calibration gas” when performing functional bump tests. The “bump gas” is often packaged in cylinders (or aerosol cans) with poor stability and shelf life. The reactive components in the “bump gas” (like hydrogen sulfide) generally deteriorate more rapidly than when the mixture is packaged as “calibration gas” in a more expensive fully passivated cylinder.

At least some docking stations allow the user to define the level of accuracy desired. While the default settings of the docking station may emphasize speed and minimize gas use, the user can optionally specify more stringent pass fail criteria. One way to do this is to wait longer after the docking station begins to flow gas to the sensors before deciding whether or not the instrument is in calibration. However, the higher the level of accuracy specified, the longer the bump test takes.

In the case of the GfG DS-400 Docking Station users have three setup choices when specifying the amount of time the docking station waits before verifying the response of the sensors. The default setting for the bump test duration is “no time” specified. In this case the docking station determines from the shape of the sensor response curve when it has enough information to assess whether or not the instrument is in calibration, and whether or not the alarms are properly activated when exposed to gas. Typically with this setting a bump test takes about 15 seconds (which is enough time for all sensors to reach their high alarm setting, and for the docking station to have enough information to extrapolate their final T100 reading). On an optional basis users can select “T50” rather than the default setting. In this case the docking station only waits until the readings from all sensors have reached 50% of the concentration of the gas applied before verifying the calibration of the sensors, (any alarms that are set higher than 50% of the value of the gas applied will not be tested). Choosing “T50” speeds up the test a little, and takes about 12 seconds to complete. The third choice is “T90.” In this case the docking station waits until the readings from all sensors have reached 90% of the concentration of the gas applied (“T90”) before verifying the calibration of the sensors. A “T90” bump test takes about 25 seconds to complete.  Although it takes a little longer, it provides the most stringent evaluation of the calibration state of the sensors.

Of course, docking stations can just as easily be used to calibrate as bump test instruments. Calibration is a two-step process.  In the first the readings are “fresh air” adjusted in fresh air that contains no measurable contaminants.  In the second step the sensors are “span” adjusted while exposed to known concentration calibration gas. Because the readings must be allowed to stabilize completely (i.e. reach “T100”) before being span adjusted, the complete calibration process takes substantially longer than performing a “bump test.” In the case of the GfG DSA-400 Docking Station, a complete two-step calibration (including fresh air, span calibration and a purging interval) takes about 2.5 minutes. Some users prefer to calibrate their instruments on a daily basis rather than to perform a daily bump test. This is a completely valid approach, and provides the highest level of accuracy possible.

A WORD ON GAS MONITOR MANUFACTURERS

From time to time the name of a specific manufacturer may appear in this Blog. Any such mention is for illustrative purposes only and is not an endorsement, nor is it an indictment of the particular item mentioned. The purpose of this Blog is to address the technical and safety issues of Gas Detection Instrumentation.