I’ve talked about bump testing your instrument, so it seems natural that we now cover the importance of calibrating.
There are two main reasons for calibration. Firstly, gas detectors often operate in harsh environments: high and low temperatures and/or humidities; they may be exposed to contaminants, such as solvents, silicone etc; gas exposure; as well as the age of a sensor; any of which can result in the degree to which the detector responds to a given gas concentration changing, for example, the detector may read 46% LEL when the true level is 50% LEL.
Secondly, most site managers, safety officers and users require some record to indicate their instrument has been calibrated and will do the job they want it to do (i.e. respond to gas). As well as calibrating the sensor to the correct response, instruments will have a calibration date set on the instrument, so that you can see when calibration was performed and when it’s next due, as well as get a calibration certificate as a record.
Bump testing vs. calibration
The difference between bump testing and calibrating is that a bump test is a brief exposure to gas in order to verify that the sensors respond within a specific boundary and the detector’s alarms function properly.
A calibration is a “resetting” of the detector’s response against a known concentration of target gas, in a balance of synthetic air or nitrogen. This will determine the relationship between the detector’s reading and the actual concentration of the component gas of interest. Adjustment involves modification of the detectors response to bring the reading into line with what is expected while exposing the instrument to the known source.
Carrying out a calibration
With calibration, it is important to perform the calibration process in a controlled way, taking into consideration flow paths and flow rates, pressure, temperature, humidity, gases used, cross sensitivities, time for sensors to respond, and exhaustion of waste gas, as well as following any additional requirements pointed out by the detector manufacture. Calibration is usually a two-step procedure. In the first step, the instrument is zeroed in a fresh air, synthetic air or nitrogen background, so that the readings equal those expected in clean air. The second step is to expose the detector to calibration gas that contains known concentrations of the gas that the sensor is designed to measure and adjust any deviation to the correct reading. Alternatively, you may be cross calibrating, where you use a different gas type and use a cross calibration factor to give you the response to the target gas required.
How frequently should you calibrate your gas detector?
The frequency at which the instrument should be calibrated can vary, although it is recommended that you combine information from the application and environment, as well as the user, manufacturer and the service provider. A risk assessment is generally required to confirm a calibration period is adequate. And remember, regular bump testing would be recommended between calibration periods.