How to Validate the Readings on Your Infrared Thermometer
The third instalment in our infrared thermometer series is about how to correctly validate the accuracy of an IR device in the field. Our first post looked at emissivity and how to get an accurate reading, while the second focused on how to clean and store an IR thermometer. If you haven’t already, we recommend also reading these posts in order to fully understand how infrared works before attempting calibration.
Calibration vs validation
The process of calibrating a thermometer can only be completed in a controlled laboratory environment. The process of validation, where an instrument is comparison checked for accuracy, is what is described here. If the reading of an instrument is found to be inaccurate when validated using a calibrated thermometer it must then be sent to a laboratory to be repaired or recalibrated.
Why validating a temperature on an IR instrument is different to calibrating a penetration probe
Infrared thermometers only measure surface temperatures and should therefore only be used as a quick guide. This is because the accuracy of the measurement is affected by many factors and variables such as the emissivity of the surface, type of material, transparency, colour and reflectivity (read our full guide to getting accurate IR readings here). An infrared thermometer must be validated against a laboratory calibrated ‘master’ thermometer on a known temperature source. The best way to control the emissivity and temperature of a surface, ensuring that you get the true reading of an infrared thermometer, is by using a solid black body. This minimises most external factors and prevents the temperature from changing too quickly.
As seen in our previous blog post on the accuracies and limitations of infrared, emissivity plays a huge role when calibrating IR thermometers.
Depending on what you’re pointing your infrared thermometer at you’re going to get a variation in emitted infrared energy. Emissivity is a measure of a material’s ability to emit infrared energy. It is measured on a scale from just about 0.00 to 1.00. Generally, the closer a material’s emissivity rating is to 1.00, the more that material tends to absorb reflected or ambient infrared energy and emit only its own infrared radiation. Click here to learn more about emissivity.
What equipment is required to validate the accuracy of an IR instrument
Here at ETI we have specifically designated laboratories for our calibration of infrared thermometers. We have put a lot of time and resources into ensuring that the temperature and humidity is exactly right for each process to begin. We also have controlled hot and cold black body sources in order to achieve the accuracies stated within the product’s specification. We are able to provide a traceable certificate of calibration on all ETI manufactured infrared thermometers.
In order to check the accuracy of an IR thermometer out in the field, a Thermometer Comparator and high-accuracy, calibrated ‘master’ thermometer such as a Reference Thermometer are required. The thermometer comparator consists of an aluminium cup with a solid matte black base. The base incorporates two holes for taking the internal temperature of the base using a ‘master’ thermometer. An infrared thermometer can then be held above the entrance of the cup to take the temperature of the surface of the base.
How to validate a temperature on an IR instrument
Ensure the comparator and infrared thermometer are clean and free of any debris or substances that could affect the reading (read our full guide to cleaning and storing your IR device here).
Place the thermometer comparator on a flat surface.
Insert the Reference Thermometer probe into one of the base test holes and allow it to stabilise. This could take any amount of time, depending on the response time of the inserted probe.
If the IR device has adjustable emissivity, ensure it is set to 0.95, the correct setting for the matte black surface of the Thermometer Comparator.
Point the thermometer straight down into the bottom of the comparator and take a measurement. The instrument should read within 1 °C of the Reference Thermometer at 22°C ambient room temperature, depending on the accuracy of the thermometer.
What temperature can an IR instrument be validated at
The accuracy of an infrared thermometer can be checked using a comparator at any stable temperature. However, to reduce the possibility of a difference in temperature between the inside surface and the base test hole, it is more accurate at 22°C, ambient room temperature.
Using an IR thermometer at hot or cold temperatures will increase the possibility of thermal instability.
For every 1°C the environment is above or below 22°C (ambient temperature), an adjustment factor should be added to the instrument’s accuracy to allow for the thermal instability. Typically this is 0.05°C for RayTemp thermometers. Other infrared thermometers may have a different value. Here is a table showing the values that need to be considered when using a RayTemp 2 thermometer in cold or hot environments.
*accuracies and thermal stability for other instruments can vary.
Dos and don’ts
Do calibrate at an ambient temperature of approximately 22°C if possible.
Don’t change the temperature surrounding the comparator before validation or the surface temperature may differ from the internal temperature.
Do be aware of the external factors that influence taking a correct IR reading from the comparator, such as moisture, frost and debris.
Don’t position the infrared thermometer too far away, or at an angle, when taking the temperature of the comparator as it may provide an inaccurate reading.
Do take the measurements as quickly as possible, to prevent the surface temperature from changing.
Don’t forget that the thermometers require time to acclimatise to a different environment.
Learn more about infrared thermometers: