Accuracy of all temperature measurements in process production is vital to ensure quality, efficiency and safety.
Noncontact temperature measurement is a key process metric in many industries, but there are many factors that influence accurate temperature measurement from a pyrometer. Here we look at the common influences on inaccurate temperature measurement.
Pyrometer Choice
There are several different pyrometers on the market, but the key deciders on choosing the correct pyrometer for the most accurate temperature measurement of your process are the same.
What wavelength of detector is used for a given temperature range?
The electromagnetic spectrum contains many different forms of electromagnetic emissions, including infrared, visible light, X-rays, radio waves and several others (Fig. 1). The only difference between these emissions is their wavelength (related to frequency).
Radiation pyrometers are designed to typically respond to wavelengths within the infrared portion of the spectrum. In practice, temperature measurement is made using pyrometers that are operational over many different ranges of wavelength, which generally reside between 0.5-15 µm.
The intensity of the emitted energy of an object versus wavelength is given by Planck’s radiation law, and this shows the wavelength of the peak energy radiated will decrease as the temperature increases. Utilizing Wien’s displacement law and the Stefan-Boltzmann law to develop the Planck’s curves (Fig. 2) shows clearly why the peak energy level shifts to shorter wavelengths as the temperature increases.
The relative amount of energy emitted by an object when heated to different temperatures across the infrared spectrum is shown in Figure 3. Again, this shows the characteristic of peak energy emitted at increasingly short wavelength for increasing temperature.
The implications of these laws mean that the most energy, which gives the most signal strength to a pyrometer detector and therefore the smallest noise-to-signal ratio, which gives the most accurate temperature readings, is with short-wavelength pyrometers.
Typically, high-temperature processes in the steel and glass industries are measured using 1 µm pyrometers. With recent technical advances, short-wavelength pyrometers can even reach down to 50°C (122°F) with a 2.1 µm detector and still cover a wide temperature span up to 1100°C (2012°F). This allows them to be utilized across a wide range of industrial processes.
What detector wavelength should be used for an application?
The other factor to consider when selecting the detector wavelength to use is what type of application is being measured, including the transmission path of the emitted energy being measured and the properties of the material being measured. Figure 4 shows the transmission of infrared through the atmosphere, and it can be clearly seen that there are several absorption bands where water vapor and CO2 absorb the energy and the temperature measured will be inaccurate.
For semi-transparent products such as glass or plastic, careful consideration of the material’s transmission, absorption and reflection characteristics is required. The infrared energy received by the pyrometer from a heated target is the sum of three quantities: the emitted radiation due to temperature, the background radiation reflected from the target and the energy transmitted through the target. Figure 5 shows the cumulative effect of the atmospheric absorption bands, a typical uncoated glazing glass reflectivity and transmission.
From the graph, it can be seen the optimum wavelength to select is 5 µm. This is where the glass transmission has dropped to a very low level, the reflectivity level has not started to increase and there is not an atmospheric absorption band. A typical 8-14 µm pyrometer used to measure glass would have a large reflective component to the energy, making the measured temperature inaccurate. For very thin glass, such as used in mobile telecommunication devices, the characteristics of the transmission in figure 5 are different, and a 7.9 µm pyrometer is typically selected.
The best way to determine the correct wavelength pyrometer for new or novel semi-transparent materials is with a trial of the actual instrument being considered.
Emissivity
The emissivity of a material, usually denoted as e, is the ratio of the energy radiated by material to the energy radiated by a blackbody (a perfect absorber and emitter) at the same temperature. It is the measure of a material’s ability to absorb and radiate energy. Emissivity is a numerical value between 0 and 1, with a true blackbody ɛ = 1.
The emissivity value of a material is dependent on the molecular-level properties of the material and is therefore dependent on the exact material being measured and process conditions. For example, an unoxidized steel undergoing processing may have an emissivity value at 1 µm of around 0.35. When the processing is complete and the steel becomes oxidized, this can change to 0.85.
Emissivity is typically specified at a wavelength because it will normally vary with wavelength. For example, the emissivity of a polished metal tends to decrease as the wavelength becomes longer. This along with the energy dependence is why a short-wavelength-detector pyrometer is typically the best choice, especially for metals processing.
Figure 6 shows the rapid rise in radiated energy with temperature at short wavelength. The actual change in signal output from a short-wavelength thermometer is often around 1% for every 1°C change in target temperature when viewing a target at 1000°C (1832°F). The result of this is that a 1% reduction in radiated energy, possibly due to a change in target emissivity, will result in a fall in indicated temperature of only 1°C. This translates to an error of 0.1% in temperature. If the correct emissivity value is not selected, a significant error in the measurement can occur.
Determination of the material emissivity is therefore as critical as choosing the correct wavelength of the pyrometer to measure the temperature. Emissivity can be determined on the production line with advanced sampling techniques using a Gold Cup emissivity-enhancer pyrometer that integrates the emitted and reflected radiation, producing near blackbody conditions. Alternatively, a sample of the material with the process temperature and atmospheric conditions specified can be sent to a pyrometer manufacturer’s calibration laboratories, where an emissivity determination service is typically offered.
Temperature Calibration
Regular, traceable calibration checks are essential for all temperature-measuring equipment to ensure accuracy.
Why calibrate pyrometers?
The first point to make is that where an industry relies on accurate temperature measurement and stability, instruments do need calibrating. While industry leaders have developed extremely stable detectors over 65 years of experience, there are many detectors on the market that drift with time. Calibration is essential to reference back to a known temperature and account for drift offset in the pyrometer.
Why do pyrometers drift?
A manufacturer will typically publish a temperature dependency of the pyrometer versus an ambient temperature (e.g., 0.01K/K for an ambient temperature greater than 23°C). Over time, instruments running in higher than laboratory ambient condition will drift as the ions in the detector physically migrate and diffuse, changing the electrical property of the detector and, hence, the temperature reading.
How often should pyrometers be calibrated?
There are a number of reasons why it is necessary to carry out periodic calibration. On-site conditions can have a significant influence on the accuracy and stability of the temperature measurement. Dirt, dust, vibration, EMC and physical impacts or shocks (from dropping portable instruments) are all typical causes of drift in performance and accuracy while the instrument is in use.
Leading infrared products are designed to be inherently stable and have protective housings for harsh industrial environments, but the only way to ensure performance is verifying it in a calibration laboratory.
For industries operating outside a quality standard that defines the calibration period, the judgement must ultimately be made on a risk-assessment basis. The key variables to include in this assessment are:
- What are the accuracy requirements of the process?
- What is the criticality of the temperature measurement?
- What industrial environment is the instrument operating in?
- What is the frequency of use?
The operating environment is a dominant factor. If it is a process within a heavy industry, calibration may have to be carried out every six months, especially if the quality of the product is temperature-critical. Conversely, if the instrument is operating in laboratory-type conditions, it may be a longer duration. Ultimately, the operator should make the judgment because there are other factors to consider. For instance, a high defect-risk industry like aerospace or defense needs to rely on precise and consistent temperature measurements.
Whatever frequency is chosen, it gives you the option to track the instrument’s “calibration history.” By doing this, it is possible to see how quickly instruments drift from the manufacturer’s specification. If the graphed data shows steady, stable measurements over an extended duration, calibration may not need to be carried out as frequently (depending on industry and customer needs). If the measurements track off trend relatively quickly, it is a sign that further investigation and more frequent calibration is needed.
Frequency of calibration is a balance between risk and cost of defective material, plus other factors specific to each business. Pyrometers are often returned for calibration only after it has been noticed that there is a difference in the temperature reading between the infrared pyrometer and a fixed thermocouple.
The general rule to follow is just because any piece of electrical equipment is working doesn’t necessarily mean that it is accurate.
Once a decision has been made to calibrate, you need to choose whether to go down the certificate-of-conformity route (factory calibration) or full calibration and verification to national standards. The difference between the two is that the former meets a minimum set of regulatory, technical and safety requirements, whereas the latter goes beyond this by ensuring calibration is carried out in accordance with the highest ISO/IEC 17025:2005 internationally recognized standards.
Why calibrate to a traceable national standard?
Any calibration laboratory accredited to a national standard is routinely independently audited by the National Standard Institute, and the accuracy of the calibration laboratory is then published in the public domain. Customers can then very quickly determine which laboratory will provide the best accuracy and lowest uncertainty in any reported measurements.
Summary
There are many different factors in measuring an accurate temperature using a noncontact pyrometer. The choice of wavelength determining the emissivity and routine calibration checks of the pyrometer are key factors in improving measurement accuracy. Several other factors – from background temperature, reflected energy, measuring inside a hotter environment, etc. – also impact the measured accuracy of the temperature. The benefits of partnering with pyrometers manufacturers with a wealth of application and industrial experience bring not just instrumentation knowledge but expertise in measuring temperature accurately and repeatedly.
For more information: Contact Iain Scott, at AMETEK Land, Stubley Ln., Dronfield Derbyshire, England S18 1DJ; e-mail: iain.scott@ametek.co.uk; or tel: +44 (0)1246 417691; e-mail: land.enquiry@ametek.co.uk; web: www.landinst.com
Article originally published in the May 2016 issue of Industrial Heating as "Are You Measuring the Correct Temperature?"
Report Abusive Comment