The desired metallurgical properties of forgings are derived from having the right billet temperature when they are forged. When using IR temperature sensors, operators may confuse the laser pointer as the temperature-measuring probe, when in fact it only designates the area from which a body’s radiosity is being measured. The radiosity is converted by the IR sensor’s circuitry into a temperature reading. This article discusses sources of uncertainty in these readings and how to minimize them.

Temperature is one of the most frequently measured physical parameters. The only parameter we measure more often than temperature is time. And while our measurement of time is often accurate to within a few minutes, our measurement of temperature is much less certain. The problem starts with our perception. For example, take two disks, one of steel and another of wood. Both have been stored in the same room-temperature cabinet (i.e., identical ambient conditions). Place the wood disk in one hand and the steel in the other. Most people observe that the steel feels colder than the wood.

Why is this? Wood is a very poor conductor of heat, while steel is a very good conductor. The steel disk quickly conducts the heat from your hand, while the wood very slowly conducts the heat from your hand. What you are feeling is the movement of heat via conduction, not temperature. If we then verify this phenomenon with a thermocouple probe, we find that the actual temperatures are the same for both disks.

The thermocouple makes a contact measurement. This is a very convenient technique that can provide an accurate measurement with little uncertainty. The most critical variable is the physical contact between the thermocouple and the material surface. Proper physical contact is essential for an accurate measurement. And we must hold the probe in physical contact for a sufficient period for the thermocouple to reach steady state with the material surface. The uncertainty of the measurement is largely dependent on the physical contact and the time to reach steady-state.


The Uncertainty of IR Temperature Measurement

Alternatively, we could use an infrared (IR) temperature sensor, which is quicker and easier to use. IR temperature sensors do not require physical contact, and the reading is nearly instantaneous. When we measure the wood, the infrared sensor displays the same measurement as the thermocouple probe. However, this IR measurement is subject to much greater uncertainty to the inexperienced user. The uncertainty comes from the critical variables that directly affect the temperature reading. These include: the emissivity (reflectivity) of the material surfaces, the background radiosity, the angle of view, the spot size (field of view) and more.

What is “radiosity?” Radiosity is the thermal radiation that the infrared detector observes. This thermal radiation is a combination of thermal radiation that may be emitted by the surface, reflected on the surface from background sources or even transmitted through the material in some cases. A perfect emitter has an emissivity of 1.0 and has no reflectivity or transmissivity.

The wood surface has an emissivity of approximately 0.95, meaning the wood is not very reflective. It is minimally affected by background radiation sources. The steel disk is highly reflective and has an emissivity of about 0.30, meaning it is highly subject to background influences (reflected background radiosity). The IR sensor indicates a temperature of the wood at about 20°C (68°F). When the IR sensor is focused on the steel disk, it also produces an instant temperature reading of about 20°C (68°F). The reading is the same as the thermocouple reading because the ambient background of the room is the same temperature as the wood and steel.

In fact, the IR gun is observing 30% of the radiance being emitted by the steel, plus 70% of the radiance being emitted by the background via reflectance off the steel surface. The perception is that the reading is accurate, but the reading is actually a false indication of the steel temperature. Keep in mind, many of these simple IR sensors have a fixed emissivity value of 0.95, while others may have an adjustable emissivity for the more knowledgeable user. For this illustration, we are maintaining the 0.95 emissivity value.

If we were to heat the wood and steel disks side by side on a hot plate to a uniform temperature of 100°C (212°F) and repeat the experiment, you will see that the thermocouple probe in proper contact accurately observes the 100°C temperatures of the wood and steel, while the IR sensor reports the wood temperature at 100°C and the steel temperature at only 44°C (111°F).

The infrared measurement is technically accurate in its measurement of total radiosity. The infrared sensor “sees” radiosity equating to 44°C from the steel surface, with 30% of the radiosity emitted by the steel and 70% being reflected from the cooler walls and ceiling onto the steel surface. You can see that while the infrared sensor is quick and easy to use, the result has significant uncertainty. The key factors contributing to the uncertainty are the surface emissivity and background radiosity, as well as the field of view and the angle of view. The emissivity of a surface varies with the angle of view, and shallow angles result in increased reflectivity. The field of view equates to the spot size of the measurement area.


Using the Shortest Wavelength on your IR Sensor Improves Accuracy

Taking a closer look at our example with the wood and steel disks, the IR temperature sensor used an 8-14 µm IR detector. At this wavelength and at 100°C, the temperature error due to each percentage point of error in the emissivity is about 0.8°C. The IR gun was set to emissivity 1.0, while the actual emissivity was 0.3. Our error was therefore 70%. Thus, 70% x 0.8°C/% = 56°C error (44°C measurement + 56°C error = 100°C actual temperature).

Let’s consider how this affects measurements of hot-metal surfaces, such as one at 1280°C. From this, we will observe the 8-14 µm sensor and a 0.55 µm sensor. The most readily apparent difference is that the 8-14 µm sensor can be selected with a temperature range of 0-1500°C. But the 0.55 µm sensor has a measurement range of 1000-3000°C, so it cannot measure anything cooler than 1000°C.

If you were to use the 8-14 µm sensor for the 1280°C measurement, you would discover that the error due to each 1% error in emissivity results in about a 10°C error in temperature measurement. So, using an 8-14 µm sensor and keeping the emissivity set at 1.0 while the emissivity is actually 0.3 would result in an error of approximately 743°C. On the other hand, using the 0.55 µm sensor would have an error of approximately 104°C for the same 1280°C measurement. The shorter the wavelength, the less the error due to emissivity.

The typical user is not aware of the physics associated with these measurements, but the uncertainty is minimized by using the shortest wavelength available for the temperatures you need to measure.


Measuring Radiosity

IR temperature sensors are very common, but their misuse is much more common. Many users have the mistaken impression that the lasers somehow make the temperature measurement. They do not. The laser’s only purpose is to identify what it is being pointed at, but what the infrared sensor “sees” is not the small red dot.

With a thermocouple probe, we easily see the size of the thermocouple element, but the measurement area of an IR sensor is a bit more difficult. This is especially significant when trying to measure the temperature of small targets. In this illustration, the IR temperature sensor has an 8:1 field of view. This means that the sensor is observing a 1-inch-diameter target at a distance of 8 inches, and the target size increases at the 8:1 ratio as you move further away. The point of this exercise is to demonstrate that “infrared temperature sensors” do not actually measure temperature; they measure radiosity. Radiosity is the total radiation – emitted plus reflected – leaving a surface.



Accurate temperature measurements are critical to most manufacturing processes. For a given set of specific conditions, IR temperature sensors work reliably and produce repeatable temperature measurements – but not necessarily accurate ones. Temperature-measurement accuracy comes from properly calibrating sensors to the specific materials and applications. When properly applied to specific materials and processes, infrared temperature sensors provide fast, repeatable and accurate temperature measurements.

Part 2 of this article will explore ways to improve accuracy through proper calibration and to take temperature measurements off highly reflective bodies.