Latest News

Why is it important to calibrate temperature instruments

The most commonly and most frequently measurable variable in industry is temperature. Temperature greatly influences many physical features of matter, and its influence on quality, energy consumption and environmental emission is significant.

Temperature, being a state of equilibrium, makes it different from other quantities. A temperature measurement consists of several time constants and it is crucial to wait until thermal equilibrium is reached before measuring. Metrology contains mathematic formulas for calculating uncertainty.

The polynoms are specified in ITS 90 table (International Temperature Scale of 1990). For each measurement, a model that includes all influencing factors must be created.

Determine accuracy

Every temperature measurement is different, which makes the temperature calibration process slow and expensive. While standards determine accuracy to which manufacturers must comply, they nevertheless do not determine the permanency of accuracy.

Therefore, the user must be sure to verify the permanency of accuracy. If temperature is a significant measurable variable from the point of view of the process, it is necessary to calibrate the instrument and the temperature sensor.

It is important to keep in mind an old saying: all meters, including sensors, show incorrectly, calibration will prove by how much.

Temperature sensors

The most commonly used sensors in the industry used for measuring temperature are temperature sensors. They either convert temperature into resistance (Resistance Temperature Detectors, RTD) or convert temperature into low voltage (Thermocouples, T/C). RTDs are based on the fact that the resistance changes with temperature. Pt100 is a common RTD type made of platinum and its resistance in 0°C is 100 ohms.

Thermocouple consists of two different metal wires connected together. If the connections (hot junction and cold junction) are at different temperatures, a small temperature dependent voltage difference/ current can be detected.

This means that the thermocouple is not measuring the temperature, but the difference in temperature.
The most common T/C type is the K-type (NiCr/NiAl).

Despite their lower sensitivity (low Seebeck coefficient), the noble thermo-elements S-, R- or B-type (PtRh/Pt, PtRh/Pt/Rh) are used especially in high temperatures for better accuracy and stability.

Temperature transmitters

The signal from the temperature sensor cannot be transmitted a longer distance than the plant.
Therefore, temperature transmitters were developed to convert the sensor signal into a format that can be transmitted easier.

Most commonly, the transmitter converts the signal from the temperature sensor into a standard ranging between 4 and 20 mA.

Nowadays, transmitters with a digital output signal, such as Fieldbus transmitters, are also being adopted, while the transmitter converts the sensor signal, it also has an impact on the total accuracy, and therefore the transmitter must be calibrated on regular basis.

A temperature transmitter can be calibrated using a temperature calibrator.

Calibrating instruments

To calibrate a temperature sensor, it must be inserted into a known temperature. Sensors are calibrated either by using temperature dry blocks for industrial field or liquid baths (laboratory). To make comparisons, we compare the sensor to be calibrated and the reference sensor.

The most important criterion in the calibration of temperature sensors is how accurate the sensors are at the same temperature. The heat source may also have an internal temperature measurement that can be used as reference, but to achieve better accuracy and reliability, an external reference temperature sensor is recommended.

The uncertainty of calibration is not the same as the accuracy of the device. Many factors influence the total uncertainty, and performing calibration is not the least influencing factor. All heat sources show measurement errors due to their mechanical design and thermodynamic properties.

Measurement uncertainty

These effects can be quantified to determine the heat source’s contribution to the measurement uncertainty. The major sources of measurement uncertainty are axial homogeneity, radial homogeneity, loading effect, stability and immersion depth.

Guidelines for minimising measurement uncertainty should be applied according to Euramet/cg-13/v.01 (former EA-10/13).

The key parameters to understand are as follows:

  • Axial homogeneity which is the temperature distribution in the measurement zone along the boring (axial temperature distribution).
  • Radial homogeneity can be explained as the difference in temperature occurring between the borings.
  • When several sensors are placed in the borings of the heat source, they will affect accuracy. This phenomenon is called loading effect.
  • Stability means variation of the temperature in the measurement zone over time when the system has reached equilibrium. Thirty minutes is commonly used.
  • To achieve a more stable calibration, the immersion depth for a probe should be sufficient for the sensor being calibrated. Stem conduction, heat flux along the length of the thermometer stem, affects both the reference sensor and the unit being tested.

Accuracy requirements

The calibration of instruments and sensors must be performed periodically. The ISO Quality Control System presupposes the quality control of calibration, the calibration of instruments effecting production, regular calibration of sensors and traceable calibration as well as calibration documentation.

The level of performance a calibration device needs to have depends on the accuracy requirements determined by each company. However, the calibration device must always be more accurate than the instrument or sensor being calibrated.

Calibration of instruments and sensors can be carried out either on site or in a laboratory.

[Dirk Kuiper is General Manager, AMS Instrumentation & Calibration.]

Send this to a friend