Thermometer


A thermometer is a device that measures temperature or a temperature gradient. A thermometer has two important elements: a temperature sensor in which some change occurs with a change in temperature; and some means of converting this change into a numerical value. Thermometers are widely used in technology and industry to monitor processes, in meteorology, in medicine, and in scientific research. During the COVID-19 pandemic they were used by businesses to detect the fever brought on by the virus.
Some of the principles of the thermometer were known to Greek philosophers of two thousand years ago. The Italian physician Santorio Santorio is commonly credited with the invention of the first thermometer, but its standardisation was completed through the 17th and 18th centuries.
There are many types of thermometers used under Science, Geography etc

History

While an individual thermometer is able to measure degrees of hotness, the readings on two thermometers cannot be compared unless they conform to an agreed scale. Today there is an absolute thermodynamic temperature scale. Internationally agreed temperature scales are designed to approximate this closely, based on fixed points and interpolating thermometers. The most recent official temperature scale is the International Temperature Scale of 1990. It extends from to approximately.
and Celsius units.

Early developments

Various authors have credited the invention of the thermometer to Hero of Alexandria. The thermometer was not a single invention, however, but a development.
Hero of Alexandria knew of the principle that certain substances, notably air, expand and contract and described a demonstration in which a closed tube partially filled with air had its end in a container of water. The expansion and contraction of the air caused the position of the water/air interface to move along the tube.
Such a mechanism was later used to show the hotness and coldness of the air with a tube in which the water level is controlled by the expansion and contraction of the gas. These devices were developed by several European scientists in the 16th and 17th centuries, notably Galileo Galilei and Santorio Santorio. As a result, devices were shown to produce this effect reliably, and the term thermoscope was adopted because it reflected the changes in sensible heat. The difference between a thermoscope and a thermometer is that the latter has a scale. Though Galileo is often said to be the inventor of the thermometer, there is no surviving document that he actually produced any such instrument.
The first clear diagram of a thermoscope was published in 1617 by Giuseppe Biancani : the first showing a scale and thus constituting a thermometer was Santorio Santorio in 1625. This was a vertical tube, closed by a bulb of air at the top, with the lower end opening into a vessel of water. The water level in the tube is controlled by the expansion and contraction of the air, so it is what we would now call an air thermometer.
The word thermometer first appeared in 1624 in La Récréation Mathématique by J. Leurechon, who describes one with a scale of 8 degrees. The word comes from the Greek words, thermos, meaning "hot" and μέτρον, metron, meaning "measure".
The above instruments suffered from the disadvantage that they were also barometers, i.e. sensitive to air pressure. In 1629, Joseph Solomon Delmedigo, a student of Galileo and Santorio in Padua, published what is apparently the first description and illustration of a sealed liquid-in-glass thermometer. It is described as having a bulb at the bottom of a sealed tube partially filled with brandy. The tube had a numbered scale. Delmedigo did not claim to have invented this instrument. Nor did he name anyone else as its inventor. In about 1654, Ferdinando II de' Medici, Grand Duke of Tuscany did produce such an instrument, the first modern-style thermometer, dependent on the expansion of a liquid and independent of air pressure. Many other scientists experimented with various liquids and designs of thermometer.
However, each inventor and each thermometer was unique — there was no standard scale. In 1665, Christiaan Huygens suggested using the melting and boiling points of water as standards and, in 1694, Carlo Renaldini proposed using them as fixed points on a universal scale. In 1701, Isaac Newton proposed a scale of 12 degrees between the melting point of ice and body temperature.

Era of precision thermometry

In 1714, Dutch scientist and inventor Daniel Gabriel Fahrenheit invented the first reliable thermometer, using mercury instead of alcohol and water mixtures. In 1724, he proposed a temperature scale which now bears his name. He could do this because he manufactured thermometers, using mercury for the first time, and the quality of his production could provide a finer scale and greater reproducibility, leading to its general adoption. In 1742, Anders Celsius proposed a scale with zero at the boiling point and 100 degrees at the freezing point of water, though the scale which now bears his name has them the other way around. French entomologist René Antoine Ferchault de Réaumur invented an alcohol thermometer and, temperature scale in 1730, that ultimately proved to be less reliable than Fahrenheit's mercury thermometer.
The first physician to use thermometer measurements in clinical practice was Herman Boerhaave. In 1866, Sir Thomas Clifford Allbutt invented a clinical thermometer that produced a body temperature reading in five minutes as opposed to twenty. In 1999, Dr. Francesco Pompei of the Exergen Corporation introduced the world's first temporal artery thermometer, a non-invasive temperature sensor which scans the forehead in about two seconds and provides a medically accurate body temperature.

Registering

Traditional thermometers were all non-registering thermometers. That is, the thermometer did not hold the temperature reading after it was moved to a place with a different temperature. Determining the temperature of a pot of hot liquid required the user to leave the thermometer in the hot liquid until after reading it. If the non-registering thermometer was removed from the hot liquid, then the temperature indicated on the thermometer would immediately begin changing to reflect the temperature of its new conditions. Registering thermometers are designed to hold the temperature indefinitely, so that the thermometer can be removed and read at a later time or in a more convenient place. Mechanical registering thermometers hold either the highest or lowest temperature recorded, until manually re-set, e.g., by shaking down a mercury-in-glass thermometer, or until an even more extreme temperature is experienced. Electronic registering thermometers may be designed to remember the highest or lowest temperature, or to remember whatever temperature was present at a specified point in time.
Thermometers increasingly use electronic means to provide a digital display or input to a computer.

Physical principles of thermometry

Thermometers may be described as empirical or absolute. Absolute thermometers are calibrated numerically by the thermodynamic absolute temperature scale. Empirical thermometers are not in general necessarily in exact agreement with absolute thermometers as to their numerical scale readings, but to qualify as thermometers at all they must agree with absolute thermometers and with each other in the following way: given any two bodies isolated in their separate respective thermodynamic equilibrium states, all thermometers agree as to which of the two has the higher temperature, or that the two have equal temperatures. For any two empirical thermometers, this does not require that the relation between their numerical scale readings be linear, but it does require that relation to be strictly monotonic. This is a fundamental character of temperature and thermometers.
As it is customarily stated in textbooks, taken alone, the so-called "zeroth law of thermodynamics" fails to deliver this information, but the statement of the zeroth law of thermodynamics by James Serrin in 1977, though rather mathematically abstract, is more informative for thermometry: "Zeroth Law – There exists a topological line which serves as a coordinate manifold of material behaviour. The points of the manifold are called 'hotness levels', and is called the 'universal hotness manifold'." To this information there needs to be added a sense of greater hotness; this sense can be had, independently of calorimetry, of thermodynamics, and of properties of particular materials, from Wien's displacement law of thermal radiation: the temperature of a bath of thermal radiation is proportional, by a universal constant, to the frequency of the maximum of its frequency spectrum; this frequency is always positive, but can have values that tend to zero. Another way of identifying hotter as opposed to colder conditions is supplied by Planck's principle, that when a process of isochoric adiabatic work is the sole means of change of internal energy of a closed system, the final state of the system is never colder than the initial state; except for phase changes with latent heat, it is hotter than the initial state.
There are several principles on which empirical thermometers are built, as listed in the section of this article entitled "Primary and secondary thermometers". Several such principles are essentially based on the constitutive relation between the state of a suitably selected particular material and its temperature. Only some materials are suitable for this purpose, and they may be considered as "thermometric materials". Radiometric thermometry, in contrast, can be only slightly dependent on the constitutive relations of materials. In a sense then, radiometric thermometry might be thought of as "universal". This is because it rests mainly on a universality character of thermodynamic equilibrium, that it has the universal property of producing blackbody radiation.

Thermometric materials

There are various kinds of empirical thermometer based on material properties.
Many empirical thermometers rely on the constitutive relation between pressure, volume and temperature of their thermometric material. For example, mercury expands when heated.
If it is used for its relation between pressure and volume and temperature, a thermometric material must have three properties:
Its heating and cooling must be rapid. That is to say, when a quantity of heat enters or leaves a body of the material, the material must expand or contract to its final volume or reach its final pressure and must reach its final temperature with practically no delay; some of the heat that enters can be considered to change the volume of the body at constant temperature, and is called the latent heat of expansion at constant temperature; and the rest of it can be considered to change the temperature of the body at constant volume, and is called the specific heat at constant volume. Some materials do not have this property, and take some time to distribute the heat between temperature and volume change.
Its heating and cooling must be reversible. That is to say, the material must be able to be heated and cooled indefinitely often by the same increment and decrement of heat, and still return to its original pressure, volume and temperature every time. Some plastics do not have this property;
Its heating and cooling must be monotonic. That is to say, throughout the range of temperatures for which it is intended to work,
At temperatures around about 4 °C, water does not have the property, and is said to behave anomalously in this respect; thus water cannot be used as a material for this kind of thermometry for temperature ranges near 4 °C.
Gases, on the other hand, all have the properties,, and and. Consequently, they are suitable thermometric materials, and that is why they were important in the development of thermometry.

Constant volume thermometry

According to Preston, Regnault found constant pressure air thermometers unsatisfactory, because they needed troublesome corrections. He therefore built a constant volume air thermometer. Constant volume thermometers do not provide a way to avoid the problem of anomalous behaviour like that of water at approximately 4 °C.

Radiometric thermometry

very accurately quantitatively describes the power spectral density of electromagnetic radiation, inside a rigid walled cavity in a body made of material that is completely opaque and poorly reflective, when it has reached thermodynamic equilibrium, as a function of absolute thermodynamic temperature alone. A small enough hole in the wall of the cavity emits near enough blackbody radiation of which the spectral radiance can be precisely measured. The walls of the cavity, provided they are completely opaque and poorly reflective, can be of any material indifferently. This provides a well-reproducible absolute thermometer over a very wide range of temperatures, able to measure the absolute temperature of a body inside the cavity.

Primary and secondary thermometers

A thermometer is called primary or secondary based on how the raw physical quantity it measures is mapped to a temperature. As summarized by Kauppinen et al., "For primary thermometers the measured property of matter is known so well that temperature can be calculated without any unknown quantities. Examples of these are thermometers based on the equation of state of a gas, on the velocity of sound in a gas, on the thermal noise voltage or current of an electrical resistor, and on the angular anisotropy of gamma ray emission of certain radioactive nuclei in a magnetic field."
In contrast, "Secondary thermometers are most widely used because of their convenience. Also, they are often much more sensitive than primary ones. For secondary thermometers knowledge of the measured property is not sufficient to allow direct calculation of temperature. They have to be calibrated against a primary thermometer at least at one temperature or at a number of fixed temperatures. Such fixed points, for example, triple points and superconducting transitions, occur reproducibly at the same temperature."

Calibration

Thermometers can be calibrated either by comparing them with other calibrated thermometers or by checking them against known fixed points on the temperature scale. The best known of these fixed points are the melting and boiling points of pure water.
The traditional way of putting a scale on a liquid-in-glass or liquid-in-metal thermometer was in three stages:
  1. Immerse the sensing portion in a stirred mixture of pure ice and water at atmospheric pressure and mark the point indicated when it had come to thermal equilibrium.
  2. Immerse the sensing portion in a steam bath at Standard atmospheric pressure and again mark the point indicated.
  3. Divide the distance between these marks into equal portions according to the temperature scale being used.
Other fixed points used in the past are the body temperature which was originally used by Fahrenheit as his upper fixed point and the lowest temperature given by a mixture of salt and ice, which was originally the definition of.. As body temperature varies, the Fahrenheit scale was later changed to use an upper fixed point of boiling water at.
These have now been replaced by the defining points in the International Temperature Scale of 1990, though in practice the melting point of water is more commonly used than its triple point, the latter being more difficult to manage and thus restricted to critical standard measurement. Nowadays manufacturers will often use a thermostat bath or solid block where the temperature is held constant relative to a calibrated thermometer. Other thermometers to be calibrated are put into the same bath or block and allowed to come to equilibrium, then the scale marked, or any deviation from the instrument scale recorded. For many modern devices calibration will be stating some value to be used in processing an electronic signal to convert it to a temperature.

Precision, accuracy, and reproducibility

The precision or resolution of a thermometer is simply to what fraction of a degree it is possible to make a reading. For high temperature work it may only be possible to measure to the nearest 10 °C or more. Clinical thermometers and many electronic thermometers are usually readable to 0.1 °C. Special instruments can give readings to one thousandth of a degree. However, this precision does not mean the reading is true or accurate, it only means that very small changes can be observed.
A thermometer calibrated to a known fixed point is accurate at that point. Most thermometers are originally calibrated to a constant-volume gas thermometer. In between fixed calibration points, interpolation is used, usually linear. This may give significant differences between different types of thermometer at points far away from the fixed points. For example, the expansion of mercury in a glass thermometer is slightly different from the change in resistance of a platinum resistance thermometer, so these two will disagree slightly at around 50 °C. There may be other causes due to imperfections in the instrument, e.g. in a liquid-in-glass thermometer if the capillary tube varies in diameter.
For many purposes reproducibility is important. That is, does the same thermometer give the same reading for the same temperature ? Reproducible temperature measurement means that comparisons are valid in scientific experiments and industrial processes are consistent. Thus if the same type of thermometer is calibrated in the same way its readings will be valid even if it is slightly inaccurate compared to the absolute scale.
An example of a reference thermometer used to check others to industrial standards would be a platinum resistance thermometer with a digital display to 0.1 °C which has been calibrated at 5 points against national standards and which is certified to an accuracy of ±0.2 °C.
According to British Standards, correctly calibrated, used and maintained liquid-in-glass thermometers can achieve a measurement uncertainty of ±0.01 °C in the range 0 to 100 °C, and a larger uncertainty outside this range: ±0.05 °C up to 200 or down to −40 °C, ±0.2 °C up to 450 or down to −80 °C.

Indirect methods of temperature measurement

;Thermal expansion
;Pressure
;Density
;Thermochromism
;Band edge thermometry
;Fluorescence
;Optical absorbance spectra
;Electrical resistance
;Electrical potential
;Electrical resonance
;Nuclear magnetic resonance
;Magnetic susceptibility

Applications

Thermometers utilize a range of physical effects to measure temperature. Temperature sensors are used in a wide variety of scientific and engineering applications, especially measurement systems. Temperature systems are primarily either electrical or mechanical, occasionally inseparable from the system which they control. Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist. Indoors, thermistors are used in climate control systems such as air conditioners, freezers, heaters, refrigerators, and water heaters. Galileo thermometers are used to measure indoor air temperature, due to their limited measurement range.
Such liquid crystal thermometers are also used in mood rings and used to measure the temperature of water in fish tanks.
Fiber Bragg grating temperature sensors are used in nuclear power facilities to monitor reactor core temperatures and avoid the possibility of nuclear meltdowns.

Nanothermometry

is an emergent research field dealing with the knowledge of temperature in the sub-micrometric scale. Conventional thermometers cannot measure the temperature of an object which is smaller than a micrometre, and new methods and materials have to be used. Nanothermometry is used in such cases. Nanothermometers are classified as luminescent thermometers and non-luminescent thermometers.

Cryometer

Thermometers used specifically for low temperatures.

Medical

Various thermometric techniques have been used throughout history such as the Galileo thermometer to thermal imaging.
Medical thermometers such as mercury-in-glass thermometers, infrared thermometers, pill thermometers, and liquid crystal thermometers are used in health care settings to determine if individuals have a fever or are hypothermic.

Food and food safety

Thermometers are important in food safety, where food at temperatures within can be prone to potentially harmful levels of bacterial growth after several hours which could lead to foodborne illness. This includes monitoring refrigeration temperatures and maintaining temperatures in foods being served under heat lamps or hot water baths.
Cooking thermometers are important for determining if a food is properly cooked. In particular meat thermometers are used to aid in cooking meat to a safe internal temperature while preventing over cooking. They are commonly found using either a bimetallic coil, or a thermocouple or thermistor with a digital readout.
Candy thermometers are used to aid in achieving a specific water content in a sugar solution based on its boiling temperature.

Environmental

Alcohol thermometers, infrared thermometers, mercury-in-glass thermometers, recording thermometers, thermistors, and Six's thermometers are used in meteorology and climatology in various levels of the atmosphere and oceans. Aircraft use thermometers and hygrometers to determine if atmospheric icing conditions exist along their flight path. These measurements are used to initialize weather forecast models. Thermometers are used in roadways in cold weather climates to help determine if icing conditions exist and indoors in climate control systems.