Temperature Scales
Temperature Scales
In its simplest definition, a temperature scale uses numbers to classify how hot or cold
something is. The Celsius scale is the most commonly used scale of temperature around the
world, with units of degrees Celsius. At 1 atmosphere of pressure, 0 °C is the temperature at
which water freezes, and 100 °C is the temperature where it boils. A nerd will tell you that’s
technically not true, but we’ll get to that later.
The history of the Celsius scale begins all the way back in the year 1742, with Swedish physicist
and astronomer Anders Celsius. Though he invented the scale that would eventually become
Celsius as we know it, it involved a choice that would seem bizarre to us today: 0 was boiling
and 100 was freezing, so higher numbers represented colder temperatures. Indeed, since
temperature is a kind of energy measurement, this would mean that higher numbers
counterintuitively correspond to lower energy.
However, Celsius himself actually knew this; you will recall that he lived in Sweden, where it
gets exceptionally cold, so he chose this orientation to avoid dealing with negative numbers too
much. Additionally, his choice was based on the earlier Delisle scale, which follows a similar
orientation. Celsius’s water-based approach resulted in a revolution in temperature scale
standardization. His work was highly respected among the scientific community, though pretty
much everyone came to the decision of making one change: flip 0 and 100. And with that, the
modern Celsius scale was invented.
There is still one remaining convention to discuss: the name of the scale. The standard name
was centigrade beginning in the 1800s. However, this could also mean 1/100 of a gradian,
where the gradian is a unit of angle measurement, as covered previously. Due to this ambiguity,
the name Celsius was adopted for the scale in honor of Anders Celsius, even though he
technically didn’t invent it. This is the standard name of the scale today in scientific
communities, though centigrade persists in some more colloquial contexts.
Fahrenheit
The Fahrenheit scale, with units of degrees Fahrenheit, is likely familiar to those living in the
United States of America, as well as places with heavy US influence. Its story begins in 1724,
18 years before that of Celsius, with Polish–Lithuanian physicist and inventor Daniel Gabriel
Fahrenheit.
It is believed that the Fahrenheit scale was initially defined using two fixed points at 0 and 90. 0
was chosen as the stable temperature of a brine made of ice, water, and ammonium chloride;
90 was based on an estimate of average human body temperature.
The scale was based on earlier work by Danish astronomer Ole Christensen Rømer, inventor of
the Rømer scale. Unfortunately, we’re not discussing these temperature scales in chronological
order, so we’ll get to that later, too. A later adjustment put 32 as water’s freezing point and 96 as
human body temperature, leaving 64 degrees between; Daniel Fahrenheit chose a power of two
so that degrees could be marked just by repeatedly splitting intervals in half.
However, soon after, Anders Celsius’s work meant that using water to measure temperature
was the hot new thing. So the Fahrenheit scale underwent one final major revision: 32 and 212
became the freezing and boiling points of water, respectively. This puts the brine’s stable
temperature at about 4 degrees and the average human body temperature at about 98.6
degrees—a modest difference from Daniel Fahrenheit’s original vision. Accounting for this gives
us these formulas for Celsius–Fahrenheit conversion:
f = (9/5)c + 32
c = (5/9)(f - 32)
Though Fahrenheit was in popular use among English-speaking countries for a while, it largely
fell out of favor throughout the 20th century as Celsius took over—except in the US, as
previously mentioned. Fahrenheit and Celsius each have their supporters—one might claim that
Fahrenheit’s 0 to 100 range is a close representation for the temperature range where humans
live and that rounding in Fahrenheit is more useful, while another might point out that Celsius’s 0
to 100 range was designed to work nicely with water, which humans find themselves using
extremely frequently. Feel free to discuss while respecting each other’s humanity and not
sparking massive flame wars. Thank you.
Kelvin
The Kelvin scale is the temperature scale used in the International System of Units, or SI. The
International System of Units is exactly what it says on the tin: a standardized measurement
system to be used by everyone. In order to understand the Kelvin scale, let’s start really thinking
about what temperature actually is. It begins with a concept called kinetic energy, which is the
energy of an object that comes from its motion. This is determined by the amount of work
required to accelerate an object to a given speed. More mass means more kinetic energy
because heavier objects need more work to be moved; similarly, more speed means more
kinetic energy because you need more work to make it go faster.
Now, temperature is simply a measure of the average kinetic energy of the particles in a
material. Ultimately, these particles are jiggling around at the atomic level, and temperature just
tells us how much they jiggle. So what if there’s no jiggling at all in the material? That would
mean it has the lowest possible temperature, which has a special name: absolute zero, or
-273.15 degrees Celsius. This is where 0 is on the Kelvin scale, making it something called an
absolute temperature scale. This is the least arbitrary possible choice for 0 in a temperature
scale, so if meaningfulness of point choices is something you value, then Kelvin is the scale for
you. A kelvin itself has the same magnitude as a degree Celsius: a change by 1 kelvin equates
to a change by 1 degree Celsius.
Quick note on terminology: the Kelvin scale, as defined in the SI, does not use degrees. The
units are kelvins, not degrees Kelvin. As for capitalization, the K is uppercase for the scale but
lowercase for the unit. Of course, this only matters for pedantry.
The history of the Kelvin scale begins in 1848 with British physicist and engineer William
Thomson, 1st Baron Kelvin. At this time, Kelvin calculated absolute zero as being about -273
degrees Celsius, which we know is accurate today. This was accompanied by a proposal for an
“absolute Celsius” scale, the predecessor of the modern Kelvin scale. However, the initial
system was flawed and had to be redefined in 1854, and then the 10th General Conference on
Weights and Measures re-redefined it in 1954.
The unit name changed from degrees absolute Celsius to degrees Kelvin, then to just kelvins
between 1967 and 1968. The story concludes with the 2019 SI revision, which centered on
redefining units based on universal constants; the kelvin in particular was redefined in terms of
the Boltzmann constant, and that definition stands today. All other temperature scales are now
defined in terms of the Kelvin scale, so 0 and 100 degrees Celsius aren’t exactly the freezing
and boiling points of water anymore, as previously mentioned.
Lightning round
With the main three temperature scales done, here are a bunch of weird and obscure
temperature scales in quick succession:
Rankine
This one’s unit is the degree Rankine, written °R or °Ra. It’s Fahrenheit, but 0 degrees Rankine
is absolute zero. This was proposed in 1859 by Scottish mathematician and physicist William
John Macquorn Rankine. Some call the units “rankines” instead, like with kelvins. This one is
not used much.
Rømer
This is the Fahrenheit forerunner from before, invented by Ole Christensen Rømer in 1702 and
whose unit is written °Rø. It was defined so that 7.5 and 60 degrees are water’s freezing and
boiling point, respectively. Historians hypothesize that 0 degrees was based on the brine
temperature from before. One version of Fahrenheit took Rømer and multiplied everything by 4
to eliminate fractions.
Delisle
This is the Celsius forerunner from before, invented by French cartographer and astronomer
Joseph Nicolas Delisle in 1732 and whose unit is written °D. Again, higher numbers are colder
in this scale. This scale was used in Russia for a while.
Newton
Invented by Sir Isaac Newton in 1701. Newton was a brilliant mathematician and physicist, but
he was not a good temperature scale inventor: his scale describes many different reference
points, many of which are completely subjective, such as “the greatest heat of a bath which one
can endure for some time when the hand is dipped in and is kept still”. The objective parts taken
together make for a completely inconsistent and incoherent system. However, this scale was
likely only intended for personal use, so we can’t judge it too harshly.
Réaumur
Based on a 1730 scale by French entomologist René Antoine Ferchault de Réaumur. 0 and 80
degrees are the melting and boiling point of water. You’re probably starting to see a pattern
here. This one was widespread throughout Europe up to the 19th century.
Wedgwood
An 18th-century scale by English potter and abolitionist Josiah Wedgwood, intended to be used
for metals. 0 degrees Wedgwood was 580.8 degrees Celsius, and step sizes were 72 degrees
Celsius. Unfortunately, in creating this system, Wedgwood overshot the melting points of
copper, silver, and gold each by at least 1,400 degrees Celsius. A later correction revealed that
the starting point was about 300 degrees Celsius too high and the steps nearly twice as big as
they should be, but the element melting points were overshot even then.