Is there any resource out there that actually teaches the metric system?

In fact, as such you could say Celsius is already too fiddly, but then, 100 points between freezing and boiling water, nicely decimal, makes sense.

I mean, not as much sense as Fahrenheit, right?
212-32 = 180 = 60 * 3 = 12 * 15
So those make even more sense in a strict mathematical sense, we just use base-10 instead of the more sensible base-12 (or even base-8) because of arbitrary evolutionary coincidence that means we happen to have ten fingers total.

Or conversely, Fahrenheit got about 100 degrees of useful range in like, most places in Europe don't tend to go below 0 F or above 100 F, so the parts people use for daily things like meteorology, it's in that "nicely decimal" range for that. Nobody really needs the second half of the scale that goes from, like, 50 C to 100 C, right?

I mean these "freezing / boiling temp of water" definitional points are arbitrary to begin with! Especially because they are phase-change events that are also dependent on atmospheric pressure, which in turn is temperature-dependent per ideal gas law. So there is a "fun" "riddle" -- do the definition of "freezing" and "boiling" points assume that you're pumping air in / out to maintain some particular constant air pressure, or do you assume that you've got some normal air pressure at some normal air temperature sealed in a can, and then the air pressure also goes up/down with temperature inside your sealed can, and the phenomenological events of freezing and boiling go up and down with the air pressure?

(The answer is it's a trick question, these days temperature is defined off the Boltzmann constant).

Like, fundamentally, the Fahrenheit and Celsius scales are basically fungible, you can't really make a convincing argument that one is definitely better than the other, because they're just different. I mean from my selfish perspective it's almost best that there are two in competition like this, because it forces us to consider these things instead of just taking them as "just so" stories that people never even think to question, never even think to think about.
 
Last edited:
if we're gonna be real grognards about it, then it's gotta be the Pellicer Scale

It's logarithmic, so it treats "absolute zero" as an asymptote the way it actually seems to work in real life
If the asymptotic representation is a concern, one might as well use coldness β = 1/kT and be done with it. Since it is related to the rate of change of entropy with respect to internal energy at constant volume, it is rather fundamental, and it makes absolute zero at +∞ and absolute hot at -∞. One could have some additional factor to flip the sign and/or normalise to the triple point or something else. (But the original Celsius scale was reversed anyway, with higher numbers representing colder rather than hotter temperatures.)

I don't know why everyone just doesn't use inverse electron-volts (heh) or perhaps information per energy, since E/kT is essentially change in bits per 2-fold change of thermal energy.

More seriously, a logarithmic scale for positive absolute temperatures would be interesting and force people to be more intuitively familiar with log scales (logarithmic graphing paper seems to be a lost art nowadays), and though here inverted, this general utility to being able to reading a log scale makes it not too weird to retrofit a such a scale to common bulb thermometer design. Maybe essentially something like centiBels.
 
Last edited:
I think there's a good case for a scientific temperature scale that's logarithmic, but log-scale is not intuitive without a decade of math first, and anyway from the scientific perspective that goes from the concept of absolute zero all the way to stars exploding, there's only a tiny little temperature band that's not instantly lethal to humans where we live our whole lives, so we might as well have a temperature scale that spans a couple dozen degrees across that tiny little band that's linear, for all the normies that just want to know whether they should wear a coat today, or what. And pushing the degree sign the other direction so it's a measure of "increasing entropy" rather than "increasing phonon density" would encourage thinking about it in a different way, but again that's... arcane, compared to the tiny little band of habitability for our soft wet mammal bodies.

Like the purpose of measurements are to be useful, so our units of measure should be scaled to also be useful. Like the reason Fahrenheit and Celcius scales are basically the same (linear, about a hundred degrees across (instead of, like, six or a thousand) with zero at "obnoxiously but not lethally cold" and going up to "cooks-food-hot"), is because they both converged on design parameters that made them maximally useful to the maximum number of people.
 
More seriously, a logarithmic scale for positive absolute temperatures would be interesting and force people to be more intuitively familiar with log scales (logarithmic graphing paper seems to be a lost art nowadays), and though here inverted, this general utility to being able to reading a log scale makes it not too weird to retrofit a such a scale to common bulb thermometer design. Maybe essentially something like centiBels.
How would that work, though? I though that most fluids expand at least approximately linearly with temperature.
 
How would that work, though? I though that most fluids expand at least approximately linearly with temperature.
I was thinking of simply changing graduation marks, and in the generic case over wide enough range would basically analogous but opposite to regular semi-log graph paper, and so isn't anything particularly hairy; logs are more useful than arbitrary smooth monotonic functions anyway.

However, if one wishes to keep the zero point the same, it's not practical because it is simultaneously close to linear over ±50℃ yet not quite, which is probably more difficult to deal with than than if it were simply far from linear. Specifically, if the zero point stays same as Celsius, log(T/273.15) = C/273.15 + (1/2)(C/273.15)² + ..., where C = T-273.15 is the Celsius degree, so deviation from linearity over that range is small but not insignificant. One could shift things again to deal with that, but at that point, why bother.

You're right to be concerned because a linear scale has an important advantage of more closely corresponding thermal expansion, which is useful trait in dealing with materials on a not too large a temperature range.
 
So those make even more sense in a strict mathematical sense, we just use base-10 instead of the more sensible base-12 (or even base-8) because of arbitrary evolutionary coincidence that means we happen to have ten fingers total.

You can do base-12 on your hands too. You just count finger bones using your thumb. Four fingers with 3 phalanges each mean you can count to 12 on one hand. And if you use the phalanges on your other hand to count 12s (which makes sense since base is the same as the number you can reach on one hand), then you can count all the way up to 156 on your fingers (12 on one hand and 12 squared on the other).

If metric was base-12 I'd be on it like a fly on shit, but it isn't and probably never will be so switching over is pretty pointless.
 
I suppose the disconnect for me with regards to people claiming Farenheit is more suitable for "daily use" is that I honestly can't quite wrap my head around the notion that the weather is the only (or even the most common) use of temperature scales for "daily use".

Instead, the temperature scale use I think about most often (as opposed to "use this number to set the air-conditioning temperature", which does not require thought or calculation) is in cooking and food preparation. At which point the boiling point of water is very useful, since I love to make tea.

I keep hearing that Farenheit is especially useful in European climates, but does that apply to places outside of Europe, where most of the world's population live?
 
Instead, the temperature scale use I think about most often (as opposed to "use this number to set the air-conditioning temperature", which does not require thought or calculation) is in cooking and food preparation. At which point the boiling point of water is very useful, since I love to make tea.

Do you often need to use a thermometer for that? Most of the time you just wait for steam.


I keep hearing that Farenheit is especially useful in European climates, but does that apply to places outside of Europe, where most of the world's population live?

Let's just say the idea that you rarely have to reach for negative Fahrenheit is very much not true in the Midwest.
 
Giga whats per nanoJoule?

Upper-case "B" is bytes, especially of computer memory
(Lower-case "b" is bar of pressure or less commonly barns; somewhat ironically in this context, both of those are metric units that are not part of the formal SI system)

I don't know if this is the joke,
but bytes/Joule is a real thing -- it's used to express efficacy of wireless networks like Bluetooth.

And I really don't know if this is the joke here,
But I am mildly familiar with this thing called Landauer's Principle, which is basically the theoretical maximum efficiency of a computer at a particular temperature I guess? And GB/nJ actually.... plugs into that. I am using this right and definitely not misunderstanding it in any way.

Limit per byte = ( k * T * ln(2) )

Multiplying out, units in {brackets} for keeping track --

Limit >= 40 G {B} / n {J} * ( 1.38E-23 {J/K} * T {K} * 0.69315 {1/B} )
1/T {1/K} >= 40 * 10E9 / 10E-9 * 1.38E-23 * 0.69315 {1/K} = 3.83E-4 {1/K}
T {K} >= 2613.6 {K}

So it's at least 2,340 degrees Celsius outside wherever @Baughn is.

This is perfectly correct and not wrong-headed at all
 
And I really don't know if this is the joke here,
But I am mildly familiar with this thing called Landauer's Principle, which is basically the theoretical maximum efficiency of a computer at a particular temperature I guess? And GB/nJ actually...
...
So it's at least 2,340 degrees Celsius outside wherever @Baughn is.
It's not a joke. It's the basically equivalent to coldness β = 1/(kT) except using units of GB/nJ is more amusing, though it also makes typical terrestrial weather temperatures have decently sized numbers.

Gibbs (or von Neumann) entropy and information entropy are in direct correspondence. Since 1/T = ∂S/∂E with some work parameters held constant, inverse temperature essentially has units of information per energy, up to a Boltzmann constant. Thus assuming a convention of 8 bits per byte, which is common but not universal, and converting bit = log(2) nat, that's 222E18 nats/Joule, which using Boltzmann's constant gives 53℃. That's still pretty hot, though.
 
I rounded down, of course.

There are worse things than centigrades. Don't make us force you to use them. :V
 
At which point the boiling point of water is very useful, since I love to make tea.

The act of boiling water is useful in everyday life, yes. The numerical value at which that happens is not. You don't boil water by throwing it in the oven and setting it to 100/212. You boil water by putting it on a hot thing until it boils, at which point the water's temperature stays where it is because a boiling liquid self-regulates.
 
The act of boiling water is useful in everyday life, yes. The numerical value at which that happens is not. You don't boil water by throwing it in the oven and setting it to 100/212. You boil water by putting it on a hot thing until it boils, at which point the water's temperature stays where it is because a boiling liquid self-regulates.

But for making tea, I don't want boiling water, though. I want water that is just below boiling, so about 90-95 degrees Celsius.
 
So B actually is referring to bytes?
Yes. There's a relation between temperature (in its entropic formulation), and information. To summarize it very briefly... Imagine a system that's at absolute zero. Could be anything. A perfect iron crystal, maybe.

It's at absolute zero, and so none of its atoms are moving in the slightest[1]. The positions of the atoms are also known; iron has a well-known lattice structure, and we're positing that this particular piece is monocrystalline.

If you heat it up, then the atoms will start moving. Since the atoms are moving chaotically, their positions and momentums both become uncertain.[2]

Uncertainty, like all forms of information, is measured in bits: The number of bits of information you do not know. This is also called entropy, and applies to all forms of uncertainty. Anything we already know about the crystal doesn't count.[3]

So, to revise:

- State at 0K: Completely certain. 0 bits of entropy.

- State at 1K: Far, far less certain. There is an enormous amount of uncertainty about the internal state of the crystal, which can be measured (estimated) in bits.

- State at 2K: Less certain yet! But this is adding less than 2x the bits. And so it goes.

Thermodynamic beta, measured as bits/joule, basically measures how much more random the microstate of a system will become when you add some amount of energy. There are lots of atoms in a gram of water, so although adding 4.186 J certainly will increase the temperature by 1C, it implies adding a really, humongously, ludicrously huge number of bits.

So it's a derivative, but a monotonically decreasing one[5]. This makes it straightforward to calculate one from the other: The hotter it gets, the less uncertainty you 'gain' from adding a given amount of energy.

Why GB/nJ, then?

The factor of 8*10^18 just happens to make normal everyday temperatures into reasonably small numbers. It's a unitless, arbitrary, yet amusing scaling factor.

= = =

Still with me? Here are some footnotes, because I'm sure you aren't screaming at the walls yet.

1: Careful readers will note that this violates the uncertainty principle; specifically, the product of the uncertainty of momentum and position must be over a small, nonzero constant. There are many other such relations. Indeed, this implies that absolute zero is unreachable, which is to say that the equations sketched above are an approximation that's only valid at (very slightly) higher temperatures.

2: That's "uncertainty" in the classic statistical sense, not any form of quantum mechanics. Quantum mechanics certainly gets involved when you want the exact equations, but nothing of what I've described depends on the universe following it. This formulation of temperature would be just as valid in a purely classical world.

(Their positions also become 'uncertain' in the QM sense, though 'uncertainty' is a poor way to describe a smeared-out but still coherent wavefunction, and I prefer not to use that word. I only mention it because not everyone is so principled.)

3: Does that mean that if we knew the positions and velocities of every atom in a boiling glass of water, then the water is at absolute zero?

In a spherical-cow-in-a-vacuum sense, yes. Having that information would, in principle, allow you to turn the water into a block of ice while emitting no heat and extracting electricity equivalent to the heat energy that was in it. You'd have to use a mechanism similar to Maxwell's demon, which in this case would actually work.

In reality? Obviously that's completely impossible. But so is learning the complete microstate of a glass of boiling water, so it balances out.

An example from computer science might make this more obvious. If you're trying to transfer a file, then the informational entropy tells you the absolute best possible compression you can achieve.

For a file that you know is all-zero, the entropy is zero[4]. This is true even if it's several gigabytes on disk, and you can usually get a decent estimate of entropy by running files through a good compression algorithm. (This is an incredibly useful trick!)

But what about a file that's full entirely of random numbers?

Well, you can't compress it at all! So the entropy is equal to the size of the file... unless it turns out that the "random" numbers are from a pseudo-random number generator, and you know the seed. In that case the actual entropy of a terabyte-sized file might only be a few bytes!

You won't be able to tell without knowing the trick, though.

This situation is roughly equivalent to the glass of water, except without the numbers changing while you're trying to read it. (Water atoms like to move! That's why it wasn't already ice...)

4: Actually it's log2(length of file). You still have to mention how many zeroes. Unless there's a pattern in that...

5: For completeness' sake: There are some rare systems (you won't find 'em outside a lab) which have an upper bound on how much energy you can add. (Without breaking the lab, anyway.) This means you'd eventually max it out, at which point entropy should be zero, right?

Yeah, and it is. Check the wikipedia article on negative temperature if you're really curious, but just like you'd expect, thermodynamic beta eventually flips sign for such a system. Even so, it's still monotonically decreasing! It just, y'know, drops down below zero.

The precise point at which that happens corresponds to "infinite temperature", and if the system gets hotter than that then it would have a "negative temperature". Sounds weird? Not when put in terms of coldness, i.e. beta; there are no infinities there, it just crosses zero.

But regular temperature is the reciprocal. Therefore...
 
Last edited:
Measuring entropy in nats is more natural than in bits, and indeed that's what standard physical formulae implicitly do. Though other than a conversion 1 nat = log(2) bits, it's not particularly important.

Temperature can be thought of as being about sorting systems by their propensity to exchange thermal energy, so that if two systems are put in thermal contact and there is no net energy exchange between them, we say that they are of the same temperature. Thus, if you hypothetically know the exact microstate of a gas the particles of which are very energetically bouncing around in a container, then it has zero entropy, but it does not follow that it is at absolute zero temperature. If it were the case, then putting it into thermal contact with another gas as near absolute zero, there would be a net flow of thermal energy to the first from the second, whereas the exact opposite would happen.

Generally, at absolute zero, the system would be in the ground state of lowest energy, and still satisfy quantum uncertainty relations. Thus, those relations aren't the reason for difficulty reaching absolute zero. (Also, quantum uncertainty does not contribute to entropy at all because the pure quantum states are have maximal information, i.e. they are not missing any information, which is what entropy counts.) Moreover, if the ground state is degenerate, i.e. there are multiple states with the same lowest energy, then even at absolute zero there would be positive residual entropy, as the condition of lowest energy would not fully determine which microstate the system is in.
 
All true, though bits is more intuitive if you happen to be a computer scientist who dabbles in networking. (Shannon.)

I guess the takeaway is, we should be putting disk labels on our thermometers. It's as hot as an Intel SSD today.
 
Back
Top