So B actually is referring to bytes?
Yes. There's a relation between temperature (in its entropic formulation), and information. To summarize it very briefly... Imagine a system that's at absolute zero. Could be anything. A perfect iron crystal, maybe.
It's at absolute zero, and so none of its atoms are moving in the slightest[1]. The positions of the atoms are also known; iron has a well-known lattice structure, and we're positing that this particular piece is monocrystalline.
If you heat it up, then the atoms will start moving. Since the atoms are moving chaotically, their positions and momentums both become uncertain.[2]
Uncertainty, like all forms of information, is measured in bits: The number of bits of information
you do not know. This is also called entropy, and applies to all forms of uncertainty. Anything we already know about the crystal doesn't count.[3]
So, to revise:
- State at 0K: Completely certain. 0 bits of entropy.
- State at 1K: Far, far less certain. There is an enormous amount of uncertainty about the internal state of the crystal, which can be measured (estimated) in bits.
- State at 2K: Less certain yet! But this is adding less than 2x the bits. And so it goes.
Thermodynamic beta, measured as bits/joule, basically measures how much more random the microstate of a system will become when you add some amount of energy. There are
lots of atoms in a gram of water, so although adding 4.186 J certainly will increase the temperature by 1C, it implies adding a
really, humongously, ludicrously huge number of bits.
So it's a derivative, but a monotonically decreasing one[5]. This makes it straightforward to calculate one from the other: The hotter it gets, the less uncertainty you 'gain' from adding a given amount of energy.
Why GB/nJ, then?
The factor of 8*10^18 just happens to make normal everyday temperatures into reasonably small numbers. It's a unitless, arbitrary, yet amusing scaling factor.
= = =
Still with me? Here are some footnotes, because I'm sure you aren't screaming at the walls yet.
1: Careful readers will note that this violates the uncertainty principle; specifically, the product of the uncertainty of momentum and position must be over a small, nonzero constant. There are many other such relations. Indeed, this implies that absolute zero is unreachable, which is to say that the equations sketched above are an approximation that's only valid at (very slightly) higher temperatures.
2: That's "uncertainty" in the classic statistical sense, not any form of quantum mechanics. Quantum mechanics certainly gets involved when you want the exact equations, but nothing of what I've described depends on the universe following it. This formulation of temperature would be just as valid in a purely classical world.
(Their positions
also become 'uncertain' in the QM sense, though 'uncertainty' is a poor way to describe a smeared-out but still coherent wavefunction, and I prefer not to use that word. I only mention it because not everyone is so principled.)
3: Does that mean that if we knew the positions and velocities of every atom in a boiling glass of water, then the water is at absolute zero?
In a spherical-cow-in-a-vacuum sense, yes.
Having that information would, in principle, allow you to turn the water into a block of ice while emitting no heat and extracting electricity equivalent to the heat energy that was in it. You'd have to use a mechanism similar to Maxwell's demon, which in
this case would actually
work.
In reality? Obviously that's completely impossible. But so is learning the complete microstate of a glass of boiling water, so it balances out.
An example from computer science might make this more obvious. If you're trying to transfer a file, then the informational entropy tells you the absolute best possible compression you can achieve.
For a file that you know is all-zero, the entropy is zero[4]. This is true even if it's several gigabytes on disk, and you can usually get a decent estimate of entropy by running files through a good compression algorithm. (This is an incredibly useful trick!)
But what about a file that's full entirely of random numbers?
Well, you can't compress it at all! So the entropy is equal to the size of the file...
unless it turns out that the "random" numbers are from a pseudo-random number generator, and you know the seed. In that case the actual entropy of a terabyte-sized file might only be a few bytes!
You won't be able to tell without knowing the trick, though.
This situation is roughly equivalent to the glass of water, except without the numbers changing while you're trying to read it. (Water atoms like to move! That's why it wasn't already ice...)
4: Actually it's log2(length of file). You still have to mention how many zeroes. Unless there's a pattern in that...
5: For completeness' sake: There are some rare systems (you won't find 'em outside a lab) which have an upper bound on how much energy you can add. (Without breaking the lab, anyway.) This means you'd eventually max it out, at which point entropy should be zero, right?
Yeah, and it is. Check the wikipedia article on negative temperature if you're really curious, but just like you'd expect, thermodynamic beta eventually flips sign for such a system. Even so, it's still monotonically decreasing! It just, y'know, drops down below zero.
The precise point at which that happens corresponds to "infinite temperature", and if the system gets hotter than that then it would have a "negative temperature". Sounds weird? Not when put in terms of coldness, i.e. beta; there are no infinities there, it just crosses zero.
But regular temperature is the reciprocal. Therefore...