A neat little corollary to this is to look a little more closely at what temperature actually is. "Temperature" doesn't appear too often in the main explanation here, but it's all over the "student's explanation". So... what is it?
The most useful definition of temperature at the microscopic scale is probably this one:
1/T = dS / dU,
which I've simplified because math notation is hard, and because we're not going to need the full baggage here. (The whole thing with the curly-d's and the proper conditions imposed is around if you want it.) Okay, so what does that mean? (Let's not even think about where I dug it up from.)
It's actually pretty simple: it says that the inverse of temperature is equal to the change in entropy over the change in energy. That means that temperature is measuring how much the entropy changes when we add or remove energy. And now we start to see why temperature is everywhere in these energy-entropy equations: it's the link between them! And we see why two things having the same temperature is so important: no entropy will change if energy flows. Or, in the language of the article, energy would not actually spread out any more if it would flow between objects at the same temperature. So there's no flow!
The whole 1/T bit, aside from being inconvenient to calculate with, also suggests a few opportunities to fuzz-test Nature. What happens at T=0, absolute zero? 1/T blows up, so dS/dU should blow up too. And indeed it does: at absolute zero, any amount of energy will cause a massive increase in entropy. So we're good. What about if T -> infinity, so 1/T -> zero? So any additional energy induces no more entropy? Well, that's real too: you see this in certain highly-constrained solid-state systems (probably among others), when certain bands fill. And you do indeed observe the weird behavior of "infinite temperature" when dS/dU is zero. Can you push further? Yes: dS/dU can go negative in those systems, making them "infinitely hot", so hot they overflow temperature itself and reach "negative temperature" (dS/dU < 0 implies absolute T < 0). Entropy actually decreases when you pump energy into these systems!
These sorts of systems usually involve population inversions (which might, correctly, make you think of lasers). For a 2-band system, the "absolute zero" state would have the lower band full and the upper band empty. Adding energy lifts some atoms to the upper band. When the upper and lower band are equally full, that's maximum entropy: infinite temperature. Add a little more energy and the upper band is now more full than the lower: this is the negative temperature regime. And, finally, when everything's in the upper band, that is the exact opposite of absolute zero: the system can absorb no more energy. Its temperature is maximum. What temperature is that? Well, if you look at how we got here and our governing equation, we started at 0, went through normal temperatures +T, reached +infinity, crossed over to -infinity, went through negative temperatures -T, and finally reached... -0. Minus absolute zero!
(Suck on that, IEEE-754 signed zero critics?)
And all that from our definition of temperature: how much entropy will we get by adding a little energy here?
Thermodynamics: it'll hurt your head even more than IEEE-754 debugging.
> Many people focus on the statistical definition of entropy and the fact that entropy increases for any spontaneous process. Fewer people are familiar with thinking about entropy as the conjugate thermodynamic variable to temperature. Just as volumes shift to equalize pressure, areas shift to equalize surface tension, and charges shift to equalize voltage, entropy is the "stuff" that shifts to equalize temperature. (Entropy is of course also unique in that it's generated in all four processes.) Entropy is thus in some ways the modern version of the debunked theory of caloric.
> Just as volumes shift to equalize pressure, areas shift to equalize surface tension, and charges shift to equalize voltage, entropy is the "stuff" that shifts to equalize temperature.
I remember watching videos of Leonard Susskind in which he talked about a similar phenomenon where circuit complexity itself increases till it maximizes. It behaves similar to entropy.
“Temperature stems from the observation that if you bring physical objects (and liquids, gases, etc.) in contact with each other, heat (i.e., molecular kinetic energy) can flow between them. You can order all objects such that:
- If Object A is ordered higher than Object B, heat will flow from A to B.
- If Object A is ordered the same as Object B, they are in thermal equilibrium: No heat flows between them.
Now, the position in such an order can be naturally quantified with a number, i.e., you can assign numbers to objects such that:
- If Object A is ordered higher than Object B, i.e., heat will flow from A to B, then the number assigned to A is higher than the number assigned to B.
- If Object A is ordered the same as Object B, i.e., they are in thermal equilibrium, then they will have the same number.
> Mind that all of this does not impose how we actually scale temperature.
> How we scale temperature comes from practical applications such as thermal expansion being linear with temperature on small scales.
An absolute scale for temperature is determined (up to proportionality) by the maximal efficiency of a heat engine operating between two reservoirs: e = 1 - T2/T1.
This might seem like a practical application, but intellectually, it’s an important abstraction away from the properties of any particular system to a constraint on all possible physical systems. This was an important step on the historical path to a modern conception of entropy and the second law of thermodynamics [2].
Yes, but this still allows infinitely many "temperature" scales. I.e. take the current definition of temperature, and apply any nondecreasing function to it.
More intuitively: that TdS has the same "units" as -PdV suggests that temperature [difference] is a "pressure" (thermodynamic potential) that drives entropy increase.
It's also precisely what will show up if you use Lagrange multipliers to maximize entropy given a fixed energy. (though for that to make sense you're no longer looking at a single state, you're optimizing the probability distribution itself)
It has the same “units” (if you mean “energy”) as mc2 as well and that doesn’t suggest anything to me… Your intuition is much better than mine - or it’s informed by what you know about temperature.
Sorry! I meant they have the same form as used in the energy differential (1-form), but I had thought "units" would make more sense. In fact, this comparison was how I came to the intuition, although, as you coyly suggested, I did do a check with my earlier intuitions..
I agree that thermodynamic relations - and Legendre transformations - are fascinating. I don’t think I ever fully understood them though - at least not to the point where they became “intuitive” :-)
Erm sorry again to have implied they were intuitive, all I meant was that it was relatively intuitive --maybe i should have said "retrievable in a high-pressure concept-doodling game" --compared to a wall of text..
I'm pretty sure I don't understand the possible meanings of what you said there either so let's try :)
<layman-ish op-research lingo>
I meant that the tangent to the convex conjugate ("momentum") provides bounds on what the values returned by the dual step in a primal-dual algo should be. I don't know which meaning of "exponential" I should focus on here (the action perhaps? A power set? A probability distribution?), but "implications" seem to refer to a constraint on outputs contingent on the inputs so I will go with that. Delimited continuations seem to be the closest thing I found in the PL lit, aka wikipedia, feel free to suggest something less kooky :)
That makes much more sense than my flash, which had been following a spark in the other direction:
Delimited continuations are functions, and as such (in a world of algebraic types where we can take sums and products of types) exponentials of types, ran^dom.
[in particular, with the substitution of isomorphism for equality they follow the normal K-12 rules: C^(A+B) ~= C^A * C^B, etc.]
I'd just been glancing at https://en.wikipedia.org/wiki/Convex_conjugate#Examples and the pattern by which single-branched f(x) seems to often become a multibranch f(x) reminded me of how logic-reversing functions in general and logical implication in particular "adds branching": if we wish to establish x'<=5 then if x is already <= 5 we may `skip` but otherwise we must calculate x-5 (and then subtract it off); similarly an implication x->y may be satisfied on one branch by not x but on the other requires y.
[and on the general topic: I like to think of temperature as tying together energy and entropy, where positive temperatures yield the familiar relationships but negative temperatures "unintuitive" ones]
This reverse flash might be what could motivate me to make the connection useful.. an exercise in geometric vengeance (and intuition building) for me to use backtracking DCs in optimization problems (engineering => SDP/IPs)? now to find a plug-and-chug example..
Besides negative T occuring in situations where the arrow of t appears reversed..., PG13 "exponentials turn products into sums"
There is also the pun where S stands for both "action" and "entropy" so that's another direction in which to hunt for the Lagrange multiplier/Lagrangian-Hamiltonian connecting unicorn e.g. picking the "most representative", not necessarily the most optimal path.
I don't know if this counts as an indecent flash, because.. well it is a half-formed opinion (malformed intuition?) born of recent experience..
It's hard to describe this personal experience succinctly, nevertheless I can relate it to wizened physicists often marvelling at having terms miraculously cancel when they engage in the voudou popularly known as path integration
Does the temperature actually change discontinuously in a physical system from -infty to +infty, or is it a theoretical artifact that does not show up experimentally?
Depending on what you mean by “discontinuously” it always does: the microscopic world is “discrete”.
Instead of thinking of “temperature” you may think of “inverse of temperature” and then there is no issue with that number going “continously” from very negative to very positive.
The most useful definition of temperature at the microscopic scale is probably this one: 1/T = dS / dU, which I've simplified because math notation is hard, and because we're not going to need the full baggage here. (The whole thing with the curly-d's and the proper conditions imposed is around if you want it.) Okay, so what does that mean? (Let's not even think about where I dug it up from.)
It's actually pretty simple: it says that the inverse of temperature is equal to the change in entropy over the change in energy. That means that temperature is measuring how much the entropy changes when we add or remove energy. And now we start to see why temperature is everywhere in these energy-entropy equations: it's the link between them! And we see why two things having the same temperature is so important: no entropy will change if energy flows. Or, in the language of the article, energy would not actually spread out any more if it would flow between objects at the same temperature. So there's no flow!
The whole 1/T bit, aside from being inconvenient to calculate with, also suggests a few opportunities to fuzz-test Nature. What happens at T=0, absolute zero? 1/T blows up, so dS/dU should blow up too. And indeed it does: at absolute zero, any amount of energy will cause a massive increase in entropy. So we're good. What about if T -> infinity, so 1/T -> zero? So any additional energy induces no more entropy? Well, that's real too: you see this in certain highly-constrained solid-state systems (probably among others), when certain bands fill. And you do indeed observe the weird behavior of "infinite temperature" when dS/dU is zero. Can you push further? Yes: dS/dU can go negative in those systems, making them "infinitely hot", so hot they overflow temperature itself and reach "negative temperature" (dS/dU < 0 implies absolute T < 0). Entropy actually decreases when you pump energy into these systems!
These sorts of systems usually involve population inversions (which might, correctly, make you think of lasers). For a 2-band system, the "absolute zero" state would have the lower band full and the upper band empty. Adding energy lifts some atoms to the upper band. When the upper and lower band are equally full, that's maximum entropy: infinite temperature. Add a little more energy and the upper band is now more full than the lower: this is the negative temperature regime. And, finally, when everything's in the upper band, that is the exact opposite of absolute zero: the system can absorb no more energy. Its temperature is maximum. What temperature is that? Well, if you look at how we got here and our governing equation, we started at 0, went through normal temperatures +T, reached +infinity, crossed over to -infinity, went through negative temperatures -T, and finally reached... -0. Minus absolute zero!
(Suck on that, IEEE-754 signed zero critics?)
And all that from our definition of temperature: how much entropy will we get by adding a little energy here?
Thermodynamics: it'll hurt your head even more than IEEE-754 debugging.