Top
Best
New

Posted by jfantl 4/14/2025

What Is Entropy?(jasonfantl.com)
288 points | 113 commentspage 2
tsimionescu 4/15/2025|
This goes through all definitions of entropy, except the very first one, which is also the one that is in fact measurable and objective: the variation in entropy is the amount of heat energy that the system exchanges with the environment at a given temperature during a reversible process. While tedious, this can be measured, and it doesn't depend on any subjective knowledge about the system. Any two observers will agree on this value, even if one knows all of the details of every single microstate.
anon84873628 4/14/2025||
Nitpick in the article conclusion:

>Heat flows from hot to cold because the number of ways in which the system can be non-uniform in temperature is much lower than the number of ways it can be uniform in temperature ...

Should probably say "thermal energy" instead of "temperature" if we want to be really precise with our thermodynamics terms. Temperature is not a direct measure of energy, rather it is an extensive property describing the relationship between change in energy to change in entropy.

johan_felisaz 4/14/2025||
Nitpick of the nitpick... Temperature is actually an intensive quantity, i.e. combining two subsystems with the same temperature yields a bigger system with the same temperature, not twice bigger.
timewizard 4/15/2025|||
This is why the "thermodynamic beta" is really useful.

https://en.wikipedia.org/wiki/Thermodynamic_beta

anon84873628 4/15/2025|||
D'oh! Thanks for the correction
kgwgk 4/15/2025||
I think you used “extensive” in the sense of “defined for the whole system and not locally”. It’s true that thermodynamics is about systems at equilibrium.
anon84873628 4/15/2025||
I meant to say "intensive" in the physics sense but just brain farted while typing.
kgwgk 4/15/2025||
Ah, then I don’t see what’s wrong with “the number of ways in which the system can be non-uniform in temperature is much lower than the number of ways it can be uniform in temperature”. In equilibrium one doesn’t have a gradient of temperature because “…” indeed.
anon84873628 4/15/2025||
If you take "temperature" to mean "average kinetic energy of molecules" then it's fine. But that's sort of the same class of simplification as saying "entropy is the amount of disorder".
kgwgk 4/15/2025||
I don't follow you. Whatever you take temperature to mean, for an isolated system in equilibrium that intensive thermodynamic property will have the same value everywhere and the entropy of the system will thus be maximized given the constraints.

If you put two subsystems at different temperatures in thermal contact the combined system will be in equilibrium only when the cold one warms up and the hot one cools down. The increase in the entropy of the first is larger than the decrease in the entropy of the second (because ΔQ/T1 > ΔQ/T2 when T1<T2) and the total entropy increases.

No kinetic energies of molecules are involved in that phenomenological description of heat flowing from hot to cold.

brummm 4/14/2025||
I love that the author clearly describes why saying entropy measures disorder is misleading.
bargava 4/14/2025||
Here is a good overview on Entropy [1]

[1] https://arxiv.org/abs/2409.09232

perihelions 4/14/2025|
Here's the HN thread about that overview on Entropy,

https://news.ycombinator.com/item?id=41037981 ("What Is Entropy? (johncarlosbaez.wordpress.com)" — 209 comments)

marojejian 4/15/2025||
This is the best description of entropy and information I've read: https://arxiv.org/abs/1601.06176

Most of all, it highlights the subjective / relative foundations of these concepts.

Entropy and Information only exist relative to a decision about the set of state an observer cares to distinguish.

It also caused me to change my informal definition of entropy from a negative ("disorder)" to a more positive one ("the number of things I might care to know")

The Second Law now tells me that the number of interesting things I don't know about is always increasing!

This thread inspired me to post it here: https://news.ycombinator.com/item?id=43695358

dswilkerson 4/15/2025||
Entropy is expected information. That is, given a random variable, if you compute the expected value (the sum of the values weighted by their probability) of the information of an event (the log base 2 of the multiplicative inverse of the probability of the event), you get the formula for entropy.

Here it is explained at length: "An Intuitive Explanation of the Information Entropy of a Random Variable, Or: How to Play Twenty Questions": http://danielwilkerson.com/entropy.html

bowsamic 4/15/2025||
I didn’t read in depth but it seems to me on first glance (please correct me if I’m wrong) but as with all articles on entropy this seems to explain everything but the classical thermodynamic quantity called entropy which is 1. the quantity to which all these others are chosen to be related to and 2. the one that is by far the most difficult to explain intuitively

Information and statistical explanations of entropy are very easy. The real question is, what does entropy mean in the original context that it was introduced in, before those later explanations?

im3w1l 4/15/2025||
So here is an amusing thought experiment I thought of at one point.

Imagine a very high resolution screen. Say a billion by a billion pixels. Each of them can be white, gray or black. What is the lowest entropy possible? Each of the pixels has the same color. How does the screen look? Gray. What is the highest entropy possible? Each pixel has a random color. How does it look from a distance? Gray again.

What does this mean? I have no idea. Maybe nothing.

Also sorry for writing two top level comments, but I just really care about this topic

flanked-evergl 4/15/2025||
Not sure what the point of this article, it seems to focus on confusion which could be cleared up with a simple visit to wikipedia.

> But I have no idea what entropy is, and from what I find, neither do most other people.

The article does not go on to explain what entropy is, it just tries to explain away some hypothetical claims about entropy which as far as we can tell do hold, and does not explain why, if they were wrong, they do in fact hold.

im3w1l 4/15/2025|
As a kid I wanted to invent a perpetuum mobile. From that perspective, entropy is that troublesome property that prevents a perpetuum mobile of the second kind. And any fuzziness or ambiguity in its definition is a glimmer of hope that we may yet find a loop hole.
More comments...