Top
Best
New

Posted by jfantl 4/14/2025

What Is Entropy?(jasonfantl.com)
288 points | 113 commentspage 3
jwilber 4/14/2025|
There’s an interactive visual of Entropy here in the Where To Partition section (midway thru the article): https://mlu-explain.github.io/decision-tree/
jwarden 4/15/2025||
Here's my own approach to explaining entropy as a measure of uncertainty: https://jonathanwarden.com/entropy-as-uncertainty
FilosofumRex 4/15/2025||
Boltzmann and Gibbs turn in their graves, every time some information theorist mutilates their beloved entropy. Shanon & Von Neumann were hacking a new theory of communication, not doing real physics and never meant to equate thermodynamic concepts to encoding techniques - but alas now dissertations are written on it.

Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn't tell us anything new. If it did, it'd violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.

The article never mentions the simplest and most basic definition of entropy, ie its units (KJ/Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.

“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell

kgwgk 4/15/2025||
> Entropy can't be a measure of uncertainty

Gibbs’ entropy is derived from “the probability that an unspecified system of the ensemble (i.e. one of which we only know that it belongs to the ensemble) will lie within the given limits” in phase space. That’s the “coefficient of probability” of the phase, its logarithm is the “index of probability” of the phase, the average of that is the entropy.

Of course the probability distribution corresponds to the uncertainty. That’s why the entropy is defined from the probability distribution.

Your claim sounds like saying that the area of a polygon cannot be a measure of its extension because the extension is given by the shape and calculating the area doesn’t tell us anything new.

kgwgk 4/15/2025||
> Shanon & Von Neumann were hacking a new theory of communication, not doing real physics

Maybe I’m misunderstanding the reference to von Neumann but his work on entropy was about physics, not about communication.

nanna 4/15/2025|||
Think the parent has confused Von Neumann with Wiener. They've also misspelled Shannon.
FilosofumRex 4/15/2025|||
More precisely, Von Neumann was extending Shannon's information theoretic entropy to quantum channels, which he restated as S(p)=Tr(p ln(p)) - Again showing that information theoretic entropy reveals nothing more about a system than its probability distribution density matrix p.
kgwgk 4/16/2025||
It’s quite remarkable that in his 1927 paper “The thermodynamics of quantum-mechanical ensembles” von Neumann was extending the mathematical theory of communication that Shannon - who was 11 at the time - would only publish decades later.
FilosofumRex 4/17/2025||
Dude, read your own reference... There is no mention of information or communication theory anywhere in his 1927 paper or 1932 book. Young Von Nuemann was doing real physics extending and updating Gibb's entropy.

OTOH, old Von Neumann was wealthy, hobnobbing with politicians and glitterati musing about life, biology, econ and anything else that would amuse his social circles. "Entropy", as he's alleged to have told Shannon, was his ace in the pocket to win arguments.

Formal similarity with Shannon's entropy is superfluous and conveys no new information about any system, quantum or otherwise. But it does make for lot's PhD dissertations, for exactly the same reason Von Nuemann stated.

kgwgk 4/17/2025||
> There is no mention of information or communication theory anywhere in his 1927 paper or 1932 book. Young Von Nuemann was doing real physics extending and updating Gibb's entropy.

We agree then! John von Neumann’s work on entropy was about physics, not about communication theory. S(p)=Tr(p ln(p)) is physics. If you still claim that he “was extending Shannon's information theoretic entropy to quantum channels” at some point could you maybe give a reference?

> Formal similarity with Shannon's entropy is superfluous and conveys no new information about any system, quantum or otherwise

What I still don’t understand is your fixation with that.

“Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x)” makes zero sense given that the entropy is a property of the probability distribution. (Any measure of “all the uncertainty” which is “in the probability distribution p(x)” will be a property of p(x). The entropy checks that box so why can’t it be a measure of uncertainty?)

It is a measure of the uncertainty in the probability distribution that describes a physical system in statistical mechanics. It is a measure of the lack of knowledge about the system. For a quantum system, von Neumann’s entropy becomes zero when the density matrix corresponds to a pure state and there is nothing left to know.

Ono-Sendai 4/15/2025||
Anyone else notice how the entropy in the 1000 bouncing balls simulation goes down at some point, thereby violating the second law of thermodynamics? :)
thowawatp302 4/15/2025|
Over long enough scales there is no conservation of energy because the universe does not have temporal symmetry.
kgwgk 4/15/2025||
Cosmologists are not serious people.

https://math.ucr.edu/home/baez/physics/Relativity/GR/energy_...

“Is Energy Conserved in General Relativity?”

“In special cases, yes. In general, it depends on what you mean by "energy", and what you mean by "conserved".”

gozzoo 4/14/2025||
The visualisation is great, the topic is interesting and very well explained. Can sombody recomend some other blogs with similar type of presentation?
floxy 4/14/2025|
If you haven't seen it, you'll probably like:

https://ciechanow.ski/archives/

fedeb95 4/15/2025||
given all the comments, it turns out that a post on entropy has high entropy.
vitus 4/14/2025||
The problem with this explanation (and with many others) is that it misses why we should care about "disorder" or "uncertainty", whether in information theory or statistical mechanics. Yes, we have the arrow of time argument (second law of thermodynamics, etc), and entropy breaks time-symmetry. So what?

The article hints very briefly at this with the discussion of an unequally-weighted die, and how by encoding the most common outcome with a single bit, you can achieve some amount of compression. That's a start, and we've now rediscovered the idea behind Huffman coding. What information theory tells us is that if you consider a sequence of two dice rolls, you can then use even fewer bits on average to describe that outcome, and so on; as you take your block length to infinity, your average number of bits for each roll in the sequence approaches the entropy of the source. (This is Shannon's source coding theorem, and while entropy plays a far greater role in information theory, this is at least a starting point.)

There's something magical about statistical mechanics where various quantities (e.g. energy, temperature, pressure) emerge as a result of taking partial derivatives of this "partition function", and that they turn out to be the same quantities that we've known all along (up to a scaling factor -- in my stat mech class, I recall using k_B * T for temperature, such that we brought everything back to units of energy).

https://en.wikipedia.org/wiki/Partition_function_(statistica...

https://en.wikipedia.org/wiki/Fundamental_thermodynamic_rela...

If you're dealing with a sea of electrons, you might apply the Pauli exclusion principle to derive Fermi-Dirac statistics that underpins all of semiconductor physics; if instead you're dealing with photons which can occupy the same energy state, the same statistical principles lead to Bose-Einstein statistics.

Statistical mechanics is ultimately about taking certain assumptions about how particles interact with each other, scaling up the quantities beyond our ability to model all of the individual particles, and applying statistical approximations to consider the average behavior of the ensemble. The various forms of entropy are building blocks to that end.

alex5207 4/15/2025||
Super read! Thanks for sharing
sysrestartusr 4/15/2025||
at some point my take became: if nothing orders the stuff that lies and flies around, any emergent structures that follow the laws of nature eventually break down.

organisms started putting things in places to increase "survivability" and thriving of themselves until the offspring was ready for the job at which point the offspring started to additionaly put things in place for the sake of the "survivability" and thriving of their ancestors ( mostly overlooking their nagging and shortcomings because "love" and because over time, the lessons learned made everything better for all generations ) ...

so entropy is only relevant if all the organisms that can put some things in some place for some reason disappear and the laws of nature run until new organisms emerge. ( which is why I'm always disappointed at leadership and all the fraudulent shit going on ... more pointlessly dead organisms means less heads that can come up with ways to put things together in fun and useful ways ... it's 2025, to whomever it applies: stop clinging to your sabotage-based wannabe supremacy, please, stop corrupting the law, for fucks sake, you rich fucking losers )

nanna 4/15/2025|
Yet another take on entropy and information focused on Claude Shannon and lacking even a single mention of Norbert Wiener, even though they invented it simultaneously and evidence suggests Shannon learned the idea from Wiener.
More comments...