Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x) - multiplying it with its own logarithm and summing doesn't tell us anything new. If it did, it'd violate quantum physics principles including the Bell inequality and Heisenberg uncertainty.
The article never mentions the simplest and most basic definition of entropy, ie its units (KJ/Kelvin), nor the 3rd law of thermodynamics which is the basis for its measurement.
“Every physicist knows what entropy is. Not one can write it down in words.” Clifford Truesdell
Gibbs’ entropy is derived from “the probability that an unspecified system of the ensemble (i.e. one of which we only know that it belongs to the ensemble) will lie within the given limits” in phase space. That’s the “coefficient of probability” of the phase, its logarithm is the “index of probability” of the phase, the average of that is the entropy.
Of course the probability distribution corresponds to the uncertainty. That’s why the entropy is defined from the probability distribution.
Your claim sounds like saying that the area of a polygon cannot be a measure of its extension because the extension is given by the shape and calculating the area doesn’t tell us anything new.
Maybe I’m misunderstanding the reference to von Neumann but his work on entropy was about physics, not about communication.
OTOH, old Von Neumann was wealthy, hobnobbing with politicians and glitterati musing about life, biology, econ and anything else that would amuse his social circles. "Entropy", as he's alleged to have told Shannon, was his ace in the pocket to win arguments.
Formal similarity with Shannon's entropy is superfluous and conveys no new information about any system, quantum or otherwise. But it does make for lot's PhD dissertations, for exactly the same reason Von Nuemann stated.
We agree then! John von Neumann’s work on entropy was about physics, not about communication theory. S(p)=Tr(p ln(p)) is physics. If you still claim that he “was extending Shannon's information theoretic entropy to quantum channels” at some point could you maybe give a reference?
> Formal similarity with Shannon's entropy is superfluous and conveys no new information about any system, quantum or otherwise
What I still don’t understand is your fixation with that.
“Entropy can't be a measure of uncertainty, because all the uncertainty is in the probability distribution p(x)” makes zero sense given that the entropy is a property of the probability distribution. (Any measure of “all the uncertainty” which is “in the probability distribution p(x)” will be a property of p(x). The entropy checks that box so why can’t it be a measure of uncertainty?)
It is a measure of the uncertainty in the probability distribution that describes a physical system in statistical mechanics. It is a measure of the lack of knowledge about the system. For a quantum system, von Neumann’s entropy becomes zero when the density matrix corresponds to a pure state and there is nothing left to know.
https://math.ucr.edu/home/baez/physics/Relativity/GR/energy_...
“Is Energy Conserved in General Relativity?”
“In special cases, yes. In general, it depends on what you mean by "energy", and what you mean by "conserved".”
The article hints very briefly at this with the discussion of an unequally-weighted die, and how by encoding the most common outcome with a single bit, you can achieve some amount of compression. That's a start, and we've now rediscovered the idea behind Huffman coding. What information theory tells us is that if you consider a sequence of two dice rolls, you can then use even fewer bits on average to describe that outcome, and so on; as you take your block length to infinity, your average number of bits for each roll in the sequence approaches the entropy of the source. (This is Shannon's source coding theorem, and while entropy plays a far greater role in information theory, this is at least a starting point.)
There's something magical about statistical mechanics where various quantities (e.g. energy, temperature, pressure) emerge as a result of taking partial derivatives of this "partition function", and that they turn out to be the same quantities that we've known all along (up to a scaling factor -- in my stat mech class, I recall using k_B * T for temperature, such that we brought everything back to units of energy).
https://en.wikipedia.org/wiki/Partition_function_(statistica...
https://en.wikipedia.org/wiki/Fundamental_thermodynamic_rela...
If you're dealing with a sea of electrons, you might apply the Pauli exclusion principle to derive Fermi-Dirac statistics that underpins all of semiconductor physics; if instead you're dealing with photons which can occupy the same energy state, the same statistical principles lead to Bose-Einstein statistics.
Statistical mechanics is ultimately about taking certain assumptions about how particles interact with each other, scaling up the quantities beyond our ability to model all of the individual particles, and applying statistical approximations to consider the average behavior of the ensemble. The various forms of entropy are building blocks to that end.
organisms started putting things in places to increase "survivability" and thriving of themselves until the offspring was ready for the job at which point the offspring started to additionaly put things in place for the sake of the "survivability" and thriving of their ancestors ( mostly overlooking their nagging and shortcomings because "love" and because over time, the lessons learned made everything better for all generations ) ...
so entropy is only relevant if all the organisms that can put some things in some place for some reason disappear and the laws of nature run until new organisms emerge. ( which is why I'm always disappointed at leadership and all the fraudulent shit going on ... more pointlessly dead organisms means less heads that can come up with ways to put things together in fun and useful ways ... it's 2025, to whomever it applies: stop clinging to your sabotage-based wannabe supremacy, please, stop corrupting the law, for fucks sake, you rich fucking losers )