>Heat flows from hot to cold because the number of ways in which the system can be non-uniform in temperature is much lower than the number of ways it can be uniform in temperature ...
Should probably say "thermal energy" instead of "temperature" if we want to be really precise with our thermodynamics terms. Temperature is not a direct measure of energy, rather it is an extensive property describing the relationship between change in energy to change in entropy.
If you put two subsystems at different temperatures in thermal contact the combined system will be in equilibrium only when the cold one warms up and the hot one cools down. The increase in the entropy of the first is larger than the decrease in the entropy of the second (because ΔQ/T1 > ΔQ/T2 when T1<T2) and the total entropy increases.
No kinetic energies of molecules are involved in that phenomenological description of heat flowing from hot to cold.
https://news.ycombinator.com/item?id=41037981 ("What Is Entropy? (johncarlosbaez.wordpress.com)" — 209 comments)
Most of all, it highlights the subjective / relative foundations of these concepts.
Entropy and Information only exist relative to a decision about the set of state an observer cares to distinguish.
It also caused me to change my informal definition of entropy from a negative ("disorder)" to a more positive one ("the number of things I might care to know")
The Second Law now tells me that the number of interesting things I don't know about is always increasing!
This thread inspired me to post it here: https://news.ycombinator.com/item?id=43695358
Here it is explained at length: "An Intuitive Explanation of the Information Entropy of a Random Variable, Or: How to Play Twenty Questions": http://danielwilkerson.com/entropy.html
Imagine a very high resolution screen. Say a billion by a billion pixels. Each of them can be white, gray or black. What is the lowest entropy possible? Each of the pixels has the same color. How does the screen look? Gray. What is the highest entropy possible? Each pixel has a random color. How does it look from a distance? Gray again.
What does this mean? I have no idea. Maybe nothing.
Also sorry for writing two top level comments, but I just really care about this topic
Information and statistical explanations of entropy are very easy. The real question is, what does entropy mean in the original context that it was introduced in, before those later explanations?
> But I have no idea what entropy is, and from what I find, neither do most other people.
The article does not go on to explain what entropy is, it just tries to explain away some hypothetical claims about entropy which as far as we can tell do hold, and does not explain why, if they were wrong, they do in fact hold.