Top
Best
New

Posted by konradx 4/14/2025

Why Everything in the Universe Turns More Complex(www.quantamagazine.org)
149 points | 98 comments
hliyan 4/14/2025|
Tried reading the paper [1]. I understand the authors are academics, which is why I'm surprised the paper reads like a layman's attempt at a contributing to a "theory of everything", or at best, an inquiry written by a 18th century European philosopher of science.

- "identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" - what exactly is a "conceptual equivalence"? You mean models? Unifying disparate observations into models is basic science. Not sure why it is highlighted here as some important insight.

- "The laws of classical physics emerged as efforts to provide comprehensive, predictive explanations of phenomena in the macroscopic world" - followed by a laymen's listing of physical laws, then goes on to claim "conspicuously absent is a law of increasing “complexity.”"

- then a jumble of examples including gravitation, stellar evolution, mineral evolution and biological evolution

- this just feels like a slight generalization of evolution: "Systems of many interacting agents display an increase in diversity, distribution, and/or patterned behavior when numerous configurations of the system are subject to selective pressure."

At this point, I gave up.

[1] https://www.pnas.org/doi/10.1073/pnas.2310223120

bubblyworld 4/14/2025||
I think speculative science always starts out as philosophy. This is as true now as it was in the 18th century. If you look at any thinker on the edge of human understanding you'll find something similar (e.g. I was reading Michael Levin's stuff on bioelectricity recently and it also has a heavy dose of philosophy).

I don't really have an issue with any of the points you raised - why do they bother you?

The interesting stuff is the discussion about "functional information" later in the paper, which is their proposed quantitative measure for understanding the evolution of complexity (although it seems like early stages for the theory).

It's "just" a slight generalisation of the ideas of evolution but it applies to nonbiological systems and they can make quantitative predictions. If it turns out to be true then (for me) that is a pretty radical discovery.

I'm looking forward to seeing what can be demonstrated experimentally (the quanta article suggests there is some evidence now, but I haven't yet dug into it).

haswell 4/14/2025|||
> I think speculative science always starts out as philosophy. This is as true now as it was in the 18th century.

Indeed, and Natural Philosophy was the precursor to what we now call Science.

I still think the old name better fit what we’re doing because it admits that the work is still a philosophical endeavor.

This is not to question the validity of what we now call science, but it’s common these days to believe in the ultimate supremacy of science as the answer to questions that are best explored both philosophically and scientifically, and because pure science still can’t answer important philosophical questions that that the entire scientific discipline rests upon.

analog31 4/14/2025||
Tell me about the supremacy of science after the government restores the NIH, NOAA, etc. In fact most people in the world believe in the supremacy of their religious faiths.
haswell 4/14/2025|||
You're describing anti-science sentiment, which is problematic and dangerous. But this is also whataboutism.

I'm describing unfounded beliefs many people hold about science based mostly on a lack of philosophical understanding, which is orthogonal to anti-science sentiment and still important to examine.

I don't see a reason for there to be tension between the two.

analog31 4/14/2025||
I may have overreacted. I'm a scientist, and I'm surrounded by scientists. When I hear about "supremacy of science" it's usually being presented as a straw man. I don't know any scientists who believe it, beyond the temporary phase in everybody's education where they get caught up in the "master of the universe" feeling.
haswell 4/15/2025||
That’s understandable, especially in the current climate. I’ve definitely encountered the straw men from the anti-science types too, and it’s incredibly frustrating.

As a layperson, I often come in contact with people who believe in science but fall into what is essentially scientific absolutism and see philosophy as irrelevant. I was one of those people in my 20s before I went down some rabbit holes that set me straight. Many of the people around me did not.

The scientists I know are not the absolutist types. I sometimes forget there are more scientists here than the average internet community.

ysofunny 4/14/2025|||
My religious faith is science

now what?

analog31 4/14/2025||
I'm open minded about religion. It can be whatever you want.
ysofunny 4/14/2025||||
> I think speculative science always starts out as philosophy

or in my words: "the first approximation is poetic. the last one is mathematical"

from philosophy to hard-science and engineered tooling and other products (andor services)

similarly to

from poetry as dubious, cloudy, and vague ideas all the way to crystal clear, fixed and unmoving (dead) formalizations

cryptonector 4/15/2025|||
> I don't really have an issue with any of the points you raised - why do they bother you?

Idk about GP, but bad science writing ("identification of conceptual equivalencies ...") does bother me. It's sloppy, and tends to hide possibly invalid shortcuts taken by the authors by being an impenetrable fog of words. That sort of thing is a very good indicator of bunk, and it tends to peg my BS meter. Which isn't to say that there is no place for that sort of language in a scientific paper, but that one should preface the use of it with an admission of hand-waving for some purpose.

bubblyworld 4/16/2025||
Right, I agree with you in general but in this particular case it seems fine to me. OP refers to the first two paragraphs of the introdution, so the authors clearly aren't hiding anything. It's a very far cry from an actual crank paper like the sort you see on vixra. But yeah, this stuff is subjective, I probably just have a higher tolerance for fluff.
visarga 4/16/2025|||
I think you should try to get the intent instead of stumbling on surface level. The core idea is that recursion explains emergence.

A distributed system can still achieve centralized outcomes as a result of centralizing constraints acting on it. For example, matter under gravity forces leads to celestial bodies, particles under EM forces lead to stable chemical molecules, genes and species under the constraint of replication lead to evolution, language under constraint of usage leads to the evolution of culture, and brains under the constraint of serial action lead to centralized semantics and behavior. In neural nets we have the loss function as a centralizing constraint, moving weights towards achieving a certain functional outcome.

Ok, so what is the relation between centralizing constraints and recursion? Recursion is how distributed activity generates constraints. Every action becomes a future constraint. I think this approach shows great promise. We can link recursive incompressibility and undecidability to explanatory gaps. You can't know a recursive process unless you walk the full path of recursion, you have to be it to know it. There is no shorter description of a recursive process than its full history.

So what looks like constraints when seen top-down, looks like search seen bottom-up. Particles search for minimal energy, genes for survival, markets search for profit, and our actions for goal maximization. Search acts on all levels, but since constraints are emergent, search is also open-ended.

Mzxr 4/18/2025||
Your comment reminded me of this article. It suggested that we experience time because the underlying processes of the universe are recursive. And since our computational capacity is limited, we as observers can only perceive the future by progressively unfolding it. If recursion explains emergence, could it follow that everything tends to grow more complex over time?

[1] https://news.ycombinator.com/item?id=41782534

raxxorraxor 4/14/2025|||
I believe model and concept can be equivalent, not sure about the required formal terminology in English.

Complexity is probably most formally modeled in entroy in thermodynamics, although it behaves in the opposite direction that these ideas and oberservations suggest it should.

It still asks questions about the reason for this complexity and there is no scientific answer aside from "propably accidental complexity".

Science is curious so it probably shouldn't be dismissed by unmet formal requirements that aren't specified. "Layman" is unspecific, so what would your requirements be exactly?

coldtea 4/14/2025||
>- "identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" - what exactly is a "conceptual equivalence"? You mean models?

No, a model is not an "identification of conceptual equivalencies among disparate phenomena". It's a simplified representation of a system.

"identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" could be called an analogy, an isomorphism, a unifying framework, etc.

>Unifying disparate observations into models is basic science. Not sure why it is highlighted here as some important insight.

Perhaps because the most important insights are the most basic ones - it's upon those eveything else sits upon.

>At this point, I gave up

If you can't bother beyond the abstract or 1st paragraph, or are perplexed that the abstract has a 10,000ft simplistic introduction into the basics, then it's better that you did :)

EncomLab 4/14/2025||
"Complexity" is a hugely problematic term when used in this way - remember that entropy and complexity are related, but they are not interchangeable. A complex system can have lower entropy than a simpler system, and conversely, a system can have high entropy but be relatively simple. By mingling these terms without specifying objective reference points, it all just comes out as word salad.

This paper just reads like an attempt at sounding smart while actually saying little.

titzer 4/14/2025||
> a system can have high entropy but be relatively simple.

Good examples of these are anything that Kolmogorov-compresses well. For example, by almost any measure the output of a pseudo random number generator has high entropy. Yet it has low information density (low complexity), as the program that generates the sequence, plus its state, is really small.

andrewflnr 4/15/2025||
I think a better example is just hot gas. Heat up a tube of gas, and its entropy will increase, with no effect on its complexity. Still not terribly compressible either though.
pyfon 4/14/2025|||
Yes indeed. As I understand it, entropy is about states that are more likely.

I wonder if it always increases though? Eventually there will be enough entropy that any change may cause it to reduce or oscillate? (At universe / reachable universe scale).

kergonath 4/14/2025|||
> I wonder if it always increases though?

It always increases in an isolated system. That caveat is almost always missing in pop-sci level of discussions about entropy, but it is crucial.

> Eventually there will be enough entropy that any change may cause it to reduce or oscillate?

Assuming that the universe is actually an isolated system, entropy will reach a maximum (it cannot oscillate). It is interesting to speculate, and of course our theories are imperfect and we are certainly missing something. In particular, the relationship between time and entropy is not straightforward. Very roughly: is the entropy a function of time, which we could define otherwise, or is time a consequence of entropy changes?

In the first case, we can suppose that if the universe reaches an entropy maximum we’d be far enough outside the conditions under which our theories work that we’d just have entropy decrease with time (i.e., the rule that entropy increases with time is only valid close to our usual conditions).

But in the second case, it would mean that the universe reached the end of time. It could evolve in any conceivable way (in terms of the fundamental laws of Physics), and the arrow of time would always point to the same moment. "What comes after?" Would be a question just as meaningless as "what came before the Big Bang?"

In any case, there are a lot of assumptions and uncertainty. The story does not do the subject any justice.

andrewflnr 4/14/2025|||
Yes, we call that state "heat death". Note that the second law is actually that entropy never decreases; it's allowed to stay constant for certain interactions (for instance I'm pretty sure an elastic collision preserves entropy).
ysofunny 4/14/2025||
that is why the complex is distinct from the complicated
kens 4/14/2025||
Coincidentally, I'm reading Walker's book "Life as No One Knows It: The Physics of Life's Emergence" on the same topic. (Walker is one of the researchers in the article.) Summary: I don't like the book. The book was motivating me to write an article "Books I don't like", but I'll comment here instead :-)

The book describes "Assembly Theory", a theory of how life can arise in the universe. The idea is that you can quantitatively measure the complexity of objects (especially chemicals) by the number of recursive steps to create them. (The molecule ATP is 21 for instance.) You need life to create anything over 15; the idea of life is it contains information that can create structures more complex than what can be created randomly. The important thing about life is that it isn't spontaneous, but forms an unbroken chain through time. Explaining how it started may require new physics.

If the above seems unclear, it's because it is unclear to me. The book doesn't do a good job of explaining things. It looks like a mass-market science book, but I found it very confusing. For instance, it's unclear where the number 21 for ATP comes from, although there's an analogy to LEGO. The book doesn't define things and goes into many, many tangents. The author is very, very enthusiastic about the ideas but reading the book is like looking at ideas through a cloud of vagueness.

The writing is also extremely quirky. Everyone is on a first-name basis, from Albert (Einstein) to Johnny (von Neumann) and Erwin (Schrödinger). One chapter is written in the second person, and "you" turn out to be "Albert." The book pushes the idea that physics is great and can solve everything, covering physics "greatest hits" from relativity and quantum mechanics to gravitational waves and the Higgs boson. (The underlying theme is: "Physics is great. This book is physics. Therefore, this book is great.") The book has a lot of discussion of how it is a new paradigm, Kuhn's paradigm shifts, how it will move astrobiology beyond the pre-paradigmatic phase and unify fields of research and so forth. It's not a crackpot book, but there are an uncomfortable number of crackpot red flags.

I'm not rejecting the idea of assembly theory. To be honest, after reading the book, I don't understand it well enough to say which parts seem good and which parts seem flawed. There seem to be interesting ideas struggling to get out but I'm not getting them. (I don't like to be negative about books, but there are a few that I regret reading and feel that I should warn people.)

roughly 4/14/2025||
Walker gave a talk recently at Long Now on Assembly Theory that sounds like it did a better job of getting the point across:

https://longnow.org/ideas/informational-theory-life/

aradox66 4/15/2025|||
I felt similar reading that book. She seems very clear that she wants to develop paradigmatic physics, and wants Assembly Theory to be paradigmatic, but there's not a lot of meat on the bone.
Viliam1234 4/14/2025||
> It's not a crackpot book, but there are an uncomfortable number of crackpot red flags.

How do you know it's not a crackpot book? All evidence you mentioned here seems to support that conclusion.

petre 4/14/2025||
Douglas Adams was right all along then.

"There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another theory which states that this has already happened."

gsf_emergency_2 4/14/2025|
Fancier but less humorous take by "experts" (including Goedel):

https://en.wikipedia.org/wiki/Modal_collapse

andrewflnr 4/14/2025||
Amateur speculation, but informed by professionals: I think this tendency toward complexity is situational, not fundamental. Specifically, it's a product of this stage of the universe having lots of available energy. More complex structures are favored when/because they can consume more energy and increase entropy more effectively. The complexity will probably start fading when the hydrogen-fusion party dies. The second law will continue on its way.
OgsyedIE 4/14/2025||
I'm fairly sure this is already in the usual canon of statistical mechanics.

"When one compares a hotplate with and without a Benard cell apparatus on top, there is an overall increase in entropy as energy passes through the system as required by the second law, because the increase in entropy in the environment (at the heat sink) is greater than the decreases in entropy that come about by maintaining gradients within the Benard cell system."

gsf_emergency 4/14/2025|
The abstract heresy innuendo'd here seems to be about an increase in global (aka universal) "complexity"*

(Think: no heat death!)

Related to another heresy understated by qmag just this week: https://news.ycombinator.com/item?id=43665831

In that case, qmag didn't (dare to?) shout loud enough that the para-particles are globally ?distinguishable..

That's like a very restricted version of TFA's claim though..

Another take on the issue:

https://scottaaronson.blog/?p=762

*I don't want to say "entropy" because it's not clear to many folks, including experts, whether entropy is uh, "correlated" or "anticorrelated" with complexity.

raxxorraxor 4/14/2025||
> "correlated" or "anticorrelated" with complexity.

Also the value of entropy has different signs in thermodynamics and computer science for example. Not helpful either...

gsf_emergency 4/15/2025||
It's because thermo count states and CS use probabilities.. strange swap, I know, pple assume the opposite..
Kungfuturtle 4/14/2025||
This reminds me of Teilhard de Chardin's take on complexification, as laid out in his seminal book Le Phénomène humain. See e.g., this article[0] for a simple overview of the hypothesis. For further reading, I recommend the excellent new translation by Sarah Appleton-Weber, The Human Phenomenon[1].

[0] <https://onlinelibrary.wiley.com/doi/pdf/10.1002/%28SICI%2910...>

[1] <https://www.liverpooluniversitypress.co.uk/doi/book/10.3828/...>

skywhopper 4/14/2025||
I never trust the sense of new scientific ideas I get from popular press articles. But this comes across as highly questionable, “Intelligent Design” redux stuff. Sure there are some interesting points about information theory etc, but overall it sounds like a lot of scientists desperately cribbing concepts they don’t actually understand from other fields and misapplying them to oversimplified computer simulations someone who barely understands Python wrote 20 years ago, and assuming the simulation, which has built-in, accidentally hard-coded selection factors, is the same as reality.

Seriously, phrases like “selection for function”, unified theories of biology and physics, and big ideas about the second law of thermodynamics are major red flags.

adrian_b 4/14/2025||
Sentences like this, i.e. "everything turns more complex", must be formulated much more precisely in order to become true.

The article talks a lot about biological evolution, but in that case the only claim that is likely to be true is that the complexity of the entire biosphere increases continuously, unless a catastrophe resets the biosphere to a lower complexity.

If you look only at a small part of the biosphere, like one species of living beings, it is extremely frequent to see that it evolves to become simpler, not more complex, because a simpler structure is usually optimal for constant environmental conditions, the more complex structures are mainly beneficial for avoiding extinction when the environmental conditions change.

kdavis 4/14/2025|
“The law that entropy always increases holds, I think, the supreme position among the laws of Nature. If someone points out to you that your pet theory of the universe is in disagreement with Maxwell's equations - then so much the worse for Maxwell's equations. If it is found to be contradicted by observation - well, these experimentalists do bungle things sometimes. But if your theory is found to be against the Second Law of Thermodynamics I can give you no hope; there is nothing for it to collapse in deepest humiliation.” ― Arthur Eddington, New Pathways in Science
EVa5I7bHFq9mnYK 4/14/2025|
Entropy is always increasing in a closed system, but locally it can decrease, if energy is supplied from the outside. Us evolving on Earth comes at the expense of increased entropy of the Sun.
mr_toad 4/14/2025|||
> Entropy is always increasing in a closed system

Only if that system isn’t already in thermodynamic equilibrium. A closed system that reaches thermodynamic equilibrium has maximum entropy.

Why the universe as a whole didn’t start out in thermodynamic equilibrium, i.e doesn’t have maximum entropy is something we don’t understand.

api 4/14/2025|||
Maybe it's not a closed system.

https://en.wikipedia.org/wiki/Black_hole_cosmology

I'm partial to the hypothesis that our universe is actually a giant black hole in some kind of larger universe. The Big Bang was really the formation of our universe's event horizon. Cosmic inflation is the result of stuff falling into our universe, adding to its mass-energy -- there is no dark energy, our universe is just accreting mass-energy from something larger.

As for what the larger universe looks like -- in this model it may be impossible to know because the event horizon is impenetrable. It could be a much larger universe or it could be something else, like a higher dimensional one.

EVa5I7bHFq9mnYK 4/14/2025|||
If it were so, there would be no one to ask that question.
ThrowawayTestr 4/14/2025|||
I read a theory that life in the universe might be favorable because we increase entropy so much.
pyfon 4/14/2025||
Life in the universe is pretty unfavourable! A rare thing indeed. Where it has evolved I think it is less about entropy and more about the nature of the matter - atoms, molecules. Particularly carbon and water. And the way they can replicate themselves through chemistry. That had to obey entropy but is not driven by it. Light scattering off the atmosphere will do the entropy trick well enough!
dr_dshiv 4/14/2025||
> A rare thing indeed

We can hardly know that, can we? Water and carbon are abundant.

More comments...