10 Comments

Any comment on syntropy?

Entropy is the tendency toward disorder, while syntropy is the tendency toward order and complexity. These two forces are complementary but opposing, and they govern how energy and matter flow and organize in the universe.

Expand full comment

Tendencies are not forces. So they are not opposing. If you squeeze a balloon one radius shrinks and another expands.

Syntropy is "repeated symmetry without reversion." But clearly you have a different definition. I thought the standard term was *synergy*? The distinction is useful to explore, since synergy is not just "negentropy".

Synergy is characteristic of nonlinear open systems and has no clear single definition, but is typically either gain in free energy or gain in information. That is related to *increase* in entropy paradoxically. But — no paradoxes — that tells you entropy is a blunt concept. For an open system it is not what closed cycle thermodynamics traditionally implies (disorder).

Expand full comment

Here: *"The 'geometry' here is defined by the inner product, ⟨ψ | φ⟩. "* — is not the way I would say it. The (spacetime) geometry is stochastically compressed by the ψ.... is the more nuanced way of putting it. And the inner product merely projects out some partial information from that compression. It is lossy compression — so-to-speak — so it cannot be defining the geometry.

The actual geometry is defined by ... *nothing* (to be pedantic) since we do not know what it is, but since the ψ are our best stochastic account for the spacetime geometry, you should note the ψ is purely spacetime algebra valued. So Einstein (really) defines the geometry. It is pseudo-Riemannian, but with non-trivial Planck scale (or thereabouts) topology. I know what you are thinking: how the heck do I know? Well you caught me with my substack pants down, congrats. No one knows. But allow me to be bullish about spacetime realism will ya. it does not detract one iota from a companion spiritual worldview.

All the so-called "fields" (of the Standard Model) can be accounted for in 4D spacetime algebra, so there is nothing else Nature is *forcing* us to accept as base marble geometry. You can add metaphysics on top as you please, like fibre bundles, e.g, Maxwell and Yang-Mills "fields", or strings and whatnot, but that's not good science (and sometimes is "not even physics"🤣), since it is not necessary, it is overloaded language/mathematical tools that may or may not be useful.

Expand full comment

How do this approach deal with the Pusey-Barrett-Rudolph theorem? I thought the psi-epistemic theories like Gabriele's were ruled out by this theorem. Quantum world shall be psi-ontic at best, quantum world cannot be retrieved by probability alone or classical mechanics...hmmm...

Expand full comment

This is way above me educational level at the moment.

What would be your advice on making such material more accessible? Would one need to have a degree in physics and mathematics? Or is there a shorter path to at least be able to understand this stuff on some level of abstraction?

It's all very interesting to me personally, but not very accessible.

Expand full comment

Entropy, in its various formulations, captures different aspects of uncertainty or state counting, whether in thermodynamics, statistical mechanics, or information theory. It is traditionally defined in diverse ways, such as through the Boltzmann formula, the Gibbs-Shannon entropy in statistical contexts, or the von Neumann entropy for quantum systems. These approaches reflect the underlying probabilistic or combinatorial nature of the systems, rather than being confined to a single mathematical interpretation.

In classical mechanics, the symplectic structure of phase space (given by forms like ω in dq∧dp) provides a geometric framework for measuring volumes that directly correspond to numbers of states.

In quantum mechanics, the geometry of Hilbert space (via the inner product) informs how states overlap and, through the Born rule, how uncertainty is quantified.

In these contexts, the geometric structures serve as a backbone for the mathematical tools used to derive entropy. This perspective is elegant because it ties the abstract notion of "counting" states to tangible geometric measures.

However, to claim that "entropy is geometry and geometry is entropy" is an overstatement for several reasons:

Multiple Mathematical Descriptions:

Entropy can be— and is—formulated in many ways. Probability theory, combinatorics, and even pure information theory provide alternative descriptions that do not rely directly on geometric language. In these formulations, entropy arises from counting possibilities or quantifying uncertainty based purely on probabilistic distributions (e.g., in the Gibbs-Shannon or von Neumann definitions).

Context-Specific Nature:

In a well-prepared quantum system, for instance, the coherent superposition (a pure quantum state) inherently has zero entropy under von Neumann’s measure because there is no statistical mixture involved. The entropy appears only when we describe our lack of complete knowledge, or when the system is in a mixed state. Thus, the relationship between geometry and entropy is contingent on how we set up and interpret the system rather than an intrinsic property of the system itself.

While geometry provides a compelling and insightful framework for understanding the structure underlying entropy—especially in the context of state counting and invariant measures—it is only one of many ways to describe entropy. The concept of entropy is richer than any single mathematical formulation, and insisting on an exclusive equivalence between entropy and geometry overlooks the diverse methods by which we can and do capture uncertainty and information in physical systems.

Expand full comment

I'm excited to read more about emergent gravity and the idea you mentioned in this article. It's interesting that gravity might be explained by the law of thermodynamics.

Expand full comment

I will be exploring that in the next couple months

Expand full comment

100% Tiger Blood Entropy post. 🔥🔥🍾

Expand full comment

This is above my head. But here it is an answer from someone wiser:

https://www.youtube.com/watch?v=BmPcWuv6Mcw

Expand full comment