Entropy is the tendency toward disorder, while syntropy is the tendency toward order and complexity. These two forces are complementary but opposing, and they govern how energy and matter flow and organize in the universe.
Tendencies are not forces. So they are not opposing. If you squeeze a balloon one radius shrinks and another expands.
Syntropy is "repeated symmetry without reversion." But clearly you have a different definition. I thought the standard term was *synergy*? The distinction is useful to explore, since synergy is not just "negentropy".
Synergy is characteristic of nonlinear open systems and has no clear single definition, but is typically either gain in free energy or gain in information. That is related to *increase* in entropy paradoxically. But — no paradoxes — that tells you entropy is a blunt concept. For an open system it is not what closed cycle thermodynamics traditionally implies (disorder).
That is a much better word than neg entropy! Which is what I've been using, although I have recently turned it into Evolution. I think syntropy is better though.
Syntropy Is the geometry that creates the system that allows entropy to happen through geometry.
Here: *"The 'geometry' here is defined by the inner product, ⟨ψ | φ⟩. "* — is not the way I would say it. The (spacetime) geometry is stochastically compressed by the ψ.... is the more nuanced way of putting it. And the inner product merely projects out some partial information from that compression. It is lossy compression — so-to-speak — so it cannot be defining the geometry.
The actual geometry is defined by ... *nothing* (to be pedantic) since we do not know what it is, but since the ψ are our best stochastic account for the spacetime geometry, you should note the ψ is purely spacetime algebra valued. So Einstein (really) defines the geometry. It is pseudo-Riemannian, but with non-trivial Planck scale (or thereabouts) topology. I know what you are thinking: how the heck do I know? Well you caught me with my substack pants down, congrats. No one knows. But allow me to be bullish about spacetime realism will ya. it does not detract one iota from a companion spiritual worldview.
All the so-called "fields" (of the Standard Model) can be accounted for in 4D spacetime algebra, so there is nothing else Nature is *forcing* us to accept as base marble geometry. You can add metaphysics on top as you please, like fibre bundles, e.g, Maxwell and Yang-Mills "fields", or strings and whatnot, but that's not good science (and sometimes is "not even physics"🤣), since it is not necessary, it is overloaded language/mathematical tools that may or may not be useful.
How do this approach deal with the Pusey-Barrett-Rudolph theorem? I thought the psi-epistemic theories like Gabriele's were ruled out by this theorem. Quantum world shall be psi-ontic at best, quantum world cannot be retrieved by probability alone or classical mechanics...hmmm...
This is way above me educational level at the moment.
What would be your advice on making such material more accessible? Would one need to have a degree in physics and mathematics? Or is there a shorter path to at least be able to understand this stuff on some level of abstraction?
It's all very interesting to me personally, but not very accessible.
Again (not to be saucy though), check my book out, as I wrote it for teenagers like myself when I was 18 and curious. It is just 200 pages and written in plain language - or as plain as it gets for some of these subjects. It also addresses why QM is indeterminant.
Entropy, in its various formulations, captures different aspects of uncertainty or state counting, whether in thermodynamics, statistical mechanics, or information theory. It is traditionally defined in diverse ways, such as through the Boltzmann formula, the Gibbs-Shannon entropy in statistical contexts, or the von Neumann entropy for quantum systems. These approaches reflect the underlying probabilistic or combinatorial nature of the systems, rather than being confined to a single mathematical interpretation.
In classical mechanics, the symplectic structure of phase space (given by forms like ω in dq∧dp) provides a geometric framework for measuring volumes that directly correspond to numbers of states.
In quantum mechanics, the geometry of Hilbert space (via the inner product) informs how states overlap and, through the Born rule, how uncertainty is quantified.
In these contexts, the geometric structures serve as a backbone for the mathematical tools used to derive entropy. This perspective is elegant because it ties the abstract notion of "counting" states to tangible geometric measures.
However, to claim that "entropy is geometry and geometry is entropy" is an overstatement for several reasons:
Multiple Mathematical Descriptions:
Entropy can be— and is—formulated in many ways. Probability theory, combinatorics, and even pure information theory provide alternative descriptions that do not rely directly on geometric language. In these formulations, entropy arises from counting possibilities or quantifying uncertainty based purely on probabilistic distributions (e.g., in the Gibbs-Shannon or von Neumann definitions).
Context-Specific Nature:
In a well-prepared quantum system, for instance, the coherent superposition (a pure quantum state) inherently has zero entropy under von Neumann’s measure because there is no statistical mixture involved. The entropy appears only when we describe our lack of complete knowledge, or when the system is in a mixed state. Thus, the relationship between geometry and entropy is contingent on how we set up and interpret the system rather than an intrinsic property of the system itself.
While geometry provides a compelling and insightful framework for understanding the structure underlying entropy—especially in the context of state counting and invariant measures—it is only one of many ways to describe entropy. The concept of entropy is richer than any single mathematical formulation, and insisting on an exclusive equivalence between entropy and geometry overlooks the diverse methods by which we can and do capture uncertainty and information in physical systems.
I'm excited to read more about emergent gravity and the idea you mentioned in this article. It's interesting that gravity might be explained by the law of thermodynamics.
How do you then link the emergence of time to geometry, given that the most common assumption is that the arrow of time emerges as a consequence of entropy being equal or greater than 0?
Ok I think that after going through Gabriele's video and mathemaniac explaination video about that I'm getting closer to understand the underlying train of thoughts. Fascinating indeed! The question is now how comes that in simplectic space one can count a set of states as large as one wants but in the "lorentzian manifold" we live in there is an upper bound of states available, given by the Bekenstein bound? Maybe if gravity is entropy and entropy is geometry, then gravity is geometry indeed, as stated by GR...
This week you giveth with one hand and take away with the other!
After dragging me kicking and screaming to give up on Godels " limits on what is knowable" - a favorite intellectualism in my mouth box- you have handed us something conceptually very useful to hang onto with entropy as geometry.
This formulation seems to me to be related to what Julian Barbour works with in his book Janus Point.
QUESTION: is there a transcript available for the Consciousness Iceburg?
Thanks Liam. I took a quick look at your posts, and enjoyed the following: spiral combination; understanding vs comprehension (which I, like McGilchrist, reword as knowledge (science) vs understanding (philosophy or full-body comprehension), and, yes, it all starts in the body); and resonance. I'm pretty sure no one understands time, certainly not me, which is partly why it is fundamental to my temporal hypothesis that explains concept-formation or thinking works. This is the real contribution of my book that took 40 years to complete - we differentiate real or continuous time and thereby create discrete clock time, a human invention that supports concept-formation and explains why we are so different from other animals (who don't wear watches). Light is matter traveling at the speed of causality (with no mass); it is the key interface between spacetime and the quantum domain that underlies everything. Geometry (i.e. fractals) are no doubt involved in this interface (via the collapse of the wave function). In any event, thanks for the comment and GPT summary.
Apologies, I'm new to this platform and appear to be missing some features. Liam I took a quick look at the GPT response that compared my book to some dynamic theory that I'm not familiar with. From what I can tell it got some things correct, others were off the mark. Best to read the book directly. I'll get this figured out, but it may take a bit. Thanks for understanding.
Respectfully, evolution while playing a role in the development of life does not appear to be fundamental. Rather, life appears to be a highly unusual state of matter that straddles the two domains of the universe (space and matter, and the quantum domain). You can read about it here:
Any comment on syntropy?
Entropy is the tendency toward disorder, while syntropy is the tendency toward order and complexity. These two forces are complementary but opposing, and they govern how energy and matter flow and organize in the universe.
Tendencies are not forces. So they are not opposing. If you squeeze a balloon one radius shrinks and another expands.
Syntropy is "repeated symmetry without reversion." But clearly you have a different definition. I thought the standard term was *synergy*? The distinction is useful to explore, since synergy is not just "negentropy".
Synergy is characteristic of nonlinear open systems and has no clear single definition, but is typically either gain in free energy or gain in information. That is related to *increase* in entropy paradoxically. But — no paradoxes — that tells you entropy is a blunt concept. For an open system it is not what closed cycle thermodynamics traditionally implies (disorder).
That is a much better word than neg entropy! Which is what I've been using, although I have recently turned it into Evolution. I think syntropy is better though.
Syntropy Is the geometry that creates the system that allows entropy to happen through geometry.
Here: *"The 'geometry' here is defined by the inner product, ⟨ψ | φ⟩. "* — is not the way I would say it. The (spacetime) geometry is stochastically compressed by the ψ.... is the more nuanced way of putting it. And the inner product merely projects out some partial information from that compression. It is lossy compression — so-to-speak — so it cannot be defining the geometry.
The actual geometry is defined by ... *nothing* (to be pedantic) since we do not know what it is, but since the ψ are our best stochastic account for the spacetime geometry, you should note the ψ is purely spacetime algebra valued. So Einstein (really) defines the geometry. It is pseudo-Riemannian, but with non-trivial Planck scale (or thereabouts) topology. I know what you are thinking: how the heck do I know? Well you caught me with my substack pants down, congrats. No one knows. But allow me to be bullish about spacetime realism will ya. it does not detract one iota from a companion spiritual worldview.
All the so-called "fields" (of the Standard Model) can be accounted for in 4D spacetime algebra, so there is nothing else Nature is *forcing* us to accept as base marble geometry. You can add metaphysics on top as you please, like fibre bundles, e.g, Maxwell and Yang-Mills "fields", or strings and whatnot, but that's not good science (and sometimes is "not even physics"🤣), since it is not necessary, it is overloaded language/mathematical tools that may or may not be useful.
How do this approach deal with the Pusey-Barrett-Rudolph theorem? I thought the psi-epistemic theories like Gabriele's were ruled out by this theorem. Quantum world shall be psi-ontic at best, quantum world cannot be retrieved by probability alone or classical mechanics...hmmm...
This is way above me educational level at the moment.
What would be your advice on making such material more accessible? Would one need to have a degree in physics and mathematics? Or is there a shorter path to at least be able to understand this stuff on some level of abstraction?
It's all very interesting to me personally, but not very accessible.
Again (not to be saucy though), check my book out, as I wrote it for teenagers like myself when I was 18 and curious. It is just 200 pages and written in plain language - or as plain as it gets for some of these subjects. It also addresses why QM is indeterminant.
https://www.find-your-map.com/__static/jdj5jdewjgvfy0ximkw5vtvxdxrtwfbw/Find-Your-Map-PDF-Download.pdf
Entropy, in its various formulations, captures different aspects of uncertainty or state counting, whether in thermodynamics, statistical mechanics, or information theory. It is traditionally defined in diverse ways, such as through the Boltzmann formula, the Gibbs-Shannon entropy in statistical contexts, or the von Neumann entropy for quantum systems. These approaches reflect the underlying probabilistic or combinatorial nature of the systems, rather than being confined to a single mathematical interpretation.
In classical mechanics, the symplectic structure of phase space (given by forms like ω in dq∧dp) provides a geometric framework for measuring volumes that directly correspond to numbers of states.
In quantum mechanics, the geometry of Hilbert space (via the inner product) informs how states overlap and, through the Born rule, how uncertainty is quantified.
In these contexts, the geometric structures serve as a backbone for the mathematical tools used to derive entropy. This perspective is elegant because it ties the abstract notion of "counting" states to tangible geometric measures.
However, to claim that "entropy is geometry and geometry is entropy" is an overstatement for several reasons:
Multiple Mathematical Descriptions:
Entropy can be— and is—formulated in many ways. Probability theory, combinatorics, and even pure information theory provide alternative descriptions that do not rely directly on geometric language. In these formulations, entropy arises from counting possibilities or quantifying uncertainty based purely on probabilistic distributions (e.g., in the Gibbs-Shannon or von Neumann definitions).
Context-Specific Nature:
In a well-prepared quantum system, for instance, the coherent superposition (a pure quantum state) inherently has zero entropy under von Neumann’s measure because there is no statistical mixture involved. The entropy appears only when we describe our lack of complete knowledge, or when the system is in a mixed state. Thus, the relationship between geometry and entropy is contingent on how we set up and interpret the system rather than an intrinsic property of the system itself.
While geometry provides a compelling and insightful framework for understanding the structure underlying entropy—especially in the context of state counting and invariant measures—it is only one of many ways to describe entropy. The concept of entropy is richer than any single mathematical formulation, and insisting on an exclusive equivalence between entropy and geometry overlooks the diverse methods by which we can and do capture uncertainty and information in physical systems.
I'm excited to read more about emergent gravity and the idea you mentioned in this article. It's interesting that gravity might be explained by the law of thermodynamics.
I will be exploring that in the next couple months
How do you then link the emergence of time to geometry, given that the most common assumption is that the arrow of time emerges as a consequence of entropy being equal or greater than 0?
Ok I think that after going through Gabriele's video and mathemaniac explaination video about that I'm getting closer to understand the underlying train of thoughts. Fascinating indeed! The question is now how comes that in simplectic space one can count a set of states as large as one wants but in the "lorentzian manifold" we live in there is an upper bound of states available, given by the Bekenstein bound? Maybe if gravity is entropy and entropy is geometry, then gravity is geometry indeed, as stated by GR...
This week you giveth with one hand and take away with the other!
After dragging me kicking and screaming to give up on Godels " limits on what is knowable" - a favorite intellectualism in my mouth box- you have handed us something conceptually very useful to hang onto with entropy as geometry.
This formulation seems to me to be related to what Julian Barbour works with in his book Janus Point.
QUESTION: is there a transcript available for the Consciousness Iceburg?
Thanks Liam. I took a quick look at your posts, and enjoyed the following: spiral combination; understanding vs comprehension (which I, like McGilchrist, reword as knowledge (science) vs understanding (philosophy or full-body comprehension), and, yes, it all starts in the body); and resonance. I'm pretty sure no one understands time, certainly not me, which is partly why it is fundamental to my temporal hypothesis that explains concept-formation or thinking works. This is the real contribution of my book that took 40 years to complete - we differentiate real or continuous time and thereby create discrete clock time, a human invention that supports concept-formation and explains why we are so different from other animals (who don't wear watches). Light is matter traveling at the speed of causality (with no mass); it is the key interface between spacetime and the quantum domain that underlies everything. Geometry (i.e. fractals) are no doubt involved in this interface (via the collapse of the wave function). In any event, thanks for the comment and GPT summary.
Apologies, I'm new to this platform and appear to be missing some features. Liam I took a quick look at the GPT response that compared my book to some dynamic theory that I'm not familiar with. From what I can tell it got some things correct, others were off the mark. Best to read the book directly. I'll get this figured out, but it may take a bit. Thanks for understanding.
So syntropy describes life ... cool. You can read about it here:
https://www.find-your-map.com/__static/jdj5jdewjgvfy0ximkw5vtvxdxrtwfbw/Find-Your-Map-PDF-Download.pdf
Evolution is the geometry that creates the system - that's capable of entropy through geometry 🌀
Respectfully, evolution while playing a role in the development of life does not appear to be fundamental. Rather, life appears to be a highly unusual state of matter that straddles the two domains of the universe (space and matter, and the quantum domain). You can read about it here:
https://www.find-your-map.com/__static/jdj5jdewjgvfy0ximkw5vtvxdxrtwfbw/Find-Your-Map-PDF-Download.pdf
Thanks Brad. https://chatgpt.com/share/67e65915-de84-800f-b0f6-36e8b103ac2e
Light creates the time, that evolves into geometry that creates the system, that's capable of entropy through geometry. Evolution is time.
100% Tiger Blood Entropy post. 🔥🔥🍾