Discussion about this post

User's avatar
Elan Barenholtz, Ph.D.'s avatar

As I think others have pointed out, this conjecture appears to be a reformulation of computationalism or functionalism. I remain an enthusiastic proponent of this perspective, but I also find the term “meaning” to be both rich and problematically vague, making it hard to pin down precisely.

In my view, there are two versions of computationalism. One version holds that the observable functions of cognition—thinking, reasoning, and responding—are inherently computational and substrate independent. The success of large language models provides strong evidence for this view. These models replicate core linguistic behaviors once thought uniquely human, even though they run on silicon rather than in biological neurons. Here, the emphasis is on the abstract patterns and algorithms that drive behavior. When we set aside concerns about the physical substrate, it becomes possible to consider that similar computational processes might be present in systems far removed from brains, such as machines, societies, ecosystems, or even the universe itself. In this light, terms like “spirit” could be reinterpreted to refer to the dynamic computational patterns that give rise to complex behavior rather than to some mysterious, non-material substance, although I suspect that both ancient conceptions and many contemporary users might still lean toward the latter.

The other, more ambitious stance asserts that if two systems are computationally equivalent, then they are not merely functionally similar but also share all non-observable, qualitative properties. In other words, if one system exhibits phenomenal consciousness—if it truly “feels” something—when it performs a certain computation, then any other system executing that same computation should, in principle, be conscious as well. This idea resonates with some of Chalmers’ claims regarding phenomenal equivalence. I think this is a reasonable view and I belive it but the challenge still lies in determining exactly what level of computational equivalence is necessary. Large language models might mirror human linguistic behavior and thus support the first version of computationalism, yet they clearly lack the type of computation that might be needed for sensory experience. Perhaps a robot equipped with a multimodal language model—integrating sensory, motor, and affective processing—would be computationally closer to a biological system. Still, it remains an open question whether such equivalence would be sufficient for genuine phenomenal experience, given that the continuous, analog nature of biological bodies can never be fully captured by digital models. Perhaps the right level of "computation" lies in the quarks or microtubules or whatever.

Expand full comment
Rob M's avatar

Super interesting Curt. I have been interested in the mechanics (meaning software) of AI as possibly revelatory of human consciousness. I'm just a curious amateur so nothing to contribute. However, I strongly believe in the concept of "emergence".

To me this is "the ghost in the machine". Have you discussed this topic?

Expand full comment
25 more comments...

No posts