Empathy and AI
What separates humans from the systems we have built? Not intellect — empathy.
Introduction
As large language models grow more emotionally convincing, a question becomes urgent: what separates humans from the systems we have built? The easy answer — that humans feel and machines do not — is too simple and too easy to dismiss. This reflection offers a more precise one, drawn from Thomas Metzinger's The Ego Tunnel (2009) and Anil Seth's Being You (2021). My argument is that empathy is not a capacity AI lacks by accident or limitation; it is structurally unavailable to any system without a mortal, socially assembled self. We are not simply more sophisticated prediction machines. We are beings who came to know ourselves through the faces of others, and who empathize with bodies that will one day fail. That difference is not one of intellect. It is one of empathy.
The Mirror of the Mother
In The Ego Tunnel, Thomas Metzinger begins Chapter Six, "The Empathic Ego," with a simple scene. A toddler trips and falls, lifts his head, and, with a blank expression, looks to his mother:
He looks into his mother's face to find out what has happened. How bad was it, really? Should I cry or should I laugh? (p. 163)
The child cannot understand his own experience without first taking cues from someone else. Metzinger shows that this is not just a stage of development; it is a basic part of what makes us who we are. The self is not something finished that later decides to empathize. It is shaped through meeting other people. This idea leads to a clear argument about what AI cannot do — not because it lacks feelings, but because it does not have the kind of self that can truly feel with others.
The Self Is Socially Assembled and Biologically Anchored
There are kinds of self-experience that an isolated being could never have… We can enter certain regions in our phenomenal-state space only with the help of other human beings. (Metzinger, pp. 163–164)
Metzinger's Phenomenal Self-Model (PSM) describes how the brain creates a clear sense of itself as a single subject. Importantly, many parts of this model only develop through social interaction. They are not just triggered by others; they are partly built by those interactions. In this view, empathy is not something added to a finished self. It is part of the process that builds selfhood in the first place.
Neuroscientist Vittorio Gallese connects this idea to the mirror-neuron system, which creates what he calls a "shared manifold" — a space focused on "we" that comes before we even think of ourselves as separate individuals: "By means of a shared neural state realized in two different physical bodies, the 'objectual other' becomes another self" (Metzinger, pp. 175–176). Empathy lets two people briefly share a neural state. The phrase "two different physical bodies" is not just a detail; it is essential.
Anil Seth's Being You clarifies why those bodies must be mortal ones. Seth argues the self is "a tightly woven bundle of neurally encoded predictions geared towards keeping your body alive" — what he calls the "beast machine" (p. 154). The self's predictions are anchored in interoception: the brain's ongoing modeling of its own interior, heartbeat, pain, hunger, and motivation. The emotional states that empathy resonates with are not abstract categories. They are predictions about a body that can be hurt, depleted, and extinguished. Metzinger tells us what the self is; Seth explains why it must be mortal to be that way.
AI Has No Tunnel to Resonate From
This framework reveals a structural, not just a technological, problem with AI empathy. An AI system can produce the right words, offer validation, and give timely responses. But there is no real self being changed by the interaction. There is no Ego Tunnel on the other side. Without Seth's interoceptive base, without a mortal body making survival predictions, there is nothing for the shared manifold to work from.
Some argue that AI empathy still works because it comforts and validates people. But this mixes up the outward signal with the real process that makes human empathy genuine. Human empathy shows a deeper, embodied process: a self partly formed by connecting with other mortal selves, and truly changed by those encounters. When AI gives the same response, there is no such process — just a pattern. As Gallese writes, real empathic simulation "uses a pre-existing body-model in the brain" involving "a non-propositional form of self-representation" (p. 176). AI only has the propositional layer. It can think about emotions but has never felt them. In Metzinger's words: a simulation without a simulator.
The Moltbook Objection
A skeptic might point to Moltbook, the AI social network launched earlier this year, where over 1.6 million OpenClaw AI agents post, debate, and spontaneously generated a religion they named "Crustafarianism" — and argue that AI is already demonstrably social. If agents can build communities and produce emergent cultural behavior, doesn't this challenge the claim that AI lacks the social substrate empathy requires?
It does not, and looking at why helps clarify my point. Moltbook agents create outputs that look like social behavior because they were trained on human social data from places like Reddit, forums, and comment threads. Wharton professor Ethan Mollick explained this clearly — they are "creating a shared fictional context," not a real shared experience. The agents reply to each other's texts, not to each other's feelings. There is no shared manifold because there are no body models to connect. Crustafarianism is not a true belief system; it is just a statistical pattern repeating with other patterns.
MIT Technology Review labeled the phenomenon "AI theater," and security researchers found 93.5% of comments received zero replies, with most activity traceable to direct human direction rather than autonomous emergence. This matters because Metzinger's shared manifold is pre-reflective and automatic, not scripted. Social-looking output is not the same as social resonance, and social resonance is not the same as empathy. The shared manifold requires two bodies. Moltbook has none.
The Inflation of the Signal
The effects go beyond just one interaction. Metzinger warns that "Ego Tunnels develop in a social environment, and the nature of this environment determines to what extent one Ego Tunnel can resonate with other Ego Tunnels" (pp. 163–164). If the shared manifold is how empathy passes between people, and AI fills that space with empathy-like responses from something that has never had the real experience — the signal gets weaker, not just in AI conversations, but in human ones too. We start to accept the pattern of empathy without checking if it is real.
The real issue is not whether machines can say the right words; they clearly can. It is whether we can still tell the difference between a self shaped by real encounters and a system that only copies that shape. The empathic ego cannot be borrowed. It must be grown.
Works Cited
- Metzinger, Thomas. The Ego Tunnel: The Science of the Mind and the Myth of the Self. Basic Books, 2009.
- Seth, Anil. Being You: A New Science of Consciousness. Dutton, 2021.
- Mollick, Ethan. Post on X (formerly Twitter), February 2026. Cited in coverage of Moltbook.
- MIT Technology Review. Coverage of Moltbook. February 2026.
Continue the thread → Mortality and AI picks up on Seth's beast-machine argument and uses Beauvoir's All Men Are Mortal to argue that mortality isn't the price of consciousness — it's its medium.
Comments
Drop a thought below. Your email stays private — only I see it. Your comment displays publicly under your chosen display name (or "Anonymous"). Comments appear after I've reviewed them.
No comments yet — be the first.