To make an AI 'feel' empathy, you have to link its internal state to yours, not just tell it how you're feeling.
April 14, 2026
Original Paper
Prosociality by Coupling, Not Mere Observation: Homeostatic Sharing in an Inspectable Recurrent Artificial Life Agent
arXiv · 2604.10760
The Takeaway
Agents only exhibit helping behavior when their affective states are homeostatically coupled with another agent. This suggests that true prosocial AI requires shared internal dynamics rather than just high-level cognitive awareness of another's distress.
From the abstract
Artificial agents can be made to "help" for many reasons, including explicit social reward, hard-coded prosocial bonuses, or direct access to another agent's internal state. Those possibilities make minimal prosocial behavior hard to interpret. Building on ReCoN-Ipsundrum, an inspectable recurrent controller with affect-coupled regulation, I add an explicit homeostat and a social coupling channel while keeping planning strictly self-directed: the agent scores only its own predicted internal stat