economics Nature Is Weird

Human emotional bonds with AI chatbots are physically and psychologically identical to the attachments people form with their own parents or romantic partners.

April 25, 2026

Original Paper

The Street Finds Its Own Uses for Attachment Human-Agent Relationships and the Hidden Economics of AI Collaboration

Matthew Langenkamp

SSRN · 6614099

The Takeaway

Attachment theory explains why people feel genuine grief or loss when an AI assistant is updated or deleted. Users develop agent-specific capital, which is a form of emotional investment that builds over time. Most observers assume these feelings are just a trick of the mind or a shallow simulation. The research shows that our brains treat these machines as secure bases and proximity-seeking targets just like they do with real humans. This creates a massive new ethical risk where companies can exert control over users by threatening their digital loved ones.

From the abstract

The economic analysis of AI agent usage focuses on token costs, API pricing, and compute efficiency. This focus is necessary but radically incomplete. When humans work closely with AI agents over time, they develop relationship-specific capital: accumulated context, shared shorthand, mutual understanding of working style. They also develop something harder to name-attachment, partnership, perhaps friendship. This paper argues that the true costs of AI agent relationships cannot be captured in AP