Psychology Practical Magic

If you're already stressed out, treating an AI like it’s 'human' actually makes your anxiety worse.

March 26, 2026

Original Paper

The Relational Amplifier: How Anthropomorphism of Generative AI Backfires for Distressed Users

dorit Hadar Shoval, Elad Refoua, Karny Gigi, Yuval Haber, Inbar Levkovich, Zohar Elyoseph

PsyArXiv · m25w7_v1

The Takeaway

While many believe that anthropomorphizing AI makes it a better mental health tool, this study found a 'backfire effect.' While 'humanizing' the AI reduced anxiety for stable users, it significantly amplified anxiety for vulnerable, distressed individuals, suggesting that for those who need help most, a 'personality-driven' AI feels threatening rather than supportive.

From the abstract

General-purpose Generative Artificial Intelligence (GenAI) is increasingly utilized as an unregulated source of mental health support, yet the psychological dynamics of users' interactions with these agents, and the associated risks, remain underexplored. Integrating social-cognitive models of anthropomorphism with attachment theory via the proposed Relational Amplifier framework, this study addressed two primary objectives: (1) to identify the unique predictors driving GenAI adoption for mental