menu

AI judged to be more compassionate than expert crisis responders

February 2, 2025

Listen to this article

AI judged to be more compassionate than expert crisis responders

By definition, robots can’t feel empathy — it requires being able to relate to another person’s human experience, to “put yourself in their shoes”.

But according to new research, artificial intelligence (AI) can create empathetic responses more reliably and consistently than humans, even when compared to professionals whose job relies on empathizing with those in need.

What the researchers say: “AI doesn’t get tired,” pronounced the lead author of the study.  “It can offer consistent, high-quality empathetic responses without the emotional strain that humans experience.” The research, published in the journal Communications Psychology, looked at how people evaluated empathetic responses generated by ChatGPT compared to human responses. Across four separate experiments, participants were asked to judge the level of compassion (an important facet of empathy) in written responses to a series of positive and negative scenarios created by AI as well as regular people and expert crisis responders. In each scenario, the AI responses were preferred and rated as more compassionate and responsive—conveying greater care, validation and understanding compared to the human responses.

So, how could a general chatbot like ChatGPT outperform professionals trained in responding with empathy? The researchers point to AI’s ability to pick up on fine details and stay objective, making it particularly adept at crafting attentive communication that appears empathetic.

Empathy is an important trait not only in fostering social unity, but in helping people feel validated, understood and connected to others who empathize with them. In clinical settings, it plays a critical role in helping people regulate emotions and feel less isolated.

But constantly expressing empathy has its costs. “Caregivers can experience compassion fatigue,” explained the lead author, who herself has professional experience volunteering as a crisis line responder.

She adds that professional caregivers, especially in mental health settings, may need to sacrifice some of their ability to empathize to avoid burnout or balance their emotional engagement effectively for each of their clients.

Humans also come with their own biases and can be emotionally affected by a particularly distressing or complex case, which also impacts their ability to be empathetic. The researchers say that, coupled with shortages in accessible healthcare services and qualified workers and a widespread increase in mental health disorders, it means empathy is in short supply.

That doesn’t mean we should cede empathy-derived care to AI overnight, the researchers pointed out. “AI can be a valuable tool to supplement human empathy, but it does come with its own dangers.”

While AI might be effective in delivering surface-level compassion that people might find immediately useful, something like ChatGPT will not be able to effectively give them deeper, more meaningful care that will get to the root of a mental health disorder.  

The lead author notes that overreliance on AI also poses ethical concerns, namely the power it could give tech companies to manipulate those in need of care. For example, someone feeling lonely or isolated may become reliant on talking to an AI chatbot that is constantly doling out empathy, instead of fostering meaningful connections with another human being. “If AI becomes the preferred source of empathy, people might retreat from human interactions, exacerbating the very problems we’re trying to solve, like loneliness and social isolation.”

Another issue is a phenomenon known as “AI aversion,” which is a prevailing skepticism about AI’s ability to truly understand human emotion. While participants in the study initially ranked AI-generated responses highly when they didn’t know who had written them, that preference shifted slightly when they were told the response came from AI. However, this bias may fade over time and experience. The researchers noted that younger people who grew up interacting with AI are likely to trust it more.

Despite the critical need for empathy, they urge for a transparent and balanced approach to deployment where AI supplements human empathy rather than replaces it. “AI can fill gaps, but it should never replace the human touch entirely,” they concluded.

My take: Whether humans in a caring role will be replaced completely by AI is, I think, something that we will have to wait and find out. Personally, I am sure that, in many circumstances, it will.

But then, I suppose, why have humans at all?

Dr Bob Murray

Bob Murray, MBA, PhD (Clinical Psychology), is an internationally recognised expert in strategy, leadership, influencing, human motivation and behavioural change.

Join the discussion

Join our tribe

Subscribe to Dr. Bob Murray’s Today’s Research, a free weekly roundup of the latest research in a wide range of scientific disciplines. Explore leadership, strategy, culture, business and social trends, and executive health.