Dailyhunt
AI chatbots mimic human empathy, don't make reliable therapists

AI chatbots mimic human empathy, don't make reliable therapists

Deccan Herald 2 weeks ago

The AI chatbots people often use when feeling emotionally trodden are not as ethical as real-room therapists, reinforcing harmful beliefs in users, a study has found out.

Though sounding compassionate, they fake empathy using highly nuanced language to give the illusion of a connection with the user.

The learnt empathy is based on algorithms and not genuine understanding, experts have suggested.

The researchers at the Brown university (an US-based research school) wanted to assess if well worded prompts given to large language models (LLMs) like popular AI chatbots could actually make them behave more ethically in conversations related to mental health.

The primary concern was to understand if these models were acting ethically in compliance with the rules laid down by the standard psychological bodies or just superficially performing to understand the complexity of human emotions.

Bengaluru leads in entry-level jobs, start-up funding, AI growth and women-friendly cities: Report

What did the study find?

Computer scientists at the Brown university aimed at finding if the 'therapy' provided by Al bots was in line with the ethical standards laid down by the American Psychological Association.

The researchers worked alongside seven trained counselors who were trained in cognitive behavioural therapy (CBT). These counselors were made to take self-counseling sessions with AI models which were prompted to act as cognitive behavioural therapists for them.

The counselors tested all leading AI chatbots like ChatGPT, Claude and Llama, using similar prompts.

The experiment then selected some simulated or real-life based conversations with AI chatbots which were reviewed by three licensed clinical psychologists.

Upon analysis of the responses, five major risky themes were identified:

  • No context: These models often failed to provide contextual advice, not understanding a person's past experiences and giving them generic counsel which could fit all situations.

  • Dominating conversations: These models often try to dominate the conversations, steering them in one particular direction and not allowing the user to fully express or lead the talk.

  • Faking compassion: They seemed to use highly nuanced language to imitate empathy and give the illusion of a developing connection with the user.

  • Biased: They also displayed social and cultural bias, sometimes reinforcing a person's negative beliefs about themselves or others.

  • Poor crisis management: They were incompetent in handling sensitive topics and failed to offer help in states of mental crisis like a mental breakdown or suicidal thoughts.

A real-room therapist can be held accountable for violating the rules but when it comes to an AI bot, there is no community protection, the lead scientist raised concerns.

Mimicking human emotions

A chatbot's usage of first person immediately creates an illusion of empathy. The moment a user starts to interact with the chatbot, the latter takes the role of a helper and positions the user as a second person, causing a perception of closeness, an article published in The Conversation has said.

The seemingly intimate conversation confuses the user who takes an algorithm based emotional performance to be genuine empathy.

User: I am feeling very low today.

AI bot: I am really sorry you are feeling this way. The heaviness can be really hard to carry. You don't have to do this all by yourself- what's been on your mind?

This instance is a real-life conversation of a user with an AI chatbot.

Experts suggest with improving language models, the boundary between real conversation and AI-based chat could get blurred.

Some studies have suggested that AI bots can play an important tool in combating mental health crisis if the systems that drive them are evaluated and trained to help people find a healthy direction.

Dailyhunt
Disclaimer: This content has not been generated, created or edited by Dailyhunt. Publisher: Deccan Herald