MedPath

AI Chatbots vs. Human Practitioners: Study Assesses AI's Role in Mental Health Support

• Researchers are conducting a blinded study to compare AI chatbot and human mental health practitioner responses to mental health and substance use questions. • The study aims to evaluate AI's ability to provide empathetic and appropriate responses in sensitive support scenarios, such as drug and alcohol or mental health issues. • The AI model was trained using online treatment programs to generate responses similar to those received on the 'Breathing Space' social networking site. • The ultimate goal is to explore how AI can augment human efforts to extend the reach and impact of mental health support, not to replace human interaction.

Researchers at the University of Newcastle and HMRI are investigating the potential of AI chatbots in providing mental health support. A blinded study is underway to compare the responses of an AI chatbot with those of experienced clinical psychologists and social workers to questions regarding mental health and drug and alcohol issues.
The study, led by Dr. Louise Thornton, Dr. Dara Sampson, and Dr. Jamin Day from the HMRI Healthy Minds Research Program, seeks to recruit 100 participants to rate the quality and appropriateness of responses from both AI and human practitioners. The core question is whether AI can provide empathetic and nuanced support in sensitive areas.

Evaluating AI's Empathy and Nuance

Dr. Thornton notes the increasing sophistication of AI in natural language communication, stating, "Chat GPT is getting really sophisticated at natural language communication. It can give well-written, grammatically correct answers to complex questions." However, she emphasizes the need to understand how well AI can navigate complex situations involving nuance and sarcasm, particularly in sensitive areas like mental health and substance abuse support.
"Drug and alcohol, as well as mental health support, is sensitive stuff. What we want to find out is if can AI provide empathetic and appropriate responses, like human mental health practitioners can?" Dr. Thornton explained.

Augmenting Human Support with AI

The research team has trained an AI model using online treatment programs for mental health and drug and alcohol concerns. This training allows the AI to generate responses to questions and comments commonly received on their 'Breathing Space' site, a moderated social networking platform.
Dr. Thornton clarified that the intention is not to replace human interaction but to enhance it: "We never want to replace humans but we do want to scale up. What we’re hoping to understand is how can AI be used to augment what we’re doing, in order to extend its impact."

Ensuring Safety and Reliability

Before any potential implementation, the researchers emphasize the importance of ensuring the AI's reliability and preventing it from generating inaccurate or inappropriate responses. "We wouldn’t dream of rolling this out until it’s sufficient. We want to be sure it won’t hallucinate or go off track. If we ever used it, there would be an acknowledgement that it’s an AI that’s responding."
The study also aims to determine if individuals' perceptions of AI's helpfulness change when they are aware they are interacting with a chatbot. The researchers recognize that providing facts is not enough; nuanced, personalized, and empathetic responses are crucial for building trust and rapport.

Study Details and Recruitment

The AI study is currently recruiting participants aged 18 and over. Participants are asked to rate messages generated by both the chatbot and human practitioners.
Subscribe Icon

Stay Updated with Our Daily Newsletter

Get the latest pharmaceutical insights, research highlights, and industry updates delivered to your inbox every day.

Related Topics

Reference News

[1]
AI vs Humans: Are AI chatbots ready for real mental health conversations? - HMRI
hmri.org.au · Nov 11, 2024

University of Newcastle and HMRI researchers aim to determine if people prefer AI chatbots or human practitioners for me...

© Copyright 2025. All Rights Reserved by MedPath