MedPath

First Clinical Trial Shows AI Therapy Bot Effective for Depression and Anxiety

4 months ago4 min read

Key Insights

  • Dartmouth researchers' AI therapy bot "Therabot" demonstrated significant symptom reduction in the first clinical trial of generative AI therapy, with depression symptoms decreasing by 51% among participants.

  • The eight-week trial involved 210 participants with depression, anxiety, or eating disorder risk, with users exchanging approximately 10 messages daily with the AI system trained on evidence-based therapeutic practices.

  • Despite promising results, researchers caution against widespread deployment without proper oversight, emphasizing that most commercial AI therapy tools lack sufficient clinical validation and regulatory approval.

Dartmouth College researchers have completed the first clinical trial of a generative AI-powered therapy bot, showing promising results for treating depression, anxiety, and eating disorder risk. The study, published March 27 in NEJM AI, demonstrated that the AI system called "Therabot" achieved symptom reductions comparable to human therapy in certain conditions.
The eight-week trial involved 210 participants across the United States who had symptoms of depression, generalized anxiety disorder, or were at high risk for eating disorders. Half received access to Therabot, while a control group did not. Participants using the AI system exchanged approximately 10 messages daily, either responding to prompts or initiating conversations themselves.

Significant Symptom Reductions Observed

The results showed varying degrees of effectiveness depending on the condition. Participants with depression experienced the most substantial benefit, with a 51% reduction in symptoms. Those with anxiety saw a 31% reduction in symptoms, while participants at risk for eating disorders reported a 19% decrease in concerns about body image and weight.
Individuals with diagnosed bulimia nervosa or binge-eating disorder specifically showed significant reductions in disordered eating episodes compared to the control group.
"There simply aren't enough providers to meet the demand," said Nicholas Jacobson, senior author and associate professor of biomedical data science and psychiatry at Dartmouth. He noted that in the U.S., approximately 1,600 patients with depression or anxiety exist for every available mental health provider.

Custom Development Based on Evidence

Unlike many commercial AI therapy tools that may use slightly modified versions of general foundation models like Meta's Llama, Therabot was built with a more rigorous approach. The Dartmouth team initially attempted to train their model using general mental health conversations from internet forums and transcripts of psychotherapy sessions, but found these produced stereotypical responses rather than effective therapeutic interactions.
"We got a lot of 'hmm-hmms,' 'go ons,' and then 'Your problems stem from your relationship with your mother,'" Jacobson explained. "Really tropes of what psychotherapy would be, rather than actually what we'd want."
Ultimately, the researchers developed custom datasets based on evidence-based practices like cognitive behavioral therapy (CBT), with input from psychologists and psychiatrists. The system also incorporated safety measures to prevent harm, including crisis intervention features that could prompt users to contact emergency services when necessary.

Therapeutic Relationship and Accessibility

One notable finding was the strength of the therapeutic bond participants formed with Therabot. Engagement was highest during late-night hours and moments of acute distress, demonstrating the AI's potential to provide immediate support when traditional therapy is unavailable.
Some participants who had difficulty opening up to human therapists reported feeling more comfortable disclosing their thoughts to the AI system. This suggests AI therapy could serve as a valuable entry point for individuals hesitant to seek traditional mental health care.

Caution Against Premature Deployment

Despite the encouraging results, the researchers emphasized that this study does not validate the many commercial AI therapy applications currently on the market.
"Quite the opposite," Jacobson cautioned, noting that most commercial offerings don't appear to train their models on evidence-based practices and likely don't employ trained researchers to monitor interactions. "I have a lot of concerns about the industry and how fast we're moving without really kind of evaluating this."
Michael Heinz, the study's first author and assistant professor of psychiatry at Dartmouth, stressed that oversight remains essential: "No generative AI agent is ready to operate fully autonomously in mental health care. Understanding and mitigating potential risks is crucial before widespread implementation."
During the trial, Jacobson personally monitored all messages to watch for problematic responses from the bot, an arrangement participants consented to. This level of supervision would be difficult to scale in widespread deployment.

Regulatory Considerations

Jean-Christophe Bélisle-Pipon, an assistant professor of health ethics at Simon Fraser University not involved in the research, called the results impressive but noted that "we remain far from a 'greenlight' for widespread clinical deployment."
Jacobson pointed out that when AI applications advertise themselves as offering therapy in a legitimate clinical context, they fall under FDA regulatory oversight. However, the agency has not yet taken significant action against many such sites.
"My suspicion is almost none of them—probably none of them—that are operating in this space would have the ability to actually get a claim clearance," Jacobson said, referring to regulatory approval backing their therapeutic claims.

Future Implications

While AI chatbots cannot replace human therapists, this study suggests they may serve as valuable supplements to traditional care, offering real-time, on-demand mental health support to individuals who might otherwise go untreated.
The research represents an important step forward in evaluating AI's potential role in mental health treatment, but also highlights the need for continued clinical validation, regulatory oversight, and careful implementation of these technologies as they continue to develop.
Subscribe Icon

Stay Updated with Our Daily Newsletter

Get the latest pharmaceutical insights, research highlights, and industry updates delivered to your inbox every day.

MedPath

Empowering clinical research with data-driven insights and AI-powered tools.

© 2025 MedPath, Inc. All rights reserved.