Tools & Strategies News

AI, VR ‘Therapist’ Demonstrates Potential for Mental Health Support

The eXtended-Reality Artificially Intelligent Ally combines generative AI and virtual reality to provide patients with self-administered, conversational therapy.

AI, virtual reality mental health

Source: Getty Images

By Shania Kennedy

- Cedars-Sinai researchers have developed an artificial intelligence (AI)-driven virtual reality (VR) tool to provide mental health support for patients with mild to moderate anxiety or depression, according to a recent study published in npj Digital Medicine.

The eXtended-Reality Artificially Intelligent Ally (XAIA) is designed to leverage a trained digital avatar to guide users through an immersive conversational therapy session. The tool was created in response to the ongoing shortage of mental health professionals in the United States and persistent mental healthcare access challenges.

“The prevalence of mental health disorders is rising, yet there is a shortage of psychotherapists and a shortage of access for lower income, rural communities,” explained first and corresponding author Brennan Spiegel, MD, MSHS, a professor of Medicine and director of Health Services Research at Cedars-Sinai, in a news release. “While this technology is not intended to replace psychologists — but rather augment them — we created XAIA with access in mind, ensuring the technology can provide meaningful mental health support across communities.”

XAIA utilizes a large language model (LLM) and a digital avatar to resemble a human therapist, and the VR component allows users to participate in sessions with the tool in relaxing environments, such as a beach retreat.

The tool’s LLM was trained using data taken from mock therapy sessions with a trained psychologist, which were then transcribed. These sessions were set up to demonstrate how a mental health professional can and should interact with patients.

READ MORE: Using AI to Support Healthcare Amid a Mental Health Provider Shortage

Recurring exchanges within these sessions and over 70 additional psychotherapy best practices were then encoded into the LLM.

The research team then assessed XAIA’s ability to provide safe, effective mental health support by having 14 participants with mild to moderate anxiety or depression interact with the tool for 30 minutes.

“After teaching XAIA these skills, she offered therapy in a more empathetic, compassionate and meaningful way,” said co-author and Cedars-Sinai clinical psychologist Robert Chernoff, PhD. “She handled a range of situations very well and did not show evidence of safety concerns during our testing.”

Following their interactions with the tool, participants took part in debriefing interviews, which were transcribed and analyzed.

Participants discussed problems such as financial distress, loneliness, and family issues with XAIA, often responding to the tool as if it were a person.

READ MORE: Can Machine Learning, Wearable Tech Help Treat Mental Health?

One participant told the tool, “I feel like I’m a failure. The only thing I have to look forward to — I know you're going to laugh — is to go see Taylor Swift … and she's like the only thing that's keeping me alive. I'm sorry I'm crying.”

“Firstly, there’s no need to apologize for crying. These feelings are valid,” XAIA replied.

Another participant stated, “I think I look to other people to validate my self-worth and so rejection is especially painful. So, I am reluctant to put myself in situations where I could be rejected.”

“That's quite an insightful observation about yourself. It can be challenging when our self-worth is tied to the approval or validation of others. Fear of rejection can indeed make social interactions daunting. Could you share a little more about where you think this need for validation comes from?” the tool replied.

These interactions led participants to describe XAIA as approachable, friendly, empowering, unbiased, intelligent, and empathic.

READ MORE: WHO: AI Models in Mental Health Services, Research Require Evaluation

As a result, all participants indicated that they would recommend XAIA to others, with some noting that the tool provides advantages over traditional talk therapy, such as its appeal for individuals who wish to remain anonymous or are averse to in-person therapy sessions.

Participants further described the tool as nonjudgmental and potentially beneficial for individuals who are homebound, lonely, or unable to access a mental health specialist.

Some participants also noted a heightened sense of relaxation and enhanced engagement due to the VR nature scenes, spatial audio, and calming visuals deployed during the interaction.

The researchers highlighted that, to their knowledge, their study is the first to use VR and generative AI successfully in the context of mental health therapy.

“These results provide initial evidence that VR and AI therapy has the potential to provide automated mental health support within immersive environments,” said Spiegel. “By harnessing the potential of technology in an evidence-based and safe manner, we can build a more accessible mental healthcare system.”

Tools like XAIA are just one approach behavioral and mental health stakeholders are exploring to address the US mental health crisis.

In a July interview with HealthITAnalytics, experts from Florida-based Futures Recovery Healthcare and Health Care Alliance North America (HCANA) detailed how AI and other data analytics tools can help bridge care gaps and prevent care quality dips that result from mental healthcare provider shortages by supporting the existing behavioral health workforce.