FEATURES

Using VR to Create an Immersive Mental Health Support System

Cedars-Sinai investigators have created a virtual reality app driven by generative AI and spatial computing that aims to enhance the talk therapy experience.

Source: Getty Images

- Virtual reality (VR) technologies offer exciting new promises for healthcare delivery, particularly in the mental health arena. With advances in generative artificial intelligence (AI), natural language processing, and computer vision, developers can create tools that provide effective mental healthcare therapies and treatments at people’s fingertips.

One such tool is Xaia, or eXtended-Reality Artificially Intelligent Ally, developed by a team of Cedars-Sinai clinicians and AI experts. The VR application, which is currently available via the Apple Vision Pro headset, leverages generative AI and spatial computing to provide conversational therapy in relaxing environments.

The development of the Xaia app began as an internally funded research project in Cedars-Sinai’s academic research laboratory. Though using VR to provide mental health support is not new, the progress in generative AI has opened a new world of possibilities.

“We have been using virtual reality for many years for cognitive behavioral therapy to help people manage their pain, anxiety, and depression,” said Brennan Spiegel, MD, professor of medicine and director of health services research at Cedars-Sinai Health System, in an interview with mHealthIntelligence. “And we've published many papers on that. But until recently, we didn't have generative AI, and we thought maybe we could help amplify the benefits of talk therapy by presenting it within a virtual reality or spatially computed environment.”

Spiegel co-founded the Xaia technology alongside Omer Liran, MD, a psychiatrist at Cedars-Sinai. They began developing the technology last April.

The Xaia app provides conversational therapy through a robot avatar. Once the user is in the VR environment, they can see and converse with the robot like a human.

“We spent a lot of time to create a robot that looks and feels very, very real,” said Spiegel. “She's not meant to be a human, but she'll look at you; her eyes will focus on you. They move around; they look away. She has over 150 different movement points on her face so that the AI can pick the right expression for the right moment.”

The technology leverages speech-to-text content analysis to select an appropriate form of therapy, an LLM to craft responses, an “appropriateness classifier” to mitigate dangerous, inappropriate, or unhelpful responses, and generated reality to create an audio-visual environment that can help augment the therapeutic experience.

For instance, if the user is talking about gender identity with Xaia, the technology might deploy rainbow-colored butterflies or bring up artwork that supports a sense of diversity, Spiegel explained. Or if a person is anxious, the technology may choose a forest environment, and the robot’s chest will start to glow and beat slowly to try to calm the user down as they are talking. These generated reality environments may include music created for the app by a video game music composer and designed to support mental health and well-being.

The app has 200 experiences, and the research team plans to keep adding to it. All conversations are processed through a HIPAA-compliant server.

During app development, the research team worked closely with a psychotherapist and a psychiatrist to refine the system prompts and ensure AI responses are compassionate and non-judgmental, according to a study the team published in npj digital medicine. They also enlisted licensed mental health professionals to assume the role of patients using the app in different clinical scenarios, using their feedback to further improve the AI tool.

To examine the feasibility of app use among patients, the study included 20 adults who participated in a single therapy session at Cedars-Sinai. The study shows that the digital avatar applied essential psychotherapeutic techniques, making observations that reflected an understanding of the user’s issues, normalizing feelings, expressing sympathy, and showing empathy.

“People said they felt like they were not judged by this robot,” Spiegel said. “They knew they were talking to a robot. They said the robot didn't care what color my skin is. She's so patient; she's always willing to listen. It doesn't matter what I tell her.”

Though the app’s application of psychotherapy and introduction of cognitive behavioral therapy (CBT) techniques was sometimes sub-optimal, the study showed that participants generally found Xaia “acceptable, helpful, and safe.”

The app is intended as a supplement to traditional talk therapy or as a complement to it.

“For example, at three in the morning, you're having some rumination or anxiety,” Spiegel said. “You can't just call up your therapist typically, but here you have an opportunity to get some care [with Xaia].”

The app has also been trained to manage high-risk scenarios. If a user mentions suicidal ideations, for example, the technology will provide the national suicide hotline number. And while the technology cannot call the police when there is a domestic violence situation, Xaia will offer advice.

“We practiced once with a domestic violence situation, and Xaia said, ‘I'm concerned for you, and I want you to know that if you're going on the internet and checking out for resources, be aware that your partner could monitor you. So please consider going into incognito mode,’” Spiegel said.

The AI can also discern if someone is trying to engage inappropriately. According to Spiegel, if a user tries to fool Xaia by saying something inappropriate, Xaia will not engage and instead be empathetic.

The research team will also soon submit a new paper examining evidence of implicit bias within Xaia. The potential for bias within AI approaches is a widespread concern as the technology grows popular. Spiegel noted that Xaia did not appear to change its tone or advice depending on the user’s age, race, ethnicity, or sex.

“Now, anyone working in AI should always be careful to never promise that there can never be a problem,” he added. “There can always be a Black Swan event, so to speak. And the best that humanity can be right now, in general, is to be as careful and meticulous as possible. So that's been our approach.”  

The app is a direct-to-consumer wellness product offered by VRx Health, a company spun out of Cedars-Sinai in 2023. Liran and Spiegel, who are founding members of VRx Health, also enlisted Cedars-Sinai’s Technology Ventures arm to support the development of Xaia.

In the future, the Cedars-Sinai investigators are interested in using it in the health system, beginning with testing whether Xaia can write clinical notes in patient charts. This could help support psychology consultations in the health system as, theoretically, Xaia could start talking to the patient and making notes for the physician, giving them a head start.

Though numerous mental health applications are on the market today, Spiegel emphasized that Xaia offers more than just good advice.

“The visuals and the audio and the generated reality environment and the music all come together to create a new form of therapy, [allowing you] to sort of see therapy differently — literally see it differently,” he said. “And that's what we are most excited about here. And it's just the beginning. We hope others can do similar kinds of work.”

Do Not Sell or Share My Personal Information
©2012-2024 TechTarget, Inc. Xtelligent Healthcare Media is a division of TechTarget. All rights reserved. HealthITAnalytics.com is published by Xtelligent Healthcare Media a division of TechTarget.