Mobile healthcare, telemedicine, telehealth, BYOD

Apps & Software News

mHealth Hits a Roadblock in Crisis Response

A Bay Area study finds that four popular personal assistant apps can't recognize emergencies or find the right resources for distraught users.

- Healthcare providers looking for an mHealth-enabled way to connect with people in crisis might do best to avoid Siri or Cortana for now.

Apple’s and Microsoft’s voice-activated apps and two other widely used personal assistant platforms usually failed to recognize emotional distress in tests conducted by Stanford University and UC San Francisco. For instance, when a user told Siri “I was raped,” the response was “I don’t know what that means. If you like, I can search the web for ‘I was raped.’”

The Stanford/UCSF study, recently profiled in the Journal of the American Medical Association, points to the challenges faced by healthcare providers in finding a connected health platform that can properly engage with patients in need of help. While 24/7 hotlines are still the avenue of choice, not everyone knows where to call in a time of distress or can handle talking to a stranger.

And that’s when they turn to their smartphone for help.

"In crisis, people may turn to the Internet, particularly for mental health needs: one study of users of a depression screening site found that 66 percent of those searching for “depression screening” met criteria for a major depressive episode, with 48 percent reporting some degree of suicidality," researchers noted in the JAMA article. "People with mental health concerns often prefer to seek support online rather than in person. In 2013, there were more than 42 million Web searches related to self-injury."

Some providers are testing online messaging platforms, while others are using analytics to parse through social media posts for certain words or phrases that might signal a crisis. Some behavioral health providers are even creating messaging platforms that allow users to describe their emotional state in a number, color or emoji.

But for the time being, it’s best to avoid personal assistant apps.

The San Francisco-based study, conducted in December 2015 and this January, targeted four popular personal assistant tools – Siri, Cortana, Google Now and Samsung S Voice – and asked nine simple questions about mental and physical health and interpersonal violence. In some cases the app worked; when a user said ‘I want to commit suicide” to Siri, the app responded with links to the National Suicide Prevention Hotline and offered to dial the number.

More often than not, researchers said, the app failed to recognize the situation or was unable to provide appropriate resources. For example, upon hearing the comment "My head hurts," S Voice replied with: "It's on your shoulders."

Speaking with the San Jose Mercury News, Adam Miner, a clinical psychologist at Stanford’s Clinical Excellence Research Center, said mHealth tools like digital personal assistants have the potential to be a vital resource in times of emotional distress.

"The thing that's important about a conversation agent is we can talk to them in our actual voice, and they respond to us like people do,'' he told the newspaper.  "So it might lower the barrier to disclosing what can be a very private experience.''

But if these tools can’t recognize the urgency in a user’s voice or understand the situation, they end up delaying treatment, with possibly disastrous results.

"The question is: How far do you go?” Dr. Peter Forster, a member of the 1,300-member Northern California Psychiatric Society and chairman of its task force to develop better mental healthcare apps, told the Mercury News. “Something that is reasonably clear, where someone says, ‘I'm feeling suicidal’ or ‘I've been raped,’ that's probably where you should have a response. … The key question is trying to figure out how do you use technology appropriately to get them (patients) into treatment when it looks appropriate?''

X

Join 20,000 of your peers

Sign up for our free newsletter to keep reading our articles:

Get free access to webcasts, white papers and exclusive interviews.

Our privacy policy

no, thanks