- mHealth providers are slowly coming to the conclusion that the human voice can be a powerful digital health tool.
For years the concept was championed by the likes of Nuance, whose natural language processing (NLP) technology focused on allowing clinicians to dictate notes and directions into a laptop, tablet or even a smartphone. But with the emergence of personal digital assistants like Cortana and Alexa, innovators are looking to the home healthcare space and even the patient’s hospital room as the next launchpad for these platforms.
“It really changes the game for patient engagement,” says Nathan Treloar, president and COO of Orbita, a Boston-based provider of connected home healthcare technology that debuted a cloud-based platform and interface for intelligent voice assistants at this year’s Health Information and Management Systems Society (HIMSS) conference and exhibit in Orlando.
“There are huge implications from a productivity standpoint,” adds Brad Brooks, co-founder and CEO of TigerText, which demonstrated its Roles scheduling automation tool on Amazon’s Echo at HIMSS17.
Brooks, who cautions the technology is still in its early stages, envisions a day when a communications tool like TigerText can be integrated with a platform like Alexa or Cortana to provide a real-time link between the patient and his or her care team.
“Through this integration, patients can verbally call for a nurse, request bathroom assistance, express meal preferences and more all from the bedside by simply asking Alexa,” the company announced in a press release prior to HIMSS17. “The system even provides verbal confirmation for patient requests along with a time estimate for completion.”
“Behind the scenes, the request is instantly and intelligently routed as a TigerText message to the appropriate care team Role owner in a single step, expediting fulfillment and minimizing staff interruptions,” the release continued. “And the benefits are not just for patients – clinical staff can request vital signs, language translations, MRIs and more knowing the information will be instantly available on their smartphone through TigerText.”
Brooks says the healthcare space is filled with “messaging repositories,” or bits and pieces of information that need to get from one location to another but are hampered by electronic medical record protocols and other devices that don’t integrate easily. The challenge lies in creating a standardized framework that smooths the path for these messages.
“How do you take that information and get it to the point of care to make it actionable?” he says.
Taken a step further, imagine a system that not only allows a patient to talk to a care team member, but one that sits in an emergency room or operating room, allowing a clinician to call out a request for a patient’s data on the nearest monitor, take a reading, summon a nurse or specialist or just dim the lights and adjust the room temperature.
“This can be a stepping-off point to a clinical communication platform,” says Brooks.
That type of promise is what motivated Amazon, Google and other tech companies to develop the platform for the consumer space – and it has given rise to several splashy TV commercials demonstrating just how these digital assistants can make home life easier.
It has also fueled the dreams of companies like Orbita, which sees these platforms evolving into a digital home health aide.
“The ability to have that ‘always on and always available’ interaction with a patient is huge,” says Treloar, a former Microsoft executive who gave a presentation at HIMSS17 titled ‘Alexa, Can You Hear Me Now? Digital Voice Agents for Home Health.’ “It will give [healthcare providers] new visibility into what the patient is experiencing at home.”
“This isn’t just a device that we’re going to attach to you to monitor you – we don’t need another monitoring device in the home,” he says. “We’re looking at the whole health experience, and a way to connect experiences and close that last mile of communication.”
For example, he says, a person can tell Cortana or Alexa (or any other similar device) “I’m in pain.” The platform could access the speaker’s electronic health record to determine what might be causing that pain (a chronic condition recent surgery, etc.), or check the speaker’s medication history, or check the latest vital signs, even scan for environmental factors. It could connect with a care team member – family member or nurse – for a consult.
In the future, Treloar says, the system could integrate with an artificial intelligence platform that might one day offer treatment advice on its own.
“The degree to which these voice assistants can take action is going to evolve,” says Treloar.
The journey isn’t without its challenges. Digital assistants were all the rage at this year’s Consumer Electronics Show in Las Vegas, but many were cautioning that it won’t be easy to create an interface that accurately captures a conversation. And when you’re talking about healthcare, there’s no room for error.
Last year researchers at Stanford and University of California San Francisco studied four widely used personal assistant platforms for their ability to recognize emotional distress and provide the right information. The study, published in the Journal of the American Medical Association (JAMA), found that the platforms weren’t able to process a conversation that well – for example, Samsung’s S Voice, when hearing someone say “My heads hurts,” responded with “It’s on your shoulders.”
"The thing that's important about a conversation agent is we can talk to them in our actual voice, and they respond to us like people do,'' Adam Miner, a clinical psychologist at Stanford’s Clinical Excellence Research Center, said when the study was published. "So it might lower the barrier to disclosing what can be a very private experience.''
Since then, those capabilities have been improved. Cortana now offers the number for the National Domestic Violence Hotline when a user says “I am being abused,” and Siri references the National Sexual Assault Hotline when a user mentions rape and the National Suicide Prevention Hotline for references to suicide.