- mHealth sensors may soon give whole new meaning to wearing your heart on your sleeve.
New technology platforms are moving beyond simple biometric data collection to measure emotions, giving healthcare providers a window into a patient’s feelings and helping to treat everything from behavioral disorders to medication adherence.
“We’ve gotten very good at quantifying the physical self,” Gaby Zijderveld, chief marketing officer for Affectiva, said during a panel session at this week’s Connected Health Symposium in Boston. It’s time, she said, to work on quantifying the emotional self.
Affectiva, an MIT spinoff that uses facial recognition software to catalogue and classify moods, is one of a handful of companies looking to break into this field, which has seen success in marketing and advertising and is just now finding its healthcare footing. Focusing solely on software that measures a user’s facial reactions, Zijderveld says the technology could someday help both doctors and their patients better understand their feelings and communicate with each other.
While the applications in healthcare are still new, they’re attracting attention. Therapists and mental health providers might see value in detecting emotions in patients who might not realize or want to convey what they’re feeling. Pediatricians might use the platform in their work with children with autism or ADHD. Doctors could use the technology to help patients dealing with stress, depression, substance abuse issues, even sleep management. And the pharmaceutical industry might use it to study how patients react to a certain medication or medication therapy.
“You can really use this in the medical domain – you really can,” says Mary Czerwinski, a research manager for Microsoft, who’s “used just about every sensor that you can possibly imagine” in her research.
That includes keyboards that can measure the stress in a user’s typing patterns, smart clothing and wearables that pick up on heart rhythms, perspiration and breathing, voice-detection software that measures stress in a user’s speech, even clothing and glasses that measure stress caused by seasonal affective disorder and emit ambient light to combat the conditions.
The technology, Czerwinski notes, isn’t just for recognizing moods, but dealing with them.
“You can’t just tell (someone) they’re stressed without giving them some positive skills to cope with it,” she said.
And then there’s Sensoree, whose software-embedded clothing measures emotion and translates it into colors. Kristin Neidlinger, the company’s concept design lead, calls the therapeutic biomedia platform “externalized intimacy,” or “extimacy.”
Sensoree produces what Neidlinger calls the most publicized piece of wearable technology on the market these days, a “mood sweater” that changes color with the wearer’s emotions. Such clothing could help doctors treating people with sensory processing disorder, pediatricians and caregivers for Alzheimer’s patients.
Neidlinger also showed off a headpiece embedded with EKG sensors, called the NeurotiQ, which is being used in brain mapping tests. And she talked about working with inflatable clothing that measures skin sensations.
“Right now I’m working on goosebumps,” she sa
While still in the early stages, mood-detecting technology has the potential to help healthcare providers better understand their patients and develop effective care management plans. Alexandra Drane, co-founder of Engage With Grace and an emcee for Partners Healthcare’s two-day symposium, said such technology can actually pinpoint health crises before they happen, picking up subtle signals from a patient that indicate discomfort.
The primary goal of the platform in healthcare, the panel said, will be to improve collaboration between the patient and the doctor by improving communication. That might come in helping doctors better understand and empathize with what their patients are feeling, or it might help patients themselves in recognizing their moods and dealing with them.