The words are nice the way they sound

Affective computing and sentiment analysis can help clinicians read between the lines, allowing them to detect patients’ unexpressed feelings and subtle emotional cues that may signal subclinical disease—and much more.

By Paul Cerrato, MA, senior research analyst and communications specialist, Mayo Clinic Platform and John Halamka, M.D., president, Mayo Clinic Platform

The lyrics of a James Taylor song were the last place we expected to find insights on the benefits of affective computing and sentiment analysis, technologies that have the potential to reinvent the way we address patients’ behavioral health issues. The words to Something in the Way She Moves speak for themselves:

"It isn't what she's got to say
But how she thinks and where she's been
To me, the words are nice the way they sound
I like to hear them best that way
It doesn't much matter what they mean
She says them mostly just to calm me down"

Why are her words nice they way they sound, regardless of their literal meaning? Because Taylor senses the feelings behind the words, positive emotions directed at him. Such subtle cues can also help clinicians identify negative emotions that increase patients’ risk of a variety of disorders. In fact, with the assistance of natural language processing and affective computing—sometimes called emotion AI - researchers are already learning to “read between the lines.”

Elad Maor, M.D., Ph.D., with the Mayo Clinic Department of Cardiovascular Medicine, and his colleagues have analyzed voice samples from about 100 patients who underwent coronary angiograms, asking them to read text excerpts into their smartphones and respond to questions about positive and negative emotional experiences. Their recorded responses found subtle differences in vocal pitch and intensity between patients who were ultimately diagnosed with heart disease and normal controls Dr. Maor and his associates concluded: “One possible explanation for our interesting finding is the documented association between mental stress, the adrenergic system, and voice.” They explain further: ”Emotional stress conditions change the human voice, including an increase in fundamental frequency. . . . [O]ne possible hypothesis to interpret our findings is that the association between voice and atherosclerosis is mediated by hypersensitivity of the adrenergic system to stress. The association between stress, the adrenergic system, and atherosclerosis is well established on the basis of robust data.”

Dr. Moar is one of several pioneers in this work, which is being explored by specialists from MIT. They describe emotion AI as a subset of AI that “measures, understands, simulates, and reaches to human emotions.” Also called artificial emotional intelligence, its responsible use has the potential to improve clinician/patient interactions and help spot mental health crisis before they escalate out of control. The U.S. Department of Defense and the Department of Veterans Affairs, for example, have launched a study to evaluate the ability of a Emotion AI program called CompanionMx to detect suicide risk among active duty naval personnel, using a smartphone to monitor voice analytics and passive monitoring of other smartphone metadata.

While affective computing focuses on the emotional content of people’s words, sentiment analysis more generally refers to opinion mining. The Google/Oxford Dictionary defines it as “the process of computationally identifying and categorizing opinions expressed in a piece of text, especially in order to determine whether the writer's attitude towards a particular topic, product, etc. is positive, negative, or neutral.” These analyses use a variety of digital tools to glean insights into the public’s opinions and health problems in Tweets, photos, videos, and a variety of other sources. One Australian study, for example, that looked at people’s tweets found that asthma outbreaks after a thunderstorm could be predicted up to nice hours before the outbreaks were reported in official reports.

As we point out above, it’s critically important to use affective computing and sentiment analysis responsibly. It's not hard to imagine misuse of these tools to manipulate others. Equally important is the potential for many of these tools to put marginalized members of the community at a disadvantage. There’s already evidence to show that facial recognition systems are less accurate when used in non-white, non-male faces. A report in Scientific American called attention to one study in which professional basketball players ascribed more negative emotions to Black players than White—one of many studies that show AI related data sets can be seriously flawed and biased.

We are living in stressful times. Reports of domestic violence, drug overdose deaths, and psychiatric illness have increased among both clinicians and patients. Affective computing and sentiment analysis may be developed to serve as additional tools to help us all make it through this difficult journey.

Recent Posts