We communicate in many ways, not just with our words. But nonverbal signals are often missed in conversation. Now, social scientists have found a way to train machines to spot the linguistic tics that show psychological distress.
John Pestian, a psychiatrist and expert in biomedical informatics at Cincinnati Children’s Hospital Medical Center, developed an app that helps mental health experts detect depression and suicidal thoughts in teen speech. The technology relies on an understanding of distress patterns that Pestian’s team has gleaned through research on speech and psychology, looking for what he calls “thought markers.”
Thought markers indicate a person’s state of mind as expressed vocally and acoustically. For example, “vowel space” refers to ways of pronouncing and articulating words that render speech more or less garbled. Reduced vowel space, a thought marker signaling depression, makes speech sound less intelligible, articulated, or clear.
Because depression is known to influence motor control function and especially speech production, Pestian investigated whether a machine could spot distinct mental illnesses based on vowel space frequency patterns. He recorded over 300 individuals, which the technology diagnosed based on known patterns, noting significantly reduced clarity in the speech of people with depression, PTSD, and suicidal tendencies.
Scanning for nonverbal cues of depression can help adults detect the signs of serious psychological distress in teenagers with the app that Pestian developed called SAM, or Spreading Activation Mobile. And data suggest that American teens need the help. Though teenage suicide rates have been declining in the US and worldwide, according to the most recent OECD data, incidents of teen depression in the US have not. A recent study in the journal Pediatrics found a dramatic rise in incidents of depression among US teens, climbing by 31% between 2005 and 2014 based on data from the National Surveys on Drug Use and Health.
Pestian hopes to prevent self-inflicted deaths in youths by detecting psychological signs of distress that go unnoticed in human exchanges with computational analysis. SAM’s analytic abilities were tested on the vocal characteristics of 379 subjects. The technology correctly classified subjects into one of three groups—suicidal, mentally ill but not suicidal, or controls—with up to 85% accuracy.
In a clinical setting, the app will work by recording a teen’s conversation during counseling sessions, then scanning for thought markers signaling suicide. It’s searching for the very things not commonly picked up on or understood in conversation, like vocal intensity, speech rates, and voice fluctuations.
The platform also detects the more subtle differences between indications of angst and genuine psychological distress. Pestian is working now on extending the algorithmic analysis to visual cues, collecting video data on gazing; a tendency to avert eye contact can indicate psychological distress, he says.
But machines can only go so far in addressing the problem, the psychiatrist notes, even if they pick up on subtle clues humans miss. “The technology is not going to stop the suicide, the technology can only say: We have an issue over here. Then we have to intervene and get a path to get to care.”