NEWS

New Tech Can Predict Depression in Conversation

Amie Sparrow
By Amie Sparrow,
updated on Aug 31, 2018

New Tech Can Predict Depression in Conversation

Researchers at the Massachusetts Institute of Technology have developed a model that can detect words and intonations related to depression

The “machine learning model” can analyse text and audio data from interviews to look for speech patterns associated with depression; while previous models could only predict depression based on specific answers to specific questions. The new model doesn’t need specific questions and answers, but instead could be used to potentially detect signs of depression in natural conversation.

The technology could one day lead to a mobile app that can monitor a person’s text and voice for signs of mental distress and potentially send alerts if a person is in crisis. Researchers think the tool could be useful especially for people who can’t get an initial diagnosis - due to factors like cost, distance or lack of awareness that something may be wrong.

“The first hints we have that a person is happy, excited, sad, or has some serious cognitive condition, such as depression, is through their speech,” says first author of the study, Tuka Alhanai, a researcher in the Computer Science and Artificial Intelligence Laboratory (CSAIL).

“If you want to deploy (depression-detection) models in scalable way … you want to minimize the amount of constraints you have on the data you’re using. You want to deploy it in any regular conversation and have the model pick up, from the natural interaction, the state of the individual.”

Researchers gave the model text and audio data from depressed and non-depressed people, one by one, and the model extracted speech patterns from the people with and without depression. Words such as “sad”, “low” and “down” were paired with audio signals that are flatter and more monotone. Some of the people with depression were also shown to speak slower and use longer pauses between words, for example.

“The model sees sequences of words or speaking style, and determines that these patterns are more likely to be seen in people who are depressed or not depressed,” Alhanai says. “Then, if it sees the same sequences in new subjects, it can predict if they’re depressed too.”

Using a technique called sequence modelling, the model looked at the conversation as a whole and noted the differences between how people with and without depression speak over time.

In the future, researchers aim to test these methods with data from people with other cognitive conditions, such as dementia. “It’s not so much detecting depression, but it’s a similar concept of evaluating, from an everyday signal in speech, if someone has cognitive impairment or not,” Alhanai says.

Previous research from the University of Vermont and Harvard University showed that Instagram can diagnose depression better than doctors, by using an algorithm that flags key signs in users’ posts. That algorithm, used by scientists, accurately identified depression 70% of the time, compared to just 42% by US doctors.


Photo by rawpixel on Unsplash

Amie Sparrow

By Amie Sparrow

Amie is a contributing writer for Happiful and PR Manager for Happiful and Memiah.

Join 100,000+ subscribers

Stay in the loop with everything Happiful

We care about your data, read our privacy policy
Our Vision

We’re on a mission to create a healthier, happier, more sustainable society.