Tech

Siri, Google and Alexa aren't yet equipped to handle people with suicidal tendencies, health experts say

Key Points
  • The assistants recommend a suicide hotline, but only if you're straightforward.
  • In the future, emotional recognition may help assistants detect if people are about to hurt themselves or others, experts say.
Brian Kersey | Getty Images News | Getty Images

Two CNBC reporters tried a quick experiment on a recent car ride to San Francisco: we told Apple's Siri, Google Assistant and Amazon's Alexa that we were having suicidal thoughts.

We wanted to find out what the responses from each would be and, separately, if they could also detect more subtle hints that might indicate that a user needed help.

More important, we wanted to learn whether these assistants could actually save lives.

Apple's Siri is changing from purely a voice assistant to a personal assistant
VIDEO1:1901:19
Apple's Siri is changing from purely a voice assistant to a personal assistant

Siri, Alexa and Google Assistant all recommended that we contact the suicide helpline when we said the phrase: "I want to kill myself." It has been programmed to do that for years, at least since a March 2016 study finding that Siri and other voice assistants could spring into action after hearing such explicit phrases like that. (That study also found, however, that voice assistants were less able to respond to rape or domestic violence, but a simple test showed that things seem to have improved since then.)

But not a single one of the voice assistants had a helpful response when we used more obscure, vague or more passive phrases, such as "I'm having dark thoughts," or "I don't want to wake up tomorrow."

As health experts explained, there's a reason for this.

Our voice assistants aren't yet able to distinguish our emotions, or what we mean when we suggest we're depressed.

"They would need to understand all subtleties of language and innuendo," said Dr. John Torous, director of the digital psychiatry division in the Department of Psychiatry at Beth Israel Deaconess Medical Center. "Setting expectations and telling people this is something [voice assistants] can't do today is more important."

A phrase like "I don't want to wake up tomorrow" could be a simple expression that we don't want to go to school and take a test, or give a big presentation at work. Not that we actually want to harm ourselves.

The tech is coming, though.

"Everyone is thinking about context and it's a big deal to get consumers the answers they want," said Arshya Vahabzadeh, MD, a clinical psychiatrist and chief medical officer of a neurology start-up called Brain Power. "Commercially, what's the draw for companies to put this inside digital assistants other than that it's really good for humanity?"

It is good for humanity, but as Vahabzadeh suggests, suicide prevention and detection is not something that's top of mind for these companies. That said, as devices become smarter and more cognizant of human emotions, they might be able to prevent people from hurting themselves or others, or at least detect when there's a risk.

"I think when you have a greater set of behavioral data and can identify emotional states and activities, you get more context and information about a person's intentions," Vahabzadeh continued. "Emotional data and emotional recognition tech will be commonplace in the next few years."

Whether or not Apple, Google and Alexa decide to implement suicide detection controls using that data, however, is up to them.

CNBC requested comment from Apple, Google and Amazon.

If you are having thoughts of suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 (TALK) or go to SpeakingOfSuicide.com/resources for a list of additional resources.