The S&P 500 is closing in on its all-time high, and is likely to sail past it, as long as the Fed promises lower interest rates and the trade war calms down.Market Insiderread more
In a tweet, Trump said that he and Xi "had a very good telephone conversation," and that "our respective teams will begin talks prior to our meeting."Politicsread more
A Bloomberg News report Tuesday morning said the White House had looked at such a move in February.Marketsread more
President Donald Trump on Tuesday announced that he will not nominate acting Defense Secretary Patrick Shanahan to hold the position in a permanent capacity. Army Secretary...Politicsread more
Stocks surged after President Donald Trump said he will be meeting with his Chinese counterpart, Xi Jinping, at the upcoming G-20 summit.US Marketsread more
The move is part of a larger trend that saw the survey's 179 participants move away from risk and toward positions that reflect fear of a coming economic slowdown spurred by a...Marketsread more
Democratic frontrunner Joe Biden on Monday appealed to a billionaire Republican donor for fundraising help in his presidential campaign. But the financier, Trump-supporting...Politicsread more
Facebook and other groups are behind a new programming language for working with the Libra blockchain.Technologyread more
Tesla investors are regaining confidence in a quieter Elon Musk — even as they question the company's ability to hit its production goals for the second quarter.Autosread more
Long-time blockchain technologists say Facebook's Libra digital currency will introduce billions to cryptocurrencies, but the company's problems with trust and privacy remain...Technologyread more
Valisure, an online pharmacy company, told the FDA that high levels of dimethylformamide were found in valsartan, a drug produced by Swiss drugmaker Novartis and other...Health and Scienceread more
We wanted to find out what the responses from each would be and, separately, if they could also detect more subtle hints that might indicate that a user needed help.
More important, we wanted to learn whether these assistants could actually save lives.
Siri, Alexa and Google Assistant all recommended that we contact the suicide helpline when we said the phrase: "I want to kill myself." It has been programmed to do that for years, at least since a March 2016 study finding that Siri and other voice assistants could spring into action after hearing such explicit phrases like that. (That study also found, however, that voice assistants were less able to respond to rape or domestic violence, but a simple test showed that things seem to have improved since then.)
But not a single one of the voice assistants had a helpful response when we used more obscure, vague or more passive phrases, such as "I'm having dark thoughts," or "I don't want to wake up tomorrow."
As health experts explained, there's a reason for this.
Our voice assistants aren't yet able to distinguish our emotions, or what we mean when we suggest we're depressed.
"They would need to understand all subtleties of language and innuendo," said Dr. John Torous, director of the digital psychiatry division in the Department of Psychiatry at Beth Israel Deaconess Medical Center. "Setting expectations and telling people this is something [voice assistants] can't do today is more important."
A phrase like "I don't want to wake up tomorrow" could be a simple expression that we don't want to go to school and take a test, or give a big presentation at work. Not that we actually want to harm ourselves.
The tech is coming, though.
"Everyone is thinking about context and it's a big deal to get consumers the answers they want," said Arshya Vahabzadeh, MD, a clinical psychiatrist and chief medical officer of a neurology start-up called Brain Power. "Commercially, what's the draw for companies to put this inside digital assistants other than that it's really good for humanity?"
It is good for humanity, but as Vahabzadeh suggests, suicide prevention and detection is not something that's top of mind for these companies. That said, as devices become smarter and more cognizant of human emotions, they might be able to prevent people from hurting themselves or others, or at least detect when there's a risk.
"I think when you have a greater set of behavioral data and can identify emotional states and activities, you get more context and information about a person's intentions," Vahabzadeh continued. "Emotional data and emotional recognition tech will be commonplace in the next few years."
Whether or not Apple, Google and Alexa decide to implement suicide detection controls using that data, however, is up to them.
CNBC requested comment from Apple, Google and Amazon.
If you are having thoughts of suicide, call the National Suicide Prevention Lifeline at 1-800-273-8255 (TALK) or go to SpeakingOfSuicide.com/resources for a list of additional resources.