Tech

Facebook turns to A.I. to help prevent suicides

Key Points
  • Facebook is looking to use artificial intelligence (A.I.) to help detect suicidal thoughts and feelings.
  • The move is part of the social media giant's ongoing effort to "help build a safe community on and off Facebook."
  • It plans to use pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide.
Dado Ruvic | Getty Images

Facebook is looking to use artificial intelligence (AI) to help detect suicidal thoughts and feelings.

The move is part of the social media giant's ongoing effort to "help build a safe community on and off Facebook."

Facebook said in an official blogpost Monday that it was adopting "proactive detection efforts" to help people who are expressing thoughts of suicide on the platform, including the use of pattern recognition to detect posts or live videos where suicidal thoughts are being expressed.

Facebook said that the additional work it was doing included:

  • Using pattern recognition to detect posts or live videos where someone might be expressing thoughts of suicide, and to help respond to reports faster
  • Improving how Facebook identifies appropriate first responders
  • Dedicating more reviewers from the platform's Community Operations team, which includes a team of specialists who have specific mental health training, to review reports of suicide and self-harm.

The development comes after several high-profile suicides and suicide attempts have been broadcast on Facebook Live. Multiple other social media platforms have also been used to either livestream suicides or express suicidal thoughts.

Facebook has previously asked its users to report to it any suicidal content they find, but the latest efforts go much further by using AI to detect suicidal thoughts before users have to proactively report their concerns to the tech company.

Commenting on the work, Guy Rosen, Facebook's vice president of product management, said that "when someone is expressing thoughts of suicide, it's important to get them help as quickly as possible.

"Facebook is a place where friends and family are already connected and we are able to help connect a person in distress with people who can support them. It's part of our ongoing effort to help build a safe community on and off Facebook."

Facebook portal to allow users to see if they liked Russian pages
VIDEO1:5001:50
Facebook portal to allow users to see if they liked Russian pages

Rosen said that the company was "starting to roll out artificial intelligence outside the U.S. to help identify when someone might be expressing thoughts of suicide, including on Facebook Live."

"This approach uses pattern recognition technology to help identify posts and live streams as likely to be expressing thoughts of suicide. We continue to work on this technology to increase accuracy and avoid false positives before our team reviews."

This will eventually be available worldwide, except in the EU, he said. CNBC has asked Facebook for comment on why the EU will be exempt from the use of AI, but there's some speculation that it's related to complexities surrounding data protection regulation and privacy laws. Facebook would not confirm this however, when contacted by CNBC, saying that it regularly talks with experts and other stakeholders in Europe and understands that it is a sensitive issue.

Rosen said the company — for example, comments like "Are you OK?" and "Can I help?" can be strong indicators. In some instances, Facebook has found that the technology has identified videos that may have gone unreported, it said.

The developments were part of Facebook's "ongoing commitment to suicide prevention," Rosen said, adding that the network already had dedicated teams working 24/7 around the world "who review reports (of concerns of suicide and self-harm) that come in and prioritize the most serious reports."

He said the company had been working on suicide prevention tools for more than 10 years and that its latest approach was developed in collaboration with mental health organizations — such as Save.org, National Suicide Prevention Lifeline and Forefront Suicide Prevention — and with input from people who have had personal experience thinking about or attempting suicide.