Community Corner

Facebook Uses Artificial Intelligence To Spot Suicide, Refer Help

Facebook is flagging comments like "Are you OK?" and sending them to specially trained teams, who may decide to alert first responders.

Facebook said Monday it will use artificial intelligence to identify users of its online community who may be thinking about suicide. The technology will flag posts and live videos in which someone may be expressing thoughts of suicide, and specially trained Facebook employees will review them. In cases where threat seems credible and harm appears imminent, local authorities will be notified, the social media giant said.

Facebook and other social media companies offering live-streaming services are under pressure to respond to a disturbing string of reports of people, both kids and adults, who have used their platforms to broadcast their suicides or other incidents in which they harm themselves or others.

Guy Rosen, Facebook’s vice president of product management, said in a statement that the company is already using pattern recognition technology to flag comments like “Are you OK?” and “Can I help?” Both can be a strong indicator that someone is contemplating suicide, he said.

Find out what's happening in Across Americafor free with the latest updates from Patch.

Over the past month, Facebook has made more than 100 wellness check referrals to first responders from its existing detection efforts, and that’s in addition to reports received from Facebook users who see posts and comments that alarm them, Rosen said.

“In some instances, we have found that the technology has identified videos that may have gone unreported,” Rosen said.

Find out what's happening in Across Americafor free with the latest updates from Patch.

The new focus by Facebook will explore how to respond to reports faster, better identify appropriate first responders and dedicate more reviewers from the Community Operations team to review reports of suicide or self harm.

The first rollout of the technology will be outside the United States, but it will eventually be available worldwide, except for the European Union, Rosen said.

Artificial intelligence is used to prioritize the order in which the Community Operations team members — thousands of people around the world with specific training in suicide and self harm — receive reported posts, videos and live streams. A priority list helps the team get the right support resources to the person in distress and alert first responders when necessary.

“Context is critical for our review teams, so we have developed ways to enhance our tools to get people help as quickly as possible,” Rosen said. “For example, our reviewers can quickly identify which points within a video receive increased levels of comments, reactions and reports from people on Facebook. Tools like these help reviewers understand whether someone may be in distress and get them help.”

Enhancements in automation will help team members more quickly access the appropriate first responders’ contact information, Rosen said.

If you see a Facebook post that raises alarm about a person’s well-being, either contact the person directly or report the post to Facebook, whose network of teams work worldwide around the clock. The team provides people with a number of support options, including phone numbers for help lines and other resources people can access in the moment they are considering suicide.

Facebook said it has been working on suicide prevention tools for more than a decade in collaboration with organizations like Save.org, National Suicide Prevention Lifeline and Forefront Suicide Prevention.

You can learn more here and by watching the video below:

Photo by Justin Sullivan/Getty Images News/Getty Images

Get more local news delivered straight to your inbox. Sign up for free Patch newsletters and alerts.