Facebook Turns to AI for Assistance in Suicide-Prevention Efforts

Facebook continues to ramp up its suicide-prevention efforts by both man and machine

The reporting process for concerned friends Facebook

Facebook continues to ramp up its suicide-prevention efforts by both man and machine.

Vice president of product management Guy Rosen announced in a Newsroom post that Facebook will:

  • Use pattern recognition to detect posts or Facebook Live videos in which people may be expressing suicidal thoughts and to more quickly respond to reports.
  • Improving how it identifies appropriate first responders.
  • Dedicating more reviewers from its community operations team to review reports of suicide or self-harm.

Rosen said Facebook has worked with first responders on more than 100 wellness checks over the past month spurred by reports it received via its “proactive detection efforts,” adding, “We also use pattern recognition to help accelerate the most concerning reports. We’ve found that these accelerated reports—that we have signaled require immediate attention—are escalated to local authorities twice as quickly as other reports. We are committed to continuing to invest in pattern recognition technology to better serve our community.”

He wrote that Facebook’s pattern recognition technology uses signals such as comments like “Are you OK?” and “Can I help?,” adding that the social network is beginning to roll out these artificial intelligence capabilities outside of the U.S., and it will eventually be available globally, except for the European Union.

AI is also being used to “prioritize the order in which our team reviews reported posts, videos and live streams,” enabling Facebook to get the right resources to those in distress and to more quickly alert first responders.

Rosen wrote, “Context is critical for our review teams, so we have developed ways to enhance our tools to get people help as quickly as possible. For example, our reviewers can quickly identify which points within a video receive increased levels of comments, Reactions and reports from people on Facebook. Tools like these help reviewers understand whether someone may be in distress and get them help.”

Finally, he said of options available to concerned users, “Already on Facebook if someone posts something that makes you concerned about their well-being, you can reach out to them directly or report the post to us. We have teams working around the world, 24/7, who review reports that come in and prioritize the most serious reports. We provide people with a number of support options, such as the option to reach out to a friend, and we even offer suggested text templates. We also suggest contacting a help line and offer other tips and resources for people to help themselves in that moment.”

david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.