Facebook Introduces New Suicide Prevention Tools

Facebook introduced new suicide prevention tools to help users who may be thinking about suicide, as well as their friends and family

Facebook introduced new suicide prevention tools and resources, aimed at helping users who may be thinking about suicide, as well as their friends and family members.

Facebook already offers a number of suicide prevention tools, such as allowing users to report concerning posts. Now, the platform’s suicide prevention tools have been integrated into Facebook Live.

With this release, when users are watching a Facebook Live video, and the broadcaster seems to need help, viewers will have the option to report the video for its connection to “suicide or self-injury.” The reporter will receive a link to resources that can help them assist the broadcaster, and the broadcaster will see a prompt on their own screen, which will encourage them to reach out to a friend, contact a helpline or see tips.

Elsewhere, Facebook is now testing the ability for users to connect with its “crisis support partners” in real-time through Facebook Messenger, directly from each organization’s Facebook page, or Facebook’s suicide prevention tools. These organizations include Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline. In a blog post, Facebook said this test “will expand over the next few months.”

Finally, Facebook said it is testing a “streamlined” reporting process for posts related to suicide, which, in part, will work to identify posts that are likely to include thoughts of suicide, even if they aren’t reported by another user.

The blog post explained:

  • Based on feedback from experts, we are testing a streamlined reporting process using pattern recognition in posts previously reported for suicide. This artificial intelligence approach will make the option to report a post about “suicide or self injury” more prominent for potentially concerning posts like these.
  • We’re also testing pattern recognition to identify posts as very likely to include thoughts of suicide. Our Community Operations team will review these posts and, if appropriate, provide resources to the person who posted the content, even if someone on Facebook has not reported it yet.
  • We are starting this limited test in the US and will continue working closely with suicide prevention experts to understand other ways we can use technology to help provide support.

If you’d like to learn more about Facebook’s safety resources, visit Facebook’s Safety Center.