Facebook has announced changes to its policies that are geared toward improving how the social media giant handles suicide and self-injury content.
Updates include a new Suicide Prevention page that features resources for those who need them, or for those who have a friend that is going through a tough time. The latest additions to Facebook’s Safety Center include resources for the National Suicide Prevention Hotline, the Crisis Text Line, the Veteran/Military Crisis Line, and The Trevor Project, which helps LGBTQ youth.
Facebook also said it would no longer allow self-harm images and would make it harder for people to search for suicide- and self-harm-related content on its platform and on Instagram. Changes also include the addition of a sensitivity screen on photos that depict healed self-harm cuts.
The updates come on the same day as World Suicide Prevention Day, which is dedicated to raising suicide awareness.
“Today, on World Suicide Prevention Day, we’re sharing an update on what we’ve learned and some of the steps we’ve taken in the past year, as well as additional actions we’re going to take, to keep people safe on our apps, especially those who are most vulnerable,” wrote Antigone Davis, the global head of safety at Facebook, in a blog post.
Facebook said it encourages those who are having these types of thoughts and feelings to connect to people they care about, and the platform acknowledged its role in being a connector for such difficult conversations.
“Experts have told us that one of the most effective ways to prevent suicide is for people to hear from friends and family who care about them,” the announcement reads. “To help young people safely discuss topics like suicide, we’re enhancing our online resources by including Orygen’s #chatsafe guidelines in Facebook’s Safety Center and in resources on Instagram when someone searches for suicide or self-injury content.”
The platform is also taking a stand against eating disorders by prohibiting content on both Facebook and Instagram that glorifies eating disorders and by providing resources on that topic as well.
These changes are being implemented after Facebook hosted five meetings in 2019 with experts from around the globe to discuss issues like how to deal with suicide notes posted on the platform.
“The tools Facebook is rolling out aim both at people who are expressing suicidal thoughts and also guide concerned friends or family members to resources and alternatives and appropriate interventions,” wrote Anna Chandy, the chairperson of the Live Love Laugh Foundation, on Facebook’s Suicide Prevention Page. “People use