Meta Platforms Inc.

09/12/2024 | Press release | Distributed by Public on 09/12/2024 05:02

Preventing Suicide and Self-Harm Content Spreading Online

Meta

Preventing Suicide and Self-Harm Content Spreading Online

September 12, 2024September 11, 2024
By Antigone Davis

Takeaways

  • We've worked with the Mental Health Coalition to establish Thrive, a new program that allows tech companies to share signals about violating suicide or self-harm content and stop it spreading across different platforms.
  • We're providing the technical infrastructure behind Thrive, which allows signals to be shared securely.
  • We're proud to be a founding member of Thrive alongside Snap and TikTok, and encourage others in the industry to join this important work.

Suicide and self-harm are complex mental health issues that can have devastating consequences. At Meta, we've worked with experts for years - including our suicide and self-harm advisory group and members of Meta's own safety teams - to develop an informed and thoughtful approach to suicide and self harm content shared on our apps.

Yet, like many other types of potentially problematic content, suicide and self-harm content is not limited to any one platform. To be truly effective in responding to this content, tech companies need to work together. That's why we've worked with the Mental Health Coalition to establish Thrive, the first signal sharing program to share signals about violating suicide and self-harm content.

Through Thrive, participating tech companies will be able to share signals about violating suicide or self-harm content so that other companies can investigate and take action if the same or similar content is being shared on their platforms. Meta is providing the technical infrastructure that underpins Thrive - the same technology we provide to the Tech Coalition's Lantern program- which enables signals to be shared securely.

Participating companies will start by sharing hashes - numerical codes that correspond to violating content - of images and videos showing graphic suicide and self-harm, and of content depicting or encouraging viral suicide or self-harm challenges. We're prioritizing this content because of its propensity to spread across different platforms quickly. These initial signals represent content only, and will not include identifiable information about any accounts or individuals.

Thrive builds on the work we already do at Meta to remove harmful content that shows graphic imagery or encourages suicide or self-harm, while still giving space for people to talk about their own experiences. We also provide support to those sharing and searching for content related to suicide or self-harm by connecting them to local organizations around the world, including Suicide and Crisis Lifeline and Crisis Text Line in the US.

Between April and June this year, we took actionon over 12 million pieces of suicide and self-harm content on Facebook and Instagram. While we allow people to discuss their experiences with suicide and self-harm - as long as it's not graphic or promotional - this year we've taken important stepsto make this content harder to find in Search and to hide it completely from teens, even if it's shared by someone they follow.

Thrive will help keep people safe not just on Meta's apps, but across all the apps and services they use. We're proud to work with the Mental Health Coalition and our industry partners Snap and TikTok on this vital work.

Related News

Meta

New Protections to Give Teens More Age-Appropriate Experiences on Our Apps

We're adding new policies and settings to help keep teens safe and limit the sensitive content they see on Instagram and Facebook.
January 9, 2024January 11, 2024