WORLD > NEWS

Meta, Snap, Tiktok Launch Initiative To Combat Suicide, Self-Harm Content

13/09/2024 06:04 PM

WASHINGTON, Sept 13 (Bernama-UPI) -- Three of the biggest social media platforms are teaming up to address online content that features suicide and self harm, Meta announced on Thursday, according to a report by United Press International (UPI).

Meta, the owner Facebook, Instagram, and WhatsApp, has teamed up with Snap (the company that developed and maintains Snapchat), and TikTok to form an initiative called Thrive. The programme aims to destigmatise mental health issues and work to slow the viral spread of online content featuring suicide or self-harm, Meta said in a blog post.

"Suicide and self-harm are complex mental health issues that can have devastating consequences," Meta stated in its release.

"We're prioritising this content because of its propensity to spread across different platforms quickly," Antigone Davis, Meta's global head of safety, wrote in the post.

"These initial signals represent content only, and will not include identifiable information about any accounts or individuals."

The initiative was formed in conjunction with The Mental Health Coalition, a group of mental health organisations working to destigmatise these issues.

Meta, Snap, and TikTok will share tips with each other, or "signals", allowing them to compare notes, investigate, and take steps if similar content appears on other apps. Thrive will serve as a database that all the participating social media companies will have access to.

Meta is using technology developed by Lantern, a company designed to make technology safe for minors. Amazon, Apple, Google, Discord, OpenAI and others are part of the coalition. 

The social media companies will be responsible for reviewing and taking necessary action through Thrive, and for writing a yearly report to measure the programme's impact.

Meta said when a content featuring self harm or suicide is identified, it will be given a number, or a "hash", which can then be cross checked by the other social media companies, look for the content, and remove it.

Increased social media use by minors has caused a spike in depression and suicidal behaviour, the Mental Health Coalition said. Research also suggests that young people who self harm are more active on social media.

Earlier this year, Meta announced that it would begin removing and limiting sensitive content deemed to be "age-inappropriate" from teenagers' feeds on its apps.

The company said it had plans to hide search results and terms relating to suicide, self harm, and eating disorders for all users.

Meta, TikTok, Snapchat and other social media platforms have long been criticised for failing to remove content deemed harmful to teens, including videos and images of self-harm.

-- BERNAMA-UPI  

 


BERNAMA provides up-to-date authentic and comprehensive news and information which are disseminated via BERNAMA Wires; www.bernama.com; BERNAMA TV on Astro 502, unifi TV 631 and MYTV 121 channels and BERNAMA Radio on FM93.9 (Klang Valley), FM107.5 (Johor Bahru), FM107.9 (Kota Kinabalu) and FM100.9 (Kuching) frequencies.

Follow us on social media :
Facebook : @bernamaofficial, @bernamatv, @bernamaradio
Twitter : @bernama.com, @BernamaTV, @bernamaradio
Instagram : @bernamaofficial, @bernamatvofficial, @bernamaradioofficial
TikTok : @bernamaofficial

© 2024 BERNAMA   • Disclaimer   • Privacy Policy   • Security Policy