Four of the biggest social media platforms have teamed up with British well-being charity Samaritans to combat harmful content posted online.
This is a new approach designed to weed out and challenge harmful and abusive content shared across the sites. It'll also serve as a push to make platforms like Twitter, Instagram, Facebook, and Snapchat accountable for inappropriate content.
“There is no black and white solution that protects the public from content on self-harm and suicide, as they are such specific and complex issues," said Samaritans chief executive Ruth Sutherland.
"That is why we need to work together with tech platforms to identify and remove harmful content while being extremely mindful that sharing certain content can be an important source of support for some.”
The move comes after the death of 14-year-old Molly Russell, whose death was linked to content that was viewed on her Instagram page.
The UK government has called for platforms like the above to regulated, and platforms like Facebook have failed multiple times to protect its user's data and privacy, as well as from disinformation online.
The government has also sought to impose fines on companies using targeting children with adverts deemed unsuitable. According to a report from KLFM, the partnership between the platforms and Samaritans comes from health secretary Matt Hancock, who spoke about it before a closed-door meeting with the social media giants.
“I want the UK to be the safest place to be online and give parents the confidence to know their children are safe when they use social media," Hancock said.
"As set out in our Online Harms white paper, the government will legislate to tackle harmful content online, but we will also work with social media companies to act now.
"I was very encouraged at our last summit that social media companies agreed normalising or glamourising of eating disorders, suicide and self-harm on social media platforms is never acceptable and the proliferation of this material is causing real harm.”