Instagram is cracking down further on content that displays references to self-harm or suicide.
Following the death of 14-year-old Molly Russell who died in 2017 from suicide after being exposed to graphic material on Instagram, the platform has since tried to censor and limit circulating harmful material.
Now, Instagram intends to ban graphic cartoons, drawings and memes that depict suicide, or any other method “promoting” self-harm.
“It will take time to fully implement,” Instagram head Adam Mosseri told BBC News, “but it’s not going to be the last step we take. There is still very clearly more work to do. This work never ends.”
The action that sparked the changes follows protests by father Ian Russell, who blamed Instagram’s algorithms for showing continuous harmful content to his daughter.
In response, Instagram says it has doubled the amount of material removed related to self-harm and suicide since the first quarter of 2019. Between April and June of this year, it claims to have removed 834,000 pieces of content, 77 per cent of which had not been reported by users.
Earlier this year, Instagram also trialed the removal of likes on the platform, with Mosseri stating, “we want people to worry a little bit less about how many likes they’re getting on Instagram and spend a bit more time connecting with the people that they care about.”
Despite these changes, Andy Burrows, the head of child safety online policy at the NSPCC said that the industry as a whole was irresponsible and called on the government to progress legislation that would impose a duty of care on social media.
“The reality is while Instagram has taken positive steps the rest of the tech industry has been slow to respond - on self-harm, suicide and other online harm," Burrows said.