YOUR ESSENTIAL GUIDE TO THE BUSINESS OF INFLUENCER MARKETING

News

Kid safety crackdowns coming to Facebook, Instagram and more

Kid safety crackdowns coming to Facebook, Instagram and more

Social media platforms including Instagram, Facebook and Snapchat are being forced by law to remove illegal content posted by users.

They must also sign a code of conduct designed to protect vulnerable users, including children. UK culture and digital minister Margot James announced a compulsory code of conduct on February 5th, following a BBC investigation into the death of teenager Molly Russell.

Russell committed suicide after seeing content about depression and suicide on Instagram. No specific details about the new code have been revealed as of yet, but James is expected to give a speech at the upcoming Safer Internet Day conference to initiate the policy.

“We have heard calls for an internet regulator and to place a statutory ‘duty of care’ on platforms and are seriously considering all options," a spokesman for the Department for Digital, Culture, Media and Sport said.

"Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people. Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not.”

Platform safeguarding

In reaction to the news, some platforms are taking action. Instagram has been using sensitivity screens to blur out images that might be too much, such as images from people inflicting self-harm. Facebook, on the other hand, plans to offer more support to people who may be struggling, but isn’t taking any action to remove images of self-harm.

“We're already taking steps soon to blur images, block a number of hashtags that have come to light, and thirdly to continue to work... with the Samaritans and other organisations," said Facebook head of communications Nick Clegg.


Staff Writer