YouTube will now ask its creators to clarify whether or not their videos are suitable for children.
The move comes as part of changes that are being to how the platform collects and uses data pertaining to kids, following an investigation by the Federal Trade Commission.
Back in September, YouTube was fired $170 million by the FTC for failure to comply with the Children’s Online Privacy Protection Act (COPPA). As well as the fine, which is likely a drop in the ocean for the video giant, YouTube was ordered to create a system in order to better identify content that is directly aimed at children.
During the upload process, creators will be asked whether or not the video they're uploading is aimed at a young audience. YouTube will also use machine-learning to further identify content targeted at kids.
Content needs to flagged as suitable for children if it features a child or has an emphasis on children's characters, popular programming or animated characters. It'll also consider protagonists, toys and music that is generally aimed at children.
While YouTube is trusting creators to mark their own content appropriately, there may be consequences if a creator fails to comply with the new rules. YouTube is yet to disclose what the punishment will entail.
YouTube will also stop serving personalised adverts alongside content that is aimed at children. The platform warns that this may result in a decrease in revenue for some creators, particularly those with a strong focus on content for young audiences.
The platform also states that some features will not be available on child-orientated content, such as comments. Both of these changes have been implemented to better protect children from harmful or offensive material.
These changes follow YouTube's child exploitation scandal, which saw parent company Google terminate over 400 channels in light of action that saw children being targeted and exploited via the platform. Videos with hundreds of predatory comments were uncovered, leading YouTube to consider demonetising content with inappropriate comments left unattended.
YouTube isn't the first platform to face scrutiny from the FTC this year - video app TikTok was hit with an eye-watering $5.7 million fine after violating COPPA.