YouTube has said that content accompanied by inappropriate comments from other users could be demonetised by the platform.
This action follows the noise regarding hundreds of unmonitored predatory comments found and removed by the platform this week.
In a statement to creators, YouTube explained that in the last 48 hours, it had terminated over 400 channels found to pertain to inappropriate or harmful comments. It has also disabled comments on "tens of millions" of videos.
Comment management
YouTube creator Jessica Ballinger took to Twitter to ask why YouTube had deemed videos containing her five-year-old son as 'not advertiser friendly'.
The platform responded to her in a tweet, stating that "even if your video is suitable for advertisers, inappropriate comments could result in your video receiving limited or no ads (yellow icon)".
(2/2) With regard to the actions that we've taken, even if your video is suitable for advertisers, inappropriate comments could result in your video receiving limited or no ads (yellow icon). Let us know if you have any questions.
— TeamYouTube (@TeamYouTube) February 22, 2019
Who is at fault?
Creators are responsible for their comments sections, but it's incredibly difficult to monitor hundreds of comments at all times, especially for larger creators.
Other YouTubers and community members were fast to respond to the move, calling it "unfair" and "aggravating".
It means that creators could potentially face significant monetary losses if their content is targeted by trolls or waves of negative commenters.
YouTube is yet to make an official announcement regarding these changes outside of Twitter replies, but that's not unusual for the platform.