8.3m videos were removed from YouTube in the three month period at the end of the last year, the site has said.
The moves were part of the effort to try and curb the perceived rise of disturbing content on the platform, as detailed in a quarterly report designed to highlight efforts to enforce its community guidelines.
“The majority of these 8m videos were spam or people attempting to upload adult content and represent a fraction of a percent of YouTube’s total views during this time period,” the company said.
6.7m of these videos were flagged for review by bots, and of those 76 per cent were then pulled before they ever viewed. 402k were flagged by YouTube users, 64k by NGOs and 73 by government agencies.
Of the flagged content, 30.1 per cent involved a complaint of sexual content, 26.4 per cent spam or misinformation and 15.6 per cent hate or abuse. YouTube claims that over half of the videos it reviews for violent extremism receive fewer than ten views prior to their removal.
The full report can be read here.
“This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons,” YouTube added.
“We’re also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged to us for review against our Community Guidelines.
“Last year we committed to bringing the total number of people working to address violative content to 10,000 across Google by the end of 2018. At YouTube, we've staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams.”