Interviews & Opinion

YouTube needs to grow up

YouTube needs to grow up

Accusations of anti-semitic content from PewDiePie - and Disney-owned Maker Studios' subsequent decision to drop him - as well as the so-called ‘adpocalypse’ have offered not the first signs of trouble on the platform.

Then there’s the video of a man cementing a microwave to his head, only to nearly die and, eventually, result in his friends calling the ambulance services - who could have diverted from real emergencies. It was all uploaded to YouTube, of course - one of channel TGFbro’s most viewed videos to date.

YouTube has also had to crackdown on the content of its Kids channels. Though the scattershot approach has led to innocent channels being taken offline while disturbing content such as a guide to sharpening knives, images of bloodied clowns and footage of popular cartoon characters in a burning plane are still accessible on the supposedly safeguarded YouTube Kids app.

Recently we’ve come to Logan Paul's infamous video showing the dead body of a man who had taken his own life in Japan’s Aokigahara Forest, also known as the suicide forest. Even the video thumbnail - that Paul crafted himself - drew attention to the body. The YouTuber has since apologised for his actions in both a written statement and video and took a break from YouTube.

He’s now back - with what I think was a great piece recognising the poor taste of that previous video as well as showing Paul taking practical steps to learn from his mistakes, but most importantly, inform the world of the serious issue of suicide.

Then in one of his follow-up videos he tazes a dead rat.

And then there’s the guy throwing water at random members of the public for a prank that, intentionally or unintentionally, coincides with the rising number of acid attacks. YouTube closed the channel.

Disturbing content, under-age viewers

What’s most disturbing about all these videos is that a lot of these YouTubers count numerous young fans as part of their audience.

The simple reaction is to blame the influencers for being idiots. They of course aren’t blameless for their actions - PewDiePie gave a good - and opinionated - breakdown of his thoughts on Logan Paul’s video on how it’s easy to get caught up in chasing the hits and ever crazier videos.

When dealing with such controversies, there’s definitely an appropriate action somewhere between the fire and fury of Twitter and, much further away, YouTube’s mostly hands-off approach. People don’t deserve to lose their jobs over every mistake - even some of the big ones. But there needs to be consequences.

So what is YouTube doing about all this? After an overdue reaction to Logan Paul’s video (and really the ‘adpocalypse’ in general), the video platform has set out its stance on so called ‘bad actors’.

Picking on the little channels, eligibility for YouTube monetisation now stands at 4,000 hours of watchtime within the past 12 months and 1,000 subscribers. Previously, eligibility for the YouTube Partner Programme had a requirement of 10,000 lifetime views. While the idea of the change here isn’t bad per say, 4,000 hours of watchtime isn’t exactly supportive of channels that favour small video durations - such as animators.

Two weeks later, YouTube CEO Susan Wojcicki shared the company’s five top priorities for the year ahead to tackle the its biggest problems. These include prioritising transparency and communication (of itself), tightening and enforcing its policies, and more human reviews of its videos.

For the latter YouTube has promised to “bring the total number of people across YouTube and Google working to address content that might violate our policies to over 10,000”. To this end, YouTube is hiring policy enforcement managers in areas such as child safety. Though it doesn’t quite look like 10,000 people.

Enforcing a clear code of conduct

What YouTube really lacks though are ‘strict’ ethics guidelines, and clear rules and regulations. It has community guidelines, but who’s paying attention? Are they applied consistently?

For example, does a well-viewed video of a dead body or of someone electrocuting a dead rat require much thought on whether it should be shared, and if any penalties should result from this? Are these what YouTube considers issues that are “far more nuanced and unique” to the channel?

For me, the problem lies as much with YouTube as any influencer. People will do crazy things to get noticed. But every one of them acts independently.

You can’t simply tell YouTubers to learn from these mistakes because they aren’t one homogeneous blob. And ultimately, they are chasing views within the rules of the channel. Finding and pushing the limits is part of the game in entertainment, and on YouTube, content creators are finding there aren’t many.

But you can ask YouTube to learn, if it listens.

It seems that money speaks louder than dead bodies shown to kids. It actually required Logan Paul to take the video down - not YouTube itself. For a video appearing as the number one trending video on the site, were YouTube employees simply asleep at the wheel?

How many people currently are really checking this stuff, or has YouTube simply just relied too heavily on algorithms and no one checks the front page?

Perhaps YouTube is getting the message - it’s just suspended ads on Logan Paul’s channels in response to his “recent pattern of behaviour”. 

In the past YouTube has only shown itself to be reactive, rather than proactive. The above still counts as the former - though at least it’s starting to take action.

Lessons from traditional broadcast media

All of this is in stark contrast to the more traditional TV broadcast media. Numerous strict rules and regulations, including watershed hours that cover what can and can’t be shown at certain times of day, help ensure what viewers, many of whom are children, engage with is appropriate and safe.

Here you have regulators like the UK's communications regulator Ofcom, which oversees TV, radio and video on demand sectors in the country. Viewers can send their complaints to Ofcom if something controversial slips through the cracks of pre-recorded television or something spontaneous and inappropriate occurs on broadcast. The channel will then be investigated and potentially disciplined, with Ofcom having the legal power to impose substantial fines, and shortening or even taking away the channel or station licence to broadcast.

All of this means TV feels like a safe environment, particularly for young viewers, to watch television. Importantly here, parents have the general knowledge of what is on certain channels at set times of the day, allowing them to quickly judge what’s going to be appropriate.

These channels have built a good reputation - as has TV as a whole. Here, you have an environment where you can simply leave the TV on in the background and not be too concerned about inappropriate content being featured.

With YouTube’s lax approach, it’s at risk of losing this.

It’s clear its own rules, and the way the platform is run, have not been up to scratch when it comes to protecting viewers. It’s been great for profits and has also allowed greater creative freedom for content creators.

YouTube’s chief business officer Robert Kyncl has previously said it shouldn’t be treated with the same “editorial hand” as traditional broadcasters, claiming it was being open and introducing measures without any government asking it to do so.

But it’s not been particularly strict, on any consistent basis, on how open it wants to be and how strict it is on enforcing its guidelines. Beyond just the controversies mentioned in this article there are also still major issues over how sponsorships are disclosed on channels, though arguably this has improved over the years.

While the platform won’t die of course if it remains the same, YouTube risks losing trust, and with that, lots of money.

It needs to seriously rethink how it greenlights content - a process it currently lacks but is reported to be considering for its premium channels - how it is marked for appropriate audiences and how that those videos are accessible.

Is just simply asking if you’re old enough good enough? You can create an account in less than a minute with a fake age.

Perhaps YouTube needs its own regional watershed hours, to have clear penalties such as fines for content, ad suspension (as a clear punishment rather than off the cuff) or channel suspensions. These could prove bigger deterents than the current three strikes system.

Or it could just let governments wise up and create regulations for it.

Logan Paul appeared to do some soul searching about how he got to the point of filming and uploading a video that focused on a dead body for the clicks. But then he electrocuted a dead rat for his young fans.

Now YouTube needs to do some soul searching of its own, and be clear to the public and content creators about what's acceptable and what's not on its platform.

Head of Content

Craig Chapple is a freelance analyst, consultant and writer with specialist knowledge of the games industry. He has previously served as Senior Editor at, as well as holding roles at Sensor Tower, Nintendo and Develop.