YouTube is working to remove 'hateful and supremacist' channels, but creators are being caught in the crossfire

YouTube is working to remove 'hateful and supremacist' channels, but creators are being caught in the crossfire

YouTube has unveiled a new plan to tackle hateful content on the platform.

This move follows the uproar caused after Vox producer Carlos Maza tweeted a string of complaints about abuse received from YouTube commentator Steven Crowder.

YouTube responded to the tweets and initially stated that it would demonetise Crowder until he removed links to his 'offensive' merchandise. It then dialed back and added Crowder's channel would be demonetised because his "pattern of egregious actions has harmed the broader community".

In a blog post shared on Wednesday, YouTube discusses further steps to control harmful content. A rule will see YouTube prohibit videos pushing or promoting discriminatory opinions relating to age, gender, race, caste, religion, sexual orientation or veteran status.

It'll also take action against creators openly promoting Nazi ideologies, as well as channels that deny the existence of "well-documented" historic events, like the Holocaust.

YouTube also states that any partnered creators or channels that "brush up" against the platform's community guidelines will be suspended from the site's Partner Program, which means they won't be able to run ads or use monetisation tools. YouTube did not clarify whether this suspension would be permanent or temporary.

"The openness of YouTube’s platform has helped creativity and access to information thrive," YouTube concluded.

"It’s our responsibility to protect that, and prevent our platform from being used to incite hatred, harassment, discrimination and violence. We are committed to taking the steps needed to live up to this responsibility today, tomorrow and in the years to come."

Context matters

While YouTube is clamping down on discriminatory ideologies, handfuls of other creators are being caught up in the purge. 

Journalist Ford Fischer, who runs YouTube channel News2Share, tweeted that he had been demonetised "within minutes" of the platform's announcement.

Fischer's channel documents activism and extremism, and often uses footage from rallies, marches, and interviews displaying far-right ideologies. His videos are made to educate viewers, not promote the opinions within them. YouTube has said that it recognises the contextual difference between the two.

"We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future," YouTube added into the blog post.

"And as always, context matters, so some videos could remain up because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events."

However, YouTube can't seem to distinguish between channels that are promoting dangerous or discriminatory ideas from channels that documenting and combatting it. As a result, news channels on both sides are being shot down for their coverage.

While Carlos Maza and Steven Crowder's initial spat seems to be the catalyst of what people are calling the 'VoxAdpocalypse', both parties are in agreement that independent news channels are being caught in the middle.

"I want to point out that without any comment on the issue between @gaywonk and @scrowder that seems to be the backdrop for this whole issue, both seem to agree that I was a bystander caught in the crossfire," Fischer tweeted. 

Fake news

Independent creators with a focus on news are being let down by YouTube in other ways too. A recent study from YouTube channel Coffee Break highlighted that YouTube's current algorithm is set up to favour traditional media over independent creators. The results showed that established media outlets are having an easier time hitting YouTube's trending section than creators loyal to the platform.

YouTube hasn't been ignoring the vicious spread of fake news, but the behemoth platform is seemingly difficult to moderate consistently.

Back in March, the platform unveiled a tool that allows users to fact-check questionable topics covered in videos. YouTube acknowledged that it can't and won't be checking every video manually, so it encouraged viewers to fact-check the content they are seeing. However, giving users the ability to fact-check information is not combatting the spread of misinformation.

It also stated in yesterday's blog post that recommendations for videos that content misinformation - such as flat earth theories or phony medical cures - have dropped by over 50 per cent in the US. 

"Our systems are also getting smarter about what types of videos should get this treatment, and we’ll be able to apply it to even more borderline videos moving forward," YouTube said.

YouTube's new stance against fake news and discriminatory content may help, but the staunch moderation of harmful or offensive videos is already beginning to harm independent creators and will continue to do so unless YouTube can manually check for the context it needs.


Danielle Partis is editor of and former editor of She was named Journalist of the Year at the MCV Women in Games Awards 2019, as well as in the MCV 30 under 30 2020. Prior to Steel Media, she wrote about music and games at Team Rock.