YouTube’s got a big problem right now, and I’m not talking about the overabundance of hyperbolic, overenthusiastic product shills. Many of its most prolific creators are finding their videos demonetised before they’ve even gone live. Creators – particularly those who rely on YouTube’s ad money to live – are understandably upset, and many are contemplating abandoning the platform.
It stems from YouTube’s crackdown on unsavoury content that either targeted children, or showed them in perilous or paedophillic situations. YouTube’s seen another “adpocalypse” as a result, with big name advertisers including Mars, Adidas, Hewlett-Packard, Deutsche Bank and more pulling ads until the problem could be remedied. Earlier this year, a similar chain of events happened that saw advertisers pull out over fears their products would be linked with hateful content.
One of the problems with content being flagged is that it all appears to be done algorithmically, with sporadic human intervention. It means many creators have to manually appeal these flags, leaving their videos demonetised in the interim.
YouTube is aware of the problem, addressing it in a statement to Polygon:
Our community of creators are currently being hurt by bad actors who are spamming our systems with videos masquerading as family content. In order to protect creators and advertisers alike, we’re taking aggressive action using a combination of machine learning and people to take action on this content through age-gating, demonetization and even the removal of channels where necessary. As always, creators can appeal video-specific demonetizations, and our goal is to ultimately to protect the revenue of creators across the platform by taking these necessary actions.
It’s something that’s affecting some of YouTube’s biggest voices, like Philip De Franco, Jim Sterling and Casey Neistat – as well as a few of our local YouTubers.
— Brett (@UFDisciple) November 29, 2017
So YouTube's new thing is now to auto-flag everything as unsuitable for advertisers regardless of content and force the manual review upfront. This is so beyond fucking stupid.
— Jim Sterling (@JimSterling) November 29, 2017
I understand that YouTube wants to make its platform a friendlier an safer place for children – because as it is now it’s a cesspool of hatred and toxicity. It’s fantastic that they’re finally trying to do something about it – but if it’s done at the expense of its creators, we could see YouTube eat itself.
Last Updated: November 30, 2017