I love YouTube! As a service, it’s a great platform to watch and upload video that’s around for perpetuity. There’s a big problem with YouTube though, in that you’re only ever about 4 clicks away from content that’ll rot your brain. You can be watching an insightful and education video on making the very best breakfast smoothing, and just a few clicks later you’re watching somebody debunking the moon landing using the most spurious, coincidental and fallacious – yet somehow convincing -evidence. Next thing you know and you’re a 9/11 “truther,” refusing to believe that the moon even exists as you sit with tinfoil wrapped around your head so that the lizard people can’t influence your brain waves.
While that content has every right to exist, it’s often pushed so hard by YouTube that people with more time on their hands than they rightfully should have end up not just believing the stuff, but going down rabbit holes of similar content. YouTube’s going to stop pushing the stuff so much.
“When recommendations are at their best,” says YouTube, “they help users find a new song to fall in love with, discover their next favorite creator, or learn that great paella recipe. That’s why we update our recommendations system all the time—we want to make sure we’re suggesting videos that people actually want to watch.”
“We’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
They’re not going to be scrubbing this sort of content from the internet, but will stop putting harmful content like this in the recommended videos. They’ll be using a combination of machine learning (which is what caused this nonsense to propagate in the first place) with actual human beings.
Last Updated: January 28, 2019