YouTube is, as far as I’m concerned, a garbage platform, particularly for children. While there is content of real worth on the platform, there seems to be orders of magnitudes more asinine rubbish, polluting children’s brains with hogwash, vapid nonsense and outright lies. YouTube’s danger for the youth is more pronounced than that though, as it’s also used by paedophiles to get their kicks and groom children for abuse.
In an investigation by Harvard’s Berkman Klein Center for Internet and Society, run by the New York Times, it was found that YouTube’s algorithms happily recommend the otherwise innocent family home videos featuring naked or near naked children to those who’ve watched videos of that sort before; another example of YouTube’s “rabbit hole effect,” that sends people into a vortex of increasingly extreme content. Says the New York Times:
“So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.
On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable.”
YouTube hasn’t really done too much about all of this, though they have automatically disabled comments on any videos that feature children, and will be tweaking recommendations in the future. They have now also made it so that minors are unable to live stream without parental guidance.
Children will no longer be able to stream on the platform “unless they are clearly accompanied by an adult.”
They’ll also be limiting recommendations.
“We expanded our efforts from earlier this year around limiting recommendations of borderline content to include videos featuring minors in risky situations,” YouTube says in a blog post. “While the content itself does not violate our policies, we recognize the minors could be at risk of online or offline exploitation.”
“With this update, we’ll be able to better identify videos that may put minors at risk and apply our protections” across more videos.
It’s appositive step, but I really think the only solution for this particular problem is for YouTube to just straight-up stop recommending videos that feature children in them.
Last Updated: June 4, 2019