Every time there’s another terrorist attack, people talk about extremists. Whether from the left or right, religious or not, extremism and fundamentalism are scary stuff. While politicians might talk about rooting out terror at its source, ideological battles such as these are rarely fought or won with weaponry. I’m honestly a huge believer in the power of content. It used to be called propaganda and be really overt, and now it’s not always as easy to spot, which makes it way more subversive. YouTube is joining the battle against extremist content, though, and it’s pretty cool.
In an op-ed piece in the Financial Times (Thanks, The Verge) YouTube has outlined four new steps it’s taking to combat extremist content on its platform. The op-ed explains some of the difficulties and how YouTube is doing more.
First, it’s increasing the use of technology to identify extremist and terrorism-related videos. This is tricky because a video of a terrorist attack could be informative if uploaded by the BBC, or glorification of violence if uploaded in a different context by a radicalized user. While video analysis models have been used to find and assess more than 50% of the terrorism-related content removed over the past six months, the platform is devoting more engineering resources to apply advanced machine learning research to train new “content classifiers” – making the process a lot faster and more efficient.
Second is the human element:
Machines can help identify problematic videos, but human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech. While many user flags can be inaccurate, Trusted Flagger reports are accurate over 90 per cent of the time and help us scale our efforts and identify emerging areas of concern. We will expand this programme by adding 50 expert NGOs to the 63 organisations who are already part of the programme, and we will support them with operational grants.
Third, YouTube will take a tougher stance on videos that don’t clearly violate policies. So, videos that contain inflammatory religious or supremacist content will appear behind a warning screen and will not be monetized, recommend or eligible for comments or user endorsements. As someone who knows how important those things can be to get views and be found on YouTube, this should essentially banish such videos to YouTube’s wastelands.
Finally, the fourth point, which is something I thought was already done to some extent and should definitely be done a whole lot more. I always figured that Google knows who is susceptible to extremism – if you’re googling sites that are classified as hate speech as well as directions for how to make an explosive, surely that should get flagged by someone. Instead of just monitoring such people, overloading already understaffed law enforcement and blurring lines of privacy, why not make sure that such people are exposed to as much positive propaganda as possible? Well, YouTube is pretty much going to do that:
Building on our successful Creators for Change programme promoting YouTube voices against hate and radicalisation, we are working with Jigsaw to implement the “Redirect Method” more broadly across Europe. This promising approach harnesses the power of targeted online advertising to reach potential Isis recruits, and redirects them towards anti-terrorist videos that can change their minds about joining. In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages.
I’m really glad that they’re doing this. I just also hope they expand the debunking content. I know ISIS is a major threat around the world, but so is rising nationalism and other forms of extremism. I’d hope that over time YouTube find moderating content creators who share important stories and facts to help undo some of the hate speech swirling around the interwebs. Of course it remains to be seen if these four approaches make a huge difference, but it’s definitely great to see such a powerful platform taking action to try to make the world a better place.
Last Updated: June 19, 2017