Home Technology The AI arms race could be the source of the next big war

The AI arms race could be the source of the next big war

3 min read
8

The topic of when the next world war will happen has been a subject of debate ever since World War II ended. Since then, people have been making assumptions about everything from a nuclear holocaust, to a massive chemical war or even a zombie outbreak as the catalyst, with either China, Russia, North Korea, The Middle East or the US as the key instigators. If a new report published by the Centre for a New American Security (CNAS) is anything to go by though, it appears it could possibly be those pesky robots after all. Perhaps James Cameron is a prophet and not just a movie visionary.

In the report, it claims that Chinese officials see the scramble to build AI in the global arms race as a global security risk that could lead to potential conflict. Particularly, how it could be sparked completely unintentionally by AI getting things wrong and leading conflict to escalate when humans perhaps hadn’t intended. A scenario which Gregory C. Allen, an adjunct senior fellow at CNAS and author of the new report shared in a recent interview with The Verge:

The specific scenario described to me [by one anonymous Chinese official] is unintentional escalation related to the use of a drone… The point made to me was that it’s not clear how either side will interpret certain behaviours [involving autonomous equipment], the side sending out an autonomous drone will think it’s not a big deal because there’s no casualty risk, while the other side could shoot it down for the same reason. But there’s no agreed framework on what message is being sent by either sides’ behaviour.

In both the report and the article, Allen goes on to explain how drones, both large and small are being utilised by the military with the capabilities to not only autopilot and perform simple tasks like flying to specific targets, but also even being provided with the autonomy to engage in lethal use of force should it deem necessary. A drone of which China already owns called the Blowfish A2 drone.

It’s something which is rather frighteningly easy to see happening. Yes, companies are going to great lengths to improve AI and Machine Learning capabilities and certainly elements of these have proven to be more reliable than human minds. But arming machines with weapons and expecting them to know the difference between a real threat and perhaps an unintended mistake is highly unlikely, especially considering we as humans fall short of this even at the best of times.

To make things even scarier, is that in his research even though a lot of progress is already been made especially between the US and China – the only two countries believed to have autonomous drones – not nearly enough is known about the technology, only making the likelihood of something going wrong to be even higher.

There are definitely pockets of real expertise on this issue (in the US) but there’s not the widespread comprehension there needs to be

Let’s hope that governments come to their sense and allow some level of human decision making into the process for the time being before handing things over to the robots. Like waiting for the birth of John Connor.

However, as Allen explains further, it’s not just all down to pure AI, but also the element of perceived risk, as because drones remove the risk of human life from the military, they may be more inclined to push the boundaries with them, only triggering further escalations as a result.

I understand that governments will always feel the need to arm themselves but the thought of what AI driven weaponry could mean is certainly a scary one. Of course, robots and AI could also solve all the world’s problem and lead to world peace, but that’s not profitable, so I unfortunately struggle to see arms companies even allowing for that possibility in their programming.

Last Updated: February 8, 2019

8 Comments

  1. Admiral Chief

    February 8, 2019 at 12:00

    Each nation should bring their “champion”

    Winner gets cake, loser GOES TO GULAG

    Reply

  2. G8crasha

    February 8, 2019 at 08:00

    I can believe this. Our ability to reason separates us from other natural creatures and AI robots. When you are separated from the real world via some proxy (i.e. drones/robots), do you still consider your actions that you commit via these proxies in the same lights as you would if you were the one performing that action in person. Maybe you do, maybe you don’t, but understandably, everyone has a different moral compass, so while one person wouldn’t kill someone else via some proxy because it would be like they were physically there doing it, another person would because it doesn’t seem real to them.

    Reply

  3. Guz

    February 8, 2019 at 08:19

    All hail our new AI overlords o/o/o/ !! Skynet is coming !! Jokes aside, scary stuff…..

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also

Turns out Microsoft will require a TPM chip for you to install Windows 11

Turns out the much hyped low-specs for Microsoft's new operating system might be more rest…