Terrorists could gain access to autonomous weapons tech.

Remote control drones are now readily available to the public.

Autonomous weaponry, when developed may be the next weapon in a terrorist’s arsenal.

Autonomous weapons use artificial intelligence technology and would be capable of selecting and eliminating their targets based on a program with pre-determined factors inputted by a human being. Unlike current technology such as drones these machines would make judgements themselves and essentially have freedom once released into an appropriate situation.

The development of autonomous weapon technology is now known, the bigger issue is the capabilities of the technology when it is developed within non-state organisations.

The International Committee for Robot Arms Control (ICRAC) are an organisation committed to the peaceful use of robotics and its members are very against the use of autonomous weaponry.  Steve Wright; a member of ICRAC responded to a question about the possibility of terror organisations gaining and using these weapons on home soil saying: “There is a fermenting anger it’s too dangerous to jump you will end up with a paradigm shift.” he also said: “But I hope to God I am wrong, and you are just pontificating about possible technology.”

An open letter signed in 2015 by leading scientists such as Inventor, Elon Musk and the British Artificial Intelligence company DeepMind expressed to the United Nations in the letter that beginning a new AI arms race would be a poor idea and the UN should ban offensive autonomous weaponry. The dilemma is that an international ban only applies to high contracting parties that are states and the technology would find its way into non-state organisations such as security, border control or extremist groups.

Ariel Conn, Director of Media and Outreach at The Future Of Life Institute shared her organisations thoughts on Autonomous Weaponry:

“If a country can send in autonomous weapons to fight, instead of humans, then it becomes easier to enter into war. But then we also have to worry about whether the autonomous weapon can recognize nuance and know whether someone is really a target or if they just look like a target.”

The threat of terror already looms is there a likelihood that terrorist organisations can gain access to this type of weapons technology?

“Yes. We’re already seeing instances of people taking drones that exist today and adding guns or bombs to them.”

A link to a Washington Post article about the recent use of rudimentary Autonomous Weapons by terrorist organisations such as the prolific terrorist group, ISIS.

https://www.washingtonpost.com/world/national-security/use-of-weaponized-drones-by-isis-spurs-terrorism-fears/2017/02/21/9d83d51e-f382-11e6-8d72-263470bf0401_story.html?utm_term=.ed57c2ad8d1c

Terrorists have in recent times used barbaric and primitive techniques to inflict damage upon areas and more importantly people.

The threat of such advanced technology falling into the hands of non-state groups is a possibility.

Guilluame Fournier, an author of an Autonomous Weaponry Facebook Page, said:

“So far only states would have the ability to create such weapons.If such weapons are developed in the future they would be potentially exposed to hacking by other states or even non state actors such as terrorist groups.”

Mr Fournier’s opinion on the link between Terror and Autonomous Weapons was that it could be possible through the process of hacking:

These weapons could therefore if they are hacked fall in the hands of such groups.If these weapons are hacked it would be possible for states or non-state armed groups to turn them against soldiers or civilians.”

“From my point of view the only way terrorist groups could get their hands on autonomous weapons would be if they hack and take control of them.”
The arguments for and against Autonomous Weaponry will never cease and reach a decisive conclusion but limiting the scale of these weapons and restricting their use will reduce the threat that non-state groups could gain access to said technology.

Leave a comment