War and Automated Weapons Systems

A- A A+

DroneThe eventual use of Autonomous Weapons Systems would remove large amounts of the immediate risk to troops in the field of battle.

However, the question of who would be liable for the actions of these systems was the subject of a talk by Max Weber Fellow Pablo Kalmanovitz, who was speaking at the Autonomous Weapons Systems – Law, Ethics and Policy Conference at the EUI on 25 April.

Their use could potentially isolate targets and minimise collateral damage and civilian casualties, according to their supporters. However, their acts would still be initiated by human commanders. These individuals would be regarded as liable for consequences of their actions had they themselves been in the battle field. Are they similarly responsible for weapons they deploy in battle?

“It would be reckless to deploy a weapon without knowing the consequences, therefore there can be liability,” said Kalmanovitz.

In criminal law an individual can be held accountable for reckless or negligent behaviour that leads to injury of death. Kalmanovitz suggests that similar clarification would ensure that due process was followed in the deployment of weapons.

“If commanders know they would be liable, procedures and controls will be followed.”

Much of the debate is current skewed by the uneven distribution of such weapons. With one side having externalised risk to their troops and with little fear of reprisals there is currently little incentive for states to accept international controls and guidelines.

Such attitudes might change, Kalmanovitz suggested, if there was a AWS arms race extending their use to countries more likely to come into combat with each other. The other is through a public naming and shaming campaign.

While the risk to troops is dramatically reduced through the use of AWS, that risk is not eliminated, it is merely moved to other spheres. The use of AWS has economic, moral, political and diplomatic issues attached to it, as the US government is currently discovering with its drone campaign in Pakistan and Afghanistan.

“Ultimately these are questions of information what you know and what it is reasonable to know,” added Professor Neha Bhuta. “Without this information, you can’t set international standards.”

Pablo Kalmanovitz and Nehal Bhuta were speaking at the Autonomous Weapons Systems – Law, Ethics and Policy Conference co-financed by the Fritz Thyssen Foundation.