Red Cross sounds alarm over use of 'killer robots' in future wars

web_photo_Red_Cross_04102016

File: The Red Cross is a humanitarian organization that provides emergency assistance, disaster relief, and disaster preparedness education in the United States.

NAIROBI - Countries must agree on strict rules on "killer robots" - autonomous weapons which can assassinate without human involvement, a top Red Cross official has said, amid growing ethical concerns over their use in future wars.

Semi-autonomous weapons systems from drones to tanks have for decades been used to eliminate targets in modern day warfare - but they all have human control behind them.

With rapid advancements in artificial intelligence, there are fears among humanitarians over its use to develop machines which can independently make the decision about who to kill.

Yves Daccord, director-general of the International Committee of the Red Cross (ICRC), said this would be a critical issue in the coming years as it raised ethical questions on delegating lethal decisions to machines and accountability.

READ: German military says no to 'killer robots'

"We will have weapons which fly without being remotely managed by a human and have enough intelligence to locate a target and decide whether it is the right person to take out," Daccord told the Thomson Reuters Foundation in an interview.

"There will be no human making that decision, it will be the machine deciding - the world will essentially be delegating responsibility to an algorithm to decide who is the enemy and who is not, and who gets to live and who gets to die."

The ICRC initiated the international adoption of the four Geneva Conventions that lie at the core of international humanitarian law in 1949.

Since then, it has urged governments to adopt international humanitarian laws to changing circumstances, in particular to modern developments in warfare, so as to provide more effective protection and assistance for conflict victims.

TO BAN OR TO BAN

A global survey published by Human Rights Watch and the Campaign to Stop Killer Robots, a global coalition of NGOs, on Tuesday, found six out of ten people polled across 26 countries oppose the development of fully autonomous lethal weapons.

The study, conducted by Ipsos, surveyed 18,795 people in 26 countries including Brazil, India, the United States, Britain, China, South Africa, Japan and Israel.

Daccord said autonomous weapons crossed a moral threshold as machines did not have the human characteristics such as compassion necessary to make complex ethical decisions.

They lacked human judgment to evaluate whether an attack was a proportional response; distinguish civilians from combatants, and abide by core principles of international humanitarian law, he added.

The issue of "killer robots" has divided humanitarians.

READ: Increasingly human-like robots spark fascination and fear 

The United Nations Secretary-General Antonio Guterres has called for a complete ban, while other organisations such as the ICRC are advocating for strict regulation.

"We should not go for banning, but I am of the opinion that we have to keep a level of human control over such weapons. This means that, at any time of the operation, a human can intervene," said Daccord.

"There are no guidelines regarding their use and they have not even been defined yet, so we have to create a common grammar between states and develop guidelines, or treaty law."

The rules would address issues such as the definition of autonomous weapons, the level of human supervision over these weapons such as the ability to intervene and deactivate, as well as the operational conditions for their use, says the ICRC.

Supporters of autonomous weapons argue they will make war more humane. They will be more precise in determining and eliminating targets, not fall prey to human emotions such as fear or vengeance and will minimise civilian deaths, they say.

READ: Activists urge killer robot ban 'before it is too late'

But Daccord said such machines could malfunction, and this raised questions over who would be held responsible.

"You can hold people accountable under international humanitarian law with remotely managed weapons such as drones. With autonomous weapons, we are moving into new territory," he said.

"There is a process underway, but we have to get countries together to agree on a common text which is not easy. It's better they start to negotiate now and find an agreement than wait for a major disaster."

Source
Reuters