New Campaign Aims to Halt Rise of 'Killer Robots'

Published on
by
Common Dreams

New Campaign Aims to Halt Rise of 'Killer Robots'

What was once science fiction is not as far off as people think, say Nobel laureates and scientists

by
Jon Queally, staff writer

A scene from the 2003 film Terminator 3: Rise of the Machines. Science fiction? The Campaign to Stop Killer Robots is organizing a push to preemptively ban the deployment of 'fully autonomous and weaponized' machines by world governments. (Photograph: Observer)

Under the simple but provocative header of The Campaign to Stop Killer Robots, human rights advocates and concerned scientists are readying the launch of a new international effort to preemptively ban the deployment of 'fully autonomous and weaponized' machines by world governments.

Claiming that the science is not as far as some people might suppose, the campaigners say they must build support for stronger mechanisms and raise awareness before, not after, such technologies are unleashed on battlefields or among civilian populations.

Following on the popular use of unmanned aerial drones by the US military in war zones across the globe, many critics see the deepening dependence on robotic weapons as a distressing sign for a future where wars will be fought from a control booth, while killing becomes more automated -- even fully automated. They say without a proper legal or moral framework, the consequences would be disastrous.

The Guardian reports that the Stop the Killer Robots campaign "will be launched in April at the House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons."

Among others the campaign is being organized by Professor Noel Sharkey, an artificial intelligence and robotics expert at Sheffield University in the UK, and US Nobel laureate Jody Williams, who received the peace prize for her work to ban landmines.

"These things are not science fiction; they are well into development," Sharkey told the Guardian in an interview. "The research wing of the Pentagon in the US is working on the X47B [unmanned plane] which has supersonic twists and turns with a G-force that no human being could manage, a craft which would take autonomous armed combat anywhere in the planet.

"In America they are already training more drone pilots than real aircraft pilots, looking for young men who are very good at computer games. They are looking at swarms of robots, with perhaps one person watching what they do."

"This is going to be big, big money," Sharkey continued. "But actually there is no transparency, no legal process. The laws of war allow for rights of surrender, for prisoner of war rights, for a human face to take judgments on collateral damage. Humans are thinking, sentient beings. If a robot goes wrong, who is accountable? Certainly not the robot."

The call for a ban follows on a report by Human Rights Watch last year, "Losing Humanity: The Case Against Killer Robots," which described machines under development as being capable of choosing and firing "on targets without human intervention. "

“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, Arms Division director at Human Rights Watch. “Human control of robotic warfare is essential to minimizing civilian deaths and injuries.”

And Jody Williams, who chairs the Nobel Women's Initiative, says that acting preemptively is key.

"Killer robots loom over our future if we do not take action to ban them now," she said. "The six Nobel peace laureates involved in the Nobel Women's Initiative fully support the call for an international treaty to ban fully autonomous weaponized robots."

"I know we can do the same thing with killer robots. I know we can stop them before they hit the battlefield," she said.

Video from Human Rights Watch, "Pull the Plug on Killer Robots":

________________________________________

Share This Article

More in: