The growing use of unmanned aircraft in combat situations raises huge moral and legal issues, and threatens to make war more likely as armed robots take over from human beings, according to an internal study by the Ministry of Defence.
The report warns of the dangers of an "incremental and involuntary journey towards a Terminator-like reality", referring to James Cameron's 1984 movie, in which humans are hunted by robotic killing machines. It says the pace of technological development is accelerating at such a rate that Britain must quickly establish a policy on what will constitute "acceptable machine behaviour".
"It is essential that before unmanned systems become ubiquitous (if it is not already too late) ... we ensure that, by removing some of the horror, or at least keeping it at a distance, we do not risk losing our controlling humanity and make war more likely," warns the report, titled The UK Approach to Unmanned Aircraft Systems. MoD officials have never before grappled so frankly with the ethics of the use of drones. The report was ordered by Britain's defence chiefs, and coincides with continuing controversy about drones' use in Afghanistan, and growing Pakistani anger at CIA drone attacks against suspected insurgents on the Afghan borders.
It states that "the recent extensive use of unmanned aircraft over Pakistan and Yemen may already herald a new era". Referring to descriptions of "killer drones" in Afghanistan, it notes that "feelings are likely to run high as armed systems acquire more autonomy".
The insurgents "gain every time a mistake is made", enabling them to cast themselves "in the role of underdog and the west as a cowardly bully that is unwilling to risk his own troops, but is happy to kill remotely", the report adds.
Pakistan last week demanded that the US stop drone strikes and the CIA drastically cut its officers there. David Cameron said in December that British drones had killed 124 insurgents in Afghanistan since June 2008, hailing them as a "classic example of a modern weapon which is necessary for today's war". The drones, known as Reapers, have to date fired 167 missiles and bombs in Afghanistan.
The report was drawn up last month by the ministry's internal thinktank, the Development, Concepts and Doctrine Centre (DCDC), based in Shrivenham, Wiltshire, which is part of MoD central staff. The centre's reports are sent to the most senior officers in all three branches of the armed forces and influence policy and strategy.
The concept of "fighting from barracks" or the "remote warrior" raises such questions as whether a person operating the drones - sometimes from thousands of miles away and "walking the streets of his home town after a shift" - is a legitimate target as a combatant. "Do we fully understand the psychological effects on remote operators of conducting war at a distance?" ask the officials. There is one school of thought, they note, that suggests that for war to be moral, as opposed to just legal, "it must link the killing of enemies with an element of self-sacrifice, or at least risk to oneself".
"The role of the human in the loop has, before now, been a legal requirement which we now see being eroded," the MoD report warns. It asks: "What is the role of the human from a moral and ethical standpoint in automatic systems? ... To a robotic system, a school bus and a tank are the same - merely algorithms in a programme ... the robot has no sense of ends, ways and means, no need to know why it is engaging a target." Chris Cole, a campaigner who runs the Drone Wars UK website, which monitors the development of unmanned weapons systems, welcomed the MoD study while calling for a halt to the use of drones by British forces.
"There needs to be an open and public discussion about the implications of remote warfare, and it may be that a parliamentary select committee inquiry would be the appropriate forum to begin this discussion," he said. The report notes that the MoD "currently has no intention to develop systems that operate without human intervention in the weapon command and control chain".
However, the MoD, like the Pentagon, is keen to develop more and more sophisticated "automated" weapons, it admits.
The report also identifies advantages of an unmanned weapons system, such as preventing the potential loss of aircrew lives, which mean it "is thus in itself morally justified". It adds: "Robots cannot be emotive, cannot hate. A robot cannot be driven by anger to carry out illegal actions such as those at My Lai [the massacre by US troops of hundreds of unarmed civilians in South Vietnam in March 1968].
"In theory, therefore," says the MoD study, "autonomy should enable more ethical and legal warfare. However, we must be sure that clear accountability for robotic thought exists, and this raises a number of difficult debates. Is a programmer guilty of a war crime if a system error leads to an illegal act? Where is the intent required for an accident to become a crime?"
The technology
The US-manufactured General Atomics Reaper is currently the RAF's only armed unmanned aircraft. It can carry up to four Hellfire missiles, two 230kg (500lb) bombs, and 12 Paveway II guided bombs. It can fly for more than 18 hours, has a range of 3,600 miles, and can operate at up to 15,000 metres (50,000ft).
The Reaper is operated by RAF personnel based at Creech in Nevada. It is controlled via a satellite datalink. Earlier this year, David Cameron promised to increase the number of RAF Reapers in Afghanistan from four to nine, at an estimated cost of PS135m.
The MoD is also funding the development by BAE Systems of a long-range unmanned aircraft, called Taranis, designed to fly at "jet speeds" between continents while controlled from anywhere in the world using satellite communications.
Richard Norton-Taylor