Military Killer Robots 'Could Endanger Civilians'

Published on
by
Telegraph/UK

Military Killer Robots 'Could Endanger Civilians'

Action on a global scale must be taken to curb the development of military killer robots that think for themselves, a leading British expert said.

by

Terminator style killer robots could lead to a major escalation in civilian deaths, warns Prof Noel Sharkey (Photo: WARNER BROS. PICTURES)

"Terminator"-style machines that decide how, when and who to kill
are just around the corner, warns Noel Sharkey, Professor of Artificial
Intelligence and Robotics at the University of Sheffield.

Far
from helping to reduce casualties, their use is likely to make conflict
and war more common and lead to a major escalation in numbers of
civilian deaths, he believes.

"I do think there should be some
international discussion and arms control on these weapons but there's
absolutely none," said Prof Sharkey.

"The military have a strange
view of artificial intelligence based on science fiction. The nub of it
is that robots do not have the necessary discriminatory ability. They
can't distinguish between combatants and civilians. It's hard enough
for soldiers to do that."

Iraq and Afghanistan have both provided ideal "showcases" for robot weapons, said Prof Sharkey.

The
"War on Terror" declared by President George Bush spurred on the
development of pilotless drone aircraft deployed against insurgents.

Initially used for surveillance, drones such as the Predator and larger Reaper were now armed with bombs and missiles.

The
US currently has 200 Predators and 30 Reapers and next year alone will
be spending 5.5 billion dollars (£3.29 billion) on unmanned combat
vehicles.

Britain had two Predators until one crashed in Iraq last year.

At
present these weapons are still operated remotely by humans sitting in
front of computer screens. RAF pilots on secondment were among the more
experienced controllers used by the US military, while others only had
six weeks training, said Prof Sharkey. "If you're good at computer
games, you're in," he added.

But rapid progress was being made
towards robots which took virtually all their own decisions and were
merely "supervised" by humans.

These would be fully autonomous killing machines reminiscent of those depicted in the "Terminator" films.

"The
next thing that's coming, and this is what really scares me, are armed
autonomous robots," said Prof Sharkey speaking to journalists in
London. "The robot will do the killing itself. This will make decision
making faster and allow one person to control many robots. A single
soldier could initiate a large scale attack from the air and the ground.

"It could happen now; the technology's there."

A
step on the way had already been taken by Israel with "Harpy", a
pilotless aircraft that flies around searching for an enemy radar
signal. When it thinks one has been located and identified as hostile,
the drone turns into a homing missile and launches an attack - all
without human intervention.

Last year the British aerospace
company BAe Systems completed a flying trial with a group of drones
that could communicate with each other and select their own targets,
said Prof Starkey. The United States Air Force was looking at the
concept of "swarm technology" which involved multiple drone aircraft
operating together.

Flying drones were swiftly being joined by
armed robot ground vehicles, such as the Talon Sword which bristles
with machine guns, grenade launchers, and anti-tank missiles.

However it was likely to be decades before such robots possessed a human-like ability to tell friend from foe.

Even with human controllers, drones were already stacking up large numbers of civilian casualties.

As
a result of 60 known drone attacks in Pakistan between January 2006 and
April 2009, 14 al Qaida leaders had been killed but also 607 civilians,
said Prof Sharkey.

The US was paying teenagers "thousands of
dollars" to drop infrared tags at the homes of al Qaida suspects so
that Predator drones could aim their weapons at them, he added. But
often the tags were thrown down randomly, marking out completely
innocent civilians for attack.

Prof Sharkey, who insists he is
"not a pacifist" and has no anti-war agenda, said: "If we keep on using
robot weapons we're going to put civilians at grave risk and it's going
to be much easier to start wars. The main inhibitor of wars is body
bags coming home.

"People talk about programming the 'laws of
war' into a computer to give robots a conscience, so that if the target
is a civilian you don't shoot. But for a robot to recognise a civilian
you need an exact specification, and one of the problems is there's no
specific definition of a civilian. Soldiers have to rely on common
sense.

"I'm not saying it will never happen, but I know what's out there and it's not going to happen for a long time."

Share This Article

More in: