The Perverse Rise of Killer Robots
The development of "killer robots" is a new and original way of using human intelligence for perverse means. Humans directing machines to kill and destroy on a scale not yet imagined is a concept that not even George Orwell could have imagined. In the meantime, the leading world powers continue their un-merry-go-round of destruction and death -mostly of innocent civilians- without stopping to consider the consequences of their actions.
Killer robots are fully autonomous weapons that can identify, select, and engage targets without meaningful human control. Although fully developed weapons of this kind do not yet exist, world leaders such as the U.S., the U.K., Israel, Russia, China, and South Korea are already working on creating their precursors.
The U.S. Government Accountability Office reports that in 2012, 76 countries had some drones, and 16 countries already possessed armed ones. The U.S. Department of Defense spends $6 billion every year on the research and development of better drones.
South Korea uses Samsung Techwin security surveillance guard robots in its demilitarized zone with North Korea. Although humans operate these units, the robots have an automatic feature that can detect body heat and fire a machine gun without human intervention.
Israel is developing an armed drone called Harop that can select targets with a special sensor. Northrop Grumman has also developed an autonomous drone called the X-47B. This drone can travel on a preprogrammed flight path while being monitored by a pilot on a ship. It is planned to enter active service by 2019. China is also moving rapidly in this area. In 2012, it already had 27 armed drone models, one of which is an autonomous air-to-air supersonic combat aircraft.
Killer robots follow the generation of drones and, as with drones, their potential use is also creating a host of human rights, legal and ethical issues. Military officials state that this hardware protects human life by taking soldiers and pilots out of harm's way. What they don't say, however, is that the protected lives are those of the attacking armies, not those of the mostly civilians who are their targets, whose untimely deaths are euphemistically called collateral damage.
According to Denise Garcia, an expert in international law, four branches of internationally law have been used to limit violence in war: the law of state responsibility, the law on the use of force, international humanitarian law and human rights law. As currently carried out, U.S. drone strikes violate all of them.
From an ethical point of view, using these machines presents a moral dilemma. Allowing machines to make life-and-death decisions removes people's responsibility for their actions and eliminates accountability. Lack of accountability almost ensures future human rights violations. In addition, many experts believe that the proliferation of autonomous weapons would make an arms race inevitable.
As the United Nations is trying to negotiate the future use of autonomous weapons, the U.S. and U.K. representatives want to support weaker rules prohibiting future technology but not killer robots developed during the negotiating period. That delay would allow existing semi-autonomous prototypes to continue being used.
The need for a pre-emptive ban on developing and using this kind of weapon is urgent. As Christof Heyns, the UN Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, stated recently, "If there is not a pre-emptive ban on the high-level autonomous weapons, then once the genie is out of the bottle, it will be extremely difficult to get it back in."
Urgent. It's never been this bad.
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission from the outset was simple. To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It’s never been this bad out there. And it’s never been this hard to keep us going. At the very moment Common Dreams is most needed and doing some of its best and most important work, the threats we face are intensifying. Right now, with just two days to go in our Spring Campaign, we're falling short of our make-or-break goal. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Can you make a gift right now to make sure Common Dreams not only survives but thrives? There is no backup plan or rainy day fund. There is only you. —Craig Brown, Co-founder |
The development of "killer robots" is a new and original way of using human intelligence for perverse means. Humans directing machines to kill and destroy on a scale not yet imagined is a concept that not even George Orwell could have imagined. In the meantime, the leading world powers continue their un-merry-go-round of destruction and death -mostly of innocent civilians- without stopping to consider the consequences of their actions.
Killer robots are fully autonomous weapons that can identify, select, and engage targets without meaningful human control. Although fully developed weapons of this kind do not yet exist, world leaders such as the U.S., the U.K., Israel, Russia, China, and South Korea are already working on creating their precursors.
The U.S. Government Accountability Office reports that in 2012, 76 countries had some drones, and 16 countries already possessed armed ones. The U.S. Department of Defense spends $6 billion every year on the research and development of better drones.
South Korea uses Samsung Techwin security surveillance guard robots in its demilitarized zone with North Korea. Although humans operate these units, the robots have an automatic feature that can detect body heat and fire a machine gun without human intervention.
Israel is developing an armed drone called Harop that can select targets with a special sensor. Northrop Grumman has also developed an autonomous drone called the X-47B. This drone can travel on a preprogrammed flight path while being monitored by a pilot on a ship. It is planned to enter active service by 2019. China is also moving rapidly in this area. In 2012, it already had 27 armed drone models, one of which is an autonomous air-to-air supersonic combat aircraft.
Killer robots follow the generation of drones and, as with drones, their potential use is also creating a host of human rights, legal and ethical issues. Military officials state that this hardware protects human life by taking soldiers and pilots out of harm's way. What they don't say, however, is that the protected lives are those of the attacking armies, not those of the mostly civilians who are their targets, whose untimely deaths are euphemistically called collateral damage.
According to Denise Garcia, an expert in international law, four branches of internationally law have been used to limit violence in war: the law of state responsibility, the law on the use of force, international humanitarian law and human rights law. As currently carried out, U.S. drone strikes violate all of them.
From an ethical point of view, using these machines presents a moral dilemma. Allowing machines to make life-and-death decisions removes people's responsibility for their actions and eliminates accountability. Lack of accountability almost ensures future human rights violations. In addition, many experts believe that the proliferation of autonomous weapons would make an arms race inevitable.
As the United Nations is trying to negotiate the future use of autonomous weapons, the U.S. and U.K. representatives want to support weaker rules prohibiting future technology but not killer robots developed during the negotiating period. That delay would allow existing semi-autonomous prototypes to continue being used.
The need for a pre-emptive ban on developing and using this kind of weapon is urgent. As Christof Heyns, the UN Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, stated recently, "If there is not a pre-emptive ban on the high-level autonomous weapons, then once the genie is out of the bottle, it will be extremely difficult to get it back in."
The development of "killer robots" is a new and original way of using human intelligence for perverse means. Humans directing machines to kill and destroy on a scale not yet imagined is a concept that not even George Orwell could have imagined. In the meantime, the leading world powers continue their un-merry-go-round of destruction and death -mostly of innocent civilians- without stopping to consider the consequences of their actions.
Killer robots are fully autonomous weapons that can identify, select, and engage targets without meaningful human control. Although fully developed weapons of this kind do not yet exist, world leaders such as the U.S., the U.K., Israel, Russia, China, and South Korea are already working on creating their precursors.
The U.S. Government Accountability Office reports that in 2012, 76 countries had some drones, and 16 countries already possessed armed ones. The U.S. Department of Defense spends $6 billion every year on the research and development of better drones.
South Korea uses Samsung Techwin security surveillance guard robots in its demilitarized zone with North Korea. Although humans operate these units, the robots have an automatic feature that can detect body heat and fire a machine gun without human intervention.
Israel is developing an armed drone called Harop that can select targets with a special sensor. Northrop Grumman has also developed an autonomous drone called the X-47B. This drone can travel on a preprogrammed flight path while being monitored by a pilot on a ship. It is planned to enter active service by 2019. China is also moving rapidly in this area. In 2012, it already had 27 armed drone models, one of which is an autonomous air-to-air supersonic combat aircraft.
Killer robots follow the generation of drones and, as with drones, their potential use is also creating a host of human rights, legal and ethical issues. Military officials state that this hardware protects human life by taking soldiers and pilots out of harm's way. What they don't say, however, is that the protected lives are those of the attacking armies, not those of the mostly civilians who are their targets, whose untimely deaths are euphemistically called collateral damage.
According to Denise Garcia, an expert in international law, four branches of internationally law have been used to limit violence in war: the law of state responsibility, the law on the use of force, international humanitarian law and human rights law. As currently carried out, U.S. drone strikes violate all of them.
From an ethical point of view, using these machines presents a moral dilemma. Allowing machines to make life-and-death decisions removes people's responsibility for their actions and eliminates accountability. Lack of accountability almost ensures future human rights violations. In addition, many experts believe that the proliferation of autonomous weapons would make an arms race inevitable.
As the United Nations is trying to negotiate the future use of autonomous weapons, the U.S. and U.K. representatives want to support weaker rules prohibiting future technology but not killer robots developed during the negotiating period. That delay would allow existing semi-autonomous prototypes to continue being used.
The need for a pre-emptive ban on developing and using this kind of weapon is urgent. As Christof Heyns, the UN Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, stated recently, "If there is not a pre-emptive ban on the high-level autonomous weapons, then once the genie is out of the bottle, it will be extremely difficult to get it back in."

