Apr 09, 2015
Fully autonomous weapons, or "killer robots," present a legal and ethical quagmire and must be banned before they can be further developed, a new human rights report published Thursday urges ahead of next week's United Nations meeting on lethal weapons.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots, was jointly published by Human Rights Watch and Harvard Law School's International Human Rights Clinic and outlines the "serious moral and legal concerns" presented by the weapons, which would "possess the ability to select and engage their targets without meaningful human control."
Although fully autonomous weapons do not yet exist, their "precursors" are already in use, such as the Iron Dome in Israel and the Phalanx CIWS in the U.S., the report states.
Under current law, the makers and users of killer robots could get away with unlawful deaths and injuries if the weapons are allowed to develop. Allowing weapons that operate without human control to make decisions about the use of lethal force could lead to violations of international law and make it difficult to hold anyone accountable for those crimes. Moreover, civil liability would be "virtually impossible, at least in the United States," the report found.
"No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party," lead author and HRW Arms Division researcher Bonnie Docherty said in a press release on Thursday. "The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."
The UN will discuss the killer robots and more conventional arms at its upcoming meeting on inhumane weapons in Geneva, Switzerland from April 13-17. In the past, the UN has used the gathering to preemptively ban military tools such as blinding lasers (pdf).
The report calls on the UN to make a similar call on fully autonomous weapons, stating:
In order to preempt the accountability gap that would arise if fully autonomous weapons were manufactured and deployed, Human Rights Watch and Harvard Law School's International Human Rights Clinic (IHRC) recommend that states:
- Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.
- Adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons
Docherty concluded, "The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban."
Join Us: News for people demanding a better world
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.
Nadia Prupis
Nadia Prupis is a former Common Dreams staff writer. She wrote on media policy for Truthout.org and has been published in New America Media and AlterNet. She graduated from UC Santa Barbara with a BA in English in 2008.
Fully autonomous weapons, or "killer robots," present a legal and ethical quagmire and must be banned before they can be further developed, a new human rights report published Thursday urges ahead of next week's United Nations meeting on lethal weapons.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots, was jointly published by Human Rights Watch and Harvard Law School's International Human Rights Clinic and outlines the "serious moral and legal concerns" presented by the weapons, which would "possess the ability to select and engage their targets without meaningful human control."
Although fully autonomous weapons do not yet exist, their "precursors" are already in use, such as the Iron Dome in Israel and the Phalanx CIWS in the U.S., the report states.
Under current law, the makers and users of killer robots could get away with unlawful deaths and injuries if the weapons are allowed to develop. Allowing weapons that operate without human control to make decisions about the use of lethal force could lead to violations of international law and make it difficult to hold anyone accountable for those crimes. Moreover, civil liability would be "virtually impossible, at least in the United States," the report found.
"No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party," lead author and HRW Arms Division researcher Bonnie Docherty said in a press release on Thursday. "The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."
The UN will discuss the killer robots and more conventional arms at its upcoming meeting on inhumane weapons in Geneva, Switzerland from April 13-17. In the past, the UN has used the gathering to preemptively ban military tools such as blinding lasers (pdf).
The report calls on the UN to make a similar call on fully autonomous weapons, stating:
In order to preempt the accountability gap that would arise if fully autonomous weapons were manufactured and deployed, Human Rights Watch and Harvard Law School's International Human Rights Clinic (IHRC) recommend that states:
- Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.
- Adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons
Docherty concluded, "The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban."
Nadia Prupis
Nadia Prupis is a former Common Dreams staff writer. She wrote on media policy for Truthout.org and has been published in New America Media and AlterNet. She graduated from UC Santa Barbara with a BA in English in 2008.
Fully autonomous weapons, or "killer robots," present a legal and ethical quagmire and must be banned before they can be further developed, a new human rights report published Thursday urges ahead of next week's United Nations meeting on lethal weapons.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots, was jointly published by Human Rights Watch and Harvard Law School's International Human Rights Clinic and outlines the "serious moral and legal concerns" presented by the weapons, which would "possess the ability to select and engage their targets without meaningful human control."
Although fully autonomous weapons do not yet exist, their "precursors" are already in use, such as the Iron Dome in Israel and the Phalanx CIWS in the U.S., the report states.
Under current law, the makers and users of killer robots could get away with unlawful deaths and injuries if the weapons are allowed to develop. Allowing weapons that operate without human control to make decisions about the use of lethal force could lead to violations of international law and make it difficult to hold anyone accountable for those crimes. Moreover, civil liability would be "virtually impossible, at least in the United States," the report found.
"No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party," lead author and HRW Arms Division researcher Bonnie Docherty said in a press release on Thursday. "The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."
The UN will discuss the killer robots and more conventional arms at its upcoming meeting on inhumane weapons in Geneva, Switzerland from April 13-17. In the past, the UN has used the gathering to preemptively ban military tools such as blinding lasers (pdf).
The report calls on the UN to make a similar call on fully autonomous weapons, stating:
In order to preempt the accountability gap that would arise if fully autonomous weapons were manufactured and deployed, Human Rights Watch and Harvard Law School's International Human Rights Clinic (IHRC) recommend that states:
- Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.
- Adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons
Docherty concluded, "The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban."
We've had enough. The 1% own and operate the corporate media. They are doing everything they can to defend the status quo, squash dissent and protect the wealthy and the powerful. The Common Dreams media model is different. We cover the news that matters to the 99%. Our mission? To inform. To inspire. To ignite change for the common good. How? Nonprofit. Independent. Reader-supported. Free to read. Free to republish. Free to share. With no advertising. No paywalls. No selling of your data. Thousands of small donations fund our newsroom and allow us to continue publishing. Can you chip in? We can't do it without you. Thank you.