SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
Fully autonomous weapons, or "killer robots," present a legal and ethical quagmire and must be banned before they can be further developed, a new human rights report published Thursday urges ahead of next week's United Nations meeting on lethal weapons.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots, was jointly published by Human Rights Watch and Harvard Law School's International Human Rights Clinic and outlines the "serious moral and legal concerns" presented by the weapons, which would "possess the ability to select and engage their targets without meaningful human control."
Although fully autonomous weapons do not yet exist, their "precursors" are already in use, such as the Iron Dome in Israel and the Phalanx CIWS in the U.S., the report states.
Under current law, the makers and users of killer robots could get away with unlawful deaths and injuries if the weapons are allowed to develop. Allowing weapons that operate without human control to make decisions about the use of lethal force could lead to violations of international law and make it difficult to hold anyone accountable for those crimes. Moreover, civil liability would be "virtually impossible, at least in the United States," the report found.
"No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party," lead author and HRW Arms Division researcher Bonnie Docherty said in a press release on Thursday. "The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."
The UN will discuss the killer robots and more conventional arms at its upcoming meeting on inhumane weapons in Geneva, Switzerland from April 13-17. In the past, the UN has used the gathering to preemptively ban military tools such as blinding lasers (pdf).
The report calls on the UN to make a similar call on fully autonomous weapons, stating:
In order to preempt the accountability gap that would arise if fully autonomous weapons were manufactured and deployed, Human Rights Watch and Harvard Law School's International Human Rights Clinic (IHRC) recommend that states:
- Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.
- Adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons
Docherty concluded, "The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban."
Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place. We're hundreds of thousands strong, but every single supporter makes the difference. Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. |
Fully autonomous weapons, or "killer robots," present a legal and ethical quagmire and must be banned before they can be further developed, a new human rights report published Thursday urges ahead of next week's United Nations meeting on lethal weapons.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots, was jointly published by Human Rights Watch and Harvard Law School's International Human Rights Clinic and outlines the "serious moral and legal concerns" presented by the weapons, which would "possess the ability to select and engage their targets without meaningful human control."
Although fully autonomous weapons do not yet exist, their "precursors" are already in use, such as the Iron Dome in Israel and the Phalanx CIWS in the U.S., the report states.
Under current law, the makers and users of killer robots could get away with unlawful deaths and injuries if the weapons are allowed to develop. Allowing weapons that operate without human control to make decisions about the use of lethal force could lead to violations of international law and make it difficult to hold anyone accountable for those crimes. Moreover, civil liability would be "virtually impossible, at least in the United States," the report found.
"No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party," lead author and HRW Arms Division researcher Bonnie Docherty said in a press release on Thursday. "The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."
The UN will discuss the killer robots and more conventional arms at its upcoming meeting on inhumane weapons in Geneva, Switzerland from April 13-17. In the past, the UN has used the gathering to preemptively ban military tools such as blinding lasers (pdf).
The report calls on the UN to make a similar call on fully autonomous weapons, stating:
In order to preempt the accountability gap that would arise if fully autonomous weapons were manufactured and deployed, Human Rights Watch and Harvard Law School's International Human Rights Clinic (IHRC) recommend that states:
- Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.
- Adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons
Docherty concluded, "The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban."
Fully autonomous weapons, or "killer robots," present a legal and ethical quagmire and must be banned before they can be further developed, a new human rights report published Thursday urges ahead of next week's United Nations meeting on lethal weapons.
The report, titled Mind the Gap: The Lack of Accountability for Killer Robots, was jointly published by Human Rights Watch and Harvard Law School's International Human Rights Clinic and outlines the "serious moral and legal concerns" presented by the weapons, which would "possess the ability to select and engage their targets without meaningful human control."
Although fully autonomous weapons do not yet exist, their "precursors" are already in use, such as the Iron Dome in Israel and the Phalanx CIWS in the U.S., the report states.
Under current law, the makers and users of killer robots could get away with unlawful deaths and injuries if the weapons are allowed to develop. Allowing weapons that operate without human control to make decisions about the use of lethal force could lead to violations of international law and make it difficult to hold anyone accountable for those crimes. Moreover, civil liability would be "virtually impossible, at least in the United States," the report found.
"No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party," lead author and HRW Arms Division researcher Bonnie Docherty said in a press release on Thursday. "The many obstacles to justice for potential victims show why we urgently need to ban fully autonomous weapons."
The UN will discuss the killer robots and more conventional arms at its upcoming meeting on inhumane weapons in Geneva, Switzerland from April 13-17. In the past, the UN has used the gathering to preemptively ban military tools such as blinding lasers (pdf).
The report calls on the UN to make a similar call on fully autonomous weapons, stating:
In order to preempt the accountability gap that would arise if fully autonomous weapons were manufactured and deployed, Human Rights Watch and Harvard Law School's International Human Rights Clinic (IHRC) recommend that states:
- Prohibit the development, production, and use of fully autonomous weapons through an international legally binding instrument.
- Adopt national laws and policies that prohibit the development, production, and use of fully autonomous weapons
Docherty concluded, "The lack of accountability adds to the legal, moral, and technological case against fully autonomous weapons and bolsters the call for a preemptive ban."