

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

Experts in robotics and artificial intelligence, human rights advocates, lawyers, and even U.S. military leaders have recently expressed concerns about autonomous weapons. (Photo: Sharron Ward/Campaign to Stop Killer Robots/Facebook)
More than 100 robotics and artificial intelligence (AI) experts published on Monday an open letter urging the United Nations to ban the development and use of killer robots.
"Lethal autonomous weapons threaten to become the third revolution in warfare"
--letter from tech experts
"Lethal autonomous weapons threaten to become the third revolution in warfare," states the letter (pdf), signed by 116 tech experts from 26 countries, including SpaceX and Tesla Motors founder Elon Musk, and Google AI expert Mustafa Suleyman.
Emphasizing a need to act urgently, the letter states:
Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close.
More than a dozen countries--including the United States, China, Israel, South Korea, Russia, and the United Kingdom--are developing autonomous weapons, according to Human Rights Watch. And as Common Dreams reported last summer, concerns about autonomous weapons transcend warzones. Legal experts raised alarms in July 2016 when police in Dallas, Texas used an armed robot to kill a suspected shooter.
"Unlike other potential manifestations of AI, which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability," signatory Ryan Gariepy, the founder of Clearpath Robotics, told CNN.
In response to rapidly advancing technology, the U.N. Conference of the Convention on Certain Conventional Weapons (UNCCW) had scheduled formal discussions about autonomous weapons--including drones, tanks, and automated machine guns--for its Group of Governmental Experts on Lethal Autonomous Weapons Systems. Though the group was set to meet Monday, the meeting was cancelled because some states did not make their financial contributions to the U.N. The next meeting is currently scheduled for November.
Frustrated by the cancellation, the tech experts implore the UNCCW to "work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies."
The letter was released at the opening of the International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne on Monday. Two years ago, IJCAI was also used as a platform to "launch an open letter signed by thousands of AI and robotics researchers including Musk and Stephen Hawking similarly calling for a ban, which helped push the UN into formal talks on the technologies," The Guardian reports.
"Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend."
--letterMore than 90 countries are expected to participate in the November meeting, as well as key UN agencies, the International Committee of the Red Cross, and Campaign to Stop Killer Robots, according to the Campaign to Stop Killer Robots--which also expressed frustration over the cancelled first meeting, and urged states "to begin negotiations on a new CCW protocol by the end of 2018 that preemptively bans fully autonomous weapons."
Just last month, members of the U.S. Senate heard similar sentiments from the second-highest ranking general in U.S. military. As Common Dreams reported, Gen. Paul Selva addressed the dangers of killer robots at his confirmation hearing before the Senate Armed Services Committee, concluding: "I don't think it's reasonable for us to put robots in charge of whether or not we take a human life."
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission has always been simple: To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It's never been this bad out there. And it's never been this hard to keep us going. At the very moment Common Dreams is most needed, the threats we face are intensifying. We need your support now more than ever. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Will you donate now to make sure Common Dreams not only survives but thrives? —Craig Brown, Co-founder |
More than 100 robotics and artificial intelligence (AI) experts published on Monday an open letter urging the United Nations to ban the development and use of killer robots.
"Lethal autonomous weapons threaten to become the third revolution in warfare"
--letter from tech experts
"Lethal autonomous weapons threaten to become the third revolution in warfare," states the letter (pdf), signed by 116 tech experts from 26 countries, including SpaceX and Tesla Motors founder Elon Musk, and Google AI expert Mustafa Suleyman.
Emphasizing a need to act urgently, the letter states:
Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close.
More than a dozen countries--including the United States, China, Israel, South Korea, Russia, and the United Kingdom--are developing autonomous weapons, according to Human Rights Watch. And as Common Dreams reported last summer, concerns about autonomous weapons transcend warzones. Legal experts raised alarms in July 2016 when police in Dallas, Texas used an armed robot to kill a suspected shooter.
"Unlike other potential manifestations of AI, which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability," signatory Ryan Gariepy, the founder of Clearpath Robotics, told CNN.
In response to rapidly advancing technology, the U.N. Conference of the Convention on Certain Conventional Weapons (UNCCW) had scheduled formal discussions about autonomous weapons--including drones, tanks, and automated machine guns--for its Group of Governmental Experts on Lethal Autonomous Weapons Systems. Though the group was set to meet Monday, the meeting was cancelled because some states did not make their financial contributions to the U.N. The next meeting is currently scheduled for November.
Frustrated by the cancellation, the tech experts implore the UNCCW to "work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies."
The letter was released at the opening of the International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne on Monday. Two years ago, IJCAI was also used as a platform to "launch an open letter signed by thousands of AI and robotics researchers including Musk and Stephen Hawking similarly calling for a ban, which helped push the UN into formal talks on the technologies," The Guardian reports.
"Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend."
--letterMore than 90 countries are expected to participate in the November meeting, as well as key UN agencies, the International Committee of the Red Cross, and Campaign to Stop Killer Robots, according to the Campaign to Stop Killer Robots--which also expressed frustration over the cancelled first meeting, and urged states "to begin negotiations on a new CCW protocol by the end of 2018 that preemptively bans fully autonomous weapons."
Just last month, members of the U.S. Senate heard similar sentiments from the second-highest ranking general in U.S. military. As Common Dreams reported, Gen. Paul Selva addressed the dangers of killer robots at his confirmation hearing before the Senate Armed Services Committee, concluding: "I don't think it's reasonable for us to put robots in charge of whether or not we take a human life."
More than 100 robotics and artificial intelligence (AI) experts published on Monday an open letter urging the United Nations to ban the development and use of killer robots.
"Lethal autonomous weapons threaten to become the third revolution in warfare"
--letter from tech experts
"Lethal autonomous weapons threaten to become the third revolution in warfare," states the letter (pdf), signed by 116 tech experts from 26 countries, including SpaceX and Tesla Motors founder Elon Musk, and Google AI expert Mustafa Suleyman.
Emphasizing a need to act urgently, the letter states:
Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close.
More than a dozen countries--including the United States, China, Israel, South Korea, Russia, and the United Kingdom--are developing autonomous weapons, according to Human Rights Watch. And as Common Dreams reported last summer, concerns about autonomous weapons transcend warzones. Legal experts raised alarms in July 2016 when police in Dallas, Texas used an armed robot to kill a suspected shooter.
"Unlike other potential manifestations of AI, which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability," signatory Ryan Gariepy, the founder of Clearpath Robotics, told CNN.
In response to rapidly advancing technology, the U.N. Conference of the Convention on Certain Conventional Weapons (UNCCW) had scheduled formal discussions about autonomous weapons--including drones, tanks, and automated machine guns--for its Group of Governmental Experts on Lethal Autonomous Weapons Systems. Though the group was set to meet Monday, the meeting was cancelled because some states did not make their financial contributions to the U.N. The next meeting is currently scheduled for November.
Frustrated by the cancellation, the tech experts implore the UNCCW to "work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies."
The letter was released at the opening of the International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne on Monday. Two years ago, IJCAI was also used as a platform to "launch an open letter signed by thousands of AI and robotics researchers including Musk and Stephen Hawking similarly calling for a ban, which helped push the UN into formal talks on the technologies," The Guardian reports.
"Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend."
--letterMore than 90 countries are expected to participate in the November meeting, as well as key UN agencies, the International Committee of the Red Cross, and Campaign to Stop Killer Robots, according to the Campaign to Stop Killer Robots--which also expressed frustration over the cancelled first meeting, and urged states "to begin negotiations on a new CCW protocol by the end of 2018 that preemptively bans fully autonomous weapons."
Just last month, members of the U.S. Senate heard similar sentiments from the second-highest ranking general in U.S. military. As Common Dreams reported, Gen. Paul Selva addressed the dangers of killer robots at his confirmation hearing before the Senate Armed Services Committee, concluding: "I don't think it's reasonable for us to put robots in charge of whether or not we take a human life."