

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

A hexacopter takes aerial photos at sunset.
"All signs point to the Pentagon developing 'killer robots' via Replicator, despite deflections from Pentagon representatives themselves," according to Public Citizen.
A report from the government watchdog Public Citizen released Friday gives the who, what, when, where, and why of the Pentagon's flagship Replicator initiative—a program to increase the number of weapons, particularly drones, in the hands of the U.S. military.
In the report, Public Citizen re-ups concerns about one particular aspect of the program. According to the report's author, Savannah Wooten, the Defense Department has remained ambiguous on the question of whether it is developing artificial intelligence weapons that can "deploy lethal force autonomously—without a human authorizing the specific use of force in a specific context." These types of weapons are also known as "killer robots."
"It is not yet clear whether or not these technologies are designed, tested, or intended for killing," according to the report.
"All signs point to the Pentagon developing 'killer robots' via Replicator, despite deflections from Pentagon representatives themselves," wrote Wooten in the summary of the report.
The program, which was announced last year, is part of the Department of Defense's plan to deter China.
"Replicator is meant to help us overcome [China's] biggest advantage, which is mass. More ships. More missiles. More people," said Deputy Secretary of Defense Kathleen Hicks in a speech announcing the project last year. That mission will be achieved specifically by "mastering the technology of tomorrow," Hicks said.
There will soon be a "Replicator 2.0" that will focus on counter-drone technologies—per a memo from the defense secretary released in September—according to Public Citizen's report.
In a letter sent in March, Public Citizen and 13 other civil society groups highlighted remarks Hicks made in 2023 as an example of the ambiguity the Pentagon has created around the issue.
"Autonomous weapons are inherently dehumanizing and unethical, no matter whether a human is 'ultimately' responsible for the use of force or not. Deploying lethal artificial intelligence weapons in battlefield conditions necessarily means inserting them into novel conditions for which they have not been programmed, an invitation for disastrous outcomes," the organizations wrote to Hicks and Secretary of Defense Lloyd Austin.
Wooten's report reiterates that same call: "The Pentagon owes Americans clarity about its own role in advancing the autonomous weapons arms race via Replicator, as well as a detailed plan for ensuring it does not open a Pandora’s Box of new, lethal weapons on the world by refusing to hold its own operations accountable."
Additionally, "'Artificial intelligence' should not be used as a catchall justification to summon billions more in Pentagon spending, especially when the existing annual budget for the U.S. military already dwarfs every other U.S. agency and is careening towards the $1 trillion mark," Wooten wrote.
The fear that these types of weapons would open a Pandora's Box—and set off a "reckless, dangerous arms race," as Public Citizen warned of Friday—is not new. Back in 2017, dozens of artificial intelligence and robotics experts published a letter urging the United Nations to ban the development and use of so-called killer robots. As drone warfare has grown, those calls have continued.
The report also highlights the public statements of the head of one defense contractor that has been selected to produce for the Replicator initiative as a hint that the program is aimed at creating weapons that are capable of autonomous lethal force.
In early October, CEO of Anduril Palmer Luckey said that, "societies have always needed a warrior class that is enthused and excited about enacting violence on others in pursuit of good aims."
"You need people like me who are sick in that way and who don't lose any sleep making tools of violence in order to preserve freedom," he said.
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission has always been simple: To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It's never been this bad out there. And it's never been this hard to keep us going. At the very moment Common Dreams is most needed, the threats we face are intensifying. We need your support now more than ever. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Will you donate now to make sure Common Dreams not only survives but thrives? —Craig Brown, Co-founder |
A report from the government watchdog Public Citizen released Friday gives the who, what, when, where, and why of the Pentagon's flagship Replicator initiative—a program to increase the number of weapons, particularly drones, in the hands of the U.S. military.
In the report, Public Citizen re-ups concerns about one particular aspect of the program. According to the report's author, Savannah Wooten, the Defense Department has remained ambiguous on the question of whether it is developing artificial intelligence weapons that can "deploy lethal force autonomously—without a human authorizing the specific use of force in a specific context." These types of weapons are also known as "killer robots."
"It is not yet clear whether or not these technologies are designed, tested, or intended for killing," according to the report.
"All signs point to the Pentagon developing 'killer robots' via Replicator, despite deflections from Pentagon representatives themselves," wrote Wooten in the summary of the report.
The program, which was announced last year, is part of the Department of Defense's plan to deter China.
"Replicator is meant to help us overcome [China's] biggest advantage, which is mass. More ships. More missiles. More people," said Deputy Secretary of Defense Kathleen Hicks in a speech announcing the project last year. That mission will be achieved specifically by "mastering the technology of tomorrow," Hicks said.
There will soon be a "Replicator 2.0" that will focus on counter-drone technologies—per a memo from the defense secretary released in September—according to Public Citizen's report.
In a letter sent in March, Public Citizen and 13 other civil society groups highlighted remarks Hicks made in 2023 as an example of the ambiguity the Pentagon has created around the issue.
"Autonomous weapons are inherently dehumanizing and unethical, no matter whether a human is 'ultimately' responsible for the use of force or not. Deploying lethal artificial intelligence weapons in battlefield conditions necessarily means inserting them into novel conditions for which they have not been programmed, an invitation for disastrous outcomes," the organizations wrote to Hicks and Secretary of Defense Lloyd Austin.
Wooten's report reiterates that same call: "The Pentagon owes Americans clarity about its own role in advancing the autonomous weapons arms race via Replicator, as well as a detailed plan for ensuring it does not open a Pandora’s Box of new, lethal weapons on the world by refusing to hold its own operations accountable."
Additionally, "'Artificial intelligence' should not be used as a catchall justification to summon billions more in Pentagon spending, especially when the existing annual budget for the U.S. military already dwarfs every other U.S. agency and is careening towards the $1 trillion mark," Wooten wrote.
The fear that these types of weapons would open a Pandora's Box—and set off a "reckless, dangerous arms race," as Public Citizen warned of Friday—is not new. Back in 2017, dozens of artificial intelligence and robotics experts published a letter urging the United Nations to ban the development and use of so-called killer robots. As drone warfare has grown, those calls have continued.
The report also highlights the public statements of the head of one defense contractor that has been selected to produce for the Replicator initiative as a hint that the program is aimed at creating weapons that are capable of autonomous lethal force.
In early October, CEO of Anduril Palmer Luckey said that, "societies have always needed a warrior class that is enthused and excited about enacting violence on others in pursuit of good aims."
"You need people like me who are sick in that way and who don't lose any sleep making tools of violence in order to preserve freedom," he said.
A report from the government watchdog Public Citizen released Friday gives the who, what, when, where, and why of the Pentagon's flagship Replicator initiative—a program to increase the number of weapons, particularly drones, in the hands of the U.S. military.
In the report, Public Citizen re-ups concerns about one particular aspect of the program. According to the report's author, Savannah Wooten, the Defense Department has remained ambiguous on the question of whether it is developing artificial intelligence weapons that can "deploy lethal force autonomously—without a human authorizing the specific use of force in a specific context." These types of weapons are also known as "killer robots."
"It is not yet clear whether or not these technologies are designed, tested, or intended for killing," according to the report.
"All signs point to the Pentagon developing 'killer robots' via Replicator, despite deflections from Pentagon representatives themselves," wrote Wooten in the summary of the report.
The program, which was announced last year, is part of the Department of Defense's plan to deter China.
"Replicator is meant to help us overcome [China's] biggest advantage, which is mass. More ships. More missiles. More people," said Deputy Secretary of Defense Kathleen Hicks in a speech announcing the project last year. That mission will be achieved specifically by "mastering the technology of tomorrow," Hicks said.
There will soon be a "Replicator 2.0" that will focus on counter-drone technologies—per a memo from the defense secretary released in September—according to Public Citizen's report.
In a letter sent in March, Public Citizen and 13 other civil society groups highlighted remarks Hicks made in 2023 as an example of the ambiguity the Pentagon has created around the issue.
"Autonomous weapons are inherently dehumanizing and unethical, no matter whether a human is 'ultimately' responsible for the use of force or not. Deploying lethal artificial intelligence weapons in battlefield conditions necessarily means inserting them into novel conditions for which they have not been programmed, an invitation for disastrous outcomes," the organizations wrote to Hicks and Secretary of Defense Lloyd Austin.
Wooten's report reiterates that same call: "The Pentagon owes Americans clarity about its own role in advancing the autonomous weapons arms race via Replicator, as well as a detailed plan for ensuring it does not open a Pandora’s Box of new, lethal weapons on the world by refusing to hold its own operations accountable."
Additionally, "'Artificial intelligence' should not be used as a catchall justification to summon billions more in Pentagon spending, especially when the existing annual budget for the U.S. military already dwarfs every other U.S. agency and is careening towards the $1 trillion mark," Wooten wrote.
The fear that these types of weapons would open a Pandora's Box—and set off a "reckless, dangerous arms race," as Public Citizen warned of Friday—is not new. Back in 2017, dozens of artificial intelligence and robotics experts published a letter urging the United Nations to ban the development and use of so-called killer robots. As drone warfare has grown, those calls have continued.
The report also highlights the public statements of the head of one defense contractor that has been selected to produce for the Replicator initiative as a hint that the program is aimed at creating weapons that are capable of autonomous lethal force.
In early October, CEO of Anduril Palmer Luckey said that, "societies have always needed a warrior class that is enthused and excited about enacting violence on others in pursuit of good aims."
"You need people like me who are sick in that way and who don't lose any sleep making tools of violence in order to preserve freedom," he said.