Report on US Killer Robots Warns of 'Reckless, Dangerous Arms Race'
"All signs point to the Pentagon developing 'killer robots' via Replicator, despite deflections from Pentagon representatives themselves," according to Public Citizen.
A report from the government watchdog Public Citizen released Friday gives the who, what, when, where, and why of the Pentagon's flagship Replicator initiative—a program to increase the number of weapons, particularly drones, in the hands of the U.S. military.
In the report, Public Citizen re-ups concerns about one particular aspect of the program. According to the report's author, Savannah Wooten, the Defense Department has remained ambiguous on the question of whether it is developing artificial intelligence weapons that can "deploy lethal force autonomously—without a human authorizing the specific use of force in a specific context." These types of weapons are also known as "killer robots."
"It is not yet clear whether or not these technologies are designed, tested, or intended for killing," according to the report.
"All signs point to the Pentagon developing 'killer robots' via Replicator, despite deflections from Pentagon representatives themselves," wrote Wooten in the summary of the report.
The program, which was announced last year, is part of the Department of Defense's plan to deter China.
"Replicator is meant to help us overcome [China's] biggest advantage, which is mass. More ships. More missiles. More people," said Deputy Secretary of Defense Kathleen Hicks in a speech announcing the project last year. That mission will be achieved specifically by "mastering the technology of tomorrow," Hicks said.
There will soon be a "Replicator 2.0" that will focus on counter-drone technologies—per a memo from the defense secretary released in September—according to Public Citizen's report.
In a letter sent in March, Public Citizen and 13 other civil society groups highlighted remarks Hicks made in 2023 as an example of the ambiguity the Pentagon has created around the issue.
"Autonomous weapons are inherently dehumanizing and unethical, no matter whether a human is 'ultimately' responsible for the use of force or not. Deploying lethal artificial intelligence weapons in battlefield conditions necessarily means inserting them into novel conditions for which they have not been programmed, an invitation for disastrous outcomes," the organizations wrote to Hicks and Secretary of Defense Lloyd Austin.
Wooten's report reiterates that same call: "The Pentagon owes Americans clarity about its own role in advancing the autonomous weapons arms race via Replicator, as well as a detailed plan for ensuring it does not open a Pandora’s Box of new, lethal weapons on the world by refusing to hold its own operations accountable."
Additionally, "'Artificial intelligence' should not be used as a catchall justification to summon billions more in Pentagon spending, especially when the existing annual budget for the U.S. military already dwarfs every other U.S. agency and is careening towards the $1 trillion mark," Wooten wrote.
The fear that these types of weapons would open a Pandora's Box—and set off a "reckless, dangerous arms race," as Public Citizen warned of Friday—is not new. Back in 2017, dozens of artificial intelligence and robotics experts published a letter urging the United Nations to ban the development and use of so-called killer robots. As drone warfare has grown, those calls have continued.
The report also highlights the public statements of the head of one defense contractor that has been selected to produce for the Replicator initiative as a hint that the program is aimed at creating weapons that are capable of autonomous lethal force.
In early October, CEO of Anduril Palmer Luckey said that, "societies have always needed a warrior class that is enthused and excited about enacting violence on others in pursuit of good aims."
"You need people like me who are sick in that way and who don't lose any sleep making tools of violence in order to preserve freedom," he said.