People mourn as they receive the dead bodies of victims  of an Israeli strike

People mourn as they receive the dead bodies of victims of an Israeli strike on March 29, 2024 in Rafah, Gaza.

(Photo: Ahmad Hasaballah/Getty Images)

IDF Allowed 100 Civilian Deaths for Every Hamas Official Targeted by Error-Prone AI System

"The IDF bombed them in homes without hesitation, as a first option," an Israeli intelligence officer said. "It's much easier to bomb a family's home. The system is built to look for them in these situations."

As the Biden administration issued its latest assurance that officials are pushing Israel to be "precise" in its selection of targets in Gaza, an investigation into the Israel Defense Forces' use of a previously undisclosed artificial intelligence system found that the mechanism has replaced "human agency and precision" with "mass target creation and lethality."

Four months after +972 Magazine and Local Call, detailed the IDF's use of an AI system called the Gospel, which generates dozens of buildings and structures for the military to target in a single day, the two Israeli media outlets revealed that another AI machine called Lavender, which has also played a part in Israel's slaughter of at least 32,975 Palestinians since October.

Unlike the Gospel, reported +972's Yuval Abraham, "Lavender marks people—and puts them on a kill list."

Abraham spoke to six Israeli intelligence officers who have served in the IDF during Israel's current assault on Gaza and have had "first-hand involvement with the use of AI to generate targets for assassination."

"Lavender has played a central role in the unprecedented bombing of Palestinians, especially during the early stages of the war," wrote Abraham. "In fact, according to the sources, its influence on the military's operations was such that they essentially treated the outputs of the AI machine 'as if it were a human decision.'"

"It was known in advance that 10% of the human targets slated for assassination were not members of the Hamas military wing at all."

The system was designed to mark suspected military operatives in Hamas and Palestinian Islamic Jihad (PIJ) as potential bombing targets.

But the IDF gave "sweeping approval for officers to adopt Lavender's kill lists" in the early days of Israel's bombardment campaign, and gave "no requirement to thoroughly check why the machine made those choices or to examine the raw intelligence data on which they were based."

Before bombing the houses of suspected armed group members, the intelligence officers did a brief assessment of the data to determine that the AI-selected target was male.

"A human being had to [verify the target] for just a few seconds," a source identified as B. told +972. "At first, we did checks to ensure that the machine didn't get confused. But at some point we relied on the automatic system, and we only checked that [the target] was a man—that was enough. It doesn't take a long time to tell if someone has a male or a female voice."

"I would invest 20 seconds for each target at this stage, and do dozens of them every day," B. added. "I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time. If [the operative] came up in the automated mechanism, and I checked that he was a man, there would be permission to bomb him, subject to an examination of collateral damage."

B. and the other sources acknowledged that the system was prone to error.

If the target had given his phone, which Lavender used to identify Hamas and PIJ suspects, to "his son, his older brother, or just a random man," said B., "that person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender."

The system marked 37,000 Palestinians—and their homes, with family members potentially inside—as targets in the first weeks of the war, when the IDF "almost completely relied on Lavender."

The intelligence officers told Abraham that the IDF was "not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity," despite Israel's persistent claims that they are targeting military outposts and other noncivilian infrastructure.

"On the contrary, the IDF bombed them in homes without hesitation, as a first option," an officer identified as A. told +972 and Local Call. "It's much easier to bomb a family's home. The system is built to look for them in these situations."

As a result of the military's reliance of Lavender, wrote Abraham, "thousands of Palestinians—most of them women and children or people who were not involved in the fighting—were wiped out by Israeli airstrikes, especially during the first weeks of the war."

The investigation also found that, according to two of the sources, the IDF decided in the early weeks of the war that "for every junior Hamas operative that Lavender marked, it was permissible to kill up to 15 or 20 civilians"—an unprecedented approach by Israel to so-called "collateral damage."

"A ratio of 20 civilians killed for one target works out to about 95% civilian deaths," said enterpreneur Arnaud Bertrand.

For senior Hamas officials that were targeted, the army authorized the killing of more than 100 civilians.

IDF officers, including the sources, accepted that Lavender's calculations about targets were accurate only 90% of the time, +972 and Local Call reported.

"In other words, it was known in advance that 10% of the human targets slated for assassination were not members of the Hamas military wing at all," wrote Abraham.

One source defended the military's use of Lavender, saying investing "manpower and time" in analyzing whether a suspected junior militant is a legitimate target was not worthwhile.

"In war, there is no time to incriminate every target," said the intelligence officer. "So you're willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it."

While collateral damage has long been a reality in violent conflicts, parties are bound by international humanitarian law that states they must distinguish between civilians and combatants.

Alex Hanna, director of research for the Distributed AI Research Institute, said the report illustrated the possible "future of AI warfare" for the U.S. and other powerful countries.

Bertrand suggested the "disturbing" report called to mind the Nazis' methodical massacre of Jewish people during the Holocaust.

"It's industrialized extermination," said Bertrand, "the likes of which we haven't seen since... you know when."

Join Us: News for people demanding a better world


Common Dreams is powered by optimists who believe in the power of informed and engaged citizens to ignite and enact change to make the world a better place.

We're hundreds of thousands strong, but every single supporter makes the difference.

Your contribution supports this bold media model—free, independent, and dedicated to reporting the facts every day. Stand with us in the fight for economic equality, social justice, human rights, and a more sustainable future. As a people-powered nonprofit news outlet, we cover the issues the corporate media never will. Join with us today!

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.