
A woman mourns as she carries the body of a child killed in Israeli attacks on Sheikh Ridwan neighborhood to the Al-Ahli Arab Hospital in Gaza City, Gaza on November 15, 2024.
Israeli Use of AI Weapons in Gaza Denounced as Blatant Violation of Treaty Obligations
"As the Israeli military weaponizes AI for oppression, the world must confront this hypocrisy and hold it accountable to save lives," said Jewish Voice for Peace.
Amid reporting that the Israel Defense Forces is using an artificial intelligence weapons system touted as improving "operator lethality," Jewish Voice for Peace said Israel's use of technological warfare is "nothing new"—but pointed out that the new reports follow the country's signing of "the first global 'safety' AI treaty."
The Framework Convention on Artificial Intelligence, spearheaded by the Council of Europe, was signed by Israel in September, and "claims to be a legal framework governing AI systems to mitigate risks to human rights, democracy, and the rule of law—but the Israeli military is using AI to do precisely the opposite," said JVP.
The Jewish-led advocacy group spoke out after Middle East Eye (MEE) reported that the IDF has been using a weapons system in Gaza that came out of a collaboration between Israeli Weapons Industries and the Indian company Adani Defence & Aerospace.
The AI system, Arbel, was first unveiled at a defense expo in Gandhinagar, Gujarat in October 2022. MEE reported it is capable of turning "machine guns and assault rifles into computerized killing machines," using algorithms to increase Israeli soldiers' chances of hitting targets with accuracy and "efficiency."
The huge death toll in Gaza—which is conservatively estimated to be about 44,000 but which nearly 100 medical professionals estimated to be more than 118,000 last month, based on their experiences in hospitals there—has previously been linked to Israel's use of AI.
As Common Dreams reported in April, an AI machine called Lavender has been used by the IDF to devise "kill lists," with the military deeming 100 civilian deaths for every Hamas official an acceptable error rate. In December, the Israeli outlets +972 Magazine and Local Call reported that another AI machine called Gospel has been used to target dozens of buildings per day.
Defense analysts told MEE that Arbel has likely been used to "carry out the carnage of Palestinians in a more efficient manner in Gaza."
Antony Loewenstein, an independent journalist who tracks the use of technological warfare in Gaza and the West Bank, told MEE that as Israel has used numerous AI systems over the past 13 months, "targeting civilians was the point. It was never about just going after Hamas."
"I have spoken to people in Gaza, I have seen the direct human impact of this kind of killing," he told the outlet. "It is horrific."
Research analyst Noah Sylvia of the Royal United Services Institute in London told MEE that the IDF "has demonstrated a disregard for civilian life in Gaza to the point of routinely targeting children with small arms, meaning that Arbel could easily be used to make the killing of civilians, of children, more efficient."
The impact of the AI system depends "on the military's operating procedures and commitment to international humanitarian law," said Sylvia.
Humanitarian groups and human rights experts have said Israel is blatantly disregarding international law with its near-total blockade of aid into Gaza and its attacks—some with U.S. weapons—on civilians infrastructure.
JVP said the IDF's use of Arbel indicates Israel is also violating the Framework Convention on Artificial Intelligence, and noted reports that the military has also used "facial recognition technology and advanced weapons to monitor, silence dissent, cut internet access, and obscure its war crimes."
Marwa Fatafta, Middle East policy and advocacy director for Access Now, warned that Israel's partnering with India—where AI companies reportedly have the sixth-highest AI investments in the world at $7.73 billion—may provide "a new and terrifying blueprint for tech-enabled warfare... this time through Indian-Israel military tech."
"Rarely does a technology stay dormant in one location," Fatafta said. "The lawlessness and impunity in which Israel commits egregious crimes with the use of AI should terrify everyone."
An Urgent Message From Our Co-Founder
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. The final deadline for our crucial Summer Campaign fundraising drive is just days away, and we’re falling short of our must-hit goal. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
Amid reporting that the Israel Defense Forces is using an artificial intelligence weapons system touted as improving "operator lethality," Jewish Voice for Peace said Israel's use of technological warfare is "nothing new"—but pointed out that the new reports follow the country's signing of "the first global 'safety' AI treaty."
The Framework Convention on Artificial Intelligence, spearheaded by the Council of Europe, was signed by Israel in September, and "claims to be a legal framework governing AI systems to mitigate risks to human rights, democracy, and the rule of law—but the Israeli military is using AI to do precisely the opposite," said JVP.
The Jewish-led advocacy group spoke out after Middle East Eye (MEE) reported that the IDF has been using a weapons system in Gaza that came out of a collaboration between Israeli Weapons Industries and the Indian company Adani Defence & Aerospace.
The AI system, Arbel, was first unveiled at a defense expo in Gandhinagar, Gujarat in October 2022. MEE reported it is capable of turning "machine guns and assault rifles into computerized killing machines," using algorithms to increase Israeli soldiers' chances of hitting targets with accuracy and "efficiency."
The huge death toll in Gaza—which is conservatively estimated to be about 44,000 but which nearly 100 medical professionals estimated to be more than 118,000 last month, based on their experiences in hospitals there—has previously been linked to Israel's use of AI.
As Common Dreams reported in April, an AI machine called Lavender has been used by the IDF to devise "kill lists," with the military deeming 100 civilian deaths for every Hamas official an acceptable error rate. In December, the Israeli outlets +972 Magazine and Local Call reported that another AI machine called Gospel has been used to target dozens of buildings per day.
Defense analysts told MEE that Arbel has likely been used to "carry out the carnage of Palestinians in a more efficient manner in Gaza."
Antony Loewenstein, an independent journalist who tracks the use of technological warfare in Gaza and the West Bank, told MEE that as Israel has used numerous AI systems over the past 13 months, "targeting civilians was the point. It was never about just going after Hamas."
"I have spoken to people in Gaza, I have seen the direct human impact of this kind of killing," he told the outlet. "It is horrific."
Research analyst Noah Sylvia of the Royal United Services Institute in London told MEE that the IDF "has demonstrated a disregard for civilian life in Gaza to the point of routinely targeting children with small arms, meaning that Arbel could easily be used to make the killing of civilians, of children, more efficient."
The impact of the AI system depends "on the military's operating procedures and commitment to international humanitarian law," said Sylvia.
Humanitarian groups and human rights experts have said Israel is blatantly disregarding international law with its near-total blockade of aid into Gaza and its attacks—some with U.S. weapons—on civilians infrastructure.
JVP said the IDF's use of Arbel indicates Israel is also violating the Framework Convention on Artificial Intelligence, and noted reports that the military has also used "facial recognition technology and advanced weapons to monitor, silence dissent, cut internet access, and obscure its war crimes."
Marwa Fatafta, Middle East policy and advocacy director for Access Now, warned that Israel's partnering with India—where AI companies reportedly have the sixth-highest AI investments in the world at $7.73 billion—may provide "a new and terrifying blueprint for tech-enabled warfare... this time through Indian-Israel military tech."
"Rarely does a technology stay dormant in one location," Fatafta said. "The lawlessness and impunity in which Israel commits egregious crimes with the use of AI should terrify everyone."
- Probe Documents Israeli Use of AI-Equipped 'Small Killer Drones' Against Civilians in Gaza ›
- IDF Allowed 100 Civilian Deaths for Every Hamas Official Targeted by Error-Prone AI System ›
- Israel's AI-Aided Bomb Targeting Creates Massacre 'Factory' in Gaza ›
Amid reporting that the Israel Defense Forces is using an artificial intelligence weapons system touted as improving "operator lethality," Jewish Voice for Peace said Israel's use of technological warfare is "nothing new"—but pointed out that the new reports follow the country's signing of "the first global 'safety' AI treaty."
The Framework Convention on Artificial Intelligence, spearheaded by the Council of Europe, was signed by Israel in September, and "claims to be a legal framework governing AI systems to mitigate risks to human rights, democracy, and the rule of law—but the Israeli military is using AI to do precisely the opposite," said JVP.
The Jewish-led advocacy group spoke out after Middle East Eye (MEE) reported that the IDF has been using a weapons system in Gaza that came out of a collaboration between Israeli Weapons Industries and the Indian company Adani Defence & Aerospace.
The AI system, Arbel, was first unveiled at a defense expo in Gandhinagar, Gujarat in October 2022. MEE reported it is capable of turning "machine guns and assault rifles into computerized killing machines," using algorithms to increase Israeli soldiers' chances of hitting targets with accuracy and "efficiency."
The huge death toll in Gaza—which is conservatively estimated to be about 44,000 but which nearly 100 medical professionals estimated to be more than 118,000 last month, based on their experiences in hospitals there—has previously been linked to Israel's use of AI.
As Common Dreams reported in April, an AI machine called Lavender has been used by the IDF to devise "kill lists," with the military deeming 100 civilian deaths for every Hamas official an acceptable error rate. In December, the Israeli outlets +972 Magazine and Local Call reported that another AI machine called Gospel has been used to target dozens of buildings per day.
Defense analysts told MEE that Arbel has likely been used to "carry out the carnage of Palestinians in a more efficient manner in Gaza."
Antony Loewenstein, an independent journalist who tracks the use of technological warfare in Gaza and the West Bank, told MEE that as Israel has used numerous AI systems over the past 13 months, "targeting civilians was the point. It was never about just going after Hamas."
"I have spoken to people in Gaza, I have seen the direct human impact of this kind of killing," he told the outlet. "It is horrific."
Research analyst Noah Sylvia of the Royal United Services Institute in London told MEE that the IDF "has demonstrated a disregard for civilian life in Gaza to the point of routinely targeting children with small arms, meaning that Arbel could easily be used to make the killing of civilians, of children, more efficient."
The impact of the AI system depends "on the military's operating procedures and commitment to international humanitarian law," said Sylvia.
Humanitarian groups and human rights experts have said Israel is blatantly disregarding international law with its near-total blockade of aid into Gaza and its attacks—some with U.S. weapons—on civilians infrastructure.
JVP said the IDF's use of Arbel indicates Israel is also violating the Framework Convention on Artificial Intelligence, and noted reports that the military has also used "facial recognition technology and advanced weapons to monitor, silence dissent, cut internet access, and obscure its war crimes."
Marwa Fatafta, Middle East policy and advocacy director for Access Now, warned that Israel's partnering with India—where AI companies reportedly have the sixth-highest AI investments in the world at $7.73 billion—may provide "a new and terrifying blueprint for tech-enabled warfare... this time through Indian-Israel military tech."
"Rarely does a technology stay dormant in one location," Fatafta said. "The lawlessness and impunity in which Israel commits egregious crimes with the use of AI should terrify everyone."
- Probe Documents Israeli Use of AI-Equipped 'Small Killer Drones' Against Civilians in Gaza ›
- IDF Allowed 100 Civilian Deaths for Every Hamas Official Targeted by Error-Prone AI System ›
- Israel's AI-Aided Bomb Targeting Creates Massacre 'Factory' in Gaza ›