

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

Palestinians gather at Yarmouk Stadium in Gaza City, Gaza to connect to the network and use the internet on January 22, 2024.
"The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join," said a Palestinian digital rights group.
The Palestinian digital rights group Sada Social on Saturday called for an investigation into Israel's alleged use of WhatsApp user data to target Palestinians with its AI system, Lavender.
The group, which is affiliated with the Al Jazeera Media Institute and Access Now, accused Meta, which owns WhatsApp, of fueling "the 'Lavender' artificial intelligence system used by the Israeli military to kill Palestinian individuals within the Gaza enclave."
As Common Dreams reported in April, the Israel Defense Forces has relied on AI systems including Lavender to target people Israel believes to be Hamas members.
At +972 Magazine, Israeli journalist Yuval Abraham wrote that a current commander of an elite Israeli intelligence unit pushed for the use of AI to choose targets in Gaza. The commander wrote in a guide book to create the system that "hundreds and thousands" of features can be used to select targets, "such as being in a WhatsApp group with a known militant, changing cell phone every few months, and changing addresses frequently."
Sada Social asserted that it had found the Lavender system uses WhatsApp data to select targets.
"The reports monitored by the Sada Social Center indicate that one of the inputs to the 'Lavender' system relies on data collected from WhatsApp groups containing names of Palestinians or activists who are wanted by 'Israel,'" said the group in a press release. "The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."
The mention of Israel's use of WhatsApp data in Abraham's reporting also caught the attention last month of Paul Biggar, founder of Tech for Palestine.
"There's a lot wrong with this—I'm in plenty of WhatsApp groups with strangers, neighbors, and in the carnage in Gaza you bet people are making groups to connect," wrote Biggar. "But the part I want to focus on is whether they get this information from Meta. Meta has been promoting WhatsApp as a 'private' social network, including 'end-to-end' encryption of messages."
"Providing this data as input for Lavender undermines their claim that WhatsApp is a private messaging app," he wrote. "It is beyond obscene and makes Meta complicit in Israel's killings of 'pre-crime' targets and their families, in violation of international humanitarian law and Meta's publicly stated commitment to human rights. No social network should be providing this sort of information about its users to countries engaging in 'pre-crime.'"
Others have pointed out that Israel may have acquired WhatsApp data through means other than a leak by Meta.
Journalist Marc Owen Jones said the question of "Meta's potential role in this is important," but noted that informants, captured devices, and spyware could be used by Israel to gain Palestinian users' WhatsApp data.
Bahraini activist Esra'a Al Shafei, founder of Majal.org, told the Middle East Monitor that the reports that WhatsApp user data has been used by the IDF's AI machine demonstrate why privacy advocates warn against the collection and storage of metadata, "particularly for apps like WhatsApp, which falsely advertise their product as fully private."
"Even though WhatsApp is end-to-end encrypted, and claims to not have any backdoors to any government, the metadata alone is sufficient to expose detailed information about users, especially if the user's phone number is attached to other Meta products and related activities," Al Shafei said. "This is why the IDF could plausibly utilize metadata to track and locate WhatsApp users."
While Meta and WhatsApp may not necessarily be collaborating with Israel, she said, "by the very act of collecting this information, they're making themselves vulnerable to abuse and intrusive external surveillance."
In turn, "by using WhatsApp, people are risking their lives," she added.
A WhatsApp spokesperson told Anadolu last month that "WhatsApp has no backdoors and we do not provide bulk information to any government," adding that "Meta has provided consistent transparency reports and those include the limited circumstances when WhatsApp information has been requested."
Al Shafei said Meta must "fully investigate" how WhatsApp's metadata may be used "to track, harm, or kill its users throughout Palestine."
"WhatsApp is used by billions of people and these users have a right to know what the dangers are in using the app," she said, "or what WhatsApp and Meta will do to proactively protect them from such misuse."
Dear Common Dreams reader, It’s been nearly 30 years since I co-founded Common Dreams with my late wife, Lina Newhouser. We had the radical notion that journalism should serve the public good, not corporate profits. It was clear to us from the outset what it would take to build such a project. No paid advertisements. No corporate sponsors. No millionaire publisher telling us what to think or do. Many people said we wouldn't last a year, but we proved those doubters wrong. Together with a tremendous team of journalists and dedicated staff, we built an independent media outlet free from the constraints of profits and corporate control. Our mission has always been simple: To inform. To inspire. To ignite change for the common good. Building Common Dreams was not easy. Our survival was never guaranteed. When you take on the most powerful forces—Wall Street greed, fossil fuel industry destruction, Big Tech lobbyists, and uber-rich oligarchs who have spent billions upon billions rigging the economy and democracy in their favor—the only bulwark you have is supporters who believe in your work. But here’s the urgent message from me today. It's never been this bad out there. And it's never been this hard to keep us going. At the very moment Common Dreams is most needed, the threats we face are intensifying. We need your support now more than ever. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. When everyone does the little they can afford, we are strong. But if that support retreats or dries up, so do we. Will you donate now to make sure Common Dreams not only survives but thrives? —Craig Brown, Co-founder |
The Palestinian digital rights group Sada Social on Saturday called for an investigation into Israel's alleged use of WhatsApp user data to target Palestinians with its AI system, Lavender.
The group, which is affiliated with the Al Jazeera Media Institute and Access Now, accused Meta, which owns WhatsApp, of fueling "the 'Lavender' artificial intelligence system used by the Israeli military to kill Palestinian individuals within the Gaza enclave."
As Common Dreams reported in April, the Israel Defense Forces has relied on AI systems including Lavender to target people Israel believes to be Hamas members.
At +972 Magazine, Israeli journalist Yuval Abraham wrote that a current commander of an elite Israeli intelligence unit pushed for the use of AI to choose targets in Gaza. The commander wrote in a guide book to create the system that "hundreds and thousands" of features can be used to select targets, "such as being in a WhatsApp group with a known militant, changing cell phone every few months, and changing addresses frequently."
Sada Social asserted that it had found the Lavender system uses WhatsApp data to select targets.
"The reports monitored by the Sada Social Center indicate that one of the inputs to the 'Lavender' system relies on data collected from WhatsApp groups containing names of Palestinians or activists who are wanted by 'Israel,'" said the group in a press release. "The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."
The mention of Israel's use of WhatsApp data in Abraham's reporting also caught the attention last month of Paul Biggar, founder of Tech for Palestine.
"There's a lot wrong with this—I'm in plenty of WhatsApp groups with strangers, neighbors, and in the carnage in Gaza you bet people are making groups to connect," wrote Biggar. "But the part I want to focus on is whether they get this information from Meta. Meta has been promoting WhatsApp as a 'private' social network, including 'end-to-end' encryption of messages."
"Providing this data as input for Lavender undermines their claim that WhatsApp is a private messaging app," he wrote. "It is beyond obscene and makes Meta complicit in Israel's killings of 'pre-crime' targets and their families, in violation of international humanitarian law and Meta's publicly stated commitment to human rights. No social network should be providing this sort of information about its users to countries engaging in 'pre-crime.'"
Others have pointed out that Israel may have acquired WhatsApp data through means other than a leak by Meta.
Journalist Marc Owen Jones said the question of "Meta's potential role in this is important," but noted that informants, captured devices, and spyware could be used by Israel to gain Palestinian users' WhatsApp data.
Bahraini activist Esra'a Al Shafei, founder of Majal.org, told the Middle East Monitor that the reports that WhatsApp user data has been used by the IDF's AI machine demonstrate why privacy advocates warn against the collection and storage of metadata, "particularly for apps like WhatsApp, which falsely advertise their product as fully private."
"Even though WhatsApp is end-to-end encrypted, and claims to not have any backdoors to any government, the metadata alone is sufficient to expose detailed information about users, especially if the user's phone number is attached to other Meta products and related activities," Al Shafei said. "This is why the IDF could plausibly utilize metadata to track and locate WhatsApp users."
While Meta and WhatsApp may not necessarily be collaborating with Israel, she said, "by the very act of collecting this information, they're making themselves vulnerable to abuse and intrusive external surveillance."
In turn, "by using WhatsApp, people are risking their lives," she added.
A WhatsApp spokesperson told Anadolu last month that "WhatsApp has no backdoors and we do not provide bulk information to any government," adding that "Meta has provided consistent transparency reports and those include the limited circumstances when WhatsApp information has been requested."
Al Shafei said Meta must "fully investigate" how WhatsApp's metadata may be used "to track, harm, or kill its users throughout Palestine."
"WhatsApp is used by billions of people and these users have a right to know what the dangers are in using the app," she said, "or what WhatsApp and Meta will do to proactively protect them from such misuse."
The Palestinian digital rights group Sada Social on Saturday called for an investigation into Israel's alleged use of WhatsApp user data to target Palestinians with its AI system, Lavender.
The group, which is affiliated with the Al Jazeera Media Institute and Access Now, accused Meta, which owns WhatsApp, of fueling "the 'Lavender' artificial intelligence system used by the Israeli military to kill Palestinian individuals within the Gaza enclave."
As Common Dreams reported in April, the Israel Defense Forces has relied on AI systems including Lavender to target people Israel believes to be Hamas members.
At +972 Magazine, Israeli journalist Yuval Abraham wrote that a current commander of an elite Israeli intelligence unit pushed for the use of AI to choose targets in Gaza. The commander wrote in a guide book to create the system that "hundreds and thousands" of features can be used to select targets, "such as being in a WhatsApp group with a known militant, changing cell phone every few months, and changing addresses frequently."
Sada Social asserted that it had found the Lavender system uses WhatsApp data to select targets.
"The reports monitored by the Sada Social Center indicate that one of the inputs to the 'Lavender' system relies on data collected from WhatsApp groups containing names of Palestinians or activists who are wanted by 'Israel,'" said the group in a press release. "The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."
The mention of Israel's use of WhatsApp data in Abraham's reporting also caught the attention last month of Paul Biggar, founder of Tech for Palestine.
"There's a lot wrong with this—I'm in plenty of WhatsApp groups with strangers, neighbors, and in the carnage in Gaza you bet people are making groups to connect," wrote Biggar. "But the part I want to focus on is whether they get this information from Meta. Meta has been promoting WhatsApp as a 'private' social network, including 'end-to-end' encryption of messages."
"Providing this data as input for Lavender undermines their claim that WhatsApp is a private messaging app," he wrote. "It is beyond obscene and makes Meta complicit in Israel's killings of 'pre-crime' targets and their families, in violation of international humanitarian law and Meta's publicly stated commitment to human rights. No social network should be providing this sort of information about its users to countries engaging in 'pre-crime.'"
Others have pointed out that Israel may have acquired WhatsApp data through means other than a leak by Meta.
Journalist Marc Owen Jones said the question of "Meta's potential role in this is important," but noted that informants, captured devices, and spyware could be used by Israel to gain Palestinian users' WhatsApp data.
Bahraini activist Esra'a Al Shafei, founder of Majal.org, told the Middle East Monitor that the reports that WhatsApp user data has been used by the IDF's AI machine demonstrate why privacy advocates warn against the collection and storage of metadata, "particularly for apps like WhatsApp, which falsely advertise their product as fully private."
"Even though WhatsApp is end-to-end encrypted, and claims to not have any backdoors to any government, the metadata alone is sufficient to expose detailed information about users, especially if the user's phone number is attached to other Meta products and related activities," Al Shafei said. "This is why the IDF could plausibly utilize metadata to track and locate WhatsApp users."
While Meta and WhatsApp may not necessarily be collaborating with Israel, she said, "by the very act of collecting this information, they're making themselves vulnerable to abuse and intrusive external surveillance."
In turn, "by using WhatsApp, people are risking their lives," she added.
A WhatsApp spokesperson told Anadolu last month that "WhatsApp has no backdoors and we do not provide bulk information to any government," adding that "Meta has provided consistent transparency reports and those include the limited circumstances when WhatsApp information has been requested."
Al Shafei said Meta must "fully investigate" how WhatsApp's metadata may be used "to track, harm, or kill its users throughout Palestine."
"WhatsApp is used by billions of people and these users have a right to know what the dangers are in using the app," she said, "or what WhatsApp and Meta will do to proactively protect them from such misuse."