

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join," said a Palestinian digital rights group.
The Palestinian digital rights group Sada Social on Saturday called for an investigation into Israel's alleged use of WhatsApp user data to target Palestinians with its AI system, Lavender.
The group, which is affiliated with the Al Jazeera Media Institute and Access Now, accused Meta, which owns WhatsApp, of fueling "the 'Lavender' artificial intelligence system used by the Israeli military to kill Palestinian individuals within the Gaza enclave."
As Common Dreams reported in April, the Israel Defense Forces has relied on AI systems including Lavender to target people Israel believes to be Hamas members.
At +972 Magazine, Israeli journalist Yuval Abraham wrote that a current commander of an elite Israeli intelligence unit pushed for the use of AI to choose targets in Gaza. The commander wrote in a guide book to create the system that "hundreds and thousands" of features can be used to select targets, "such as being in a WhatsApp group with a known militant, changing cell phone every few months, and changing addresses frequently."
Sada Social asserted that it had found the Lavender system uses WhatsApp data to select targets.
"The reports monitored by the Sada Social Center indicate that one of the inputs to the 'Lavender' system relies on data collected from WhatsApp groups containing names of Palestinians or activists who are wanted by 'Israel,'" said the group in a press release. "The Israeli Lavender system, supported by artificial intelligence, identifies Palestinians by tracking their communications via WhatsApp or the groups they join."
The mention of Israel's use of WhatsApp data in Abraham's reporting also caught the attention last month of Paul Biggar, founder of Tech for Palestine.
"There's a lot wrong with this—I'm in plenty of WhatsApp groups with strangers, neighbors, and in the carnage in Gaza you bet people are making groups to connect," wrote Biggar. "But the part I want to focus on is whether they get this information from Meta. Meta has been promoting WhatsApp as a 'private' social network, including 'end-to-end' encryption of messages."
"Providing this data as input for Lavender undermines their claim that WhatsApp is a private messaging app," he wrote. "It is beyond obscene and makes Meta complicit in Israel's killings of 'pre-crime' targets and their families, in violation of international humanitarian law and Meta's publicly stated commitment to human rights. No social network should be providing this sort of information about its users to countries engaging in 'pre-crime.'"
Others have pointed out that Israel may have acquired WhatsApp data through means other than a leak by Meta.
Journalist Marc Owen Jones said the question of "Meta's potential role in this is important," but noted that informants, captured devices, and spyware could be used by Israel to gain Palestinian users' WhatsApp data.
Bahraini activist Esra'a Al Shafei, founder of Majal.org, told the Middle East Monitor that the reports that WhatsApp user data has been used by the IDF's AI machine demonstrate why privacy advocates warn against the collection and storage of metadata, "particularly for apps like WhatsApp, which falsely advertise their product as fully private."
"Even though WhatsApp is end-to-end encrypted, and claims to not have any backdoors to any government, the metadata alone is sufficient to expose detailed information about users, especially if the user's phone number is attached to other Meta products and related activities," Al Shafei said. "This is why the IDF could plausibly utilize metadata to track and locate WhatsApp users."
While Meta and WhatsApp may not necessarily be collaborating with Israel, she said, "by the very act of collecting this information, they're making themselves vulnerable to abuse and intrusive external surveillance."
In turn, "by using WhatsApp, people are risking their lives," she added.
A WhatsApp spokesperson told Anadolu last month that "WhatsApp has no backdoors and we do not provide bulk information to any government," adding that "Meta has provided consistent transparency reports and those include the limited circumstances when WhatsApp information has been requested."
Al Shafei said Meta must "fully investigate" how WhatsApp's metadata may be used "to track, harm, or kill its users throughout Palestine."
"WhatsApp is used by billions of people and these users have a right to know what the dangers are in using the app," she said, "or what WhatsApp and Meta will do to proactively protect them from such misuse."
After a coalition of digital rights and other groups launched a campaign on Monday aimed at thwarting Facebook's planned privacy rollback on its popular WhatsApp messaging application, competitor Signal--which does not collect user data--on Tuesday began trolling the social media giant and its ubiquitous data collection in a tongue-in-cheek ad illustrating the intrusiveness of the practice.
"The way most of the internet works today would be considered intolerable if translated into comprehensible real world analogs, but it endures because it is invisible."
--Jun Harada, Signal
Amid a viral messaging campaign that has included the revelation that Facebook CEO Mark Zuckerberg "leads by example" by using the encrypted app, Signal created a multi-variant targeted ad that Jun Harada, the company's head of growth and communication, said is "designed to show you the personal data that Facebook collects about you and sells access to."
Alas, Facebook--which Signal says blocked the ad--"was not that into the idea," Harada wrote in a blog post, adding that the social media titan "is more than willing to sell visibility into people's lives, unless it's to tell people about how their data is being used."
"Being transparent about how ads use people's data is apparently enough to get banned; in Facebook's world, the only acceptable usage is to hide what you're doing from your audience," Harada continued. "So, here are some examples of the targeted ads that you'll never see on Instagram. Yours would have been so you."
"Companies like Facebook aren't building technology for you, they're building technology for your data," wrote Harada. "They collect everything they can from FB, Instagram, and WhatsApp in order to sell visibility into people and their lives."
"This isn't exactly a secret, but the full picture is hazy to most--dimly concealed within complex, opaquely rendered systems and fine print designed to be scrolled past," he added. "The way most of the internet works today would be considered intolerable if translated into comprehensible real world analogs, but it endures because it is invisible."
Signal's popularity was already soaring before the WhatsApp privacy rollback. According to Business of Apps, the number of Signal users skyrocketed from around half a million at the end of 2019 to about 40 million this January.
It seems everywhere he goes these days, Apple CEO Tim Cook is out there forcefully and publicly defending his company's decision to provide iPhone users with end-to-end text messaging and FaceTime encryption to protect against the constant threat of criminal hackers and foreign governments. The question is: when will other tech company leaders follow his lead?
If we're going to avoid having a horrible law banning encryption passed in the next year, more of the tech company giants' high-profile representatives - the Mark Zuckerbergs, Marissa Mayers and Eric Schmidts - need to use their platforms as the world's most well-known technology chiefs to make crystal clear how important encryption is to users everywhere.
US and UK officials have not let up on their months-long PR blitz villainizing encryption in an attempt to force tech companies to provide a surveillance backdoor into their products. This, despite the fact that officials still haven't produced any evidence that encryption was involved in the planning of any of the recent attacks in Paris and San Bernardino.
Not that it should matter; even if terrorists do use encrypted communications apps there are plenty of ways to track them and plenty of reasons to still encourage the technology's use. On 60 Minutes earlier this week, Cook explained why encryption is so important and why installing backdoors for government access to everyone's communications is such a bad idea:
Here's what the situation is on your smartphone today, on your iPhone, there's likely health information, there's financial information. There are intimate conversations with your family, or your co-workers. There's probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It's because if there's a way to get in, then somebody will find the way in. There have been people that suggest that we should have a back door. But the reality is if you put a back door in, that back doors for everybody, for good guys and bad guys.
Unfortunately, Cook is badly outnumbered by an onslaught of ignorant politicians making misleading and false statements about how encryption works and why we should ban it. And it isn't just a problem in the US and the UK: all over the world, countries are grappling with their sudden loss of power to surveil everything their citizenry says or does. A judge in Brazil briefly ordered Facebook's WhatsApp messaging application be blocked across the entire country because the service has no way of decrypting a suspect's communications.
Thankfully, the court backed down, but in his statement on that ruling, Mark Zuckerberg did not even mention the word "encryption", let alone explain why it's so important that every user be given this protection even if it means that the government can never access the content. While they rightfully oppose the government's push to ban encryption, other tech companies have not been nearly as vocal in public as Apple, in many cases speaking privately through lobbyists or industry representatives.
That's not to say these companies aren't doing anything: They've all released statements at various points condemning attacks on encryption. Facebook's WhatsApp has brought end-to-end encryption to more people—over 800 million—than any other service. Google's engineering team has been a leader in securing much of the web in the post-Snowden era.
But this is more than an engineering fight - it's a political one where public opinion is crucial. And if the CEOs of these tech companies and their highest-profile representatives aren't out there every day loudly fighting for our right to encryption where millions of people can hear them, then it's quite likely we might wake up one day and find that the US or UK has passed some awful bill, which will only encourage China to do the same - and very soon half the world may try to outlaw encryption in some way, shape or form.