

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"Meta’s reported plans to introduce this technology into broadly available consumer products is a red line society must not cross."
The ACLU and a coalition of 75 other rights organizations on Tuesday issued a warning to tech giant Meta about its plan to install facial recognition technology onto its artificial intelligence-powered eyeglasses.
In a letter organized by the ACLU, the ACLU of Massachusetts, and the New York Civil Liberties Union (NYCLU), the groups said adding facial recognition technology to Meta's Ray-Ban and Oakley glasses would pose a grave threat to Americans' privacy.
"People should be able to move through their daily lives," the letter states, "without fear that stalkers, scammers, abusers, federal agents, and activists across the political spectrum are silently and invisibly verifying their identities and potentially matching their names to a wealth of readily available data about their habits, hobbies, relationships, health, and behaviors."
When it comes to specific dangers posed by embedding this technology into the company's products, the letter points to the potential for scammers to use it to "find out, quickly and in complete stealth, not just the name of the person sitting next to them on the subway—but their address, marital status, social media profiles, workplace, income, hobbies, health information, and habits."
Because of this, the letter says that "Meta’s reported plans to introduce this technology into broadly available consumer products is a red line society must not cross."
Blocking facial recognization technology from Meta glasses "is a prerequisite for a free and safe society," reads the letter.
The letter concludes with a series of demands, including that Meta stop any plans to attach facial recognition technology to its products; publicly disclose any past instances of Meta glasses being used for stalking and harassment; and reveal any "past or ongoing" discussions with law enforcement agencies such as US Immigration and Customs Enforcement about deploying the technology.
Cody Venzke, senior staff attorney working on surveillance, privacy, and technology issues for the ACLU, described facial recognition technology as "inherently invasive and unethical," and said adding it to a widely available consumer product "would vastly increase the risk of harm to individuals, families, and our democracy itself."
Kade Crockford, director of technology and justice programs at the ACLU of Massachusetts, argued that "the American people have not consented to this massive invasion of privacy," which is why Meta must abandon plans to deploy it.
"Stalkers and scammers would have a field day with this technology," Crockford said. "Federal agents could use it to harass and intimidate their critics. It’s dangerous and dystopian, and Meta must disavow it."
"Between yesterday’s historic verdict in New Mexico and today’s ruling in California, it is clear that Big Tech’s free rein to addict and harm children is over," said one campaigner.
A Los Angeles jury on Wednesday found that Meta and Google acted negligently by harming a child user with their social media platforms' addictive design features in a landmark verdict that came on the heels of Tuesday's $375 million fine imposed on Meta by New Mexico jurors.
The California jury—which deliberated for 40 hours over nine days—ordered the companies to pay $3 million in compensatory civil damages to a now-20-year-old woman, known in court as Kaley G.M., for pain and suffering and other damages.
Meta—the parent company of Facebook, Instagram, and WhatsApp—must pay 70%, while Google, the Alphabet subsidiary that bought YouTube, will pay the rest.
The jury also found the companies acted fraudulently and with malice, and will impose an additional fine.
Kaley's legal team successfully argued that the social media companies designed products that are as addictive as cigarettes or online casinos, and that site features like infinite scrolling and algorithmic recommendations caused her anxiety and depression. Attorneys said Kaley began viewing YouTube videos when she was 6 years old and started using Instagram at age 9.
Attorney Mark Lanier called YouTube Kaley's "gateway" to social media addiction. Later, features like Instagram's "beauty filters" made her feel "fat" and unattractive.
Still, Kaley was hooked, testifying in court last month: “Every single day I was on it, all day long. I just can’t be without it.”
Kaley's lawyers submitted evidence including internal communications in which officials at the two companies privately acknowledged their products' addictiveness.
"If we want to win big with teens, we must bring them in as tweens," one YouTube strategy memo states.
A communication from an Instagram employee says: “We’re basically pushers... We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
Meta CEO Mark Zuckerberg says, “Kids under 13 aren’t allowed on our services.” That's a lie. 2015: Internal review found 4 million kids on Instagram.2017: Meta employees, we're "going after <13 year olds” – Zuckerberg had been talking about this “for a while.”
[image or embed]
— Tech Oversight Project (@techoversight.bsky.social) February 20, 2026 at 10:18 AM
Kaley's attorneys said in a statement following Wednesday's verdict: "For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features. Today’s verdict is a referendum—from a jury, to an entire industry—on that accountability.”
One of those attorneys, Joseph VanZandt, told The New York Times that “this is the first time in history a jury has heard testimony by executives and seen internal documents that we believe prove these companies chose profits over children."
As Courthouse News Service reported:
Kaley is the first of nearly 2,500 plaintiffs in a consolidated case in Southern California suing four tech companies—Google, Meta, TikTok, and Snap—who say their social media and streaming platforms were designed in ways that caused or worsened depression, anxiety, and body dysmorphia in minors.
TikTok and Snap settled with Kaley in the weeks before her bellwether trial but remain defendants in the broader consolidated litigation. The trial’s outcome could help spur a global settlement, though eight more bellwether trials are being prepared, with the next one scheduled to start this summer.
A Meta spokesperson told Courthouse News Service that “we respectfully disagree with the verdict and are evaluating our legal options.”
Mark Zuckerberg, Meta's CEO and co-founder, insisted during the trial that Instagram is “a good thing that has value in people’s lives.”
Appeals by the companies could drag on for years, and, as Fox Business correspondent Susan Li noted on X, "if it’s just money that they have to pay, in the end it’s just a speeding ticket as they have deep pockets of cash."
Wednesday's verdict comes amid numerous pending lawsuits against social media companies and follows Tuesday's $375 million penalty imposed on Meta by a New Mexico jury, which found that the company violated the state's Unfair Practices Act by misleading users and exposing children to harm on its platforms.
Child welfare and digital rights advocates hailed Wednesday's verdict, which The Tech Oversight Project, an advocacy group, called "an earthquake for Big Tech."
"After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years," the group's president, Sacha Haworth, said in a statement.
"These products were purposefully designed to harm [and] addict millions of young people, and lead to lifelong mental health consequences," Haworth added. "This trial was proof that if you put CEOs like Mark Zuckerberg on the stand before a judge and jury of their peers, the tech industry’s wanton disregard for people will be on full display."
Alix Fraser, vice president of advocacy at Issue One, said, “Today’s verdict is a victory for young people, their families, and all Americans, marking a critical turning point in the fight to hold Big Tech accountable."
"The message is clear: The industry cannot continue to treat the youngest generation as its guinea pigs without consequences," he continued. "The trial process exposed how these platforms are designed, how risks to young users are understood internally, and how those risks have too often been outweighed by the pursuit of growth and profit."
"Today’s verdict builds on that truth. It affirms that young people are not test subjects for unproven products that prioritize profit at all cost," Fraser added. “No other industry enjoys the level of legal protection tech companies have relied on. This verdict begins to crack that shield and move us closer to a system where accountability is the norm, not the exception."
Josh Golin, executive director of the children's advocacy group Fairplay, said, “We are so pleased that a jury has confirmed what Fairplay and the survivor parents we work with have been saying for years: Social media companies like Meta and YouTube deliberately design their products to addict kids."
"Between yesterday’s historic verdict in New Mexico and today’s ruling in California, it is clear that Big Tech’s free rein to addict and harm children is over," he added.
JB Branch, the artificial intelligence and technology policy counsel at the consumer advocacy group Public Citizen, said in a statement that "the parallels to Big Tobacco litigation are becoming harder to ignore."
"Like tobacco companies before them, social media firms built massive business models around dependency, denied or minimized mounting evidence of harm, and resisted meaningful safeguards while millions of young people were exposed to escalating risks," Branch explained. "Infinite scroll, push notifications, algorithmic amplification, and behavioral targeting were commercial design choices built to maximize attention, addiction, and revenue."
“Now more than ever, it’s time for Congress and federal regulators to establish enforceable safeguards for youth online while preserving the right of states to adopt stronger standards, including stronger product safety requirements, transparency obligations, limits on manipulative design practices, and accountability mechanisms for platforms whose business models depend on prolonged youth engagement," Branch added.
While many campaigners are urging congressional lawmakers to pass the Senate version of the Kids Online Safety Act, civil rights groups including the ACLU argue that KOSA is overbroad and poses serious risks of censorship of free speech.
While the company plans to challenge the decision, the state's attorney general said the figure "should send a clear message to Big Tech executives that no company is beyond the reach of the law."
Democratic New Mexico Attorney General Raúl Torrez and other child advocates on Tuesday celebrated a state jury's landmark verdict against Meta, despite the social media giant's plans to fight the decision requiring it to pay $375 million in civil penalties.
"The jury's verdict is a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety," said Torrez, who had accused the company behind Facebook, Instagram, and WhatsApp of violating the state's Unfair Practices Act. "Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today, the jury joined families, educators, and child safety experts in saying enough is enough."
The Associated Press highlighted that "the landmark decision comes after a nearly seven-week trial, and as jurors in a federal court in California have been sequestered in deliberations for more than a week about whether Meta and YouTube should be liable in a similar case."
Torrez said that "New Mexico is proud to be the first state to hold Meta accountable in court for misleading parents, enabling child exploitation, and harming kids. In the next phase of this legal proceeding, we will seek additional financial penalties and court-mandated changes to Meta's platforms that offer stronger protections for children."
"The substantial damages the jury ordered Meta to pay should send a clear message to Big Tech executives that no company is beyond the reach of the law," he added. "Policymakers and law enforcement officials across the country can help make this verdict a turning point in the fight for children's safety. This is a watershed moment for every parent concerned about what could happen to their kids when they go online—and this victory belongs to them."
Josh Golin, executive director of the nonprofit Fairplay, welcomed the verdict. He said in a statement that "we've known for years that Meta enables the sexual exploitation of children. Now, that has been proven by a jury."
"As an organization that fights to protect children from Big Tech's deadly business model, Fairplay thanks Attorney General Torrez for his leadership in taking Meta to court," Golin continued. "Between this case and the ongoing trial in Los Angeles, parents, survivors, and state officials are doing their part to hold Big Tech accountable. Now, it's time for our leaders in the US Congress to get off the sidelines and pass the Senate's version of the Kids Online Safety Act to force these companies to change their addictive and dangerous product designs."
As Common Dreams has reported, while a diverse coalition supports the Kids Online Safety Act, civil rights groups have also expressed concerns about the legislation. Jenna Leventoff, senior policy counsel at the ACLU, warned last year that "the overbroad language in KOSA and similar legislation risks censoring everything from jokes and hyperbole to useful information about sex ed and suicide prevention."
Amid celebrations over the New Mexico jury's decision on Tuesday, Meta said in a statement that "we respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online."
NBC News noted that "separately, Meta is facing thousands of lawsuits accusing it and other social media companies of intentionally designing their products to be addictive to young people, leading to a nationwide mental health crisis. Some of the lawsuits, which have been filed in both state and federal courts, seek damages in the tens of billions of dollars, according to Meta’s filings with financial regulators."