

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

Laura Marquez-Garrett (center), plaintiffs' attorney for Social Media Victims Law Center, gathers with family members of victims as they check their phones for news about the verdict outside the Los Angeles Superior Court on March 25, 2026.
"Between yesterday’s historic verdict in New Mexico and today’s ruling in California, it is clear that Big Tech’s free rein to addict and harm children is over," said one campaigner.
A Los Angeles jury on Wednesday found that Meta and Google acted negligently by harming a child user with their social media platforms' addictive design features in a landmark verdict that came on the heels of Tuesday's $375 million fine imposed on Meta by New Mexico jurors.
The California jury—which deliberated for 40 hours over nine days—ordered the companies to pay $3 million in compensatory civil damages to a now-20-year-old woman, known in court as Kaley G.M., for pain and suffering and other damages.
Meta—the parent company of Facebook, Instagram, and WhatsApp—must pay 70%, while Google, the Alphabet subsidiary that bought YouTube, will pay the rest.
The jury also found the companies acted fraudulently and with malice, and will impose an additional fine.
Kaley's legal team successfully argued that the social media companies designed products that are as addictive as cigarettes or online casinos, and that site features like infinite scrolling and algorithmic recommendations caused her anxiety and depression. Attorneys said Kaley began viewing YouTube videos when she was 6 years old and started using Instagram at age 9.
Attorney Mark Lanier called YouTube Kaley's "gateway" to social media addiction. Later, features like Instagram's "beauty filters" made her feel "fat" and unattractive.
Still, Kaley was hooked, testifying in court last month: “Every single day I was on it, all day long. I just can’t be without it.”
Kaley's lawyers submitted evidence including internal communications in which officials at the two companies privately acknowledged their products' addictiveness.
"If we want to win big with teens, we must bring them in as tweens," one YouTube strategy memo states.
A communication from an Instagram employee says: “We’re basically pushers... We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
Meta CEO Mark Zuckerberg says, “Kids under 13 aren’t allowed on our services.” That's a lie. 2015: Internal review found 4 million kids on Instagram.2017: Meta employees, we're "going after <13 year olds” – Zuckerberg had been talking about this “for a while.”
[image or embed]
— Tech Oversight Project (@techoversight.bsky.social) February 20, 2026 at 10:18 AM
Kaley's attorneys said in a statement following Wednesday's verdict: "For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features. Today’s verdict is a referendum—from a jury, to an entire industry—on that accountability.”
One of those attorneys, Joseph VanZandt, told The New York Times that “this is the first time in history a jury has heard testimony by executives and seen internal documents that we believe prove these companies chose profits over children."
As Courthouse News Service reported:
Kaley is the first of nearly 2,500 plaintiffs in a consolidated case in Southern California suing four tech companies—Google, Meta, TikTok, and Snap—who say their social media and streaming platforms were designed in ways that caused or worsened depression, anxiety, and body dysmorphia in minors.
TikTok and Snap settled with Kaley in the weeks before her bellwether trial but remain defendants in the broader consolidated litigation. The trial’s outcome could help spur a global settlement, though eight more bellwether trials are being prepared, with the next one scheduled to start this summer.
A Meta spokesperson told Courthouse News Service that “we respectfully disagree with the verdict and are evaluating our legal options.”
Mark Zuckerberg, Meta's CEO and co-founder, insisted during the trial that Instagram is “a good thing that has value in people’s lives.”
Appeals by the companies could drag on for years, and, as Fox Business correspondent Susan Li noted on X, "if it’s just money that they have to pay, in the end it’s just a speeding ticket as they have deep pockets of cash."
Wednesday's verdict comes amid numerous pending lawsuits against social media companies and follows Tuesday's $375 million penalty imposed on Meta by a New Mexico jury, which found that the company violated the state's Unfair Practices Act by misleading users and exposing children to harm on its platforms.
Child welfare and digital rights advocates hailed Wednesday's verdict, which The Tech Oversight Project, an advocacy group, called "an earthquake for Big Tech."
"After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years," the group's president, Sacha Haworth, said in a statement.
"These products were purposefully designed to harm [and] addict millions of young people, and lead to lifelong mental health consequences," Haworth added. "This trial was proof that if you put CEOs like Mark Zuckerberg on the stand before a judge and jury of their peers, the tech industry’s wanton disregard for people will be on full display."
Alix Fraser, vice president of advocacy at Issue One, said, “Today’s verdict is a victory for young people, their families, and all Americans, marking a critical turning point in the fight to hold Big Tech accountable."
"The message is clear: The industry cannot continue to treat the youngest generation as its guinea pigs without consequences," he continued. "The trial process exposed how these platforms are designed, how risks to young users are understood internally, and how those risks have too often been outweighed by the pursuit of growth and profit."
"Today’s verdict builds on that truth. It affirms that young people are not test subjects for unproven products that prioritize profit at all cost," Fraser added. “No other industry enjoys the level of legal protection tech companies have relied on. This verdict begins to crack that shield and move us closer to a system where accountability is the norm, not the exception."
Josh Golin, executive director of the children's advocacy group Fairplay, said, “We are so pleased that a jury has confirmed what Fairplay and the survivor parents we work with have been saying for years: Social media companies like Meta and YouTube deliberately design their products to addict kids."
"Between yesterday’s historic verdict in New Mexico and today’s ruling in California, it is clear that Big Tech’s free rein to addict and harm children is over," he added.
JB Branch, the artificial intelligence and technology policy counsel at the consumer advocacy group Public Citizen, said in a statement that "the parallels to Big Tobacco litigation are becoming harder to ignore."
"Like tobacco companies before them, social media firms built massive business models around dependency, denied or minimized mounting evidence of harm, and resisted meaningful safeguards while millions of young people were exposed to escalating risks," Branch explained. "Infinite scroll, push notifications, algorithmic amplification, and behavioral targeting were commercial design choices built to maximize attention, addiction, and revenue."
“Now more than ever, it’s time for Congress and federal regulators to establish enforceable safeguards for youth online while preserving the right of states to adopt stronger standards, including stronger product safety requirements, transparency obligations, limits on manipulative design practices, and accountability mechanisms for platforms whose business models depend on prolonged youth engagement," Branch added.
While many campaigners are urging congressional lawmakers to pass the Senate version of the Kids Online Safety Act, civil rights groups including the ACLU argue that KOSA is overbroad and poses serious risks of censorship of free speech.
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
A Los Angeles jury on Wednesday found that Meta and Google acted negligently by harming a child user with their social media platforms' addictive design features in a landmark verdict that came on the heels of Tuesday's $375 million fine imposed on Meta by New Mexico jurors.
The California jury—which deliberated for 40 hours over nine days—ordered the companies to pay $3 million in compensatory civil damages to a now-20-year-old woman, known in court as Kaley G.M., for pain and suffering and other damages.
Meta—the parent company of Facebook, Instagram, and WhatsApp—must pay 70%, while Google, the Alphabet subsidiary that bought YouTube, will pay the rest.
The jury also found the companies acted fraudulently and with malice, and will impose an additional fine.
Kaley's legal team successfully argued that the social media companies designed products that are as addictive as cigarettes or online casinos, and that site features like infinite scrolling and algorithmic recommendations caused her anxiety and depression. Attorneys said Kaley began viewing YouTube videos when she was 6 years old and started using Instagram at age 9.
Attorney Mark Lanier called YouTube Kaley's "gateway" to social media addiction. Later, features like Instagram's "beauty filters" made her feel "fat" and unattractive.
Still, Kaley was hooked, testifying in court last month: “Every single day I was on it, all day long. I just can’t be without it.”
Kaley's lawyers submitted evidence including internal communications in which officials at the two companies privately acknowledged their products' addictiveness.
"If we want to win big with teens, we must bring them in as tweens," one YouTube strategy memo states.
A communication from an Instagram employee says: “We’re basically pushers... We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
Meta CEO Mark Zuckerberg says, “Kids under 13 aren’t allowed on our services.” That's a lie. 2015: Internal review found 4 million kids on Instagram.2017: Meta employees, we're "going after <13 year olds” – Zuckerberg had been talking about this “for a while.”
[image or embed]
— Tech Oversight Project (@techoversight.bsky.social) February 20, 2026 at 10:18 AM
Kaley's attorneys said in a statement following Wednesday's verdict: "For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features. Today’s verdict is a referendum—from a jury, to an entire industry—on that accountability.”
One of those attorneys, Joseph VanZandt, told The New York Times that “this is the first time in history a jury has heard testimony by executives and seen internal documents that we believe prove these companies chose profits over children."
As Courthouse News Service reported:
Kaley is the first of nearly 2,500 plaintiffs in a consolidated case in Southern California suing four tech companies—Google, Meta, TikTok, and Snap—who say their social media and streaming platforms were designed in ways that caused or worsened depression, anxiety, and body dysmorphia in minors.
TikTok and Snap settled with Kaley in the weeks before her bellwether trial but remain defendants in the broader consolidated litigation. The trial’s outcome could help spur a global settlement, though eight more bellwether trials are being prepared, with the next one scheduled to start this summer.
A Meta spokesperson told Courthouse News Service that “we respectfully disagree with the verdict and are evaluating our legal options.”
Mark Zuckerberg, Meta's CEO and co-founder, insisted during the trial that Instagram is “a good thing that has value in people’s lives.”
Appeals by the companies could drag on for years, and, as Fox Business correspondent Susan Li noted on X, "if it’s just money that they have to pay, in the end it’s just a speeding ticket as they have deep pockets of cash."
Wednesday's verdict comes amid numerous pending lawsuits against social media companies and follows Tuesday's $375 million penalty imposed on Meta by a New Mexico jury, which found that the company violated the state's Unfair Practices Act by misleading users and exposing children to harm on its platforms.
Child welfare and digital rights advocates hailed Wednesday's verdict, which The Tech Oversight Project, an advocacy group, called "an earthquake for Big Tech."
"After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years," the group's president, Sacha Haworth, said in a statement.
"These products were purposefully designed to harm [and] addict millions of young people, and lead to lifelong mental health consequences," Haworth added. "This trial was proof that if you put CEOs like Mark Zuckerberg on the stand before a judge and jury of their peers, the tech industry’s wanton disregard for people will be on full display."
Alix Fraser, vice president of advocacy at Issue One, said, “Today’s verdict is a victory for young people, their families, and all Americans, marking a critical turning point in the fight to hold Big Tech accountable."
"The message is clear: The industry cannot continue to treat the youngest generation as its guinea pigs without consequences," he continued. "The trial process exposed how these platforms are designed, how risks to young users are understood internally, and how those risks have too often been outweighed by the pursuit of growth and profit."
"Today’s verdict builds on that truth. It affirms that young people are not test subjects for unproven products that prioritize profit at all cost," Fraser added. “No other industry enjoys the level of legal protection tech companies have relied on. This verdict begins to crack that shield and move us closer to a system where accountability is the norm, not the exception."
Josh Golin, executive director of the children's advocacy group Fairplay, said, “We are so pleased that a jury has confirmed what Fairplay and the survivor parents we work with have been saying for years: Social media companies like Meta and YouTube deliberately design their products to addict kids."
"Between yesterday’s historic verdict in New Mexico and today’s ruling in California, it is clear that Big Tech’s free rein to addict and harm children is over," he added.
JB Branch, the artificial intelligence and technology policy counsel at the consumer advocacy group Public Citizen, said in a statement that "the parallels to Big Tobacco litigation are becoming harder to ignore."
"Like tobacco companies before them, social media firms built massive business models around dependency, denied or minimized mounting evidence of harm, and resisted meaningful safeguards while millions of young people were exposed to escalating risks," Branch explained. "Infinite scroll, push notifications, algorithmic amplification, and behavioral targeting were commercial design choices built to maximize attention, addiction, and revenue."
“Now more than ever, it’s time for Congress and federal regulators to establish enforceable safeguards for youth online while preserving the right of states to adopt stronger standards, including stronger product safety requirements, transparency obligations, limits on manipulative design practices, and accountability mechanisms for platforms whose business models depend on prolonged youth engagement," Branch added.
While many campaigners are urging congressional lawmakers to pass the Senate version of the Kids Online Safety Act, civil rights groups including the ACLU argue that KOSA is overbroad and poses serious risks of censorship of free speech.
A Los Angeles jury on Wednesday found that Meta and Google acted negligently by harming a child user with their social media platforms' addictive design features in a landmark verdict that came on the heels of Tuesday's $375 million fine imposed on Meta by New Mexico jurors.
The California jury—which deliberated for 40 hours over nine days—ordered the companies to pay $3 million in compensatory civil damages to a now-20-year-old woman, known in court as Kaley G.M., for pain and suffering and other damages.
Meta—the parent company of Facebook, Instagram, and WhatsApp—must pay 70%, while Google, the Alphabet subsidiary that bought YouTube, will pay the rest.
The jury also found the companies acted fraudulently and with malice, and will impose an additional fine.
Kaley's legal team successfully argued that the social media companies designed products that are as addictive as cigarettes or online casinos, and that site features like infinite scrolling and algorithmic recommendations caused her anxiety and depression. Attorneys said Kaley began viewing YouTube videos when she was 6 years old and started using Instagram at age 9.
Attorney Mark Lanier called YouTube Kaley's "gateway" to social media addiction. Later, features like Instagram's "beauty filters" made her feel "fat" and unattractive.
Still, Kaley was hooked, testifying in court last month: “Every single day I was on it, all day long. I just can’t be without it.”
Kaley's lawyers submitted evidence including internal communications in which officials at the two companies privately acknowledged their products' addictiveness.
"If we want to win big with teens, we must bring them in as tweens," one YouTube strategy memo states.
A communication from an Instagram employee says: “We’re basically pushers... We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the reward.”
Meta CEO Mark Zuckerberg says, “Kids under 13 aren’t allowed on our services.” That's a lie. 2015: Internal review found 4 million kids on Instagram.2017: Meta employees, we're "going after <13 year olds” – Zuckerberg had been talking about this “for a while.”
[image or embed]
— Tech Oversight Project (@techoversight.bsky.social) February 20, 2026 at 10:18 AM
Kaley's attorneys said in a statement following Wednesday's verdict: "For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features. Today’s verdict is a referendum—from a jury, to an entire industry—on that accountability.”
One of those attorneys, Joseph VanZandt, told The New York Times that “this is the first time in history a jury has heard testimony by executives and seen internal documents that we believe prove these companies chose profits over children."
As Courthouse News Service reported:
Kaley is the first of nearly 2,500 plaintiffs in a consolidated case in Southern California suing four tech companies—Google, Meta, TikTok, and Snap—who say their social media and streaming platforms were designed in ways that caused or worsened depression, anxiety, and body dysmorphia in minors.
TikTok and Snap settled with Kaley in the weeks before her bellwether trial but remain defendants in the broader consolidated litigation. The trial’s outcome could help spur a global settlement, though eight more bellwether trials are being prepared, with the next one scheduled to start this summer.
A Meta spokesperson told Courthouse News Service that “we respectfully disagree with the verdict and are evaluating our legal options.”
Mark Zuckerberg, Meta's CEO and co-founder, insisted during the trial that Instagram is “a good thing that has value in people’s lives.”
Appeals by the companies could drag on for years, and, as Fox Business correspondent Susan Li noted on X, "if it’s just money that they have to pay, in the end it’s just a speeding ticket as they have deep pockets of cash."
Wednesday's verdict comes amid numerous pending lawsuits against social media companies and follows Tuesday's $375 million penalty imposed on Meta by a New Mexico jury, which found that the company violated the state's Unfair Practices Act by misleading users and exposing children to harm on its platforms.
Child welfare and digital rights advocates hailed Wednesday's verdict, which The Tech Oversight Project, an advocacy group, called "an earthquake for Big Tech."
"After years of gaslighting from companies like Google and Meta, new evidence and testimony have pulled back the curtain and validated the harms young people and parents have been telling the world about for years," the group's president, Sacha Haworth, said in a statement.
"These products were purposefully designed to harm [and] addict millions of young people, and lead to lifelong mental health consequences," Haworth added. "This trial was proof that if you put CEOs like Mark Zuckerberg on the stand before a judge and jury of their peers, the tech industry’s wanton disregard for people will be on full display."
Alix Fraser, vice president of advocacy at Issue One, said, “Today’s verdict is a victory for young people, their families, and all Americans, marking a critical turning point in the fight to hold Big Tech accountable."
"The message is clear: The industry cannot continue to treat the youngest generation as its guinea pigs without consequences," he continued. "The trial process exposed how these platforms are designed, how risks to young users are understood internally, and how those risks have too often been outweighed by the pursuit of growth and profit."
"Today’s verdict builds on that truth. It affirms that young people are not test subjects for unproven products that prioritize profit at all cost," Fraser added. “No other industry enjoys the level of legal protection tech companies have relied on. This verdict begins to crack that shield and move us closer to a system where accountability is the norm, not the exception."
Josh Golin, executive director of the children's advocacy group Fairplay, said, “We are so pleased that a jury has confirmed what Fairplay and the survivor parents we work with have been saying for years: Social media companies like Meta and YouTube deliberately design their products to addict kids."
"Between yesterday’s historic verdict in New Mexico and today’s ruling in California, it is clear that Big Tech’s free rein to addict and harm children is over," he added.
JB Branch, the artificial intelligence and technology policy counsel at the consumer advocacy group Public Citizen, said in a statement that "the parallels to Big Tobacco litigation are becoming harder to ignore."
"Like tobacco companies before them, social media firms built massive business models around dependency, denied or minimized mounting evidence of harm, and resisted meaningful safeguards while millions of young people were exposed to escalating risks," Branch explained. "Infinite scroll, push notifications, algorithmic amplification, and behavioral targeting were commercial design choices built to maximize attention, addiction, and revenue."
“Now more than ever, it’s time for Congress and federal regulators to establish enforceable safeguards for youth online while preserving the right of states to adopt stronger standards, including stronger product safety requirements, transparency obligations, limits on manipulative design practices, and accountability mechanisms for platforms whose business models depend on prolonged youth engagement," Branch added.
While many campaigners are urging congressional lawmakers to pass the Senate version of the Kids Online Safety Act, civil rights groups including the ACLU argue that KOSA is overbroad and poses serious risks of censorship of free speech.