SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF
Alison Flowers and Yohance Lacour,

Alison Flowers, Yohance Lacour, and their colleagues attend the Peabody Awards on June 9, 2024 in Beverly Hills, California. They are among seven plaintiffs that sued artificial intelligence companies under Illinois' biometric privacy law in May 2026.

(Photo by Jon Kopaloff/Getty Images for Peabody Awards)

Journalists, Audiobook Narrators Sue AI Giants Under Illinois Biometric Privacy Law for 'Stealing Their Voices'

"They've built a billion-dollar industry on stolen voices because they thought no one would make them pay for it," said a lawyer for the plaintiffs.

In yet another display of how Illinois' pioneering biometric privacy law can be used to protect Americans, state residents who work as audio storytellers, broadcast journalists, podcasters, voice actors, and more filed class-action lawsuits against Big Tech this week for "stealing their voices" to develop artificial intelligence products.

Since Illinois legislators passed the groundbreaking Biometric Information Privacy Act (BIPA) in 2008—regulating the collection, use, safeguarding, handling, storage, retention, and destruction of biometric identifiers, including fingerprints, voiceprints, and scans of a retina, iris, hand, or face geometry—there have been thousands of lawsuits filed and major settlements with Clearview AI, Facebook, and Six Flags.

Represented by the award-winning civil rights firm Loevy + Loevy, the Illinoisans are suing Adobe, Alphabet and its subsidiary Google, Apple, Amazon, ElevenLabs, Facebook parent company Meta, Microsoft, NVIDIA, and Samsung under BIPA.

The plaintiffs are audiobook narrators Lindsay Dorcus and Victoria Nassif as well as journalists Robin Amer, Yohance Lacour, Carol Marin, and Phil Rogers. Journalist Alison Flowers is part of all lawsuits except those against Amazon and Apple. Their lawyers noted that "between them, they have multiple Emmy and Peabody awards, several Pulitzer Prizes, several Alfred I. duPont-Columbia University awards, an Edward R. Murrow award, a James Beard award, a SOVAS award, and many, many other honors."

Their cases focus on the voiceprint of each plaintiff, which is "a digital fingerprint of the human voice," as the complaints explain. "It is a mathematical capture of the acoustic features—pitch, timbre, resonance—that emerge from a person's distinctive physiology, combined with the speech patterns that person develops over a lifetime: accent, cadence, articulation. Like a fingerprint, a voiceprint identifies the individual. Like a fingerprint, it cannot be changed."

The Adobe case targets Firefly, the company's family of generative AI models. The complaint states that the company "treated the human voices that built Firefly as ownerless—ignoring the speakers' rights, taking their voiceprints without asking, paying them nothing, and giving them no notice that their voices were being used at all, and "built a mirage of commercial safety around products whose construction violated the one thing Illinois law requires before collecting a voiceprint: consent from the person."

The Google filing points out that the company "has been a repeat defendant in BIPA cases" and even "paid approximately $100
million to settle BIPA claims arising from Google Photos' face grouping feature," among other high-profile settlements.

The Meta suit highlights that "no defendant in any biometric-privacy matter pending in the United States has had more direct, more sustained, or more financially consequential notice of BIPA than Meta," given that the company "has paid the three largest biometric-privacy settlements in American history," including $650 million to resolve claims under the Illinois law regarding Facebook's photo tag suggestions.

"By the time Meta released Voicebox in June 2023, MMS in May 2023, and SeamlessM4T in August 2023, Meta had been a BIPA defendant for nearly a decade and had paid more than $2 billion in biometric-privacy settlements," the complaint continues. "The technology Meta built using plaintiffs' voices now competes with plaintiffs in the markets where they earn their living."

The Amazon filing details similar harm to plaintiffs:

Amazon extracted plaintiffs' voiceprints without notice or consent, depriving them of the right BIPA guarantees to make an informed decision about the collection and use of their biometric data. Amazon retains those voiceprints in its commercial models and continues to profit from them. Amazon has further disseminated those voiceprints, encoded in model parameters, through its cross-affiliate, subprocessor, and integration-partner networks. The technology built on those voiceprints now displaces plaintiffs in the markets where they earn their living—the broadcast journalism, investigative podcast, audiobook narration, voiceover, and voice performance markets that the voice products are designed and sold to serve.

"What we are seeing is an illegal and unethical exploitation of talent on a massive scale, and one of the largest violations of biometric privacy ever committed," said Loevy + Loevy attorney Ross Kimbarovsky in a Thursday statement.

"The legislators who wrote and passed BIPA had the foresight to realize that biometric privacy was going to be a major civil rights issue in the 21st century," the attorney continued. "Social security numbers can be changed, passwords can be reset, and credit cards can be canceled, but once your biometric data is compromised, there's nothing you can do about it."

"These companies know the law, know their liability, and know exactly how to build consent systems that comply with BIPA," Kimbarovsky added. "They've built a billion-dollar industry on stolen voices because they thought no one would make them pay for it."

In addition to Illinois, Texas and Washington state have enacted biometric privacy laws, while California, Colorado, Connecticut, Utah, and Virginia have comprehensive consumer protection policies that apply to such information, according to Bloomberg Law. However, efforts in Congress to enact federal legislation—such as the National Biometric Information Privacy Act and the Facial Recognition and Biometric Technology Moratorium Act—have been unsuccessful.

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.