

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

Taylor Swift performs onstage during "Taylor Swift | The Eras Tour" at B.C. Place in Vancouver on December 6, 2024.
"Deepfakes are evolving faster than human sanity can keep up," said one critic. "We're three clicks away from a world where no one knows what's real."
Grok Imagine—a generative artificial intelligence tool developed by Elon Musk's xAI—has rolled out a "spicy mode" that is under fire for creating deepfake images on demand, including nudes of superstar Taylor Swift that's prompting calls for guardrails on the rapidly evolving technology.
The Verge's Jess Weatherbed reported Tuesday that Grok's spicy mode—one of four presets on an updated Grok 4, including fun, normal, and custom—"didn't hesitate to spit out fully uncensored topless videos of Taylor Swift the very first time I used it, without me even specifically asking the bot to take her clothes off."
Weatherbed noted:
You would think a company that already has a complicated history with Taylor Swift deepfakes, in a regulatory landscape with rules like the Take It Down Act, would be a little more careful. The xAI acceptable use policy does ban "depicting likenesses of persons in a pornographic manner," but Grok Imagine simply seems to do nothing to stop people creating likenesses of celebrities like Swift, while offering a service designed specifically to make suggestive videos including partial nudity. The age check only appeared once and was laughably easy to bypass, requesting no proof that I was the age I claimed to be.
Weatherbed—whose article is subtitled "Safeguards? What Safeguards?"—asserted that the latest iteration of Grok "feels like a lawsuit ready to happen."
Grok is now creating AI video deepfakes of celebrities such as Taylor Swift that include nonconsensual nude depictions. Worse, the user doesn't even have to specifically ask for it, they can just click the "spicy" option and Grok will simply produce videos with nudity.Video from @theverge.com.
[image or embed]
— Alejandra Caraballo (@esqueer.net) August 5, 2025 at 9:57 AM
Grok had already made headlines in recent weeks after going full "MechaHitler" following an update that the chatbot said prioritized "uncensored truth bombs over woke lobotomies."
Numerous observers have sounded the alarm on the dangers of unchained generative AI.
"Instead of heeding our call to remove its 'NSFW' AI chatbot, xAI appears to be doubling down on furthering sexual exploitation by enabling AI videos to create nudity," Haley McNamara, a senior vice president at the National Center on Sexual Exploitation, said last week.
"There's no confirmation it won't create pornographic content that resembles a recognizable person," McNamara added. "xAI should seek ways to prevent sexual abuse and exploitation."
Users of X, Musk's social platform, also weighed in on the Swift images.
"Deepfakes are evolving faster than human sanity can keep up," said one account. "We're three clicks away from a world where no one knows what's real.This isn't innovation—it's industrial scale gaslighting, and y'all [are] clapping like it's entertainment."
Another user wrote: "Not everything we can build deserves to exist. Grok Imagine's new 'spicy' mode can generate topless videos of anyone on this Earth. If this is the future, burn it down."
Musk is seemingly unfazed by the latest Grok controversy. On Tuesday, he boasted on X that "Grok Imagine usage is growing like wildfire," with "14 million images generated yesterday, now over 20 million today!"
According to a poll published in January by the Artificial Intelligence Policy Institute, 84% of U.S. voters "supported legislation making nonconsensual deepfake porn illegal, while 86% supported legislation requiring companies to restrict models to prevent their use in creating deepfake porn."
During the 2024 presidential election, Swift weighed in on the subject of AI deepfakes after then-Republican nominee Donald Trump posted an AI-generated image suggesting she endorsed the felonious former Republican president. Swift ultimately endorsed then-Vice President Kamala Harris, the Democratic nominee.
"It really conjured up my fears around AI, and the dangers of spreading misinformation," Swift said at the time.
Dear Common Dreams reader, The U.S. is on a fast track to authoritarianism like nothing I've ever seen. Meanwhile, corporate news outlets are utterly capitulating to Trump, twisting their coverage to avoid drawing his ire while lining up to stuff cash in his pockets. That's why I believe that Common Dreams is doing the best and most consequential reporting that we've ever done. Our small but mighty team is a progressive reporting powerhouse, covering the news every day that the corporate media never will. Our mission has always been simple: To inform. To inspire. And to ignite change for the common good. Now here's the key piece that I want all our readers to understand: None of this would be possible without your financial support. That's not just some fundraising cliche. It's the absolute and literal truth. We don't accept corporate advertising and never will. We don't have a paywall because we don't think people should be blocked from critical news based on their ability to pay. Everything we do is funded by the donations of readers like you. Will you donate now to help power the nonprofit, independent reporting of Common Dreams? Thank you for being a vital member of our community. Together, we can keep independent journalism alive when it’s needed most. - Craig Brown, Co-founder |
Grok Imagine—a generative artificial intelligence tool developed by Elon Musk's xAI—has rolled out a "spicy mode" that is under fire for creating deepfake images on demand, including nudes of superstar Taylor Swift that's prompting calls for guardrails on the rapidly evolving technology.
The Verge's Jess Weatherbed reported Tuesday that Grok's spicy mode—one of four presets on an updated Grok 4, including fun, normal, and custom—"didn't hesitate to spit out fully uncensored topless videos of Taylor Swift the very first time I used it, without me even specifically asking the bot to take her clothes off."
Weatherbed noted:
You would think a company that already has a complicated history with Taylor Swift deepfakes, in a regulatory landscape with rules like the Take It Down Act, would be a little more careful. The xAI acceptable use policy does ban "depicting likenesses of persons in a pornographic manner," but Grok Imagine simply seems to do nothing to stop people creating likenesses of celebrities like Swift, while offering a service designed specifically to make suggestive videos including partial nudity. The age check only appeared once and was laughably easy to bypass, requesting no proof that I was the age I claimed to be.
Weatherbed—whose article is subtitled "Safeguards? What Safeguards?"—asserted that the latest iteration of Grok "feels like a lawsuit ready to happen."
Grok is now creating AI video deepfakes of celebrities such as Taylor Swift that include nonconsensual nude depictions. Worse, the user doesn't even have to specifically ask for it, they can just click the "spicy" option and Grok will simply produce videos with nudity.Video from @theverge.com.
[image or embed]
— Alejandra Caraballo (@esqueer.net) August 5, 2025 at 9:57 AM
Grok had already made headlines in recent weeks after going full "MechaHitler" following an update that the chatbot said prioritized "uncensored truth bombs over woke lobotomies."
Numerous observers have sounded the alarm on the dangers of unchained generative AI.
"Instead of heeding our call to remove its 'NSFW' AI chatbot, xAI appears to be doubling down on furthering sexual exploitation by enabling AI videos to create nudity," Haley McNamara, a senior vice president at the National Center on Sexual Exploitation, said last week.
"There's no confirmation it won't create pornographic content that resembles a recognizable person," McNamara added. "xAI should seek ways to prevent sexual abuse and exploitation."
Users of X, Musk's social platform, also weighed in on the Swift images.
"Deepfakes are evolving faster than human sanity can keep up," said one account. "We're three clicks away from a world where no one knows what's real.This isn't innovation—it's industrial scale gaslighting, and y'all [are] clapping like it's entertainment."
Another user wrote: "Not everything we can build deserves to exist. Grok Imagine's new 'spicy' mode can generate topless videos of anyone on this Earth. If this is the future, burn it down."
Musk is seemingly unfazed by the latest Grok controversy. On Tuesday, he boasted on X that "Grok Imagine usage is growing like wildfire," with "14 million images generated yesterday, now over 20 million today!"
According to a poll published in January by the Artificial Intelligence Policy Institute, 84% of U.S. voters "supported legislation making nonconsensual deepfake porn illegal, while 86% supported legislation requiring companies to restrict models to prevent their use in creating deepfake porn."
During the 2024 presidential election, Swift weighed in on the subject of AI deepfakes after then-Republican nominee Donald Trump posted an AI-generated image suggesting she endorsed the felonious former Republican president. Swift ultimately endorsed then-Vice President Kamala Harris, the Democratic nominee.
"It really conjured up my fears around AI, and the dangers of spreading misinformation," Swift said at the time.
Grok Imagine—a generative artificial intelligence tool developed by Elon Musk's xAI—has rolled out a "spicy mode" that is under fire for creating deepfake images on demand, including nudes of superstar Taylor Swift that's prompting calls for guardrails on the rapidly evolving technology.
The Verge's Jess Weatherbed reported Tuesday that Grok's spicy mode—one of four presets on an updated Grok 4, including fun, normal, and custom—"didn't hesitate to spit out fully uncensored topless videos of Taylor Swift the very first time I used it, without me even specifically asking the bot to take her clothes off."
Weatherbed noted:
You would think a company that already has a complicated history with Taylor Swift deepfakes, in a regulatory landscape with rules like the Take It Down Act, would be a little more careful. The xAI acceptable use policy does ban "depicting likenesses of persons in a pornographic manner," but Grok Imagine simply seems to do nothing to stop people creating likenesses of celebrities like Swift, while offering a service designed specifically to make suggestive videos including partial nudity. The age check only appeared once and was laughably easy to bypass, requesting no proof that I was the age I claimed to be.
Weatherbed—whose article is subtitled "Safeguards? What Safeguards?"—asserted that the latest iteration of Grok "feels like a lawsuit ready to happen."
Grok is now creating AI video deepfakes of celebrities such as Taylor Swift that include nonconsensual nude depictions. Worse, the user doesn't even have to specifically ask for it, they can just click the "spicy" option and Grok will simply produce videos with nudity.Video from @theverge.com.
[image or embed]
— Alejandra Caraballo (@esqueer.net) August 5, 2025 at 9:57 AM
Grok had already made headlines in recent weeks after going full "MechaHitler" following an update that the chatbot said prioritized "uncensored truth bombs over woke lobotomies."
Numerous observers have sounded the alarm on the dangers of unchained generative AI.
"Instead of heeding our call to remove its 'NSFW' AI chatbot, xAI appears to be doubling down on furthering sexual exploitation by enabling AI videos to create nudity," Haley McNamara, a senior vice president at the National Center on Sexual Exploitation, said last week.
"There's no confirmation it won't create pornographic content that resembles a recognizable person," McNamara added. "xAI should seek ways to prevent sexual abuse and exploitation."
Users of X, Musk's social platform, also weighed in on the Swift images.
"Deepfakes are evolving faster than human sanity can keep up," said one account. "We're three clicks away from a world where no one knows what's real.This isn't innovation—it's industrial scale gaslighting, and y'all [are] clapping like it's entertainment."
Another user wrote: "Not everything we can build deserves to exist. Grok Imagine's new 'spicy' mode can generate topless videos of anyone on this Earth. If this is the future, burn it down."
Musk is seemingly unfazed by the latest Grok controversy. On Tuesday, he boasted on X that "Grok Imagine usage is growing like wildfire," with "14 million images generated yesterday, now over 20 million today!"
According to a poll published in January by the Artificial Intelligence Policy Institute, 84% of U.S. voters "supported legislation making nonconsensual deepfake porn illegal, while 86% supported legislation requiring companies to restrict models to prevent their use in creating deepfake porn."
During the 2024 presidential election, Swift weighed in on the subject of AI deepfakes after then-Republican nominee Donald Trump posted an AI-generated image suggesting she endorsed the felonious former Republican president. Swift ultimately endorsed then-Vice President Kamala Harris, the Democratic nominee.
"It really conjured up my fears around AI, and the dangers of spreading misinformation," Swift said at the time.