

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
AI data centers have been added to the limited menu for economic development in marginalized US communities, but people in those communities have good reason to oppose them.
One word—plastics. That was the golden grail that Dustin Hoffman learned about from some well-wisher in the movie The Graduate. I remember watching the film as a farm kid and thinking about the updated version I was being told by my guidance counselors—one word: computers. We are now in the midst of the “Fourth Industrial Revolution” and the latest mantra is: artificial intelligence. Such free advice, though, could really be a costly warning in disguise.
Granted, there is a lot of poverty in the “richest” nation on Earth, and marginalized US communities often have few choices for economic (mal) development. It becomes a twisted game of pick your own poison: supermax prison, toxic waste dump, ethanol facility, tar sands pipeline… Now, AI data centers have been added to the limited menu. Someone recently shared a map of looming AI data centers across the world. It reminded me of how a tumor spreads and Edward Abbey’s quote that “growth for the sake of growth is the ideology of the cancer cell.”
The fact that Big Data has targeted Rural America for its latest mastitis should be no surprise. We have lots of available land to grab, thanks to the legacy of settler colonialism and family-farm foreclosure. Back in August I remember driving past Beaver Dam, Wisconsin and watching bulldozers flattening over 800 acres along Hwy 151 and my first hunch was: data center. Sure enough, the secretive $1 billion deal with Meta was finally revealed in a November press release. Just north of Madison in the town of DeForest, Blackstone subsidiary QTS Realty Trust is aiming to build another $12 billion data center on close to 1,600 acres. And if we need to free up more land for AI, we quaint rural folks could just abandon growing real Xmas trees and force people to buy plastic ones instead, as one Fox News “expert” suggested over the holidays. Former President Joe Biden visited Mt. Pleasant, Wisconsin in May 2024 to promote Microsoft’s new $3.3 billion 300+ acre AI campus on the former site of flat screen maker, Foxconn, that welcomed President Donald Trump for its groundbreaking back in 2018. Foxconn abandoned that $10 billion project and its 13,000 job promise, after getting millions in state subsidies and local tax deferrals.
The Microsoft AI complex in Mt. Pleasant will also require over 8 million gallons of water per year from Lake Michigan. We still have some clean water, though that may not last long thanks to agrochemical monocultures, CAFO manure dumping, and PFAS-laden sludge spreading. And AI certainly is thirsty—the Alliance for the Great Lakes noted in its August 2025 report that a hyperscale AI data center needs up to 365 million gallons of water to keep itself cool—that is as much water as is needed by 12,000 people! A recent investigative report by Bloomberg News found that over two-thirds of the AI data centers built since 2022 are in parts of the country already facing water stress. And it is really hard to drink data.
But is all the AI hype just another bubble about to burst? Rural communities (and public taxpayers) have been offered many “amazing” schemes in the past that ended up being just a “bait and switch”—another hollow promise.
In the Midwest we also have potential access to vast electricity (fracked natural gas, wind and solar farms, methane digesters), and relatively under-stressed high voltage grids (unlike California or Texas), though the loss of “cheaper” imported Canadian hydropower with the latest trade war could be a serious challenge. In 2023 the US had over a $2 billion electricity trade deficit vis-a-vis Canada. According to a recent Clean Wisconsin report, just two of our proposed AI data centers will require 3.9 gigawatts—1.5 times the current power demand of all 4.3 million homes in the state.
But, no worry, there are dilapidated US nuclear reactors with massive waste dumps that could be put back online such as Palisades in Michigan, despite opposition from environmental activists and family farmers. The Trump administration also just announced a $1 billion low-interest loan to reanimate Three Mile Island in Pennsylvania for the sake of AI. Until all that happens, though, regular ratepayers can expect a huge hike in their energy bills as Big Data has the market clout to siphon off what it needs first, especially as it colludes with utility monopolies. Many people in Wisconsin are already paying for $1+ billion in stranded assets—mostly defunct coal plants, as well as nuclear waste storage facilities—while utility investors continue to receive guaranteed dividends of 9-10%.
But is all the AI hype just another bubble about to burst? Rural communities (and public taxpayers) have been offered many “amazing” schemes in the past that ended up being just a “bait and switch”—another hollow promise. If we subsidize a massive data center, will the projected “market” for increasing algorithms actually come? Many within the AI industry don’t think so, and are now invoking the lessons we should have learned from the Enron scandal decades ago or the even worse sequel in the subprime mortgage-fueled financial meltdown. Corporate cheerleaders can be quite clever when it comes to inflating prices (and stocks) for goods and services that may not even exist, while hiding their massive debt obligations in a whole cascading series of shadowy shell subsidiaries and dishonest accounting shenanigans.
Many industry insiders are ringing alarm bells. "These models are being hyped up, and we're investing more than we should," said Daron Acemoglu, who won the 2024 Nobel Economics Prize, quoted in a recent NPR story about the current AI boom or bubble. OpenAI says it will spend $1.4 trillion on data centers over the next eight years, while Amazon, Google, Meta, and Microsoft are going to throw in another $400 billion. Meanwhile, just 3% of people who use AI now pay for it, and many are frantically trying to figure out how to turn off AI mode on their internet searches and to reject AI eavesdropping on their Zoom calls. Where is the real revenue going to come from to pay for all this AI speculation? The same NPR story notes that such a flood of leveraged capital is equal to every iPhone user on Earth forking over $250 to “enjoy” the benefits of AI—and “that’s not going to happen,” adds Paul Kedrosky, a venture capitalist who is now a research fellow at MIT's Institute for the Digital Economy. Morgan Stanley estimates AI companies will shell out $3 trillion by 2028 for this data center buildout—but less than 50% of that money will come from them. Hmmm...
Special purpose vehicle (SPV) may sound like a fancy name for a retrofitted tractor, but that is how Big Data is creating a Potemkin Village to hide their Ponzi Scheme. Here is one example from Richland Parish, Louisiana where Meta is now building its Hyperion Data Center—a massive $27 billion project. A Wall Street outfit, Blue Owl, borrows $27 billion, using Meta’s future rent payments for a data center to back up its loan. Meta’s 20% “mortgage” on the facility gives them 100% control of the purported data crunching from the facility. This debt never shows up on Meta’s books and remains hidden from carefree investors and shallow analysts, but, like other synthetic financial instruments such as the now infamous mortgage backed security (MBS), the reality only comes home to roost when the house of cards collapses and Meta has to eventually pay off Blue Owl.
In the meantime, as the Louisiana Illuminator reports, the residents of Richland Parish (where 25% live below the poverty level) are bearing the brunt of all the real costs of having an AI factory farm. Dozens of crashes involving construction vehicles; damage to local roads; and massive future energy demands (three times that required for the entire city of New Orleans), which will entail new natural gas power plants to be built (subsidized by existing ratepayers even as fossil fuel-induced climate change floods the Louisiana delta). Beyond the initial building flurry, AI data centers are ultimately job poor. It just doesn’t take that many people to tend computers once they are built. As Meta’s VP, Brad Smith, admitted, the 250,000 square foot Hyperion data center may need 1,500 workers to build but barely 50 to operate. Beyond all the ballyhoo, the main reason a particular community is chosen to “host” one seems to be based upon the bought duplicity of elected officials and the excessive generosity of local taxpayers. Not a good cost-benefit analysis—unless you are Big Data.
And then there are the questionable kickback schemes between the suppliers of the technology and those owning the data centers. If you are maker of computer chips, would you not be tempted to fork over capital to a major buyer of your own products to ensure future demand? Nvidia just announced a $100 billion stake in OpenAI to help bankroll the data centers. In turn OpenAI signed a $300 billion deal with Oracle to actually build the AI data centers that will require Nvidia’s graphics processing units (GPUs). OpenAI also signed a separate $6+ billion deal with former BitCoin miner, CoreWeave, which rents out internet cloud access (using Nvidia’s chips once again). This type of incestuous circular financing should raise eyebrows to anyone who studies business ethics—and perhaps remind others of how a toilet operates.
What is all this AI doing? Promoters will point to many innovations—faster screening for cancer cells, closer connection to far-flung relatives, precision application of fertilizers and pesticides, elimination of drudgery in the workplace through automation. A bright future indeed—or perhaps not?
The real issue is whether or not AI data centers are economically viable, socially appropriate, environmentally sustainable, and actually serve the public interest.
In August 2025, ProPublica reported that the Food and Drug Administration (FDA) had lost 20% of its staff devoted to food safety thanks to DOGE cuts. Inspection of food import facilities is now at a historic low even as our dependence on the rest of the world to feed us grows. But not to worry, the FDA announced in May that AI was coming to the rescue thanks to a large language model (LLM)—dubbed Elsa—that would be deployed alongside what’s left of its human staff to expedite their oversight work. Hopefully, Elsa knows melamine when it sees it. AI chatbots are also growing in popularity and available 24-7 to “talk or advise” people on all sorts of pressing issues—how to win more friends, how to cheat on this exam, how to make up fake legal opinions, even encouraging a teenager to commit suicide and suggesting to someone else that they murder their own parents.
But there is an even dirtier AI underbelly. Some have dubbed these AI slop, AI smut, and AI stazi—three 21st-century horsemen of the digital apocalypse. What is this all about? Well, a lot of these accelerating AI algorithms are actually devoted to selling “products” that many people do not want and would find objectionable, as well as providing “services” that undermine our basic freedoms. Slop (Merriam Webster’s word of 2025) is used to describe when AI generates internet content that is only meant to make money through advertising. Right now there are thousands of wannabe internet “creatives” all over the globe, watching “how-to videos” to manufacture AI social media to grab the eyeballs of US consumers. That cute puppy video you see on Instagram or that shocking “news” story you read on Facebook is not by accident—the goal is to monetize clicks per thousand (cost per mille, or CPM) where advertisers pay for how much their ad is viewed online. This is also why online content is often overly long (where is the actual recipe in this cooking blog?), since that increases ad scrolling. The average US consumer is now subject to between 6,000 and 10,000 ads per day—70% of which are online. For more on AI slop, visit: https://www.visibrain.com/blog/ai-slop-social-media.
An even worse virtual commodity is AI smut—literally algorithms creating pornography. This perverted version of AI scraps the internet for images (high school yearbooks, red carpet fashion shows, popular music concerts, street cam footage, etc.) and then uses “face swap” programs to create personalized hardcore rubbish. There is little if any accountability for this theft of public images and violation of personal privacy—at best those involved are “shamed” into taking down their AI sites after being exposed due to fears of liability and prosecution for child abuse. But that has hardly stopped this seedy AI subsector. Can you imagine your face or image being put into such a lucrative sexploitative scenario without your permission? At this point, there are hardly any internet police walking the beat in the virtual AI world. We don’t even have the right to be forgotten on the internet.
Which brings us to AI stazi—the updated version of the Cold War-era East German secret police. University of Wisconsin Madison just announced the creation of a College of Computing and Artificial Intelligence (CAI), in part thanks to a $140 million donation from Cisco. Few Bucky Badger fans know that 30 years ago they were used as guinea pigs while cheering at Camp Randall Stadium to help create facial recognition technology through a UW-Madison grant from the Department of Defense Applied Research Agency (DARPA). Visitors to the UW campus today will no doubt “enjoy” the automated license plate readers (ALRPs) owned by Flock Safety. According to an August 2025 Wisconsin Examiner expose, there are hundreds of Flock cameras across the state in use by law enforcement agencies, including Wisconsin county sheriff departments with active 287(g) cooperation agreements with Immigration and Customs Enforcement. No warrant is needed for law enforcement agencies to browse the national Flock database. In fact, agents have used Flock to track peaceful protesters, spy on spouses, or just stalk people they don’t like. To see where Flock cameras are near you, visit: www.deflock.me. Of course, Flock Security has outsourced its AI programming to cheaper (and more secure?) Filipino contractors. Similar AI spying networks such as Pegasus have been widely exposed and have become “bread and butter” for authoritarian regimes from Israel to Saudi Arabia. China and Russia have their own versions (Skynet, SORM, etc.). Thanks to the cozy relationship between Trump and Peter Thiel, the US-based AI mercenary outfit, Palantir, is now being redeployed for domestic surveillance—first revealed by Edward Snowden back in 2017.
The latest executive bluster from Trump is that states’ rights are out the window when it comes to regulating AI data centers—such federal preemption of local democratic control is part of the larger neoliberal “race to the bottom” forced-trade agenda. But the cat is already out of the bag as dozens of communities have successfully blocked AI data center projects and others are poised to do the same based upon their winning strategies. Better yet, this is a bipartisan grassroots organizing issue!
What is the best way to keep out an AI factory farm? No non-disclosure agreements (NDAs)! These are massive development schemes that could not exist without the approval and support of elected officials, so any agreement should not be secret. They can hardly claim to be providing a public good if they are not subject to transparency and oversight. No sweetheart deals! Big Data is among the wealthiest sectors of our current economy and does not need or deserve subsidies, discounted electric rates, tax increment financing, property tax holidays, or other incentives. It is a classic move of crony capitalism to privatize the benefits and socialize the costs. No regulatory loopholes! Given their huge demands for land, water, and energy, Big Data should not be allowed to cut legal corners and needs to follow all the rules of any other normal enterprise—full liability coverage, no special economic zones, consideration of cumulative impacts, protections for ratepayers, no unregulated toxic pollution or illegal water transfer in violation of the Clean Water Act or the Great Lakes Compact, etc. How much water your data center demands is hardly a “trade secret.”
And most important, don’t let Big Data boosters belittle your legitimate concerns as “neo-Luddite!” Everyone uses technology—even the Amish. The real issue is whether or not AI data centers are economically viable, socially appropriate, environmentally sustainable, and actually serve the public interest. People have good reasons to be wary and oppose them on all those fronts.
For more info, checkout: Big Tech Unchecked: A Toolkit for Community Action
As well as the North Star Data Center Policy Toolkit
"Musk is not cloaked in some federal immunity just because he's off-again/on-again buddies with Trump."
Elon Musk is facing calls for legal ramifications after Grok, the AI chatbot used on his X social media platform, produced sexually suggestive images of children.
Politico reported on Friday that the Paris prosecutor's office in France is opening an investigation into X after Grok, following prompts from users, created deepfake photographs of both adult women and underage girls that removed their clothes and replaced them with bikinis.
Politico added that the investigation into X over the images will "bolster" an ongoing investigation launched by French prosecutors last year into Grok's dissemination of Holocaust denial propaganda.
France is not the only government putting pressure on Musk, as TechCrunch reported on Friday that India's information technology ministry has given X 72 hours to restrict users' ability to generate content deemed "obscene, pornographic, vulgar, indecent, sexually explicit, pedophilic, or otherwise prohibited under law."
Failure to comply with this order, the ministry warned, could lead to the government ending X's legal immunity from being sued over user-generated content.
In an interview with Indian cable news network CNBC TV18, cybersecurity expert Ritesh Bhatia argued that legal liability for the images generated by Grok should not just lie with the users whose prompts generated them, but with the creators of the chatbot itself.
"When a platform like Grok even allows such prompts to be executed, the responsibility squarely lies with the intermediary," said Bhatia. "Technology is not neutral when it follows harmful commands. If a system can be instructed to violate dignity, the failure is not human behavior alone—it is design, governance, and ethical neglect. Creators of Grok need to take immediate action."
Corey Rayburn Yung, a professor at the University of Kansas School of Law, argued on Bluesky that it was "unprecedented" for a digital platform to give "users a tool to actively create" child sexual abuse material (CSAM).
"There are no other instances of a major company affirmatively facilitating the production of child pornography," Yung emphasized. "Treating this as the inevitable result of generative AI and social media is a harrowing mistake."
Andy Craig, a fellow at the Institute for Humane Studies, said that US states should use their powers to investigate X over Grok's generation of CSAM, given that it is unlikely the federal government under President Donald Trump will do so.
"Every state has its equivalent laws about this stuff," Craig explained. "Musk is not cloaked in some federal immunity just because he's off-again/on-again buddies with Trump."
Grok first gained the ability to generate sexual content this past summer when Musk introduced a new "spicy mode" for the chatbot that was immediately used to generate deepfake nude photos of celebrities.
Weeks before this, Grok began calling itself "MechaHitler" after Musk ordered his team to make tweaks to the chatbot to make it more "politically incorrect."
"No good comes of having an AI data center near you."
The massive energy needs of artificial intelligence data centers became a major political controversy in 2025, and new reporting suggests that it will grow even further in 2026.
CNBC reported on Thursday that data center projects have become political lightning rods among politicians ranging from Sen. Bernie Sanders (I-Vt.) on the left to Republican Florida Gov. Ron DeSantis on the right.
However, objections to data centers aren't just coming from politicians but from ordinary citizens who are worried about the impact such projects will have on their local environment and their utility bills.
CNBC noted that data centers' energy needs are so great that PJM Interconnection, the largest US grid operator that serves over 65 million people across 13 states, projects that it will be a full six gigawatts short of its reliability requirements in 2027.
Joe Bowring, president of independent market monitor Monitoring Analytics, told CNBC that he's never seen the grid under such projected strain.
"It’s at a crisis stage right now," Bowring said. "PJM has never been this short."
Rob Gramlich, president of power consulting firm Grid Strategies, told CNBC that he expects the debate over data centers to become even more intense this year once Americans start getting socked with even higher utility bills.
"I don't think we’ve seen the end of the political repercussions,” Gramlich said. “And with a lot more elections in 2026 than 2025, we’ll see a lot of implications. Every politician is going to be saying that they have the answer to affordability and their opponents’ policies would raise rates."
Concerns about data centers' impact on electric grids are rising in both red and blue states.
The Austin American-Statesman reported on Thursday that a new analysis written by the office of Austin City Manager TC Broadnax found that data centers have the potential to overwhelm the city's system given they are projected to need more power than can possibly be delivered with current infrastructure.
"The speed in which AI is trying to be deployed creates tremendous strain on the already tight resources in both design and construction," says the analysis, which noted that some proposed data centers are seeking more than five gigawatts, which is more than the peak load for the entire city.
In New York, local station News 10 reported last year that the New York Independent System Operator is estimating that the state's grid could be 1.6 gigawatts short of reliability requirements by 2030 thanks in large part to data centers.
Anger over proposed data centers has even spread to President Donald Trump's primary residential home of Palm Beach County, Florida, where local residents successfully postponed the construction of a proposed 200-acre data center complex.
According to public news station WLRN, locals opposed to the project cited "expected noise from cooling towers, servers, and diesel generators, along with heavy water use, pollution concerns, and higher utility costs" when petitioning Palm Beach County commissioners to scrap the proposal.
Corey Kanterman, a local opponent of the proposed data center, told WLRN that his goal is to shut the project down entirely.
"No good comes of having an AI data center near you," Kanterman said. "Put them in the location of least impact to the environment and people. This location is not it."