

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
When our utility companies fail to build enough low-cost clean electricity and instead increase their reliance on expensive fossil fuel power plants—as many are doing in response to data center demand—electricity prices rise.
This past year, our communities were hit with skyrocketing power bills as electricity prices increased at double the rate of inflation. A new Sierra Club tool shows that, to make matters worse, utility companies in the US are planning a massive gas buildout, and it’s going to cost everyday American families even more.
The Sierra Club’s new gas plant tracker shows that utilities are planning to build 271 gigawatts of new gas power plant capacity at over 480 more expensive, polluting gas plants. This is over 40% more than all the coal capacity that is still online. This level of buildout would increase currently online gas power plant capacity by nearly 50% nationwide.
These companies have drastically increased their plans for new gas in the last few years, more than doubling planned gas power plants since the start of 2020.
This is a massive proposed buildout of new fossil fuel infrastructure that stretches across the country; new gas power plants are currently planned in 42 states. Texas has the most planned gas power plant capacity of any state followed by Georgia, Indiana, Virginia, Missouri, and Arizona.
Data center developers and utilities can stop this onslaught of plans for new gas power plants and rely on affordable, available clean energy options instead.
What do these states, spanning across the country, have in common? Data centers. All of these states face major data center proposals.
Gas power plants already provide more electricity for data center use than any other fuel, and that portion is predicted to grow without more renewable buildout. In 2024 in the US, new data center demand rivaled the amount of clean energy brought online. Data center demand is set to far exceed clean energy additions in 2025 through 2028.
Data center demand projections are still highly uncertain, meaning this level of demand may not materialize. Instead of carefully assessing this uncertainty, utilities have been too quick to propose ever more gas power plants, leaving customers on the hook to foot the bill.
Southern Company, which operates electric utilities primarily in Georgia, Alabama, and Mississippi, has the most planned gas power plant capacity of any parent company—over 20 gigawatts. This planned gas buildout is directly tied to data center proposals; for example, in Georgia, Southern subsidiary Georgia Power is planning a historic buildout of new resources specifically to serve growing demand, which is driven by data centers. What does Georgia Power want to make up the majority of that buildout? New gas power plants.
If you’ve recently looked at your utility bill and wondered why your energy costs have skyrocketed, you’re not alone. In 2025, households on average paid nearly 10% more on their utility bills than in 2024, outpacing wage growth and overall inflation. These plans to add even more gas power plants will continue to drive up our bills.
When a utility company decides to build a new gas power plant, the money it takes to build and maintain it does not come from the utility’s CEO or the Big Tech companies who want more electricity; we pay for the gas power plant in our utility bills every month. The cost of building and maintaining gas power plants has significantly and persistently increased in the US, contributing to increased prices for customers across the country. In contrast, the cost of renewables continues to fall.
When our utility companies fail to build enough low-cost clean electricity and instead increase their reliance on expensive fossil fuel power plants—as many are doing in response to data center demand—electricity prices rise.
In Virginia, for example, Dominion Energy is planning to build a massive new gas power plant that will cost Virginians at least $8 billion by the utility’s own estimates over the lifetime of the plant; the gas power plant is part of a buildout that Dominion says is necessary due to data center growth. Dominion projects that residential electric bills will more than double over the next 15 years, primarily due to data centers’ growing energy needs.
In Missouri, Ameren wants to build multiple new gas power plants to serve data centers. A single one of those gas power plants is expected to cost $900 million up front, before taking into account the volatile cost of fuel and maintenance needed throughout the plant’s lifetime. The same story is playing out across the country.
We deserve better. Data center developers and utilities can stop this onslaught of plans for new gas power plants and rely on affordable, available clean energy options instead. With proper planning, both data center developers and utilities can be part of the solution. In the meantime, we’ll continue to track utilities’ plans for new gas power plants, and you can join us to push utilities and data center developers to make better, cheaper, healthier decisions.The buildout of lots and lots of power-gobbling data centers is not as inevitable as it appears.
Caveat—this post was written entirely with my own intelligence, so who knows. Maybe it’s wrong.
But the despairing question I get asked most often is: “What’s the use? However much clean energy we produce, AI data centers will simply soak it all up.” It’s too early in the course of this technology to know anything for sure, but there are a few important answers to that.
The first comes from Amory Lovins, the long-time energy guru who wrote a paper some months ago pointing out that energy demand from AI was highly speculative, an idea he based on… history:
In 1999, the US coal industry claimed that information technology would need half the nation’s electricity by 2020, so a strong economy required far more coal-fired power stations. Such claims were spectacularly wrong but widely believed, even by top officials. Hundreds of unneeded power plants were built, hurting investors. Despite that costly lesson, similar dynamics are now unfolding again.
As Debra Kahn pointed out in Politico a few weeks ago:
So far, data centers have only increased total US power demand by a tiny amount (they make up roughly 4.4 percent of electricity use, which rose 2 percent overall last year).
And it’s possible that even if AI expands as its proponents expect, it will grow steadily more efficient, meaning it would need much less energy than predicted. Lovins again:
For example, NVIDIA’s head of data center product marketing said in September 2024 that in the past decade, “we’ve seen the efficiency of doing inferences in certain language models has increased effectively by 100,000 times. Do we expect that to continue? I think so: There’s lots of room for optimization.” Another NVIDIA comment reckons to have made AI inference (across more models) 45,000× more efficient since 2016, and expects orders-of magnitude further gains. Indeed, in 2020, NVIDIA’s Ampere chips needed 150 joules of energy per inference; in 2022, their Hopper successors needed just 15; and in 2024, their Blackwell successors needed 24 but also quintupled performance, thus using 31× less energy than Ampere per unit of performance. (Such comparisons depend on complex and wideranging assumptions, creating big discrepancies, so another expert interprets the data as up to 25× less energy and 30× better performance, multiplying to 750×.)
But that doesn’t mean that the AI industry, and its utility and government partners, won’t try to build ever more generating capacity to supply whatever power needs they project may be coming. In some places they already are: Internet Alley in Virginia has more than 150 large centers, using a quarter of its power. This is becoming an intense political issue in the Old Dominion State. As Dave Weigel reported yesterday, the issue has begun to roil Virginia politics—the GOP candidate for governor sticks with her predecessor, Glenn Youngkin, in somehow blaming solar energy for rising electricity prices (“the sun goes down”), while the Democratic nominee, Abigail Spanberger, is trying to figure out a response:
Neither nominee has gone as far in curbing growth as many suburban DC legislators and activists want. They see some of the world’s wealthiest companies getting plugged into the grid without locals reaping the benefits. Some Virginia elections have turned into battles over which candidate will be toughest on data centers; others elections have already been lost over them.
“My advice to Abigail has been: Look at where the citizens of Virginia are on the data centers,” said state Sen. Danica Roem, a Democrat who represents part of Prince William County in DC’s growing suburbs. “There are a lot of people willing to be single-issue, split-ticket voters based on this.”
Indeed, it’s shaping up to be the mother of all political issues as the midterms loom—pretty much everyone pays electric rates, and under President Donald Trump they’re starting to skyrocket. The reason isn’t hard to figure out: He’s simultaneously accelerating demand with his support for data center buildout, and constricting supply by shutting down cheap solar and wind. In fact, one way of looking at AI is that it’s main use is as a vehicle to give the fossil fuel industry one last reason to expand.
If this sounds conspiratorial, consider this story from yesterday: John McCarrick, newly hired by industry colossus OpenAI to find energy sources for ChatGPT is:
an official from the first Trump administration who is a dedicated champion of natural gas.
John McCarrick, the company’s new head of Global Energy Policy, was a senior energy policy advisor in the first Trump administration’s Bureau of Energy Resources in the Department of State while under former Secretaries of State Rex Tillerson and Mike Pompeo.
As deputy assistant secretary for Energy Transformation and the special envoy for International Energy Affairs, McCarrick promoted exports of American liquefied natural gas to Europe in the wake of the Russian invasion of Ukraine, and advocated for Asian countries to invest in natural gas.
The choice to hire McCarrick matches the intentions of OpenAI’s Trump-dominating CEO Sam Altman, who said in a U.S. Senate hearing in May that “in the short term, I think [the future of powering AI] probably looks like more natural gas.”
Sam Altman himself is an acolyte of Peter Thiel, famous climate denier who recently suggested Greta Thunberg might be the anti-Christ. But it’s all of them. In the rush to keep their valuations high, the big AI players are increasingly relying not just on fracked gas but on the very worst version of it. As Bloomberg reported early in the summer:
The trend has sparked an unlikely comeback for a type of gas turbine that long ago fell out of favor for being inefficient and polluting… a technology that’s largely been relegated to the sidelines of power production: small, single cycle natural gas turbines.
In fact, big suppliers are now companies like Caterpillar, not known for cutting edge turbine technology; these are small and comparatively dirty units.
(The ultimate example of this is Elon Musk’s Colossus supercomputer in Memphis, a superpolluter, which I wrote about for the New Yorker.) Oh, and it’s not just air pollution. A new threat emerged in the last few weeks, according to Tom Perkins in the Guardian:
Advocates are particularly concerned over the facilities’ use of Pfas gas, or f-gas, which can be potent greenhouse gases, and may mean datacenters’ climate impact is worse than previously thought. Other f-gases turn into a type of dangerous compound that is rapidly accumulating across the globe.
No testing for Pfas air or water pollution has yet been done, and companies are not required to report the volume of chemicals they use or discharge. But some environmental groups are starting to push for state legislation that would require more reporting.
Look, here’s one bottom line: If we actually had to build enormous networks of AI data centers, the obvious, cheap, and clean way to do it would be with lots of solar energy. It goes up fast. As an industry study found as long ago as December of 2024 (an eon in AI time):
Off-grid solar microgrids offer a fast path to power AI datacenters at enormous scale. The tech is mature, the suitable parcels of land in the US Southwest are known, and this solution is likely faster than most, if not all, alternatives.
As one of the country’s leading energy executives said in April:
“Renewables and battery storage are the lowest-cost form of power generation and capacity,” according to Next Era chief executive John Ketchum l. “We can build these projects and get new electrons on the grid in 12 to 18 months.”
But we can’t do that because the Trump administration has a corrupt ideological bias against clean energy, the latest example of which came last week when a giant Nevada solar project was cancelled. As Jael Holzman was the first to report:
Esmeralda 7 was supposed to produce a gargantuan 6.2 gigawatts of power–equal to nearly all the power supplied to southern Nevada by the state’s primary public utility. It would do so with a sprawling web of solar panels and batteries across the western Nevada desert. Backed by NextEra Energy, Invenergy, ConnectGen, and other renewables developers, the project was moving forward at a relatively smooth pace under the Biden administration.
But now it’s dead. One result will be higher prices for consumers. Despite everything the administration does, renewables are so cheap and easy that markets just keep choosing them. To beat that means policy as perverse as what we’re seeing—jury-rigging tired gas turbines and refitting ancient coal plants. All to power a technology that… seems increasingly like a bubble?
Here we need to get away from energy implications a bit, and just think about the underlying case for AI, and specifically the large language models that are the thing we’re spending so much money and power on. The AI industry is, increasingly, the American economy—it accounts for almost half of US economic growth this year, and an incredible 80% of the expansion of the stock market. As Ruchir Shirma wrote in the FT last week, the US economy is “one big bet” on AI:
The main reason AI is regarded as a magic fix for so many different threats is that it is expected to deliver a significant boost to productivity growth, especially in the US. Higher output per worker would lower the burden of debt by boosting GDP. It would reduce demand for labour, immigrant or domestic. And it would ease inflation risks, including the threat from tariffs, by enabling companies to raise wages without raising prices.
But for this happy picture to come to pass, AI has to actually work, which is to say do more than help kids cheat on their homework. And there’s been a growing sense in recent months that all is not right on that front. I’ve been following two AI skeptics for a year or so, both on Substack (where increasingly, in-depth and non-orthodox reporting goes to thrive).
The first is Gary Marcus, an AI researcher who has concluded that the large language models like Chat GPT are going down a blind alley. If you like to watch video, here is an encapsulation of his main points, published over the weekend. If you prefer that old-fashioned technology of reading (call me a Luddite, but it seems faster and more efficient, and much easier to excerpt), here’s his recent account from the Times explaining why businesses are having trouble finding reasons to pay money for this technology:
Large language models have had their uses, especially for coding, writing, and brainstorming, in which humans are still directly involved. But no matter how large we have made them, they have never been worthy of our trust.
Indeed, an MIT study this year found that 95% of businesses reported no measurable increase in productivity from using AI; the Harvard Business Review, a couple of weeks ago, said AI "'workslop' was cratering productivity.”
And what that means, in turn, is that there’s no real way to imagine recovering the hundreds of billions and trillions that are currently being invested in the technology. The keeper of the spreadsheets is the other Substacker, Ed Zitron, who writes extremely long and increasingly exasperated essays looking at the financial lunacy of these “investments” which, remember, underpin the stock market at the moment. Here’s last week’s:
In fact, let me put it a little simpler: All of those data center deals you’ve seen announced are basically bullshit. Even if they get the permits and the money, there are massive physical challenges that cannot be resolved by simply throwing money at them.
Today I’m going to tell you a story of chaos, hubris and fantastical thinking. I want you to come away from this with a full picture of how ridiculous the promises are, and that’s before you get to the cold hard reality that AI fucking sucks.
I’m not pretending this is the final word on this subject. No one knows how it’s all going to work out, but my guess is: badly. Already it’s sending electricity prices soaring and increasing fossil fuel emissions.
But maybe it’s also running other kinds of walls that will eventually reduce demand. Maybe human beings will decide to be… human. The new Sora “service” launched by OpenAI that allows your AI to generate fake videos, for instance, threatens to undermine the entire business of looking at videos because… what’s the point? If you can’t tell if the guy eating a ridiculously hot chili pepper is real or not, why would you watch? In a broader sense, as John Burns-Murdoch wrote in the FT (and again how lucky Europe is to have a reputable business newspaper), we may be reaching “peak social media":
It has gone largely unnoticed that time spent on social media peaked in 2022 and has since gone into steady decline, according to an analysis of the online habits of 250,000 adults in more than 50 countries carried out for the FT by the digital audience insights company GWI.
And this is not just the unwinding of a bump in screen time during pandemic lockdowns—usage has traced a smooth curve up and down over the past decade-plus. Across the developed world, adults aged 16 and older spent an average of two hours and 20 minutes per day on social platforms at the end of 2024, down by almost 10 per cent since 2022. Notably, the decline is most pronounced among the erstwhile heaviest users—teens and 20-somethings.
Which is to say: Perhaps at some point we’ll begin to come to our senses and start using our brains and bodies for the things they were built for: contact with each other, and with the world around us. That’s a lot to ask, but the world can turn in good directions as well as bad. As a final word, there’s this last week from Pope Leo, speaking to a bunch of news executives around the world, and imploring them to cool it with the junk they’re putting out:
Communication must be freed from the misguided thinking that corrupts it, from unfair competition, and the degrading practice of so-called clickbait.
Stay tuned. This story will have a lot to do with how the world turns out.
As these companies invest billions in technology for AI, they must re-up investments in renewables to power our future and protect our communities.
AI is everywhere. But its powerful computing comes with a big cost to our planet, our neighborhoods, and our wallets.
AI servers are so power hungry that utilities are keeping coal-fired power plants that were slated for closure running to meet the needs of massive servers. And in the South alone, there are plans for 20 gigawatts of new natural-gas power plants over the next 15 years—enough to power millions of homes—just to feed AI’s energy needs.
Multibillion dollar companies like Microsoft, Google, Amazon, and Meta that previously committed to 100% renewable energy are going back to the Jurassic Age, using fossil fuels like coal and natural gas to meet their insatiable energy needs. Even nuclear power plants are being reactivated to meet the needs of power-hungry servers.
At a time when we need all corporations to reduce their climate footprint, carbon emissions from major tech companies in 2023 have skyrocketed to 150% of average 2020 values.
AI data centers also produce massive noise pollution and use huge amounts of water. Residents near data centers report that the sound keeps them awake at night and their taps are running dry.
Many of us live in communities that either have or will have a data center, and we’re already feeling the effects. Many of these plants further burden communities already struggling with a lack of economic investment, access to basic resources, and exposure to high levels of pollution.
To add insult to injury, amid stagnant wages and increasing costs for food, housing, utilities, and consumer goods, AI’s demand for power is also raising electric rates for customers nationwide. To meet the soaring demand for energy that AI data servers demand, utilities need to build new infrastructure, the cost of which is being passed onto all customers.
These companies have the know-how and the wealth to power AI with wind, solar, and batteries—which makes it all the more puzzling that they’re relying on fossil fuels to power the future.
A recent Carnegie Mellon study found that AI data centers could increase electric rates by 25% in Northern Virginia by 2030. And NPR recently reported that AI data centers were a key driver in electric rates increasing twice as fast as the cost of living nationwide—at a time when 1 in 6 households are struggling to pay their energy bills.
All of these impacts are only projected to grow. AI already consumes enough electricity to power 7 million American homes. By 2028, that could jump to the amount of power needed for 22% of all US households.
But it doesn’t have to be this way.
AI could be powered by renewable energy that is nonpolluting and works to reduce energy costs for us all. The leading AI companies, who have made significant climate pledges, must lead the way.
Microsoft, Google, Amazon, and Meta have all made promises to the communities they serve to tackle climate and pollution. They all have climate pledges. And they have made significant investments in renewable energy in the past.
Those investments make sense, since renewables are the most affordable form of electricity. These companies have the know-how and the wealth to power AI with wind, solar, and batteries—which makes it all the more puzzling that they’re relying on fossil fuels to power the future.
If these corporate giants are to be good neighbors, they first need to be open and honest about the scope and scale of the problem and the solutions needed.
As these companies invest billions in technology for AI, they must re-up investments in renewables to power our future and protect our communities. They must ensure that communities have a real voice in how and where AI data centers are built—and that our communities aren’t sacrificed in the name of profits.