SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
"Big Tech companies have spent the past year cozying up to Trump," said one critic, "and this is their reward. It’s a fabulous return on a very modest investment—at the expense of all Americans.”
The White House is rapidly expanding on its efforts to stop state legislatures from protecting their constituents by passing regulations on artificial intelligence technology, with the Trump administration reportedly preparing a draft executive order that would direct the US Department of Justice to target state-level laws in what one consumer advocate called a "blatant and disgusting circumvention of our democracy"—one entirely meant to do the bidding of tech giants.
The executive order would direct Attorney General Pam Bondi to create an AI Litigation Task Force to target laws that have already been passed in both red and blue states and to stop state legislators from passing dozens of bills that have been introduced, including ones to protect people from companion chatbots, require studies on the impact of AI on employment, and bar landlords from using AI algorithms to set rent prices.
The draft order takes aim at California's new AI safety laws, calling them "complex and burdensome" and claiming they are based on "purely speculative suspicion" that AI could harm users.
“States like Alabama, California, New York and many more have passed laws to protect kids from harms of Big Tech AI like chatbots and AI generated [child sexual abuse material]. Trump’s proposal to strip away these critical protections, which have no federal equivalent, threatens to create a taxpayer-funded death panel that will determine whether kids live or die when they decide what state laws will actually apply. This level of moral bankruptcy proves that Trump is just taking orders from Big Tech CEOs,” said Sacha Haworth, executive director of the Tech Oversight Project.
The task force would operate on the administration's argument that the federal government alone is authorized to regulate commerce between states.
Shakeel Hashim, editor of the newsletter Transformer, pointed out that that claim has been pushed aggressively in recent months by venture capital firm Andreessen Horowitz.
President Donald Trump "and his team seem to have taken that idea and run with it," said Hashim. "It looks a lot like the tech industry dictating government policy—ironic, given that Trump rails against 'regulatory capture' in the draft order."
The DOJ panel would consult with Trump and White House AI Special Adviser David Sacks—an investor and cofounder of an AI company—on which state laws should be challenged.
The executive order would also authorize Commerce Secretary Howard Lutnick to publish a review of "onerous" state AI laws and restrict federal broadband funds to states found to have laws the White House disagrees with. It would further direct the Federal Communications Commission to adopt a new federal AI law that would preempt state laws.
The draft executive order was reported days after Trump called on House Republicans to include a ban on state-level AI regulations in the must-pass National Defense Authorization Act, which House Majority Leader Steve Scalise (R-La.) indicated the party would try to do.
The multipronged effort to stop states from regulating the technology, including AI chatbots that have already been linked to the suicides of children, comes months after an amendment to the One Big Beautiful Bill Act was resoundingly rejected in the Senate, 99-1.
Travis Hall, director for state engagement at the Center for Democracy and Technology, suggested that legal challenges would be filed swiftly if Trump moves forward with the executive order.
"The president cannot preempt state laws through an executive order, full stop," Hall told NBC News. "Preemption is a question for Congress, which they have considered and rejected, and should continue to reject."
David Dayen, executive editor of The American Prospect, said harm the draft order could pose becomes clear "once you ask one simple question: What is an AI law?"
The draft doesn't specify, but Dayen posited that a range of statutes could apply: "Is that just something that has to do with [large language models]? Is it anything involving a business that uses an algorithm? Machine learning?"
"You can bet that every company will try to get it to apply to their industry, and do whatever corrupt transactions with Trump to ensure it," he continued. "So this is a roadmap to preempt the vast majority of state laws on business and commerce more generally, everything from consumer protection to worker rights, in the name of preventing 'obstruction' of AI. This should be challenged immediately upon signing."
The draft order was reported amid speculation among tech industry analysts that the AI "bubble" is likely about to burst, with investors dumping their shares in AI chip manufacturer Nvidia and an MIT report finding that 95% of generative AI pilot programs are not presenting a return on investment for companies. Executives at tech giant OpenAI recently suggested the government should provide companies with a "guarantee" for developing AI infrastrusture—which was widely interpreted as a plea for a bailout.
At Public Citizen, copresident Robert Weissman took aim at the White House for its claim that AI does not pose risks to consumers, noting AI technologies are already "undermining the emotional well-being of young people and adults and, in some cases, contributing to suicide; exacerbating racial disparities at workplaces; wrongfully denying patients healthcare; driving up electric bills and increasing greenhouse gas emissions; displacing jobs; and undermining society’s basic concept of truth."
Furthermore, he said, the president's draft order proves that "for all his posturing against Big Tech, Donald Trump is nothing but the industry’s well-paid waterboy."
"Big Tech companies have spent the past year cozying up to Trump—doing everything from paying for his garish White House ballroom to adopting content moderation policies of his liking—and this is their reward," said Weissman. "It’s a fabulous return on a very modest investment—at the expense of all Americans.”
JB Branch, the group's Big Tech accountability advocate, added that instead of respecting the Senate's bipartisan rejection of the earlier attempt to stop states from regulating AI, "industry lobbyists are now running to the White House."
"AI scams are exploding, children have died by suicide linked to harmful online systems, and psychologists are warning about AI-induced breakdowns, but President Trump is choosing to protect his tech oligarch friends over the safety of middle-class Americans," said Branch. "The administration should stop trying to shield Silicon Valley from responsibility and start listening to the overwhelming bipartisan consensus that stronger, not weaker, safeguards are needed.”
A call for a new labor Bill of Rights in the age of automation.
Ask the warehouse worker training her replacement robot if progress feels inevitable.
Automation is not destiny. It is design, and design can be changed.
Internal Amazon documents reveal plans to replace more than half a million warehouse workers with robots by 2033. Executives call it innovation. Investors call it efficiency. The workers who made the company what it is call it what it feels like: erasure disguised as progress.
If Amazon can erase 500,000 jobs without consequence, every company will follow. Walmart is rolling out automated checkout. Target is testing robotic fulfillment. UPS and FedEx are developing delivery drones. Each step is described as modernization, but modernization without accountability becomes abandonment.
If we fail to govern this transition, we will inherit an economy that no longer needs its citizens.
The United States cannot afford another era of abandonment. Since 1979, productivity has risen by more than 80%, while hourly pay for most workers has barely moved. Automation threatens to widen that divide until it defines the economy itself.
Technology is not the enemy. The problem is who it serves. Every robot that replaces a worker transfers income from wages to shareholders. Every algorithm that eliminates a job turns public innovation into private accumulation. The challenge before us is not to resist progress but to govern it.
In this political moment, that may sound impossible. Washington is consumed by austerity and spectacle. The Trump administration’s second term has stripped worker protections, defunded training programs, and rewarded corporations that offshore or automate without oversight. But political cycles end, and public memory lasts. As the country heads toward the 2026 midterms and the 2028 presidential election, progressives have a rare opening to propose something larger than repair. We can build a new social contract for the automated age—a Labor Bill of Rights that reclaims the meaning of work and the purpose of progress.
That contract should rest on three pillars: profit sharing, a national transition fund, and public oversight.
The first pillar is profit sharing for automation gains.
When technology increases productivity, a share of those gains should go to the workers who make that productivity possible. France has required large firms to share profits with employees since 1967. Germany ensures worker representation on corporate boards, which prevents modernization from becoming a zero-sum game between labor and capital.
The United States could enact a federal profit-sharing mandate for companies with more than 250 employees or over $1 billion in annual revenue. When automation reduces a company’s payroll by more than 5% in a given year, that company would distribute at least 5% of its net profits as direct employee bonuses or shares. This could be structured through the tax code as a refundable surtax on undistributed automation profits.
If a company eliminates thousands of jobs to cut costs, it would still owe a share of its gains to the people and places that built its success. The rule would keep disposable income in circulation, prevent automation from collapsing demand, and ensure that the people who make automation possible continue to benefit from it.
The second pillar is a national automation transition fund.
Corporations that profit from replacing human labor should help finance the transition for those affected. The fund would be financed by an automation contribution: a 1-2% levy on the annual revenue of large firms that automate more than 5% of their workforce in any 12-month period. The Department of Labor would administer the fund through three channels.
First, wage insurance would guarantee workers at least 70% of their prior income for up to two years while they retrain or find new employment. Second, community investment grants would go directly to counties or cities experiencing major automation-driven job loss, funding small business development, infrastructure, and public employment programs. Third, an innovation dividend would fund training in fields that cannot easily be automated, such as healthcare, renewable energy, and education.
The fund could be modeled on unemployment insurance, with employer contributions adjusted annually based on automation activity. For example, if Amazon eliminated 500,000 jobs averaging $35,000 annually, a 2% contribution on its revenue—roughly $12 billion per year—would cover retraining, income support, and regional stabilization. This policy would turn automation from a corporate windfall into a shared investment in the country’s future.
The third pillar is public oversight of large-scale automation.
Just as environmental laws require companies to study and disclose the effects of pollution, corporations that plan to replace significant numbers of workers should disclose the social impacts of automation before acting. Any company planning to eliminate more than 250 jobs in a single year through automation should file an automation impact assessment with the Department of Labor.
The coming decade will decide whether automation serves democracy or displaces it.
The report would detail expected job losses, affected regions, and projected cost savings. It would also include a transition plan describing how the company will use part of those savings to fund retraining, relocation assistance, or community support. The Department of Labor would then coordinate with local governments and unions to review the plan, identify gaps, and recommend mitigation measures.
Failure to file or implement such a plan would carry penalties scaled to company size and profits. Repeat offenders could lose access to federal contracts, tax credits, or receive fines proportional to earnings. Transparency alone changes incentives. Once corporations must account for the social cost of their decisions, they begin to consider the communities they affect.
Together, these pillars would reattach innovation to justice. Profit sharing would reconnect wages and productivity. The transition fund would convert private efficiency gains into public stability. Oversight would replace secrecy with accountability.
None of this is radical. It is the next step in the unfinished project of democracy. When Franklin Roosevelt proposed an Economic Bill of Rights in 1944, he named the right to a useful job, to fair wages, to security, and to education as the foundations of freedom. We never completed that work. The next generation of progressives can.
That opportunity will not come from Congress as it stands. It will come from a national movement that links labor, climate, and democracy into one fight for a livable economy. The 2026 midterms will likely mark the beginning of that realignment, as voters look for something larger than a defense against decline. The 2028 election could be the first since the New Deal where a coalition wins not by promising safety, but by promising transformation.
Technology does not determine our future. Politics does. A robot can replace a worker, but it cannot replace the dignity of work or the shared purpose of a nation. If we fail to govern this transition, we will inherit an economy that no longer needs its citizens. If we succeed, we can create one where technology frees people from insecurity, not from income.
The wealth created by automation rests on a foundation built by the public. The internet that powers online retail began as a government project. The logistics networks that deliver goods rely on public roads and ports. The data that trains artificial intelligence is drawn from our collective lives. The returns should flow back into the society that made them possible.
The coming decade will decide whether automation serves democracy or displaces it. Progressives have a rare chance to lead with vision instead of reaction. The task is not to slow innovation but to make it answer to the people. The future of work must belong to workers—and that future begins when we decide that technology will serve humanity, not replace it.
"I wouldn't touch this stuff now," warned one financial analyst about the AI industry.
Several analysts are sounding alarms about the artificial intelligence industry being a major financial bubble that could potentially tip the global economy into a severe recession.
MarketWatch reported on Friday that the MacroStrategy Partnership, an independent research firm, has published a new note claiming that the bubble generated by AI is now 17 times larger than the dot-com bubble in the late 1990s, and four times bigger than the global real-estate bubble that crashed the economy in 2008.
The note was written by a team of analysts, including Julien Garran, who previously led the commodities strategy team at multinational investment bank UBS.
Garran contends that companies have vastly overhyped the capabilities of AI large language models (LLMs), and he pointed to data showing that the adoption rate of LLMs among large businesses has already started to decline. He also thinks that flagship LLM ChatGPT may have "hit a wall" with its latest release, which he said hasn't delivered noticeably better performance than previous releases, despite costing 10 times as much.
The consequences for the economy, he warns, could be dire.
"The danger is not only that this pushes us into a zone 4 deflationary bust on our investment clock, but that it also makes it hard for the Fed and the Trump administration to stimulate the economy out of it," he writes in the investment note.
Garran isn't the only analyst expressing extreme anxiety about the potential for an AI bubble to bring down the economy.
In a Friday interview with Axios, Dario Perkins, managing director of global macro at TS Lombard, said that tech companies are increasingly taking on massive debts in their race to build out AI data centers in a way that is reminiscent of the debts held by companies during the dot-com and subprime mortgage bubbles.
Perkins told Axios that he's particularly wary because the big tech companies are claiming "they don't care whether the investment has any return, because they're in a race."
"Surely that in itself is a red flag," he added.
CNBC reported on Friday that Goldman Sachs SEO David Solomon told an audience at the Italian Tech Week conference that he expected a "drawdown" in the stock market over the next year or two given that so much money has been pumped into AI ventures in such a short time.
"I think that there will be a lot of capital that’s deployed that will turn out to not deliver returns, and when that happens, people won’t feel good," he said.
Solomon wouldn't go so far as to definitively declare AI to be a bubble, but he did say some investors are "out on the risk curve because they’re excited," which is a telltale sign of a financial bubble.
According to CNBC, Amazon CEO Jeff Bezos, who was also attending Italian Tech Week, said on Friday that there was a bubble in the AI industry, although he insisted that the technology would be a major benefit for humanity.
"Investors have a hard time in the middle of this excitement, distinguishing between the good ideas and the bad ideas," Bezos said of the AI industry. "And that’s also probably happening today."
Perkins made no predictions about when the AI bubble will pop, but he argued that it's definitely much closer to the end of the cycle than the beginning.
"I wouldn't touch this stuff now," he told Axios. "We're much closer to 2000 than 1995."