

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
Nuclear films have been canaries in the uranium mine; each resurgence has coincided with waves of nuclear escalation. What does it mean that top directors are tackling the subject again?
As the nuclear threat once again dominates the headlines, the nuclear blockbuster has returned to screens. Following the success of Christopher Nolan’s Oppenheimer, Kathryn Bigelow’s A House of Dynamite reminds us that the atomic bomb is not mere history. Equal parts political thriller and apocalyptic horror, the film compels viewers to imagine the unimaginable, refusing any illusion of security or assurance that catastrophe could not happen here.
There is no catharsis, no safe distance from which to retreat. What we confront on screen is not fiction but the collective madness of our current reality. The dread that Bigelow conjures does not dissipate with the closing credits; it follows us out of the theater, past the exit signs, and into a world where the possibility of instant annihilation remains stitched into everyday life.
Such films are hardly new, though their resurgence should give us pause. Since 1945, Hollywood has capitalized on the mix of fear and fascination unleashed by the atomic age. In response, studios have produced roughly 1,000 nuclear-themed films, a cinematic proliferation mirroring the buildup of nuclear arsenals (70,000 warheads by 1986). As scholar Jerome Shapiro observes, these works became “a statistically important part of the American filmgoer’s diet” for decades.
Yet atomic cinema has always been more than entertainment. It has served as both warning and witness, shaping a collective consciousness of the bomb. Within the secretive and anti-democratic architecture of the nuclear security state, these films often served as cultural critique and political resistance, piercing the veil of official classification, challenging the monopoly of defense experts, and democratizing a debate otherwise foreclosed to the public.
What we need now are stories that break the spell of American innocence.
Nuclear films have been canaries in the uranium mine. Each resurgence has coincided with waves of nuclear escalation. But they have also served as calls to action, catalyzing mass movements demanding disarmament. To understand what this revival signals, and what more is needed to reignite the anti-nuclear movement, it is worth revisiting the earlier cycles of Cold War filmmaking that were shaped by and informed nuclear and popular culture.
The first major wave of atomic cinema emerged in the late 1950s and early 1960s. This was a time in which the United States had lost its temporary monopoly on nuclear force and both Washington and Moscow were producing, testing, and stockpiling weapons a thousand times more destructive than the now nearly obsolete fission bombs that had obliterated Hiroshima and Nagasaki. In less than a decade, the atomic bomb had evolved from a “city-killer” to a “nation-killer.”
The earliest films of this era confronted a growing public anxiety over radiation. They represented a response to official efforts to downplay the danger, as politicians prioritized the management of political fallout over the prevention of radioactive fallout. From the very beginning, there emerged attempts to trivialize radiation, none more infamous than General Leslie Groves’ 1945 remark that radiation poisoning was a “very pleasant way to die.”
By 1954, the Lucky Dragon incident had made the deadly consequences of nuclear contamination impossible to ignore. Science fiction movies such as Them! and Godzilla, both released that same year, translated these fears into monstrous allegories. Yet even as such films dramatized the terror of nuclear technology, official discourse worked to normalize it.
Kubrick’s dark satire exposed the Nazi-like madness underwriting “rational” deterrence, ridiculing mutual destruction while indicting the US for deepening the peril through its own policies.
Strategic war planners like Herman Kahn embodied the technocratic detachment of the emerging nuclear priesthood. By 1960, Kahn was publicly arguing that nuclear war was winnable and that even scenarios resulting in tens of millions of deaths would not ultimately preclude “normal and happy lives for the majority of survivors and their descendants.”
He also dismissed concerns about radiation, insisting that the numbers of children born “seriously defective” due to such exposure would rise by “only” 10%. Noting that there are still birth defects in peacetime, he concluded, “War is a terrible thing; but so is peace.” Such statements shocked the public, revealing the moral vacancy of those entrusted with preserving life and preventing death in the atomic age, fueling a growing fear that ordinary people might be sacrificed on the altar of Cold War credibility.
After the October 1962 Cuban Missile Crisis, when the United States came to the brink of serving as ground zero of nuclear annihilation, many Americans fully awoke to the insanity. The public backlash, led by SANE and Women’s Strike for Peace, that followed the near catastrophe helped push President John F. Kennedy to sign the Limited Test Ban Treaty of 1963, a rare moment when popular and political pressure combined to produce tangible reform.
In response also came Stanley Kubrick’s Dr. Strangelove (1964), one of the most indelible films of the Cold War. Drawing inspiration from figures such as Kahn, it confronted the suicidal nihilism of the defense intellectuals. Kubrick’s dark satire exposed the Nazi-like madness underwriting “rational” deterrence, ridiculing mutual destruction while indicting the US for deepening the peril through its own policies. These included stationing nuclear weapons in Turkey to attempting to overthrow Castro, actions that helped manufacture a crisis that threatened to extinguish the lives of as many as 200 million North Americans and even more Soviet citizens. This is not to mention the many millions dead attributed to what is perversely termed “collateral damage.”
Two decades later, a new wave of films emerged amid another period of nuclear escalation. Their arrival in 1979 marked what many remember as the spark that reignited the anti-nuclear movement after more than a decade of dormancy. That year saw the rise of presidential candidate Ronald Reagan, whose rhetoric revived the language of nuclear confrontation, alongside two disasters that reawakened fears of radiation: the partial meltdown at Three Mile Island and the uranium mill spill at Church Rock, New Mexico. Together, these events rekindled public anxiety about the existential dangers of nuclear weapons and deepened fears surrounding nuclear power.
Reagan’s campaign and subsequent presidency again advanced the chilling notion that nuclear war might be winnable, even at the cost of millions of lives. This sentiment persisted despite scientists warning of “nuclear winter,” stressing that a nuclear exchange could devastate the atmosphere and result in “omnicide,” the death of all life on Earth. The mix of apocalyptic scientific doomsaying and bellicose political posturing sent fear soaring. By the early 1980s, polls showed that nearly half of Americans believed they might die in a nuclear war.
As historian Paul Boyer observed, even the most devastating portrayals inevitably fall short, since the only truly accurate nuclear war film, he wrote, “would be two hours of a blank screen.”
Released just 12 days before the disaster at Three Mile Island, The China Syndrome (1979) captured this mounting dread with eerie prescience. What began as a fictional thriller about a near meltdown quickly became a public relations catastrophe for nuclear power. The film’s portrayal of institutionalcorruption and bureaucratic negligence, alongside the industry’s efforts to dismiss it as propaganda, exposed official narratives on nuclear safety. In a post-Vietnam, post-Watergate America defined by cynicism and mistrust, the movie crystallized public anxieties about nuclear power and the broader dangers of corporate and governmental deceit.
Under Reagan, the popular energy unleashed by this moment coalesced into a mass movement. By 1982, anti-nuclear activism had reached its apex. On June 12, some 1 million demonstrators filled the streets of New York City for what remains the largest single protest rally in American history. Their message was unambiguous: The nuclear status quo was intolerable.
Their reach extended far beyond the streets. The Nuclear Freeze campaign mobilized communities across the country, while the 1983 ABC television film The Day After brought the horror of nuclear annihilation directly into American living rooms. More than 100 million people, including the president himself, watched as a bucolic Midwestern town of Lawrence, Kansas, emblematic of the American heartland, was reduced to a radioactive wasteland. The film remains one of the most searing depictions of nuclear war ever produced. Yet, as historian Paul Boyer observed, even the most devastating portrayals inevitably fall short, since the only truly accurate nuclear war film, he wrote, “would be two hours of a blank screen.”
But public pressure grew impossible to ignore. In a remarkable reversal, Reagan declared that “a nuclear war cannot be won and must never be fought,” beginning direct talks with Soviet premier Mikhail Gorbachev to pursue arms-reductions. This marked at least the second time that protests rendered potential nuclear weapons use not only morally unimaginable but also politically untenable (the other being the 1969 Vietnam Moratorium protests which helped dissuade President Richard Nixon from carrying out a contemplated nuclear strike against North Vietnam).
The two major nuclear films of the past two years are significant cultural events. They have revived an apocalyptic imagination and a sense of nuclear consciousness that are essential if we are ever to confront the nuclear nightmare and end the arms race before it ends us. Yet they also fall short in critical ways, and risk being remembered as great films that stirred awareness but failed to inspire the resistance necessary to meet this perilous moment.
Oppenheimer, despite its cinematic brilliance, was a missed opportunity to reckon with Hiroshima and Nagasaki. Rather than compelling audiences to confront historical responsibility, it offered a familiar narrative of tragic necessity. The bomb had to be built; the bombings, though regrettable, were justified. The moral center of the story was not the victims in Japan but Oppenheimer himself, the tormented “American Prometheus.” The result was not reckoning but retreat into myth.
A House of Dynamite follows a similar trajectory. The film is powerful and unsettling, reminding viewers that any city, and the world, could be reduced to ashes within minutes. It captures the immediacy of the danger and the near impossibility of containing a “limited” exchange. Yet it ultimately retreats into American exceptionalism, reinforcing the comforting illusion that our nuclear arsenal exists only in a defensive posture to deter aggression while it is “our enemies” that recklessly endanger both us and the planet. In reality, the most perilous moments of the atomic age were less the product of foreign provocation than of American escalation.
To imagine the United States then as only a victim of a nuclear war is to obscure its role as the principal architect of prospective annihilation. For eight decades, Washington has held humanity hostage to the possibility of instant destruction, insisting that peace depends on the ever-present threat of total devastation, including in a US-initiated first strike. Moving beyond this suicidal logic of deterrence requires an honest reckoning with that history and the will to dismantle it.
What we need now are stories that break the spell of American innocence. The only sane position remains abolition, the dismantling of weapons for which there is no defense and whose risk to the continuity of human life is intolerable. Until that reckoning arrives, Oppenheimer and A House of Dynamite, and any other future films that fail to summon the courage to speak the full truth and mobilize resistance will stand as monuments to the mythology of American victimhood, stories about the terror of being attacked told by the most heavily armed nation on earth.
A House of Dynamite in limited theatrical release and will begin streaming on Netflix on October 24, 2025.
US hegemony, however frayed at the edges, continues to be taken for granted in ruling circles. What do we make of it these days?
[This essay is adapted from “Measuring Violence,” the first chapter of John Dower’s new book, The Violent American Century: War and Terror Since World War Two.]
On February 17, 1941, almost 10 months before Japan’s attack on Pearl Harbor, Life magazine carried a lengthy essay by its publisher, Henry Luce, entitled “The American Century.” The son of Presbyterian missionaries, born in China in 1898 and raised there until the age of 15, Luce essentially transposed the certainty of religious dogma into the certainty of a nationalistic mission couched in the name of internationalism.
Luce acknowledged that the United States could not police the whole world or attempt to impose democratic institutions on all of mankind. Nonetheless, “the world of the 20th century,” he wrote, “if it is to come to life in any nobility of health and vigor, must be to a significant degree an American Century.” The essay called on all Americans “to accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world and in consequence to exert upon the world the full impact of our influence, for such purposes as we see fit and by such measures as we see fit.”
Japan’s attack on Pearl Harbor propelled the United States wholeheartedly onto the international stage Luce believed it was destined to dominate, and the ringing title of his cri de coeur became a staple of patriotic Cold War and post-Cold War rhetoric. Central to this appeal was the affirmation of a virtuous calling. Luce’s essay singled out almost every professed ideal that would become a staple of wartime and Cold War propaganda: freedom, democracy, equality of opportunity, self-reliance and independence, cooperation, justice, charity—all coupled with a vision of economic abundance inspired by “our magnificent industrial products, our technical skills.” In present-day patriotic incantations, this is referred to as “American exceptionalism.”
Clearly, the number and deadliness of global conflicts have indeed declined since World War II. This so-called postwar peace was, and still is, however, saturated in blood and wracked with suffering.
The other, harder side of America’s manifest destiny was, of course, muscularity. Power. Possessing absolute and never-ending superiority in developing and deploying the world’s most advanced and destructive arsenal of war. Luce did not dwell on this dimension of “internationalism” in his famous essay, but once the world war had been entered and won, he became its fervent apostle—an outspoken advocate of “liberating” China from its new communist rulers, taking over from the beleaguered French colonial military in Vietnam, turning both the Korean and Vietnam conflicts from “limited wars” into opportunities for a wider virtuous war against and in China, and pursuing the rollback of the Iron Curtain with “tactical atomic weapons.” As Luce’s incisive biographer Alan Brinkley documents, at one point Luce even mulled the possibility of “plastering Russia with 500 (or 1,000) A bombs”—a terrifying scenario, but one that the keepers of the US nuclear arsenal actually mapped out in expansive and appalling detail in the 1950s and 1960s, before Luce’s death in 1967.
The “American Century” catchphrase is hyperbole, the slogan never more than a myth, a fantasy, a delusion. Military victory in any traditional sense was largely a chimera after World War II. The so-called Pax Americana itself was riddled with conflict and oppression and egregious betrayals of the professed catechism of American values. At the same time, postwar US hegemony obviously never extended to more than a portion of the globe. Much that took place in the world, including disorder and mayhem, was beyond America’s control.
Yet, not unreasonably, Luce’s catchphrase persists. The 21st-century world may be chaotic, with violence erupting from innumerable sources and causes, but the United States does remain the planet’s “sole superpower.” The myth of exceptionalism still holds most Americans in its thrall. US hegemony, however frayed at the edges, continues to be taken for granted in ruling circles, and not only in Washington. And Pentagon planners still emphatically define their mission as “full-spectrum dominance” globally.
Washington’s commitment to modernizing its nuclear arsenal rather than focusing on achieving the thoroughgoing abolition of nuclear weapons has proven unshakable. So has the country’s almost religious devotion to leading the way in developing and deploying ever more “smart” and sophisticated conventional weapons of mass destruction.
Welcome to Henry Luce’s—and America’s—violent century, even if thus far it’s lasted only 75 years. The question is just what to make of it these days.
We live in times of bewildering violence. In 2013, the chairman of the Joint Chiefs of Staff told a Senate committee that the world is “more dangerous than it has ever been.” Statisticians, however, tell a different story: that war and lethal conflict have declined steadily, significantly, even precipitously since World War II.
Much mainstream scholarship now endorses the declinists. In his influential 2011 book, The Better Angels of Our Nature: Why Violence Has Declined, Harvard psychologist Steven Pinker adopted the labels “the Long Peace” for the four-plus decades of the Cold War (1945-1991), and “the New Peace” for the post-Cold War years to the present. In that book, as well as in post-publication articles, postings, and interviews, he has taken the doomsayers to task. The statistics suggest, he declares, that “today we may be living in the most peaceable era in our species’s existence.”
Clearly, the number and deadliness of global conflicts have indeed declined since World War II. This so-called postwar peace was, and still is, however, saturated in blood and wracked with suffering.
It is reasonable to argue that total war-related fatalities during the Cold War decades were lower than in the six years of World War II (1939-1945) and certainly far less than the toll for the 20th century’s two world wars combined. It is also undeniable that overall death tolls have declined further since then. The five most devastating intrastate or interstate conflicts of the postwar decades—in China, Korea, Vietnam, Afghanistan, and between Iran and Iraq—took place during the Cold War. So did a majority of the most deadly politicides, or political mass killings, and genocides: in the Soviet Union, China (again), Yugoslavia, North Korea, North Vietnam, Sudan, Nigeria, Indonesia, Pakistan-Bangladesh, Ethiopia, Angola, Mozambique, and Cambodia, among other countries. The end of the Cold War certainly did not signal the end of such atrocities (as witness Rwanda, the Congo, and the implosion of Syria). As with major wars, however, the trajectory has been downward.
Unsurprisingly, the declinist argument celebrates the Cold War as less violent than the global conflicts that preceded it, and the decades that followed as statistically less violent than the Cold War. But what motivates the sanitizing of these years, now amounting to three-quarters of a century, with the label “peace”? The answer lies largely in a fixation on major powers. The great Cold War antagonists, the United States and the Soviet Union, bristling with their nuclear arsenals, never came to blows. Indeed, wars between major powers or developed states have become (in Pinker’s words) “all but obsolete.” There has been no World War III, nor is there likely to be.
Such upbeat quantification invites complacent forms of self-congratulation. (How comparatively virtuous we mortals have become!) In the United States, where we-won-the-Cold-War sentiment still runs strong, the relative decline in global violence after 1945 is commonly attributed to the wisdom, virtue, and firepower of US “peacekeeping.” In hawkish circles, nuclear deterrence—the Cold War’s MAD (mutually assured destruction) doctrine that was described early on as a “delicate balance of terror”—is still canonized as an enlightened policy that prevented catastrophic global conflict.
Branding the long postwar era as an epoch of relative peace is disingenuous, and not just because it deflects attention from the significant death and agony that actually did occur and still does. It also obscures the degree to which the United States bears responsibility for contributing to, rather than impeding, militarization and mayhem after 1945. Ceaseless US-led transformations of the instruments of mass destruction—and the provocative global impact of this technological obsession—are by and large ignored.
Continuities in American-style “warfighting” (a popular Pentagon word) such as heavy reliance on airpower and other forms of brute force are downplayed. So is US support for repressive foreign regimes, as well as the destabilizing impact of many of the nation’s overt and covert overseas interventions. The more subtle and insidious dimension of postwar US militarization—namely, the violence done to civil society by funneling resources into a gargantuan, intrusive, and ever-expanding national security state—goes largely unaddressed in arguments fixated on numerical declines in violence since World War II.
Beyond this, trying to quantify war, conflict, and devastation poses daunting methodological challenges. Data advanced in support of the decline-of-violence argument is dense and often compelling, and derives from a range of respectable sources. Still, it must be kept in mind that the precise quantification of death and violence is almost always impossible. When a source offers fairly exact estimates of something like “war-related excess deaths,” you usually are dealing with investigators deficient in humility and imagination.
If the overall incidence of violence, including 21st-century terrorism, is relatively low compared to earlier global threats and conflicts, why has the United States responded by becoming an increasingly militarized, secretive, unaccountable, and intrusive “national security state”?
Take, for example, World War II, about which countless tens of thousands of studies have been written. Estimates of total “war-related” deaths from that global conflict range from roughly 50 million to more than 80 million. One explanation for such variation is the sheer chaos of armed violence. Another is what the counters choose to count and how they count it. Battle deaths of uniformed combatants are easiest to determine, especially on the winning side. Military bureaucrats can be relied upon to keep careful records of their own killed-in-action—but not, of course, of the enemy they kill. War-related civilian fatalities are even more difficult to assess, although—as in World War II—they commonly are far greater than deaths in combat.
Does the data source go beyond so-called battle-related collateral damage to include deaths caused by war-related famine and disease? Does it take into account deaths that may have occurred long after the conflict itself was over (as from radiation poisoning after Hiroshima and Nagasaki, or from the US use of Agent Orange in the Vietnam War)? The difficulty of assessing the toll of civil, tribal, ethnic, and religious conflicts with any exactitude is obvious.
Concentrating on fatalities and their averred downward trajectory also draws attention away from broader humanitarian catastrophes. In mid-2015, for instance, the Office of the United Nations High Commissioner for Refugees reported that the number of individuals “forcibly displaced worldwide as a result of persecution, conflict, generalized violence, or human rights violations” had surpassed 60 million and was the highest level recorded since World War II and its immediate aftermath. Roughly two-thirds of these men, women, and children were displaced inside their own countries. The remainder were refugees, and over half of these refugees were children.
Here, then, is a trend line intimately connected to global violence that is not heading downward. In 1996, the UN’s estimate was that there were 37.3 million forcibly displaced individuals on the planet. Twenty years later, as 2015 ended, this had risen to 65.3 million—a 75% increase over the last two post-Cold War decades that the declinist literature refers to as the “new peace.”
Other disasters inflicted on civilians are less visible than uprooted populations. Harsh conflict-related economic sanctions, which often cripple hygiene and healthcare systems and may precipitate a sharp spike in infant mortality, usually do not find a place in itemizations of military violence. US-led UN sanctions imposed against Iraq for 13 years beginning in 1990 in conjunction with the first Gulf War are a stark example of this. An account published in the New York Times Magazine in July 2003 accepted the fact that “at least several hundred thousand children who could reasonably have been expected to live died before their fifth birthday.” And after all-out wars, who counts the maimed, or the orphans and widows, or those the Japanese in the wake of World War II referred to as the “elderly orphaned”—parents bereft of their children?
Figures and tables, moreover, can only hint at the psychological and social violence suffered by combatants and noncombatants alike. It has been suggested, for instance, that 1 in 6 people in areas afflicted by war may suffer from mental disorder (as opposed to 1 in 10 in normal times). Even where American military personnel are concerned, trauma did not become a serious focus of concern until 1980, seven years after the US retreat from Vietnam, when post-traumatic stress disorder (PTSD) was officially recognized as a mental-health issue.
In 2008, a massive sampling study of 1.64 million US troops deployed to Afghanistan and Iraq between October 2001 and October 2007 estimated “that approximately 300,000 individuals currently suffer from PTSD or major depression and that 320,000 individuals experienced a probable TBI [traumatic brain injury] during deployment.” As these wars dragged on, the numbers naturally increased. To extend the ramifications of such data to wider circles of family and community—or, indeed, to populations traumatized by violence worldwide—defies statistical enumeration.
Largely unmeasurable, too, is violence in a different register: the damage that war, conflict, militarization, and plain existential fear inflict upon civil society and democratic practice. This is true everywhere but has been especially conspicuous in the United States since Washington launched its “global war on terror” in response to the attacks of September 11, 2001.
Here, numbers are perversely provocative, for the lives claimed in 21st-century terrorist incidents can be interpreted as confirming the decline-in-violence argument. From 2000 through 2014, according to the widely cited Global Terrorism Index, “more than 61,000 incidents of terrorism claiming over 140,000 lives have been recorded.” Including September 11th, countries in the West experienced less than 5% of these incidents and 3% of the deaths. The Chicago Project on Security and Terrorism, another minutely documented tabulation based on combing global media reports in many languages, puts the number of suicide bombings from 2000 through 2015 at 4,787 attacks in more than 40 countries, resulting in 47,274 deaths.
These atrocities are incontestably horrendous and alarming. Grim as they are, however, the numbers themselves are comparatively low when set against earlier conflicts. For specialists in World War II, the “140,000 lives” estimate carries an almost eerie resonance, since this is the rough figure usually accepted for the death toll from a single act of terror bombing, the atomic bomb dropped on Hiroshima. The tally is also low compared to contemporary deaths from other causes. Globally, for example, more than 400,000 people are murdered annually. In the United States, the danger of being killed by falling objects or lightning is at least as great as the threat from Islamist militants.
This leaves us with a perplexing question: If the overall incidence of violence, including 21st-century terrorism, is relatively low compared to earlier global threats and conflicts, why has the United States responded by becoming an increasingly militarized, secretive, unaccountable, and intrusive “national security state”? Is it really possible that a patchwork of non-state adversaries that do not possess massive firepower or follow traditional rules of engagement has, as the chairman of the Joint Chiefs of Staff declared in 2013, made the world more threatening than ever?
For those who do not believe this to be the case, possible explanations for the accelerating militarization of the United States come from many directions. Paranoia may be part of the American DNA—or, indeed, hardwired into the human species. Or perhaps the anticommunist hysteria of the Cold War simply metastasized into a post-9/11 pathological fear of terrorism. Machiavellian fear-mongering certainly enters the picture, led by conservative and neoconservative civilian and military officials of the national security state, along with opportunistic politicians and war profiteers of the usual sort. Cultural critics predictably point an accusing finger as well at the mass media’s addiction to sensationalism and catastrophe, now intensified by the proliferation of digital social media.
To all this must be added the peculiar psychological burden of being a “superpower” and, from the 1990s on, the planet’s “sole superpower”—a situation in which “credibility” is measured mainly in terms of massive cutting-edge military might. It might be argued that this mindset helped “contain Communism” during the Cold War and provides a sense of security to US allies. What it has not done is ensure victory in actual war, although not for want of trying. With some exceptions (Grenada, Panama, the brief 1991 Gulf War, and the Balkans), the US military has not tasted victory since World War II—Korea, Vietnam, and recent and current conflicts in the Greater Middle East being boldface examples of this failure. This, however, has had no impact on the hubris attached to superpower status. Brute force remains the ultimate measure of credibility.
The traditional American way of war has tended to emphasize the “three Ds” (defeat, destroy, devastate). Since 1996, the Pentagon’s proclaimed mission is to maintain “full-spectrum dominance” in every domain (land, sea, air, space, and information) and, in practice, in every accessible part of the world. The Air Force Global Strike Command, activated in 2009 and responsible for managing two-thirds of the US nuclear arsenal, typically publicizes its readiness for “Global Strike… Any Target, Any Time.”
In 2015, the Department of Defense acknowledged maintaining 4,855 physical “sites”—meaning bases ranging in size from huge contained communities to tiny installations—of which 587 were located overseas in 42 foreign countries. An unofficial investigation that includes small and sometimes impermanent facilities puts the number at around 800 in 80 countries. Over the course of 2015, to cite yet another example of the overwhelming nature of America’s global presence, elite US special operations forces were deployed to around 150 countries, and Washington provided assistance in arming and training security forces in an even larger number of nations.
America’s overseas bases reflect, in part, an enduring inheritance from World War II and the Korean War. The majority of these sites are located in Germany (181), Japan (122), and South Korea (83) and were retained after their original mission of containing communism disappeared with the end of the Cold War. Deployment of elite special operations forces is also a Cold War legacy (exemplified most famously by the Army’s “Green Berets” in Vietnam) that expanded after the demise of the Soviet Union. Dispatching covert missions to three-quarters of the world’s nations, however, is largely a product of the war on terror.
Many of these present-day undertakings require maintaining overseas “lily pad” facilities that are small, temporary, and unpublicized. And many, moreover, are integrated with covert CIA “black operations.” Combating terror involves practicing terror—including, since 2002, an expanding campaign of targeted assassinations by unmanned drones. For the moment, this latest mode of killing remains dominated by the CIA and the US military (with the United Kingdom and Israel following some distance behind).
The “delicate balance of terror” that characterized nuclear strategy during the Cold War has not disappeared. Rather, it has been reconfigured. The US and Soviet arsenals that reached a peak of insanity in the 1980s have been reduced by about two-thirds—a praiseworthy accomplishment but one that still leaves the world with around 15,400 nuclear weapons as of January 2016, 93% of them in US and Russian hands. Close to 2,000 of the latter on each side are still actively deployed on missiles or at bases with operational forces.
This downsizing, in other words, has not removed the wherewithal to destroy the Earth as we know it many times over. Such destruction could come about indirectly as well as directly, with even a relatively “modest” nuclear exchange between, say, India and Pakistan triggering a cataclysmic climate shift—a “nuclear winter”—that could result in massive global starvation and death. Nor does the fact that seven additional nations now possess nuclear weapons (and more than 40 others are deemed “nuclear weapons capable”) mean that “deterrence” has been enhanced. The future use of nuclear weapons, whether by deliberate decision or by accident, remains an ominous possibility. That threat is intensified by the possibility that nonstate terrorists may somehow obtain and use nuclear devices.
What is striking at this moment in history is that paranoia couched as strategic realism continues to guide US nuclear policy and, following America’s lead, that of the other nuclear powers. As announced by the Obama administration in 2014, the potential for nuclear violence is to be “modernized.” In concrete terms, this translates as a 30-year project that will cost the United States an estimated $1 trillion (not including the usual future cost overruns for producing such weapons), perfect a new arsenal of “smart” and smaller nuclear weapons, and extensively refurbish the existing delivery “triad” of long-range manned bombers, nuclear-armed submarines, and land-based intercontinental ballistic missiles carrying nuclear warheads.
Creating a capacity for violence greater than the world has ever seen is costly—and remunerative.
Nuclear modernization, of course, is but a small portion of the full spectrum of American might—a military machine so massive that it inspired President Barack Obama to speak with unusual emphasis in his State of the Union address in January 2016. “The United States of America is the most powerful nation on Earth,” he declared. “Period. Period. It’s not even close. It’s not even close. It’s not even close. We spend more on our military than the next eight nations combined.”
Official budgetary expenditures and projections provide a snapshot of this enormous military machine, but here again numbers can be misleading. Thus, the “base budget” for defense announced in early 2016 for fiscal year 2017 amounts to roughly $600 billion, but this falls far short of what the actual outlay will be. When all other discretionary military- and defense-related costs are taken into account—nuclear maintenance and modernization, the “war budget” that pays for so-called overseas contingency operations like military engagements in the Greater Middle East, “black budgets” that fund intelligence operations by agencies including the CIA and the National Security Agency, appropriations for secret high-tech military activities, “veterans affairs” costs (including disability payments), military aid to other countries, huge interest costs on the military-related part of the national debt, and so on—the actual total annual expenditure is close to $1 trillion.
Such stratospheric numbers defy easy comprehension, but one does not need training in statistics to bring them closer to home. Simple arithmetic suffices. The projected bill for just the 30-year nuclear modernization agenda comes to over $90 million a day, or almost $4 million an hour. The $1 trillion price tag for maintaining the nation’s status as “the most powerful nation on Earth” for a single year amounts to roughly $2.74 billion a day, over $114 million an hour.
Creating a capacity for violence greater than the world has ever seen is costly—and remunerative.
So an era of a “new peace”? Think again. We’re only three-quarters of the way through America’s violent century and there’s more to come.
The reversion of the Defense Department to the War Department should be seen less as a rupture than a revelation. It strips away a euphemism to make far plainer what has long been the reality of our world.
The renaming of the Defense Department should have surprised no one. US President Donald Trump is an incipient fascist doing what such figures do. Surrounded by a coterie of illiberal ideologues and careerist sycophants, he and his top aides have dispensed with pretense and precedent, moving at breakneck speed to demolish what remains of the battered façade of American democracy.
In eight months, his second administration has unleashed a shock-and-awe assault on norms and institutions, civil liberties, human rights, and history itself. But fascism never respects borders. Fascists don’t recognize the rule of law. They consider themselves the law. Expansion and the glorification of war are their lifeblood. Italian fascist leader Benito Mussolini put it all too bluntly: The fascist “believes neither in the possibility nor the utility of perpetual peace… war alone brings up to its highest tension all human energy and puts the stamp of nobility upon the peoples who have courage to meet it.”
Pete Hegseth is now equally blunt. From the Pentagon, he’s boasting of restoring a “warrior ethos” to the armed forces, while forging an offensive military that prizes “maximum lethality, not tepid legality. Violent effect, not politically correct.” The message couldn’t be clearer: When the US loses wars, as it has done consistently despite commanding the most powerful military in history, it’s not due to imperial overreach, political arrogance, or popular resistance. Rather, defeat stems from that military having gone “woke,” a euphemism for failing to kill enough people.
The recent rechristening of the Department of Defense as the Department of War was certainly a culture-war stunt like Trump’s demand that the Gulf of Mexico be renamed the Gulf of America. But it also signaled something more insidious: a blunt escalation of the criminal logic that has long underwritten US militarism. That logic sustained both the Cold War of the last century and the War on Terror of this one, destroying millions of lives.
When Hegseth defended the recent summary executions of 11 alleged Venezuelan drug smugglers on a boat in the Caribbean, he boasted that Washington possesses “absolute and complete authority” to kill anywhere without congressional approval or evidence of a wrong and in open defiance of international law. The next day, in responding on X to a user who called what had been done a war crime, Vance wrote, “I don’t give a shit what you call it.” It was the starkest admission since the Iraq War that Washington no longer pretends to operate internationally under the rule of law but under the rule of force, where might quite simply makes right.
While such an escalation of verbiage—the brazen confession of an imperial power that believes itself immune from accountability—should alarm us, it’s neither unprecedented nor unexpected. Peace, after all, has never been the profession of the US military. The Department of Defense has always been the Department of War.
The US has long denied being an empire. From its founding, imperialism was cast as the antithesis of American values. This nation, after all, was born in revolt against the tyranny of foreign rule. Yet for a country so insistent on not being an empire, Washington has followed a trajectory nearly indistinguishable from its imperial predecessors. Its history was defined by settler conquest, the violent elimination of Indigenous peoples, and a long record of covert and overt interventions to topple governments unwilling to yield to American political or economic domination.
The record is unmistakable. As Noam Chomsky once put it, “Talking about American imperialism is like talking about triangular triangles.” And he was hardly the first to suggest such a thing. In the 1930s, General Smedley Butler, reflecting with searing candor on his years of military service in Latin America, described himself as “a racketeer, a gangster for capitalism… I helped make Mexico, especially Tampico, safe for American oil interests… I helped make Haiti and Cuba a decent place for the National City Bank boys to collect revenues in. I helped in the raping of half a dozen Central American republics for the benefit of Wall Street.”
Historically, imperialism and fascism went hand in hand. As Aimé Césaire argued in his 1950 Discourse on Colonialism, fascism is imperialism turned inward. The violence inherent in colonial domination can, in the end, never be confined to the colonies, which means that what we’re now witnessing in the Trumpian era is a reckoning. The chickens are indeed coming home to roost or, as Noura Erakat recently observed, “The boomerang comes back.”
Yet for a country so insistent on not being an empire, Washington has followed a trajectory nearly indistinguishable from its imperial predecessors.
In their insatiable projection of power and pursuit of profit, Washington and Wall Street ignored what European empires had long revealed: that colonization “works to decivilize the colonizer, to brutalize him… to degrade him.” English novelist Joseph Conrad recognized this in his classic 19th-century work of fiction, Heart of Darkness, concluding that it wasn’t the Congo River but the Thames River in Great Britain that “led into the heart of an immense darkness.”
Imperialism incubates fascism, a dynamic evident in the carnage of World War I, rooted, as W.E.B. DuBois observed at the time, in colonial competition that laid the foundations for World War II. In that conflict, Césaire argued, the Nazis applied to Europe the methods and attitudes that until then were reserved for colonized peoples, unleashing them on Europeans with similarly genocidal effect.
In the postwar years, the United States emerged from the ruins of Europe as the unrivaled global hegemon. With some 6% of the world’s population, it commanded nearly half of the global gross domestic product. Anchored by up to 2,000 military bases across the globe (still at 800 today), it became the new imperial power on which the sun never set. Yet Washington ignored the fundamental lesson inherent in Europe’s self-cannibalization. Rather than dismantle the machinery of empire, it embraced renewed militarism. Rather than demobilize, it placed itself on a permanent global war footing, both anticipating and accelerating the Cold War with that other great power of the period, the Soviet Union.
The United States was, however, a superpower defined as much by paranoia and insecurity as by military and economic strength. It was in such a climate that American officials moved to abandon the title of the Department of War in 1947, rebranding it as the Department of Defense two years later. The renaming sought to reassure the world that, despite every sign the US had assumed the mantle of European colonialism, its intentions were benign and defensive in nature.
That rhetorical shift would prove inseparable from a broader ideological transformation as the Cold War froze geopolitics into rigid Manichean camps. President Harry Truman’s March 1947 address to Congress marked the start of a new global confrontation. In that speech, the president proclaimed the United States the guardian of freedom and democracy everywhere. Leftist movements were cast as Soviet proxies and struggles for national liberation in the former colonial world were framed not in the language of decolonization and self-determination but as nefarious threats to American interests and international peace and security.
In Europe at the time, a civil war raged in Greece, while decisive elections loomed in Italy. Determined not to “lose” such countries to communism, Washington moved to undermine democracy under the guise of saving it. In Greece, it would channel $300 million to right-wing forces, many staffed by former fascists and Nazi collaborators, in the name of defending freedom. In Western Europe, Washington used its position as the world’s banker to manipulate electoral outcomes. In the wake of the 1947 National Security Act that created the Central Intelligence Agency, or CIA (the same bill that renamed the War Department), the agency launched its first large-scale covert operation. In 1948, the US would funnel millions of dollars into Italy and unleashed a torrent of propaganda to ensure that leftist parties would not prevail.
Across the Third World, the CIA perfected that template for covert interventions aimed at toppling democratic governments and installing pliant authoritarians. The overthrow of Iran’s Mohammad Mossadegh in 1953 and Guatemala’s Jacobo Árbenz in 1954 marked the beginning of a series of regime-change operations. More assassinations and coups followed, including of Patrice Lumumba in the Congo in 1961, Sukarno in Indonesia in 1965, and Salvador Allende in Chile in 1973. The utter contempt for democracy inherent in such actions was embodied in National Security Advisor Henry Kissinger’s remark: “I don’t see why we need to stand by and watch a country go communist due to the irresponsibility of its own people.”
In the aftermath of each intervention, Washington installed anticommunist dictators who had one thing in common: They murdered their own citizens, and often those of other countries as well, dismantled democratic institutions, and siphoned national wealth into personal fortunes and the coffers of multinational corporations.
By the 1980s, the CIA was bankrolling proxy wars spanning the globe. Billions of dollars were being funneled to the Afghan mujahideen and Nicaraguan Contras. In both Afghanistan and Nicaragua, those US-backed “freedom fighters” (or, as President Ronald Reagan termed the Contras, the “moral equals of our founding fathers”) deployed tactics that amounted to scaled-up terrorism. The mask occasionally slipped. As historian Greg Grandin has noted, one adviser to the Joint Chiefs of Staff described the Contras as “the strangest national liberation organization in the world.” In truth, he conceded, they were “just a bunch of killers.”
As with the CIA, the not-so-aptly-renamed “Defense Department” would oversee a succession of catastrophic wars that did nothing to make Americans safer and had little to do with the protection of democratic values. Within a year of its renaming, the US was at war in Korea. When the North invaded the South in 1950, seeking to reunify a peninsula divided by foreign powers, Washington rushed to intervene, branding it a “police action,” the first of many Orwellian linguistic maneuvers to sidestep the constitutional authority of Congress to declare war.
The official narrative that the communists launched the war to topple a democratically elected government in the South obscured its deeper origins. After World War II, Washington installed Syngman Rhee, an exile who had spent decades in the United States, as South Korea’s leader. He commanded little popular legitimacy but proved a staunch ally for American officials determined to secure an anticommunist foothold on the peninsula. Far from embodying liberal democracy, his regime presided over a repressive police state.
Washington reserves for itself the unilateral right to intervene, violently and antidemocratically, in the affairs of other nations to secure what it considers its interests.
In 1948, two years before the war, an uprising against Rhee’s corrupt rule broke out on Jeju Island. With Washington’s blessing, his security forces launched a brutal counterinsurgency that left as many as 80,000 dead. Far from an aberration, Jeju epitomized Washington’s emerging Cold War policy: not the cultivation of democracies responsive to their citizenry (with the uncertainty that entailed), but the defense of authoritarian regimes as reliable bulwarks against communism.
The Korean War also marked a growing reliance on air power. Carpet bombing and the widespread use of napalm would reduce the North to rubble, destroying some 85% of its infrastructure and killing 2 million civilians. As future Secretary of State Dean Rusk would later admit, the US bombed “everything that moved in North Korea.” The only “restraint” exercised was the decision not to deploy atomic bombs, despite the insistence of Air Force General Curtis LeMay who would reflect unapologetically, “Over a period of three years or so, we killed off… 20% of the population.”
A remarkably similar pattern unfolded in Vietnam. As revealed in the Pentagon Papers, the United States initially backed France in its attempt after World War II to reimpose colonial rule over Indochina. After the French forces were defeated in 1954, the partition of the country ensued. Elections to reunify Vietnam were scheduled for 1956, but US intelligence concluded that the North’s communist leader, Ho Chi Minh, would win in a landslide, so the elections were cancelled. Once again, Washington placed its support behind the unpopular, repressive South Vietnamese regime of Ngo Dinh Diem, chosen not for his legitimacy but for his reliability in the eyes of American policymakers.
The result was a futile slaughter. The US would kill well over 3 million people in Southeast Asia and drop more than three and a half times the tonnage of bombs on Vietnam, Cambodia, and Laos as were used in all of World War II. That orgy of violence would lead Martin Luther King Jr., in 1967, to denounce the United States as “the greatest purveyor of violence in the world today.” The same has held true for nearly the entire span of the past 80 years.
The human toll of the Cold War exceeded 20 million lives. As historian Paul Chamberlin calculated, that amounted to some 1,200 deaths every day for 45 years. To call such an era “cold” was not only misleading but obscene. It was, in truth, a period of relentless and bloody global conflict, much of it instigated, enabled, or prolonged by the United States. And its wars also produced the blowback that would later be rebranded as the “War on Terror.”
The names of America’s adversaries may have changed over the years from Hitler to Stalin, Kim Il-Sung to Ho Chi Minh, Saddam Hussein to Xi Jinping, but the principle has remained constant. Washington reserves for itself the unilateral right to intervene, violently and antidemocratically, in the affairs of other nations to secure what it considers its interests. The reversion of the Defense Department to the War Department should be seen less as a rupture than a revelation. It strips away a euphemism to make far plainer what has long been the reality of our world.
We now face a choice. As historian Christian Appy has reminded us, “The institutions that sustain empire destroy democracy.” That truth is unfolding before our eyes. As the Pentagon budget tops $1 trillion and the machinery of war only expands in Donald Trump’s America, the country also seems to be turning further inward. Only recently, President Trump threatened to use Chicago to demonstrate “why it is called the Department of War.” Meanwhile, US Customs and Immigration Enforcement, or ICE, is set to become among the most well-funded domestic “military” forces on the planet and potentially the private paramilitary of an aspiring autocrat.
If there is any hope of salvaging this country’s (not to speak of this planet’s) future, then this history has to be faced, and we must recover—or perhaps discover—our moral bearings. That will require not prolonging the death throes of American hegemony, but dismantling imperial America before it collapses on itself and takes us all with it.