

SUBSCRIBE TO OUR FREE NEWSLETTER
Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
5
#000000
#FFFFFF
To donate by check, phone, or other method, see our More Ways to Give page.


Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.
Nuclear films have been canaries in the uranium mine; each resurgence has coincided with waves of nuclear escalation. What does it mean that top directors are tackling the subject again?
As the nuclear threat once again dominates the headlines, the nuclear blockbuster has returned to screens. Following the success of Christopher Nolan’s Oppenheimer, Kathryn Bigelow’s A House of Dynamite reminds us that the atomic bomb is not mere history. Equal parts political thriller and apocalyptic horror, the film compels viewers to imagine the unimaginable, refusing any illusion of security or assurance that catastrophe could not happen here.
There is no catharsis, no safe distance from which to retreat. What we confront on screen is not fiction but the collective madness of our current reality. The dread that Bigelow conjures does not dissipate with the closing credits; it follows us out of the theater, past the exit signs, and into a world where the possibility of instant annihilation remains stitched into everyday life.
Such films are hardly new, though their resurgence should give us pause. Since 1945, Hollywood has capitalized on the mix of fear and fascination unleashed by the atomic age. In response, studios have produced roughly 1,000 nuclear-themed films, a cinematic proliferation mirroring the buildup of nuclear arsenals (70,000 warheads by 1986). As scholar Jerome Shapiro observes, these works became “a statistically important part of the American filmgoer’s diet” for decades.
Yet atomic cinema has always been more than entertainment. It has served as both warning and witness, shaping a collective consciousness of the bomb. Within the secretive and anti-democratic architecture of the nuclear security state, these films often served as cultural critique and political resistance, piercing the veil of official classification, challenging the monopoly of defense experts, and democratizing a debate otherwise foreclosed to the public.
What we need now are stories that break the spell of American innocence.
Nuclear films have been canaries in the uranium mine. Each resurgence has coincided with waves of nuclear escalation. But they have also served as calls to action, catalyzing mass movements demanding disarmament. To understand what this revival signals, and what more is needed to reignite the anti-nuclear movement, it is worth revisiting the earlier cycles of Cold War filmmaking that were shaped by and informed nuclear and popular culture.
The first major wave of atomic cinema emerged in the late 1950s and early 1960s. This was a time in which the United States had lost its temporary monopoly on nuclear force and both Washington and Moscow were producing, testing, and stockpiling weapons a thousand times more destructive than the now nearly obsolete fission bombs that had obliterated Hiroshima and Nagasaki. In less than a decade, the atomic bomb had evolved from a “city-killer” to a “nation-killer.”
The earliest films of this era confronted a growing public anxiety over radiation. They represented a response to official efforts to downplay the danger, as politicians prioritized the management of political fallout over the prevention of radioactive fallout. From the very beginning, there emerged attempts to trivialize radiation, none more infamous than General Leslie Groves’ 1945 remark that radiation poisoning was a “very pleasant way to die.”
By 1954, the Lucky Dragon incident had made the deadly consequences of nuclear contamination impossible to ignore. Science fiction movies such as Them! and Godzilla, both released that same year, translated these fears into monstrous allegories. Yet even as such films dramatized the terror of nuclear technology, official discourse worked to normalize it.
Kubrick’s dark satire exposed the Nazi-like madness underwriting “rational” deterrence, ridiculing mutual destruction while indicting the US for deepening the peril through its own policies.
Strategic war planners like Herman Kahn embodied the technocratic detachment of the emerging nuclear priesthood. By 1960, Kahn was publicly arguing that nuclear war was winnable and that even scenarios resulting in tens of millions of deaths would not ultimately preclude “normal and happy lives for the majority of survivors and their descendants.”
He also dismissed concerns about radiation, insisting that the numbers of children born “seriously defective” due to such exposure would rise by “only” 10%. Noting that there are still birth defects in peacetime, he concluded, “War is a terrible thing; but so is peace.” Such statements shocked the public, revealing the moral vacancy of those entrusted with preserving life and preventing death in the atomic age, fueling a growing fear that ordinary people might be sacrificed on the altar of Cold War credibility.
After the October 1962 Cuban Missile Crisis, when the United States came to the brink of serving as ground zero of nuclear annihilation, many Americans fully awoke to the insanity. The public backlash, led by SANE and Women’s Strike for Peace, that followed the near catastrophe helped push President John F. Kennedy to sign the Limited Test Ban Treaty of 1963, a rare moment when popular and political pressure combined to produce tangible reform.
In response also came Stanley Kubrick’s Dr. Strangelove (1964), one of the most indelible films of the Cold War. Drawing inspiration from figures such as Kahn, it confronted the suicidal nihilism of the defense intellectuals. Kubrick’s dark satire exposed the Nazi-like madness underwriting “rational” deterrence, ridiculing mutual destruction while indicting the US for deepening the peril through its own policies. These included stationing nuclear weapons in Turkey to attempting to overthrow Castro, actions that helped manufacture a crisis that threatened to extinguish the lives of as many as 200 million North Americans and even more Soviet citizens. This is not to mention the many millions dead attributed to what is perversely termed “collateral damage.”
Two decades later, a new wave of films emerged amid another period of nuclear escalation. Their arrival in 1979 marked what many remember as the spark that reignited the anti-nuclear movement after more than a decade of dormancy. That year saw the rise of presidential candidate Ronald Reagan, whose rhetoric revived the language of nuclear confrontation, alongside two disasters that reawakened fears of radiation: the partial meltdown at Three Mile Island and the uranium mill spill at Church Rock, New Mexico. Together, these events rekindled public anxiety about the existential dangers of nuclear weapons and deepened fears surrounding nuclear power.
Reagan’s campaign and subsequent presidency again advanced the chilling notion that nuclear war might be winnable, even at the cost of millions of lives. This sentiment persisted despite scientists warning of “nuclear winter,” stressing that a nuclear exchange could devastate the atmosphere and result in “omnicide,” the death of all life on Earth. The mix of apocalyptic scientific doomsaying and bellicose political posturing sent fear soaring. By the early 1980s, polls showed that nearly half of Americans believed they might die in a nuclear war.
As historian Paul Boyer observed, even the most devastating portrayals inevitably fall short, since the only truly accurate nuclear war film, he wrote, “would be two hours of a blank screen.”
Released just 12 days before the disaster at Three Mile Island, The China Syndrome (1979) captured this mounting dread with eerie prescience. What began as a fictional thriller about a near meltdown quickly became a public relations catastrophe for nuclear power. The film’s portrayal of institutionalcorruption and bureaucratic negligence, alongside the industry’s efforts to dismiss it as propaganda, exposed official narratives on nuclear safety. In a post-Vietnam, post-Watergate America defined by cynicism and mistrust, the movie crystallized public anxieties about nuclear power and the broader dangers of corporate and governmental deceit.
Under Reagan, the popular energy unleashed by this moment coalesced into a mass movement. By 1982, anti-nuclear activism had reached its apex. On June 12, some 1 million demonstrators filled the streets of New York City for what remains the largest single protest rally in American history. Their message was unambiguous: The nuclear status quo was intolerable.
Their reach extended far beyond the streets. The Nuclear Freeze campaign mobilized communities across the country, while the 1983 ABC television film The Day After brought the horror of nuclear annihilation directly into American living rooms. More than 100 million people, including the president himself, watched as a bucolic Midwestern town of Lawrence, Kansas, emblematic of the American heartland, was reduced to a radioactive wasteland. The film remains one of the most searing depictions of nuclear war ever produced. Yet, as historian Paul Boyer observed, even the most devastating portrayals inevitably fall short, since the only truly accurate nuclear war film, he wrote, “would be two hours of a blank screen.”
But public pressure grew impossible to ignore. In a remarkable reversal, Reagan declared that “a nuclear war cannot be won and must never be fought,” beginning direct talks with Soviet premier Mikhail Gorbachev to pursue arms-reductions. This marked at least the second time that protests rendered potential nuclear weapons use not only morally unimaginable but also politically untenable (the other being the 1969 Vietnam Moratorium protests which helped dissuade President Richard Nixon from carrying out a contemplated nuclear strike against North Vietnam).
The two major nuclear films of the past two years are significant cultural events. They have revived an apocalyptic imagination and a sense of nuclear consciousness that are essential if we are ever to confront the nuclear nightmare and end the arms race before it ends us. Yet they also fall short in critical ways, and risk being remembered as great films that stirred awareness but failed to inspire the resistance necessary to meet this perilous moment.
Oppenheimer, despite its cinematic brilliance, was a missed opportunity to reckon with Hiroshima and Nagasaki. Rather than compelling audiences to confront historical responsibility, it offered a familiar narrative of tragic necessity. The bomb had to be built; the bombings, though regrettable, were justified. The moral center of the story was not the victims in Japan but Oppenheimer himself, the tormented “American Prometheus.” The result was not reckoning but retreat into myth.
A House of Dynamite follows a similar trajectory. The film is powerful and unsettling, reminding viewers that any city, and the world, could be reduced to ashes within minutes. It captures the immediacy of the danger and the near impossibility of containing a “limited” exchange. Yet it ultimately retreats into American exceptionalism, reinforcing the comforting illusion that our nuclear arsenal exists only in a defensive posture to deter aggression while it is “our enemies” that recklessly endanger both us and the planet. In reality, the most perilous moments of the atomic age were less the product of foreign provocation than of American escalation.
To imagine the United States then as only a victim of a nuclear war is to obscure its role as the principal architect of prospective annihilation. For eight decades, Washington has held humanity hostage to the possibility of instant destruction, insisting that peace depends on the ever-present threat of total devastation, including in a US-initiated first strike. Moving beyond this suicidal logic of deterrence requires an honest reckoning with that history and the will to dismantle it.
What we need now are stories that break the spell of American innocence. The only sane position remains abolition, the dismantling of weapons for which there is no defense and whose risk to the continuity of human life is intolerable. Until that reckoning arrives, Oppenheimer and A House of Dynamite, and any other future films that fail to summon the courage to speak the full truth and mobilize resistance will stand as monuments to the mythology of American victimhood, stories about the terror of being attacked told by the most heavily armed nation on earth.
A House of Dynamite in limited theatrical release and will begin streaming on Netflix on October 24, 2025.
US hegemony, however frayed at the edges, continues to be taken for granted in ruling circles. What do we make of it these days?
[This essay is adapted from “Measuring Violence,” the first chapter of John Dower’s new book, The Violent American Century: War and Terror Since World War Two.]
On February 17, 1941, almost 10 months before Japan’s attack on Pearl Harbor, Life magazine carried a lengthy essay by its publisher, Henry Luce, entitled “The American Century.” The son of Presbyterian missionaries, born in China in 1898 and raised there until the age of 15, Luce essentially transposed the certainty of religious dogma into the certainty of a nationalistic mission couched in the name of internationalism.
Luce acknowledged that the United States could not police the whole world or attempt to impose democratic institutions on all of mankind. Nonetheless, “the world of the 20th century,” he wrote, “if it is to come to life in any nobility of health and vigor, must be to a significant degree an American Century.” The essay called on all Americans “to accept wholeheartedly our duty and our opportunity as the most powerful and vital nation in the world and in consequence to exert upon the world the full impact of our influence, for such purposes as we see fit and by such measures as we see fit.”
Japan’s attack on Pearl Harbor propelled the United States wholeheartedly onto the international stage Luce believed it was destined to dominate, and the ringing title of his cri de coeur became a staple of patriotic Cold War and post-Cold War rhetoric. Central to this appeal was the affirmation of a virtuous calling. Luce’s essay singled out almost every professed ideal that would become a staple of wartime and Cold War propaganda: freedom, democracy, equality of opportunity, self-reliance and independence, cooperation, justice, charity—all coupled with a vision of economic abundance inspired by “our magnificent industrial products, our technical skills.” In present-day patriotic incantations, this is referred to as “American exceptionalism.”
Clearly, the number and deadliness of global conflicts have indeed declined since World War II. This so-called postwar peace was, and still is, however, saturated in blood and wracked with suffering.
The other, harder side of America’s manifest destiny was, of course, muscularity. Power. Possessing absolute and never-ending superiority in developing and deploying the world’s most advanced and destructive arsenal of war. Luce did not dwell on this dimension of “internationalism” in his famous essay, but once the world war had been entered and won, he became its fervent apostle—an outspoken advocate of “liberating” China from its new communist rulers, taking over from the beleaguered French colonial military in Vietnam, turning both the Korean and Vietnam conflicts from “limited wars” into opportunities for a wider virtuous war against and in China, and pursuing the rollback of the Iron Curtain with “tactical atomic weapons.” As Luce’s incisive biographer Alan Brinkley documents, at one point Luce even mulled the possibility of “plastering Russia with 500 (or 1,000) A bombs”—a terrifying scenario, but one that the keepers of the US nuclear arsenal actually mapped out in expansive and appalling detail in the 1950s and 1960s, before Luce’s death in 1967.
The “American Century” catchphrase is hyperbole, the slogan never more than a myth, a fantasy, a delusion. Military victory in any traditional sense was largely a chimera after World War II. The so-called Pax Americana itself was riddled with conflict and oppression and egregious betrayals of the professed catechism of American values. At the same time, postwar US hegemony obviously never extended to more than a portion of the globe. Much that took place in the world, including disorder and mayhem, was beyond America’s control.
Yet, not unreasonably, Luce’s catchphrase persists. The 21st-century world may be chaotic, with violence erupting from innumerable sources and causes, but the United States does remain the planet’s “sole superpower.” The myth of exceptionalism still holds most Americans in its thrall. US hegemony, however frayed at the edges, continues to be taken for granted in ruling circles, and not only in Washington. And Pentagon planners still emphatically define their mission as “full-spectrum dominance” globally.
Washington’s commitment to modernizing its nuclear arsenal rather than focusing on achieving the thoroughgoing abolition of nuclear weapons has proven unshakable. So has the country’s almost religious devotion to leading the way in developing and deploying ever more “smart” and sophisticated conventional weapons of mass destruction.
Welcome to Henry Luce’s—and America’s—violent century, even if thus far it’s lasted only 75 years. The question is just what to make of it these days.
We live in times of bewildering violence. In 2013, the chairman of the Joint Chiefs of Staff told a Senate committee that the world is “more dangerous than it has ever been.” Statisticians, however, tell a different story: that war and lethal conflict have declined steadily, significantly, even precipitously since World War II.
Much mainstream scholarship now endorses the declinists. In his influential 2011 book, The Better Angels of Our Nature: Why Violence Has Declined, Harvard psychologist Steven Pinker adopted the labels “the Long Peace” for the four-plus decades of the Cold War (1945-1991), and “the New Peace” for the post-Cold War years to the present. In that book, as well as in post-publication articles, postings, and interviews, he has taken the doomsayers to task. The statistics suggest, he declares, that “today we may be living in the most peaceable era in our species’s existence.”
Clearly, the number and deadliness of global conflicts have indeed declined since World War II. This so-called postwar peace was, and still is, however, saturated in blood and wracked with suffering.
It is reasonable to argue that total war-related fatalities during the Cold War decades were lower than in the six years of World War II (1939-1945) and certainly far less than the toll for the 20th century’s two world wars combined. It is also undeniable that overall death tolls have declined further since then. The five most devastating intrastate or interstate conflicts of the postwar decades—in China, Korea, Vietnam, Afghanistan, and between Iran and Iraq—took place during the Cold War. So did a majority of the most deadly politicides, or political mass killings, and genocides: in the Soviet Union, China (again), Yugoslavia, North Korea, North Vietnam, Sudan, Nigeria, Indonesia, Pakistan-Bangladesh, Ethiopia, Angola, Mozambique, and Cambodia, among other countries. The end of the Cold War certainly did not signal the end of such atrocities (as witness Rwanda, the Congo, and the implosion of Syria). As with major wars, however, the trajectory has been downward.
Unsurprisingly, the declinist argument celebrates the Cold War as less violent than the global conflicts that preceded it, and the decades that followed as statistically less violent than the Cold War. But what motivates the sanitizing of these years, now amounting to three-quarters of a century, with the label “peace”? The answer lies largely in a fixation on major powers. The great Cold War antagonists, the United States and the Soviet Union, bristling with their nuclear arsenals, never came to blows. Indeed, wars between major powers or developed states have become (in Pinker’s words) “all but obsolete.” There has been no World War III, nor is there likely to be.
Such upbeat quantification invites complacent forms of self-congratulation. (How comparatively virtuous we mortals have become!) In the United States, where we-won-the-Cold-War sentiment still runs strong, the relative decline in global violence after 1945 is commonly attributed to the wisdom, virtue, and firepower of US “peacekeeping.” In hawkish circles, nuclear deterrence—the Cold War’s MAD (mutually assured destruction) doctrine that was described early on as a “delicate balance of terror”—is still canonized as an enlightened policy that prevented catastrophic global conflict.
Branding the long postwar era as an epoch of relative peace is disingenuous, and not just because it deflects attention from the significant death and agony that actually did occur and still does. It also obscures the degree to which the United States bears responsibility for contributing to, rather than impeding, militarization and mayhem after 1945. Ceaseless US-led transformations of the instruments of mass destruction—and the provocative global impact of this technological obsession—are by and large ignored.
Continuities in American-style “warfighting” (a popular Pentagon word) such as heavy reliance on airpower and other forms of brute force are downplayed. So is US support for repressive foreign regimes, as well as the destabilizing impact of many of the nation’s overt and covert overseas interventions. The more subtle and insidious dimension of postwar US militarization—namely, the violence done to civil society by funneling resources into a gargantuan, intrusive, and ever-expanding national security state—goes largely unaddressed in arguments fixated on numerical declines in violence since World War II.
Beyond this, trying to quantify war, conflict, and devastation poses daunting methodological challenges. Data advanced in support of the decline-of-violence argument is dense and often compelling, and derives from a range of respectable sources. Still, it must be kept in mind that the precise quantification of death and violence is almost always impossible. When a source offers fairly exact estimates of something like “war-related excess deaths,” you usually are dealing with investigators deficient in humility and imagination.
If the overall incidence of violence, including 21st-century terrorism, is relatively low compared to earlier global threats and conflicts, why has the United States responded by becoming an increasingly militarized, secretive, unaccountable, and intrusive “national security state”?
Take, for example, World War II, about which countless tens of thousands of studies have been written. Estimates of total “war-related” deaths from that global conflict range from roughly 50 million to more than 80 million. One explanation for such variation is the sheer chaos of armed violence. Another is what the counters choose to count and how they count it. Battle deaths of uniformed combatants are easiest to determine, especially on the winning side. Military bureaucrats can be relied upon to keep careful records of their own killed-in-action—but not, of course, of the enemy they kill. War-related civilian fatalities are even more difficult to assess, although—as in World War II—they commonly are far greater than deaths in combat.
Does the data source go beyond so-called battle-related collateral damage to include deaths caused by war-related famine and disease? Does it take into account deaths that may have occurred long after the conflict itself was over (as from radiation poisoning after Hiroshima and Nagasaki, or from the US use of Agent Orange in the Vietnam War)? The difficulty of assessing the toll of civil, tribal, ethnic, and religious conflicts with any exactitude is obvious.
Concentrating on fatalities and their averred downward trajectory also draws attention away from broader humanitarian catastrophes. In mid-2015, for instance, the Office of the United Nations High Commissioner for Refugees reported that the number of individuals “forcibly displaced worldwide as a result of persecution, conflict, generalized violence, or human rights violations” had surpassed 60 million and was the highest level recorded since World War II and its immediate aftermath. Roughly two-thirds of these men, women, and children were displaced inside their own countries. The remainder were refugees, and over half of these refugees were children.
Here, then, is a trend line intimately connected to global violence that is not heading downward. In 1996, the UN’s estimate was that there were 37.3 million forcibly displaced individuals on the planet. Twenty years later, as 2015 ended, this had risen to 65.3 million—a 75% increase over the last two post-Cold War decades that the declinist literature refers to as the “new peace.”
Other disasters inflicted on civilians are less visible than uprooted populations. Harsh conflict-related economic sanctions, which often cripple hygiene and healthcare systems and may precipitate a sharp spike in infant mortality, usually do not find a place in itemizations of military violence. US-led UN sanctions imposed against Iraq for 13 years beginning in 1990 in conjunction with the first Gulf War are a stark example of this. An account published in the New York Times Magazine in July 2003 accepted the fact that “at least several hundred thousand children who could reasonably have been expected to live died before their fifth birthday.” And after all-out wars, who counts the maimed, or the orphans and widows, or those the Japanese in the wake of World War II referred to as the “elderly orphaned”—parents bereft of their children?
Figures and tables, moreover, can only hint at the psychological and social violence suffered by combatants and noncombatants alike. It has been suggested, for instance, that 1 in 6 people in areas afflicted by war may suffer from mental disorder (as opposed to 1 in 10 in normal times). Even where American military personnel are concerned, trauma did not become a serious focus of concern until 1980, seven years after the US retreat from Vietnam, when post-traumatic stress disorder (PTSD) was officially recognized as a mental-health issue.
In 2008, a massive sampling study of 1.64 million US troops deployed to Afghanistan and Iraq between October 2001 and October 2007 estimated “that approximately 300,000 individuals currently suffer from PTSD or major depression and that 320,000 individuals experienced a probable TBI [traumatic brain injury] during deployment.” As these wars dragged on, the numbers naturally increased. To extend the ramifications of such data to wider circles of family and community—or, indeed, to populations traumatized by violence worldwide—defies statistical enumeration.
Largely unmeasurable, too, is violence in a different register: the damage that war, conflict, militarization, and plain existential fear inflict upon civil society and democratic practice. This is true everywhere but has been especially conspicuous in the United States since Washington launched its “global war on terror” in response to the attacks of September 11, 2001.
Here, numbers are perversely provocative, for the lives claimed in 21st-century terrorist incidents can be interpreted as confirming the decline-in-violence argument. From 2000 through 2014, according to the widely cited Global Terrorism Index, “more than 61,000 incidents of terrorism claiming over 140,000 lives have been recorded.” Including September 11th, countries in the West experienced less than 5% of these incidents and 3% of the deaths. The Chicago Project on Security and Terrorism, another minutely documented tabulation based on combing global media reports in many languages, puts the number of suicide bombings from 2000 through 2015 at 4,787 attacks in more than 40 countries, resulting in 47,274 deaths.
These atrocities are incontestably horrendous and alarming. Grim as they are, however, the numbers themselves are comparatively low when set against earlier conflicts. For specialists in World War II, the “140,000 lives” estimate carries an almost eerie resonance, since this is the rough figure usually accepted for the death toll from a single act of terror bombing, the atomic bomb dropped on Hiroshima. The tally is also low compared to contemporary deaths from other causes. Globally, for example, more than 400,000 people are murdered annually. In the United States, the danger of being killed by falling objects or lightning is at least as great as the threat from Islamist militants.
This leaves us with a perplexing question: If the overall incidence of violence, including 21st-century terrorism, is relatively low compared to earlier global threats and conflicts, why has the United States responded by becoming an increasingly militarized, secretive, unaccountable, and intrusive “national security state”? Is it really possible that a patchwork of non-state adversaries that do not possess massive firepower or follow traditional rules of engagement has, as the chairman of the Joint Chiefs of Staff declared in 2013, made the world more threatening than ever?
For those who do not believe this to be the case, possible explanations for the accelerating militarization of the United States come from many directions. Paranoia may be part of the American DNA—or, indeed, hardwired into the human species. Or perhaps the anticommunist hysteria of the Cold War simply metastasized into a post-9/11 pathological fear of terrorism. Machiavellian fear-mongering certainly enters the picture, led by conservative and neoconservative civilian and military officials of the national security state, along with opportunistic politicians and war profiteers of the usual sort. Cultural critics predictably point an accusing finger as well at the mass media’s addiction to sensationalism and catastrophe, now intensified by the proliferation of digital social media.
To all this must be added the peculiar psychological burden of being a “superpower” and, from the 1990s on, the planet’s “sole superpower”—a situation in which “credibility” is measured mainly in terms of massive cutting-edge military might. It might be argued that this mindset helped “contain Communism” during the Cold War and provides a sense of security to US allies. What it has not done is ensure victory in actual war, although not for want of trying. With some exceptions (Grenada, Panama, the brief 1991 Gulf War, and the Balkans), the US military has not tasted victory since World War II—Korea, Vietnam, and recent and current conflicts in the Greater Middle East being boldface examples of this failure. This, however, has had no impact on the hubris attached to superpower status. Brute force remains the ultimate measure of credibility.
The traditional American way of war has tended to emphasize the “three Ds” (defeat, destroy, devastate). Since 1996, the Pentagon’s proclaimed mission is to maintain “full-spectrum dominance” in every domain (land, sea, air, space, and information) and, in practice, in every accessible part of the world. The Air Force Global Strike Command, activated in 2009 and responsible for managing two-thirds of the US nuclear arsenal, typically publicizes its readiness for “Global Strike… Any Target, Any Time.”
In 2015, the Department of Defense acknowledged maintaining 4,855 physical “sites”—meaning bases ranging in size from huge contained communities to tiny installations—of which 587 were located overseas in 42 foreign countries. An unofficial investigation that includes small and sometimes impermanent facilities puts the number at around 800 in 80 countries. Over the course of 2015, to cite yet another example of the overwhelming nature of America’s global presence, elite US special operations forces were deployed to around 150 countries, and Washington provided assistance in arming and training security forces in an even larger number of nations.
America’s overseas bases reflect, in part, an enduring inheritance from World War II and the Korean War. The majority of these sites are located in Germany (181), Japan (122), and South Korea (83) and were retained after their original mission of containing communism disappeared with the end of the Cold War. Deployment of elite special operations forces is also a Cold War legacy (exemplified most famously by the Army’s “Green Berets” in Vietnam) that expanded after the demise of the Soviet Union. Dispatching covert missions to three-quarters of the world’s nations, however, is largely a product of the war on terror.
Many of these present-day undertakings require maintaining overseas “lily pad” facilities that are small, temporary, and unpublicized. And many, moreover, are integrated with covert CIA “black operations.” Combating terror involves practicing terror—including, since 2002, an expanding campaign of targeted assassinations by unmanned drones. For the moment, this latest mode of killing remains dominated by the CIA and the US military (with the United Kingdom and Israel following some distance behind).
The “delicate balance of terror” that characterized nuclear strategy during the Cold War has not disappeared. Rather, it has been reconfigured. The US and Soviet arsenals that reached a peak of insanity in the 1980s have been reduced by about two-thirds—a praiseworthy accomplishment but one that still leaves the world with around 15,400 nuclear weapons as of January 2016, 93% of them in US and Russian hands. Close to 2,000 of the latter on each side are still actively deployed on missiles or at bases with operational forces.
This downsizing, in other words, has not removed the wherewithal to destroy the Earth as we know it many times over. Such destruction could come about indirectly as well as directly, with even a relatively “modest” nuclear exchange between, say, India and Pakistan triggering a cataclysmic climate shift—a “nuclear winter”—that could result in massive global starvation and death. Nor does the fact that seven additional nations now possess nuclear weapons (and more than 40 others are deemed “nuclear weapons capable”) mean that “deterrence” has been enhanced. The future use of nuclear weapons, whether by deliberate decision or by accident, remains an ominous possibility. That threat is intensified by the possibility that nonstate terrorists may somehow obtain and use nuclear devices.
What is striking at this moment in history is that paranoia couched as strategic realism continues to guide US nuclear policy and, following America’s lead, that of the other nuclear powers. As announced by the Obama administration in 2014, the potential for nuclear violence is to be “modernized.” In concrete terms, this translates as a 30-year project that will cost the United States an estimated $1 trillion (not including the usual future cost overruns for producing such weapons), perfect a new arsenal of “smart” and smaller nuclear weapons, and extensively refurbish the existing delivery “triad” of long-range manned bombers, nuclear-armed submarines, and land-based intercontinental ballistic missiles carrying nuclear warheads.
Creating a capacity for violence greater than the world has ever seen is costly—and remunerative.
Nuclear modernization, of course, is but a small portion of the full spectrum of American might—a military machine so massive that it inspired President Barack Obama to speak with unusual emphasis in his State of the Union address in January 2016. “The United States of America is the most powerful nation on Earth,” he declared. “Period. Period. It’s not even close. It’s not even close. It’s not even close. We spend more on our military than the next eight nations combined.”
Official budgetary expenditures and projections provide a snapshot of this enormous military machine, but here again numbers can be misleading. Thus, the “base budget” for defense announced in early 2016 for fiscal year 2017 amounts to roughly $600 billion, but this falls far short of what the actual outlay will be. When all other discretionary military- and defense-related costs are taken into account—nuclear maintenance and modernization, the “war budget” that pays for so-called overseas contingency operations like military engagements in the Greater Middle East, “black budgets” that fund intelligence operations by agencies including the CIA and the National Security Agency, appropriations for secret high-tech military activities, “veterans affairs” costs (including disability payments), military aid to other countries, huge interest costs on the military-related part of the national debt, and so on—the actual total annual expenditure is close to $1 trillion.
Such stratospheric numbers defy easy comprehension, but one does not need training in statistics to bring them closer to home. Simple arithmetic suffices. The projected bill for just the 30-year nuclear modernization agenda comes to over $90 million a day, or almost $4 million an hour. The $1 trillion price tag for maintaining the nation’s status as “the most powerful nation on Earth” for a single year amounts to roughly $2.74 billion a day, over $114 million an hour.
Creating a capacity for violence greater than the world has ever seen is costly—and remunerative.
So an era of a “new peace”? Think again. We’re only three-quarters of the way through America’s violent century and there’s more to come.
The contemporary US is a far cry from Orwell’s Oceania. Yet the Trump administration is doing its best to exert control over the present and the past.
When people use the term “Orwellian,” it’s not a good sign.
It usually characterizes an action, an individual, or a society that is suppressing freedom, particularly the freedom of expression. It can also describe something perverted by tyrannical power.
It’s a term used primarily to describe the present, but whose implications inevitably connect to both the future and the past.
In his second term, US President Donald Trump has revealed his ambitions to rewrite America’s official history to, in the words of the Organization of American Historians, “reflect a glorified narrative… while suppressing the voices of historically excluded groups.”
If it is illegal to even speak of systemic racism, for example, let alone discuss its causes and possible remedies, it constrains the potential for, even prohibits, social change.
This ambition was manifested in efforts by the Department of Education to eradicate a “DEI agenda” from school curricula. It also included a high-profile assault on what detractors saw as “woke” universities, which culminated in Columbia University’s agreement to submit to a review of the faculty and curriculum of its Middle Eastern Studies department, with the aim of eradicating alleged pro-Palestinian bias.
Now, the administration has shifted its sights from formal educational institutions to one of the key sites of public history-making: the Smithsonian, a collection of 21 museums, the National Zoo, and associated research centers, principally centered on the National Mall in Washington, DC.
On August 12, 2025, the Smithsonian’s director, Lonnie Bunch III, received a letter from the White House announcing its intent to carry out a systematic review of the institution’s holdings and exhibitions in the advance of the nation’s 250th anniversary in 2026.
The review’s stated aim is to ensure that museum content adequately reflects “Americanism” through a commitment to “celebrate American exceptionalism, [and] remove divisive or partisan narratives.”
On Aug. 19, 2025, Trump escalated his attack on the Smithsonian. “The Smithsonian is OUT OF CONTROL, where everything discussed is how horrible our Country is, how bad Slavery was…” he wrote in a Truth Social post. “Nothing about Success, nothing about Brightness, nothing about the Future. We are not going to allow this to happen.”
 
Such ambitions may sound benign, but they are deeply Orwellian. Here’s how.
Author George Orwell believed in objective, historical truth. Writing in 1946, he attributed his youthful desire to become an author in part to a “historical impulse,” or “the desire to see things as they are, to find out true facts and store them up for the use of posterity.”
But while Orwell believed in the existence of an objective truth about history, he did not necessarily believe that truth would prevail.
Truth, Orwell recognized, was best served by free speech and dialogue. Yet absolute power, Orwell appreciated, allowed those who possessed it to silence or censor opposing narratives, quashing the possibility of productive dialogue about history that could ultimately allow truth to come out.
As Orwell wrote in 1984, his final, dystopian novel, “Who controls the past controls the future. Who controls the present controls the past.”
Historian Malgorzata Rymsza-Pawlowska has written about America’s bicentennial celebrations that took place in 1976. Then, she says, “Americans across the nation helped contribute to a pluralistic and inclusive commemoration… using it as a moment to question who had been left out of the legacies of the American Revolution, to tell more inclusive stories about the history of the United States.”
This was an example of the kind of productive dialogue encouraged in a free society. “By contrast,” writes Rymsza-Pawlowska, “the 250th is shaping up to be a top-down affair that advances a relatively narrow and celebratory idea of Americanism.” The newly announced Smithsonian review aims to purge counternarratives that challenge that celebratory idea.
The desire to eradicate counternarratives drives Winston Smith’s job at the ironically named Ministry of Truth in 1984.
The novel is set in Oceania, a geographical entity covering North America and the British Isles and which governs much of the Global South.
Oceania is an absolute tyranny governed by Big Brother, the leader of a political party whose only goal is the perpetuation of its own power. In this society, truth is what Big Brother and the party say it is.
The regime imposes near total censorship so that not only dissident speech but subversive private reflection, or “thought crime,” is viciously prosecuted. In this way, it controls the present.
But it also controls the past. As the party’s protean policy evolves, Smith and his colleagues are tasked with systematically destroying any historical records that conflict with the current version of history. Smith literally disposes of artifacts of inexpedient history by throwing them down “memory holes,” where they are “wiped… out of existence and out of memory.”
At a key point in the novel, Smith recalls briefly holding on to a newspaper clipping that proved that an enemy of the regime had not actually committed the crime he had been accused of. Smith recognizes the power over the regime that this clipping gives him, but he simultaneously fears that power will make him a target. In the end, fear of retaliation leads him to drop the slip of newsprint down a memory hole.
The contemporary US is a far cry from Orwell’s Oceania. Yet the Trump administration is doing its best to exert control over the present and the past.
 
Even before the Trump administration announced its review of the Smithsonian, officials in departments across government had taken unprecedented steps to rewrite the nation’s official history, attempting to purge parts of the historical narrative down Orwellian memory holes.
Comically, those efforts included the temporary removal from government websites of information about the Enola Gay, the plane that dropped the atomic bomb over Hiroshima. The plane was unwittingly caught up in a mass purge of references to “gay” and LGBTQ+ content on government websites.
Other erasures have included the deletion of content on government sites related to the life of Harriet Tubman, the Maryland woman who escaped slavery and then played a pioneering role as a conductor of the Underground Railroad, helping enslaved people escape to freedom.
Public outcry led to the restoration of most of the deleted content.
Over at the Smithsonian, which earlier in the year had been criticized by Trump for its “divisive, race-centered ideology,” staff removed a temporary placard with references to President Trump’s two impeachment trials from a display case on impeachment that formed part of the National Museum of American History exhibition on the American presidency. The references to Trump’s two impeachments were modified, with some details removed, in a newly installed placard in the updated display.
Responding to questions, the Smithsonian stated that the placard’s removal was not in response to political pressure: “The placard, which was meant to be a temporary addition to a 25-year-old exhibition, did not meet the museum’s standards in appearance, location, timeline, and overall presentation.”
Orwell’s 1984 ends with an appendix on the history of “Newspeak,” Oceania’s official language, which, while it had not yet superseded “Oldspeak” or standard English, was rapidly gaining ground as both a written and spoken dialect.
According to the appendix, “The purpose of Newspeak was not only to provide a medium of expression for the worldview and mental habits proper to the devotees of [the Party], but to make all other modes of thought impossible.”
Orwell, as so often in his writing, makes the abstract theory concrete: “The word free still existed in Newspeak, but it could only be used in such statements as ‘This dog is free from lice’ or ‘This field is free from weeds.’ … political and intellectual freedom no longer existed even as concepts.”
The goal of this language streamlining was total control over past, present, and future.
If it is illegal to even speak of systemic racism, for example, let alone discuss its causes and possible remedies, it constrains the potential for, even prohibits, social change.
It has become a cliché that those who do not understand history are bound to repeat it.
As George Orwell appreciated, the correlate is that social and historical progress require an awareness of, and receptivity to, both historical fact and competing historical narratives.
This story is an updated version of an article originally published on June 9, 2025.