In the dispiriting of summer of 1979, a beleaguered President Jimmy Carter tried to sell his fellow citizens on a radical proposition: Having strayed from the path of righteousness, the nation was in dire need of moral and cultural repair.
Carter's pitch had a specific context: An "oil shock" -- this one a product of the Iranian Revolution -- had once more reminded Americans that their prevailing definition of the good life depended on the indulgence of others. The United States was running out of oil and was anxiously counting on others to provide it.
Yet the problem at hand, Carter insisted, went far beyond "gasoline lines or energy shortages." A "mistaken idea of freedom" had led too many Americans "to worship self-indulgence and consumption." The nation therefore faced a fundamental choice. Down one path lay "fragmentation and self-interest," pointing toward "constant conflict" and "ending in chaos and immobility." Down the other lay a "path of common purpose and the restoration of American values." By choosing rectitude over profligacy, the nation could save itself. Making the sacrifices needed to end their dependence on foreign oil would enable Americans to "seize control again of our common destiny."
Alas, members of the congregation weren't buying what Pastor Carter was selling. They had no interest in getting by with less. Ronald Reagan, sunny where Carter was dour and widely expected to challenge the president for reelection, was offering an alternative view: For Americans, there is always more. Besides, austerity didn't sound like much fun. Soon enough Carter himself got the message. In January 1980, he capitulated, declaring Persian Gulf oil a cause worth fighting for.
The implications of this Carter Doctrine were not immediately apparent. Yet what unfolded over the course of subsequent decades was a vast military enterprise that today finds US forces engaged in something approximating permanent war, not only in the Persian Gulf but across large parts of the Islamic world.
At odd intervals during this very long conflict, Carter's theme of moral and cultural restoration resurfaced, albeit with a twist. Observers expressed hopes that war itself might somehow provide the instrument of national redemption.
George W. Bush was eloquent on this point. In his 2002 State of the Union Address, Bush depicted 9/11 itself as an occasion for cultural transformation. "This time of adversity," he announced, offers "a moment we must seize to change our culture." Indeed, the change was already happening. "After America was attacked, it was as if our entire country looked into a mirror and saw our better selves. We were reminded that we are citizens with obligations. . . . We began to think less of the goods we can accumulate and more about the good we can do." For too long, Americans had adhered to the dictum "If it feels good, do it." Now, even as Bush was urging his fellow citizens to shop and take vacations, he commended them for embracing "a new culture of responsibility."
This was mostly nonsense, of course. Just as we had ignored Carter's critique of their "mistaken idea of freedom," so too we passed on Bush's "culture of responsibility." To avoid getting sucked into the Middle East, Carter had admonished Americans to change their ways. The response: Piss off. With the United States now wading into a Middle Eastern quagmire, Bush revived Carter's call for a cultural Great Awakening. Although 9/11 briefly induced a mood of "United We Stand," the invasion of Iraq, with all the mournful consequences that ensued, terminated that feel-good moment, and demolished Bush's standing as moral arbiter.
In truth, the inclinations, habits, and mores that Carter bemoaned and that Bush fancied war might banish, are immune to presidential authority. Presidents don't control the culture; they cope with it. In times of war, they abide by what the culture permits and adhere to what it requires. Simply put, culture shapes the American way of war.
Certainly this was the case during prior conflicts of US history such as the Civil War and World War II. During each, a widely shared (if imperfect) collective culture imbued the war effort with effectiveness that contributed directly to victory. From the very outset of the war that the United States has for decades waged in various parts of the Islamic world, just the reverse has been true. An absence of cultural solidarity has undermined military effectiveness.
These days, American culture posits a minimalist definition of citizenship. It emphasizes choice rather than duty and self-gratification over sacrifice -- except where sacrifice happens to accord with personal preference. Individuals enjoy wide latitude in defining the terms of their relationship to the state. Pay your taxes and obey the law; civic obligation extends that far and no further.
So in conducting military campaigns in the Islamic world, presidents from Carter's day to our own have asked little -- indeed, next to nothing -- from the vast majority of citizens. They are spectators rather than participants.
The people find this arrangement amenable. On occasion, some exceptionally egregious calamity such as the Beirut bombing of 1983, the "Black Hawk down" debacle of 1993, or the botched occupation of Iraq following the invasion of 2003 may briefly command their attention. But in general, they tune out what they view as not their affair.
As recently as the 1960s, antipathy toward a misguided and failing war generated mass protest. Today, instead of protest there is accommodation, with Americans remarkably untroubled by the inability of those presiding over the ebb and flow of military actions across the Greater Middle East to explain when, how, or even whether they will end.
To be sure, even today we retain a residual capacity for outrage, as the Occupy and Black Lives Matter movements have demonstrated. When the issue is inequality or discrimination based on race, gender, or sexuality, we still take to the streets. When it comes to war, however, not so much. The "peace movement," to the extent that it can be said to exist, is anemic and almost entirely devoid of clout. Our politics allows no room for anything approximating an antiwar party. Instead, the tacit acceptance of war has become a distinguishing feature of contemporary American scene.
With The People opting out, the burden of actually conducting the various campaigns launched pursuant to the Carter Doctrine falls to those who willingly make themselves available to fight. We may compare these volunteers to fighter pilots during the Battle of Britain: They are the Few. The many have other options and act accordingly.
Given the choice between a job in finance and the chance to carry an assault rifle, Harvard grads opt for Wall Street, destination of roughly one-third of graduating seniors in recent years. In 2015, by comparison, participants in Harvard's annual military commissioning ceremony numbered exactly four. Offered the opportunity to sign with the pros or the Army, top athletes opt for the playing field rather than the battlefield. The countercultural Tillman Exception awaits replication.
So a central task for field commanders has been to figure out how to fight wars that the political class deems necessary but to which the rest of us are largely indifferent. In 2007, Admiral Mike Mullen, the Joint Chiefs of Staff chairman, neatly summarized the problem. "In Afghanistan, we do what we can," he remarked. "In Iraq, we do what we must." Implicit in Mullen's can/must formulation was the fact that in neither Afghanistan nor Iraq were commanders able to do what they wished. The constraint they labored under was not money or equipment, which were available and expended in prodigious quantities, but troops.
We tend to rank those two conflicts among this nation's "big wars." In reality, except as measured by duration, both qualify as puny. The total number of troops committed to both Operation Iraqi Freedom and Operation Enduring Freedom together peaked at one-third the total number of Americans in Vietnam in 1968. However much commanders in Iraq and Afghanistan might have wanted more troops -- and they did -- more were not forthcoming.
For an explanation, look not to need but to availability. In truth, the pool of the willing is not deep. Sustaining what depth there is requires incentives. Since 9/11, new recruit pay has jumped by 50 percent. Reenlistment bonuses can run as high as $150,000. In a material culture, appeals to patriotism don't suffice to elicit and retain volunteers. Inducing people to put their lives on the line requires upfront compensation. Even then, the Few remain few.
In practice, roughly 1 percent of the population bears the burden of actually fighting our wars. A country that styles itself a democracy ought to find this troubling. Yet unlike the inequitable distribution of income, which generates considerable controversy, this inequitable distribution of sacrifice generates almost none. Even in a presidential election year, it finds no place on the nation's political agenda. In the prevailing culture of choice, those choosing to remain on the sidelines are not to be held accountable for the fate that befalls those choosing to go fight.
Yet while the availability of warriors may be limited, money is another matter. Ours is not a pay-as-you-go culture. It's go now and worry about the bills later. So it has been with the funding of recent military operations. Rather than defraying war costs through increased taxation -- thereby drawing public attention to the war's progress or lack thereof -- the government borrows, with the sums involved hardly trivial. Since 9/11 alone, the national debt has nearly quadrupled. By-and-large, Americans are OK with sloughing off onto future generations the responsibility of paying for wars presumably undertaken on their own behalf -- a bit like buying a pricey car and sticking your grandkids with the payments.
Granted, money partially offsets the shortage of troops. In Iraq and Afghanistan, the Pentagon found it expedient to contract out functions traditionally performed by soldiers. When each of those wars was at its height, contractors in the employ of profit-minded security firms outnumbered G.I.s. Privatizing war provides a workaround to the predicament caused by having a large appetite for war while the people nurse appetites of a different sort.
In some quarters, as a sort of hangover from Vietnam, the belief persists that American culture, at least in those quarters where professors and artsy types congregate, is intrinsically anti-military. Nothing could be further from the truth. American culture is decidedly pro-military. All it asks is that military institutions get in step with the culture's core requirements.
And so they have, becoming open to all as venues for individual self-actualization. The Pentagon has embraced diversity -- the very signature of contemporary culture. In today's military, overt racism has ceased to exist. Women wear four-stars, fly fighter jets, graduate from ranger school, and can enlist as combat Marines. Barriers preventing members of the LGBT community from serving openly? On the way out.
This is to the good. Yet where it really counts, our culturally compliant military falls short. Thrust into a series of wars to make good on the people's expectations of more, while respecting their aversion to sacrifice, the Few -- admirable in so many ways -- find themselves unable either to win and or to get out.
Whether Carter's "restoration of American values" or Bush's "culture of responsibility" would find us today in a different place is a moot point. Our culture remains on a fixed trajectory.
When it comes to making war, that culture hinders rather than helps. Rather than fighting the problem, policy makers should consider turning it to US advantage. Instead of promoting American-style freedom at the point of a bayonet, they should pursue alternatives to war. Instead of coercion, perhaps it's time to try seduction.