Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Foreign Policy
Foreign Policy
Comment
Rajan Menon, Zachery Tyson Brown, Kristen Cordell, Emma Ashford, Matthew Kroenig

Biden Must Rethink the American Way of War

As a presidential candidate, Joe Biden pays his respects to fallen service members on Memorial Day at Delaware Memorial Bridge Veteran’s Memorial Park in New Castle, Delaware, on May 25, 2020. Olivier Douliery/AFP/Getty Images

“You may not be interested in war,” it has often been said, “but war is interested in you.” Perhaps so, but Trotsky’s dictum hasn’t applied to the United States since January 1973, when the country, having relied on conscription from the Civil War through Vietnam, replaced a draft-based military with an all-volunteer force. Since then, only a sliver of U.S. society has ever served in the military, let alone participated in combat. The rest, even people age-eligible for military service, have been walled off from the hazards of war; because the post-9/11 wars have been financed through borrowing rather than higher taxes, Americans haven’t even had pay for it out of their pockets.

In turn, neither post-Cold War presidents nor members of Congress need worry about mass demonstrations or electoral backlashes, which has given them greater freedom to continue wars for years.


Today, only 0.5 percent of the U.S. population is on active duty. Veterans account for 7 percent. The proportion of young people who enlist or choose a military career has declined. In a 2015 survey, 85 percent of those between 18 and 29 said they definitely or probably wouldn’t sign up for military service even if they were needed, never mind that a majority supported using force to combat terrorism. Financial enticements, including bonuses, which run as high as $40,000 for first-time enlistees and $81,000 for those who reenlist—don’t seem to have sufficed. Despite significant increases in expenditures for bonuses, the Military Times reported in 2019 that only 180,000 of the 1.2 million individuals who turn 18 and meet the enlistment standards in any given year are willing and able to join the ranks. Even though that’s just about what’s required to maintain the size of the total force, in 2017, the Army, the largest branch by far, planned to spend $300 million on bonuses and publicity to add an additional 6,000 to its ranks. As for 2018, although the Army reduced its initial recruitment goal from 80,000 to 76,500, it still fell short by 7,600 despite waiving quality standards for at least 10 percent of enlistees. Shortages affected not only specialized fields such as explosive and ordnance disposal and cyberoperations and electronic warfare specialists—jobs that require skills for which the military must compete with the private sector—but also entry-level infantry positions.

In addition to falling on a small pool of recruits, the burdens of war now fall disproportionately on certain segments of society. In fiscal 2018, Southern states, home to 38 percent of Americans between the ages of 18 and 24, provided 46 percent of all enlistees. In 2016, of the states that placed in the top five for recruits as a proportion of their recruitable populations in that year’s cohort, four were Southern: Georgia, South Carolina, Virginia, and Florida. (Hawaii was the fifth.) Conversely, no Southern state was among the five with the lowest proportion.

Although there is a widespread belief that the economically least well-off Americans are overrepresented in the armed forces, that distinction belongs to the middle class. Families from the bottom fifth of the income distribution are in fact underrepresented, as are those from the top fifth. At the same time, though, Douglas Kriner and Francis Shen contend in The Casualty Gap that soldiers from poorer parts of the country pay a disproportionately high price in combat deaths and physical injuries relative to those from wealthier venues. (In part this is because those with more education and advanced skills are less likely to serve in infantry combat units.) Beyond the corporeal toll it takes, wartime service also deep psychological scars. Nearly 19 percent of veterans who had been deployed for active duty suffer from post-traumatic stress disorder. Worse, more than 6,000 veterans took their own lives each year between 2008 and 2017. The daily average for 2018 was 17.6. That far outpaces the national rate, which has also increased.


The uneven burden of protecting the United States has led to proposals like the one put forth by Elliot Ackerman, a decorated veteran of the Iraq and Afghanistan wars, which would require that at least 5 percent of the active duty force, around 70,000 individuals, be selected by lottery from top-tax-bracket families and assigned to combat units. The idea is that American leaders would not have quite as much leeway to wage war if more children from the wealthiest and politically most influential slice of society—including, incidentally, members of Congress, whose median net worth exceeds $1 million, and many people who land top cabinet posts—were exposed to its hazards. “Forever wars,” in short order, may quickly encounter larger and more robust opposition. But would it work? It is impossible to say, but historical evidence suggests that it would in some ways, but not in others.

The main reason for skepticism is that the American public has been walled off from the effects of war in another way. Successive post-Cold War presidents have insisted that the wars they waged, or inherited but chose to continue, were essential to keep the country safe. Yet they didn’t follow up by calling on their fellow citizens to pay for it, as matter of a civic duty, through more in taxes. Instead, according to Brown University’s Costs of War project, virtually all of the estimated $6.4 trillion spent or obligated so far on the wars in Afghanistan and Iraq, as well as the counterterrorism campaign in Pakistan, has been borrowed. The Brown study also estimates that, three decades hence, the interest payments on that debt alone could add up to $8 trillion. So much for President Calvin Coolidge’s counsel, in his December 1926 State of the Union address, that “a country loaded with debt is a country devoid of the first line of defense.”

The armed forces command great respect in the United States, but much of the reverence is symbolic. Politicians, among the most fervent supporters of the United States’ recent wars, sport flag pins and make a point of mentioning our “our men and women in uniform” in speeches. But most have never experienced war firsthand or lost a child to it. In fact, the proportion of legislators who have served in the military has declined precipitously. In 1971-72 it was 73 percent, in 1981-82 64 percent, and in 2019 17.8 percent. As for the rest of society, airlines may offer military personnel “priority boarding,” and families may tie yellow ribbons around trees or affix “Support Our Troops” stickers to their bumpers. Such gestures are not insincere or meaningless, nor should they be mocked, but they are cost-free, emblematic of a mode of fighting and funding war that places virtually no demands on the public.

In consequence, American presidents have gained great latitude to use force abroad. Any restraints that there are have proved anemic. Consider the 1973 War Powers Resolution. Since its enactment, not a single president has found it an insuperable hurdle. Although the Constitution (Article 1, Section 8) vests in Congress the authority to declare war, presidents have used military force abroad without seeking a formal declaration of war, claimed that a particular instance of it does not really constitute war, or asserted that Congress’s appropriation of funds for the military amounts to consent. Congress hasn’t even withheld authorization when president have sought it. The congressional vote on the 1991 Iraq War was 250 to 183 in the House (two members did not vote) and 52 to 47 with one abstaining in the Senate. The margin for the 2003 Iraq War was even greater. Through the 2001 Authorization for Use of Military Force, Congress went much further, effectively giving presidents carte blanche for using military force worldwide. Efforts to repeal it, or limit its duration, have come to naught.

In light of this, it’s unsurprising that presidents have used military force—for war, occupation, regime change, and coercion—far more frequently during the post-Cold War years (the past 30 years) relative to two other periods, each substantially longer: 1789-1917 (128 years) and the Cold War (45 years). Although a precise tally is impossible for a host of technical reasons, the approximate count (counting wars, lesser uses of force, and deployments to potentially hostile environments but not peacekeeping and emergency relief operations) is 103 instances from 1789-1917, 34 during the Cold War, and 73 since 1990. Given the differences in timespan, the last one stands out.

Kriner and Shen think that Americans’ attitude toward war would change if they better understood just how unequally the sacrifices demanded by war—military service, deaths, and injuries—have been shared. Their surveys found that nearly half of Americans now believe that they have been shared equally. Yet they also discovered that people’s support for wars does decline once they learn that, proportionately, more soldiers from the lower economic rungs are killed or wounded than are those from wealthier backgrounds. Ackerman, then, may have a point.

Until the financial costs are more evenly spread, though, politicians’ ability to countenance open-ended wars won’t diminish. True, replacing an all-volunteer force with one mustered through a draft may prove more expensive, although the experts are not of one mind on this, as well as unpopular. And yes, politicians may fear that proposing war taxes would arouse voters’ anger. Yet shared sacrifice, social justice, and civic duty should be part of the equation, especially when the inequality of wealth and power has been increasing.

To be clear, spreading the cost—both financial and human—of war certainly wouldn’t rule out war. But that’s not a sensible goal, unless hard realities of the world change and states no longer need armies to deter would-be aggressors and to defend themselves if deterrence fails. Moreover, draft or not, presidents can and will initiate wars, to say nothing of less dramatic uses of force, for example, drone strikes—a casualty-free mode of war on the aggressor’s side, albeit not for noncombatants.

But a more equitable distribution of the costs and risks of war might nevertheless make a difference. What’s distinctive about the post-9/11 forever wars in Afghanistan and Iraq is that they’ve dragged on for nearly 20 years. One of them, Iraq, was initiated against a country that wasn’t posing any clear and present danger to the United States. These wars have also been hugely expensive and have also killed and maimed thousands of American soldiers, and many more civilians, in Afghanistan, Iraq, and elsewhere. All this without producing anything remotely resembling success. Not only have the hubristic goals of nation-building and democracy-promotion proved chimerical, but they have also increased chaos and bloodletting. So the relevant question is whether presidents will be less likely to continue such wars, and Congress less willing to bless their continuation, if the burdens were spread more equitably among Americans.

Now may be a particularly good time to change the prevailing American way of war, what with social and economic disparities increasing at home and reminders—including very recent ones—that extremism and terrorism don’t emanate solely from distant lands or conform to prevailing perceptions.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.