The Ambiguous Legacy of the Powell Doctrine

There was so much to admire and respect about Colin Powell: son of Jamaican immigrants to post-World War II Harlem, ROTC leader at City College and infantry officer in Vietnam. He matured into a powerful member of the national-security elite: White House fellow during the Nixon administration, military adviser to senior Pentagon officials in the Carter and Reagan administrations, national security adviser, chairman of the Joint Chiefs of Staff and secretary of state. He was also a symbol of what a black man might achieve while still staying within the bounds of the American establishment.

But what is perhaps his most lasting legacy is more complex and, in some ways, corrosive. The “Powell Doctrine” on the proper uses of U.S. military power—briefly, that the U.S. should engage only in conflicts that serve our vital interests, with well-defined goals, and with the use of force as a last resort—has been a constraint on American strategic thinking and a constant problem for civil-military relations. In the wake of the recent debacle in Afghanistan, the situation promises to get worse.

The humiliating retreat from Kabul almost immediately revived the ghosts of Vietnam, which are also the spirits of the Powell Doctrine. In his autobiography My American Journey, Powell wrote that “we had entered into a halfhearted half-war, with much of the nation opposed or indifferent, while a small fraction carried the burden.” To the professional soldiers of Powell’s generation, this was a grave strategic error compounded by a moral crime: American lives lightly sacrificed without national purpose, without prospect of victory, without honor.

To many, that became and remains the dominant narrative of the war and of irregular wars in general. And, especially in the last decade, the narrative has returned. “It’s hard not to make that parallel [between Afghanistan and Vietnam],” former Army Special Forces Capt. Andrew MacNeil told Michael Phillips and Nancy Youssef of the Wall Street Journal. Writing in The Hill, Richard Pierce asserted that “[p]residents can avoid future costly and tragic disasters like the war in Afghanistan by applying the Powell Doctrine every time they consider whether to commit U.S. combat forces.”   

For Powell, the belief in this Vietnam narrative solidified as he rose through the ranks of the Army and bureaucratic Washington. In November 1984 he was military assistant to then-Defense Secretary Caspar Weinberger. The year before, the bombing of a Marine barracks in Beirut had cost the lives of 241 Marines; it appeared to be another disaster in pursuit of an ambiguous, “peace enforcement” mission that the Joint Chiefs of Staff, all Vietnam veterans, had virulently opposed. In a speech to the National Press Club on the “Uses of Military Force,” Secretary Weinberger laid out the opposition argument, proposing six “tests” to govern the employment of American military power. This was Powell’s doctrine under Weinberger’s name.

First of all, the mission had to “be deemed vital to our national interest or that of our allies.” Second, “wholeheartedly, and with the clear intention of winning.”  Third, “political and military objectives and the ways to meet them” had to be “clearly defined.” Fourth, if “conditions change,” then the commitment “must be reassessed” in terms of the national interest. And before troops are deployed, there needed to be “some reasonable assurance” of domestic support in Congress and in public opinion. And, finally, force was only to be used as a “last resort.”

When Saddam Hussein invaded Kuwait in 1990, Powell, then chairman of the Joint Chiefs of Staff, ensured that the first Bush administration passed these tests in shaping its response. The balance of power and the energy resources in the Persian Gulf region were very much in the U.S. and Western security interest; the dispatch of 500,000 U.S. and nearly as many allied troops put the outcome of any military campaign beyond doubt; the political objective—the restoration of Kuwaiti sovereignty—was modest and the military means more than sufficient; when, after four days of ground counteroffensive ejected the Iraqi army from Kuwait, the campaign was “reassessed” to a halt; the mission had a U.N. mandate, backing—if barely—from Congress, and troops returning to the United States feted with yellow-ribbon parades; the start of the war delayed and staged to offer Saddam multiple options for withdrawal.

Before retiring from the Army in 1993, Powell codified his thinking in the first formal post-Cold War National Military Strategy of the United States, arguing for a concept of “Decisive Force” in shaping the inevitable defense drawdown: “Once a decision for military action has been made,” the report declared:

Half measures and confused objectives exact a severe price in the form of a protracted conflict which can cause needless waste of human lives and material resources, a divided nation at home, and defeat. Therefore one of the essential elements of our national military strategy is the ability to rapidly assemble the forces needed to win—the concept of applying decisive force to overwhelm our adversaries and thereby terminate conflicts swiftly with minimum loss of life. 

These two sentences seem, to many, to summarize America’s strategic and military mistakes of the last two decades and what ought to have happened instead. Those who regard themselves as foreign policy “realists,” in particular, have claimed Powell as an avatar. Christopher Preble eulogized Powell for the Atlantic Council think tank*: “Powell understood that the nation’s grand ambitions needed to be tempered by reality. Here’s hoping that the strategists who come after him appreciate…the Powell Doctrine’s core wisdom: that acknowledging one’s limits and setting priorities is a virtue, not a vice.” 

Indeed, the case for “decisive” uses of force seems so obvious that it is a wonder that anyone, especially history’s “sole superpower,” would do otherwise. But the shortcoming of the Powell Doctrine is that it describes an ideal of war. It is almost never, in fact, a realistic approach to strategy, especially in a nation with worldwide security interests.

Consider the supposed “tests” of Powell’s wisdom in the use of military power. What defines America’s “vital” interests? They certainly include the security of the “homeland,” extending across North America, the Pacific (at least to Hawaii and Guam) and, in practice, throughout the Caribbean Basin, as well as unconstrained access to the seas, skies, near-earth space and cyberspace. But the “American way of life” of individual liberty and justice by rule of law are equally accounted as essential. In practice, these vital interests are so broad that they are more often in conflict than alignment.

The definition of “winning” is likewise inherently unclear; World War II is the exception to history’s rule. In Afghanistan, for example, it was a significant achievement simply to remain there in sufficient force to continue to keep the Taliban at bay and suppress al-Qaeda, ISIS and other locally based terrorists such as the Haqqani Network. The mission was also useful for constraining mischief, to which Pakistan is endemically prone. This sort of “mowing the grass” posture is something Israelis consider to be winning, even in a longer-running “forever war.”

Aligning political ends, strategic ways, and military means is a most delicate art, one that tends to reward flexibility, improvisation, and sheer luck as much or more than clear foresight. In the Civil War, Abraham Lincoln shifted the war’s purpose profoundly, from union to abolition. He embraced at least three very different strategies, beginning with Winfield Scott’s “Anaconda” plan to “strangle” the Confederacy into submission, thence to Grant’s brutal “choking and chewing” on Southern military forces and finally to Sherman’s disembowelment of the plantation economy. And, with three major theaters of war and countless secondary fronts, the level of forces needed was never enough; resources were shifted according to the crisis of the month.

In war, not only do tactical, technical, and operational realities shift, but so can the understanding of fundamental national interests. Again, the Civil War offers an example: Early in the war, the abolition of slavery was not deemed to be a national interest, yet by 1862, it had become the primary interest of the contest and indeed a redefinition of America’s national purpose. Yet this “reassessment” was entirely informal, in the minds of the president and the people, and occasioned no direct shift in military planning.

And if Lincoln had heeded Powell’s prescriptions on public and congressional opinion and support, he would never have defended Fort Sumter. The nation was at daggers drawn and the remaining Congress divided between a rump of Union Democrats—many of them “Copperheads”—and an emerging majority of “Radical” Republicans pushing Lincoln toward ever-more-aggressive prosecution of the war. Even the Union armies and generals were split along partisan lines. The president could never be confident of support, as he cobbled together new coalitions on a daily basis.

Perhaps most specious of all is Powell’s demand that military force must be a “last resort.” There is always an alternative to war—as the Biden administration appears determined to demonstrate. Lincoln might have just let the South go, and George McClellan would have done so, even in 1864. But more profoundly, military power is the ultima ratio regum, the justification for sovereign states; foreign policy is inherently a militarized endeavor, an exercise, first and foremost, in “hard power.”

Thus does the Powell Doctrine reveal itself less as a prescription for the conduct of war than a proscription against it. The late Secretary of State George Schultz, who famously feuded with Defense Secretary Weinberger, highlighted this even in the aftermath of the Beirut bombing. We “cannot allow ourselves to become the Hamlet of nations, worrying endlessly over whether and how to respond,” he told the Park Avenue Synagogue in October 1984. “A great nation with global responsibilities cannot afford to be hamstrung by confusion and indecisiveness.”

As Polonius-like as the Powell Doctrine may be as a strategy, it is much more disturbing for its effects on civil-military affairs. It has encouraged leaders to encroach upon the prerogatives of their civilian masters, who are limited, in Powell’s model, to a binary “decision for war.” In our republic, it is not for the president to submit to a test by his generals on the employment of the troops he or she, by Constitutional design, commands. Powell’s career as chairman of the Joint Chiefs of Staff was notable not only for his Desert Storm triumph, but his efforts to strong-arm the Clinton administration against intervening in the Balkans or permitting gays and lesbians to serve openly in the armed forces.

The unhappy civilian response to Powell’s power grab was to engender suspicion among civilians in subsequent administrations of both parties. Anything framed as “best military advice” was viewed with a jaundiced eye—consider the Biden administration’s conduct of the Afghanistan withdrawal—and the propensity has increasingly been to promote officers for their political reliability. In combination with the decline of military experience or expertise among political appointees, this has done much to frame the strategic decline of the United States in recent years.

Colin Powell’s American journey is both an inspiration and a caution. His service and his successes represent the best of us, but his views on the use of force have created a crippling legacy and an increasingly dysfunctional civil-military relationship.

*Correction, October 22: This article originally identified Christopher Preble as a Cato Institute scholar.

Giselle Donnelly is a senior fellow in foreign and defense policy studies at the American Enterprise Institute.

Comments (37)
Join The Dispatch to participate in the comments.
 
Load More