Where would American strategic debate be without doctrines? ‘Doctrine’ is a simple concept. In the context of killing people and breaking things, it merely means the principles that guide action, or the ideas that underpin the use of force. It is about how, when, how much and why to bleed. The word has also taken on a deeper resonance, as the mark of presidential gravitas and a way to put whole chapters of diplomatic history into a nutshell. The Monroe Doctrine of 1823 declared America’s hemisphere off limits to outsiders. President Harry Truman’s Doctrine of 1947 dedicated the country to a far-reaching struggle against Soviet communism and signaled the expanding horizons of American power. In our own time, the Bush Doctrine marked the era of the Global War on Terror. It announced Washington’s bid to tame a world of rogues and terrorists back into order through a triad of preventive war, unilateralism and democracy promotion. We don’t know how history will judge it. But the adventure in hawkish Wilsonian idealism has been a tiring and costly one. Observers of American statecraft crave yet another doctrine to guide the superpower as it picks its way through the chaos. What should it be?
If we need a fresh doctrine, it does not have to be a new one. It does not even have to be one coined by a President. The most prudent doctrine for an era of protracted wars, debt-deficit crisis, political gridlock and psychological exhaustion is one that was laid down almost thirty years ago. On 28 November 1984 at the National Press Club, Secretary of Defense Caspar Weinberger explained why and under what conditions the United States should use force. He spelt out six tenets: the vital interests of the United States or its allies must be at stake; once committed, the effort should be wholehearted; political objectives must be clearly defined; that objectives must be continually reassessed to preserve a balance between means and ends; there must be Congressional and public support; and war should be undertaken only as a last resort.
In other words, if America has to fight, it should do so for tightly defined and achievable goals, with strong backing from the political class and the masses, and it must strike hard. It became known as the ‘Weinberger Doctrine’, and was later reaffirmed by General Colin Powell in the debate about American interventions abroad after the Cold War. It was an attempt at strategy in its classical sense, the limitation of war to make it serve policy and the avoidance of wars where the chaotic violence escalates out of proportion and even control.
The shadow of the Vietnam War (1955-1975) hung over this restrictive doctrine. Memories have dimmed for today’s generation, but that struggle in South Asia bred a fear of fighting prolonged peripheral engagements that were increasingly divisive, where the costs outstripped the gains, where half-measures were seen to be ineffective and where military means and policy ends could not be bridged. In the eyes of many, it wasted American lives, not to mention Vietnamese lives, for a struggle that had perverse results. The war threatened rather than bolstered America’s containment of communism. It eroded rather than strengthened Washington’s credibility abroad. Instead of fortifying American democracy in the world, it polarized and poisoned it at home. And in expanding the war into Cambodia helped to produce a genocidal totalitarian nightmare in the Khmer Rouge. The fears bred by Vietnam were also geopolitical. In America’s wider competition with the Soviet Union, its officials feared that deepening ‘sideshow wars’ could drain resources and political will from the main struggle.
Not for the first time, the National Press Club was a venue for a doctrine that fell out of fashion. Looking back, the Weinberger Doctrine has been dismissed as a relic of its age, an overreaction to the Vietnam syndrome that is a better guide to the anxieties of policymakers in the late Cold War than it is to our current problems. The Doctrine has drawn critical fire from several directions. Its concern for control and boundaries displeases those like George Schultz and Henry Kissinger who preferred that Washington reserve wide latitude for using force on its terms, scale and timetable, unfettered by prior limitations. The Doctrine irritated those who believe that America’s armed forces are only valuable if they are used, overlooking their broader worth as means of deterrence or of insurance for a rainy day. That was the complaint of Secretary of State Madeleine Albright, frustrated as she was by the caution of the Chairman of the Joint Chiefs of Staff Colin Powell. In her words, ‘What’s the point of having this superb military that you’re always talking about if we can’t use it?’[i]
For the historian Brian Mcallister Linn in his landmark study of US Army traditions The Echo of Battle, the Weinberger Doctrine is guilty of control freakery born of ‘the propensity to view war as an engineering project in which the skilled application of the correct principles could achieve a predictable outcome…a conviction that it was possible to wage a scientifically exact, one-sided war, making the enemy no more than the passive recipient of overwhelming military power.’[ii] For the theorist and practitioner of ‘small wars’ John Nagl, the strategic future is inevitably one of fighting to win over populations rather than convenient clashes like Gulf War One to expel an invader. This makes the notion of only playing to one’s strengths naive wishful thinking.[iii] For former Secretary of the Navy James Webb writing only months after al-Qaeda’s attacks on New York and Washington, the doctrine was just ‘a “best-case scenario” for situations where the U.S. could respond at its own discretion, using a schedule of its choosing, against an enemy whose military makeup allowed such a response. But sometimes a nation must fight even when it cannot muster up an overwhelming advantage, as in the early days of World War II. And sometimes massive force is irrelevant, as in the anti-terrorist campaigns we are waging today.’[iv]
The gravamen of the disagreement between the doctrine’s defenders and its critics is the question of discretion. How far can America select its wars? Where Weinberger and his followers impose conditions and criteria, their critics insist this is futile. In their fatalistic version of history, war comes to America unbidden, at times in highly inconvenient forms. The US cannot afford to plan on the basis of convenient wars, selecting its clashes from a menu. War is imposed upon it. When a Tojo, Kim Il Sung or Bin Laden attacks and shocks the US directly or assaults a vital interest, America must make war and not under conditions of its choosing. To ignore this pattern is to leave the country’s forces exposed as it was in Iraq from 2003, designed for a swift interstate clash where it would find, fix and smash a conventional military, but then lost in a struggle against a different kind of opponent and a different kind of war it had not prepared for. For General David Petraeus, this is a lesson written in the blood of America’s dead in the 9/11 wars.[v] For Max Boot, ‘the United States cannot determine the nature of its future wars; the enemy has a vote, and the more evident the U.S. inability to deal with guerrilla or terrorist tactics, the more prevalent those tactics will become.” In this way, strategy (the big picture linking military means to policy ends) is reduced to an issue of tactics. The discussion focuses on the enemy’s tactical vote on how to fight American troops, rather than America’s strategic vote on whether and why to place them there in the first place.
But on the contrary, America almost always does get to choose the wars it fights. That is one of the conditions its founders and leaders coveted. The ability to choose or refrain was one of the bounties of the gradual effort to maneuver European empires from their continent, make the country bicoastal, and establish hemispheric dominance. Being able to choose which fights to pick and avoid is part of being a nuclear offshore superpower with unthreatening enemies, strong maritime and air shields and a potent army and Marine Corps to boot. The United States is not interwar Poland, stuck between larger predators with few reliable friends and vulnerable to a snap invasion across contiguous territory. If ever a state existed that usually, emphatically, does not have to accept war being imposed by others, it is this one.
The phrase ‘the enemy gets a vote’ has become a recurrent cliché in the debate but deserves unpacking. Translated into this debate, it effectively means that the state and its military should obey its enemy’s orders, and that the military especially should confine itself to preparing for counterinsurgency as its destiny without question. Adversaries may get a vote in how the US engages them, but in the choice of conflicts and where, when and how to deploy troops, they have no vote. Whether or not to send an expeditionary force into terrain where there is an excellent chance of violent resistance is a matter for discretion, not compulsion. Unless Washington reverts to a rigid Prussian conception of a divide between war and politics, the military is not just there to obey orders silently. America’s military is not just charged with devising tactics while awaiting orders for where and who to fight. It also has a broader and higher advisory role, and is involved in the formulation of strategy with its civilian masters. One of its responsibilities is to inform the choices of policymakers. In offering that professional advice, the military can advise not only on how to fight COIN campaigns, but whether they are a good idea. War itself is a choice. It is an elective political act, not a state of being that one power can inflict on another. Short of another state directly trying to annihilate the defender, there is always a value judgment to be made.
Enemies’ ‘voting’ to send America to war is a concept rooted in collective memories of the past, namely the strategic shock of 7 December 1941. But World War Two was not imposed upon a sleeping, passive isolationist America. It resulted from an escalating conflict in East Asia against Japan over its brutal occupation of China, as the Roosevelt administration imposed crippling sanctions. Before America became a formal belligerent when Japan struck Pearl Harbor in December 1941, it was consciously fighting an undeclared war in the Atlantic, providing armed escorts, extending its defense perimeter, engaging in secret staff talks with Britain, and generally being a most un-neutral neutral. The United States did not have to enter hostilities with Nazi Germany in 1940-41 any more than Britain had to make guarantees to Poland in 1939. Even when it did, it made choices continually: where and when to fight, which theatres to prioritize, how to trade off sacrifices and gains, and how far to favor peripheral or direct operations.
As for 9/11, that atrocity too was a violent political act that required interpretation. It did not have to lead to an ambitious war on terror. It did not have to result in a bid to bring democratic transformation to Afghanistan, where after a decade the US is fighting to preserve a kleptocratic regime barely ruling a fractured country, while bargaining with a Taliban that optimists once declared defeated. It certainly did not have to result in an invasion of Iraq. The US had wide scope to define its adversary (one terrorist network, terrorism itself, or beyond that to Iraqi Sunni supremacists, war lords or Hamas?), to define victory (containment or rollback?), and to balance risks and costs.
The most compelling argument for the Weinberger doctrine is the history of what happened when it was most radically abandoned. Consider the Global War on Terror, which has had its victories, degrading Al Qaeda’s capabilities, but at Pyrrhic costs, and its crescendo in the invasion of Iraq in 2003. The invasion of Iraq in 2003 was undertaken in a spirit antithetical to the careful statecraft laid out by Weinberger. Against Weinberger’s realist caution, it was launched in a spirit of self-celebratory triumphalism, reflecting the neoconservative vision of a politics of heroic greatness and its impatience with what it sees as an alien and un-American realism with its concern for limitation. Rather than carefully defining limited goals, the makers of war in 2003 ignored warnings that toppling Saddam could sow mischief in a fractured and frightened Iraqi population, assuming instead (with exiles urging them on) that overthrowing the regime would naturally call unleash a peaceful democracy. Instead of planning for a difficult ‘post-conflict’ reconstruction, Washington directed its energy into the campaign not the aftermath. In an in unipolar moment, hawkish liberal idealists argued, America should unleash its power and not pay heed to doctrines of self-restraint.
The cumulative indirect and direct financial costs of the Iraq conflict alone, according to the Novelist and former chief economist of the World Bank Joseph Stiglitz, exceed $3 trillion, treasure that could have been used productively to address problems in Social Security Provision, oil dependence, or not spent at all. The war killed 4,500 U.S. troops, 3,400 contractors, and injured more than 32,000, including thousands with critical brain and spinal injuries. Looking after them will not be cheap, given the costs of long-term treatment and disability benefits flowing from post-traumatic stress disorder (PTSD) and traumatic brain injury (TBI). It also resulted in the deaths of hundreds of thousands of Iraqis killed in communal violence. To be sure, Iraq now has a constitutional government of sorts, no longer under yolk of Baath party rule. But it is becoming an increasingly authoritarian state that is still wracked by communal violence. Thousands have been killed and wounded in bombings since American withdrawal. If there is an emerging winner geopolitically, it is Iran. And all of this to overthrow a regime crippled by sanctions, with no serious ties to Al Qaeda. These are disappointing gains for such heavy costs. It would be unwise to conclude that the grand lesson is to prepare to fight such an unforced war more competently. The essence of the Weinberger Doctrine, as in 1984, is to restore strategy to limit war, not just refine tactics to get it right the next time.
Moments of overreach have prompted debate before about strategic doctrines and the scope of military commitments. Two such moments were the crossing of the 39th parallel by American-led UN forces in the Korean peninsula in 1950, and the escalation of America’s expeditionary land commitment to Vietnam in 1964-65. The former expanded what could have been a successful limited territorial war into a conflict against communist China, and after many credible warnings from the enemy and from informed observers were ignored. The latter proved to be the most polarizing, costly and demoralizing conflict of the Cold War, endangering the very legitimacy of containment. Both represented a shift from territorially-conceived and bounded security interests to psychological and universal ones. Without a disciplined concept of vital interests (literally, ‘needed for life’), those interests become limitless, the strategic frontiers everywhere, and war begins to serve not policy but itself.
The embrace of a limitless concept of strategic interests and the scope for military action has weighty policy implications. In John Nagl’s words, ‘Sept. 11 conclusively demonstrated that instability anywhere can be a real threat to the American people here at home. Defeating instability through effective counterinsurgency operations is therefore a core mission of the Defense Department.’[vi] In fact, there are many measures short of expensive and difficult coin campaigns that can disrupt violent guerrilla threats against the United States from afar. Besides, military occupations are not, historically, optimum antidotes for addressing instability, either away or at home (as France found in Algeria, Israel in Lebanon or the Soviet Union in Afghanistan). Besides, September 11 was more linked to American entanglement in the politics of the Arab-Islamic world than some generic condition of instability. But there is a darker implication of Nagl’s statement. If American security relies on ‘Defeating instability’, and ‘anywhere’, then Nagl and his fellow small wars advocates are offering a doctrine of permanent revolution. In pursuit of absolute security, they would put America on a footing of endless war. In place of careful means-ends calculation, they offer a field manual.
There is, in fact, an alternative model, not only in the declaratory model of the doctrine, but in practice, America’s response to a crisis moment that gave rise to the doctrine in the first place. It is the response of President Ronald Reagan in October 1983 to the bombing of the US Marine Barracks in Beirut, killing 241 people, one of the bloodiest days for America’s military forces since World War Two. Reagan denounced the attack, pledged to stay, ordered retaliatory bombings but only months afterwards withdrew US Marines offshore. In response to Islamists using asymmetric methods, Reagan did not decide that America had no choice but to get into an ambitious land war of regime change and armed nation-building. He pulled the ground forces out. A disciplined and prudent choice was available and he took it. That decision is rarely spoken of as a great strategic blunder. For all Reagan’s ringing rhetorical offensives and sunny visions of Cold War triumph, when it came to ground wars the Gipper knew when to stop. We can only imagine the pleas of today’s small wars faction to act differently under similar circumstances.
If now is a time for fresh doctrine, it is no time for dogma. As Michael Cohen argues, a return to harder thinking about the criteria and purpose of war ‘doesn’t mean a slavish devotion to the criteria outlined by either Weinberger or Powell. There will always be times when the limited use of force is appropriate. But it does mean a far more rigorous consideration of costs and benefits, as well as the national interest, before such force is employed.’ In that pragmatic recognition of the limits of power, a better way would lie. The greatest complaint about the doctrine is its greatest virtue: that if applied, it would raise the threshold for using force. We can only hope.
[i] Madeleine Albright Madam Secretary: A Memoir (New York: Miramax 2003), p.182.
[ii] Brian McAllister Linn, The Echo of Battle: The Army’s Way of War (Cambridge: Harvard University Press, 2007), p.199.
[iii] John A. Nagl, ‘A better war in Iraq: Learning counterinsurgency and making up for lost time’, Armed Forces Journal, (2006,) pp. 22–28.
[iv] James Webb, A New Doctrine for New Wars, Wall Street Journal November 30, 2001.
[v] David Petraeus, ‘Our enemies will typically attack us asymmetrically, avoiding the conventional strengths that we bring to bear. Clearly, the continuation of so-called “small wars” cannot be discounted. And we should never forget that we don’t always get to choose the wars we fight.’ ‘We must be coldly realistic over the use of force.’ Telegraph 10 June 2013.
[vi] Nagl, above n3.