At first sight, the recurring difficulties experienced by conventional military forces when confronted by unconventional opponents is quite surprising. Unconventional war – whether labeled insurgency, guerrilla war, asymmetrical war or some other label du jour – is as old as history, the natural response of those who feel unable to openly confront conventional opponents. Yet armies, even quite good ones, often fumble their initial responses to such opponents, greatly complicating – and sometimes precluding – successful campaigns against them. Why? The answer will of course vary somewhat in each situation, but broadly speaking, three factors seem to constantly reappear: ingrained attitudes, defective institutional memory, and professional priorities.
Attitudes are the most obvious. Professional military organizations have a hard time taking seriously as opponents (or allies, for that matter) those who do not look or behave in the least like themselves. John Ford caricatured such attitudes in his classic 1948 western, “Fort Apache”; his West Point-trained colonel’s biases are the prelude to an Apache victory. Ford had his finger on something all too real – and about to recur. Only a few years later Lyndon Johnson famously (or notoriously) remarked that no group of “raggedy-assed little guerrillas in black pajamas” were going to defeat the armed forces of the United States. It is easy to identify – indeed, to parody – such attitudes; harder to fix them. After all, armies do need to prepare to confront their like. Traditions of victory in such encounters foster pride and pride reinforces cohesion and morale, constituents of continuing victory. It’s all too easy then for armies to assume that scruffy irregulars will pose no problem. Since no one wants an army without belief in itself and pride in its record, how to prevent something valuable from leading to disaster when invoked in an unconventional situation? Here is where institutional memory should come in.
The preservation of “lessons learned” in a military institution – at least beyond the professional lifetime of those who learned the lessons – is the job of staffs and the doctrine they create. In the nineteenth century no army fought more “little wars” than the British. Lacking a staff, however, the British Army never developed any doctrine on the subject. The closest approach was an unofficial study, Major General Charles Calwell’s Small Wars – Their Principles and Practice (1896), which appeared only on the eve of the Boer War. In that conflict the British Army had to develop its response to the challenge of guerrilla war on the fly, with predictable muddles, disasters and scandals. When it was all over the British Army was reorganized and a general staff created. Yet two decades later, confronted by another intractable insurgency (with an urban dimension not present in the Boer War), the British Army again found itself improvising a response. The problem of institutional “forgetfulness,” even with a staff system and doctrine to preserve lessons learned takes us to the final item: the professional priorities of armies.
When the British Army reorganized after the Boer War, it was to configure itself to fight on the continent against the German Army. A century of little wars and even the expensively purchased experience from South Africa were no longer of professional interest to the British Army or to its new general staff. Then came World War I. When unconventional war again became a concern in Ireland in 1919-20, an army that had been accustomed to thinking in terms of corps and army operations supported by artillery barrages and tactical air power found itself starting from scratch and handicapped, among other things, by the sense that the messy, ambiguous conflict in Ireland was not real soldiering (and would not advance anyone’s career). That in turn made it easier to “outsource” counter-insurgency to an improvised special force, the Black and Tans, who – while certainly damaging the insurgents – inflicted far more collateral damage on the credibility and reputation of the British government.
All this has obvious relevance to recent American military history. The great formative moment in the U.S. Army’s twentieth century history was World War II. That triumph of what the late Russell Weigley, in his most important book, The American Way of War, made the U.S. Army reluctant to reshape itself for a different type of conflict, especially with the Red Army sitting just over the horizon. As Lieutenant Colonel John Nagl has shown in his penetrating Learning to Eat Soup with a Knife (2002), this “Fulda Gap” mentality reasserted itself rapidly after the Vietnam War, aided by the understandable institutional desire to close the door on a very unhappy episode. Careers were made by staying in the military mainstream, not by worrying about unconventional war – that could be left to a few (professionally marginalized) specialists. When the next big conventional conflict came, it was not – fortunately – against the Red Army but against Iraq, a rather less formidable opponent. The dazzling success the American military enjoyed in that Gulf war, however, served only to deepen its commitment to the “decisive battle” at precisely the moment when the likelihood of symmetrical war was rapidly receding. And so we came to our current situation – a very good army once again finding that its finely honed skills and expensive equipment couldn’t cope with the situation it faced and forced to play costly catch-up, improvising equipment and revising techniques and updating doctrine as it went.
Does this always have to happen? Once again, military history may have a hint to offer. Britain in its Victorian imperial heyday had two armies: the regular British Army and the Indian Army. That latter army fought continuously, for decades, on the Northwest Frontier of the Raj (today’s “Tribal Belt” of Pakistan) and found a way to preserve and pass on the lessons of its campaigns, gradually building up a formidable body of doctrine and expertise despite the absence through most of the period of either a general staff or any “lessons learned centers.” Why? Largely because in the officer corps of the Indian Army (separate from and rather disdained by the regular British Army) careers were made – or broken – on the Frontier. Perhaps in the end, important as the collective memory of staffs and the cogency of doctrine are, what really determines the sort of response an army confronted with unconventional war can offer is the definition it has given to “real soldiering” and the promotion choices that definition drives. Attitudes, it would seem, are central.