Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Private Papers (website of Victor David Hanson) (10-12-09)
I have believed in the power of the goddess Nemesis (“dispenser of dues”) ever since I was introduced to the concept as a teenager studying classics, especially in the texts of Hesiod, Herodotus, and Sophocles.
Some of you know her also as a variant of eastern Karma, or the folk notion of ‘what comes around, goes around’, or the now common “ain’t payback a bitch”? We all agree on the symptoms: overweening success and surfeit (koris) lead tohubris (gratuitous arrogance), which in turn promotes destructive behavior (atê), that at last calls you to the attention of divine Nemesis — who ensures your ruin. At Rhamnous on the Attic coast there is a beautiful temple to the goddess, proof of her ubiquity and power.
Obama as All-Knowing Oedipus
As sure as sun rises, you readers knew that, as early as 2007, Obama’s fiery rhetoric about the disaster in Iraq and the good war in Afghanistan was not only disingenuous, but would come lurking back to haunt him — especially given the efforts of the talented David Petraeus, and the myriad challenges of the age-old tribalism in Afghanistan.
And so it has. He now owns the “good” and “necessary” war that, according to Obama, we supposedly wrongly “took our eye off of.” Now at last Obama is free as he wished to go into Pakistan in hot pursuit of terrorists (and as he once boasted in the debates amid the trashing of the then big-target Bush administration.)*
Snap My Fingers — Guantanamo Closed!
Remember Guantanamo? He could have said in January: “Tough call. Eric Holder once thought it was fine. Where else do you put non-uniformed murderers, who are sort of foreign soldiers in a global war unlike domestic criminals, but yet not soldiers either as we have traditionally defined them at Geneva? We will have a long look at the facility, get bipartisan input from the Bush administration and the Congress, and then choose the bad rather than the worst choice.”
Nope. Instead, we got the hope and change soaring cadences about shutting it down within “a year” and “reset button” inanity—ad nauseam. That will prove to be impossible. Already he is throwing his Guantanamo czar under the bus, even as Mr. Craig blames (you guessed it) the Bush administration for his inability to depose of the detainees. (Did he really think that divine-sounding Germans and British leftists who shouted that we were running a Stalig would really want their own terrorists back home rather than in Cuba under lock and key?)...
If Richard Nixon had a bad habit of being vindictive and bending the rules for political purposes, so too Obama had believed that glibness, casual acquaintance with facts, and flashy rhetoric were substitutes for accomplishment. Just as Nemesis struck Nixon in 1973 at his apogee for long accrued but previously unpaid sins, so too Obama is now caught and tumbling to or below 50% approval. (Despite the media blitz, the worn racist charge, the glamour, the youth and interviews, the novelty of his presidency, despite all that, one of two Americans, within a few months of his inauguration, simply does not support him or his agenda.)...
Posted on: Thursday, October 15, 2009 - 23:05
SOURCE: Private Papers (website of Victor David Hanson) (10-9-09)
Afghanistan is a messy war, but so far it has been conducted with a minimum loss of American life while achieving some important goals. We can argue about current strategies, fault what’s been done in the past, deplore the length of the war, lament its cost, or blame each other for its inconclusiveness, but the facts remain that we removed the Taliban, weakened al Qaeda in the region, fostered a consensual government in the most unlikely of places, and helped to prevent another catastrophic attack on our nation originating from that part of the world — and did all this with a degree of skill that is reflected in losses that by historical standards are quite moderate.
After the initial invasion, the Afghan front was largely inactive for years. U.S. annual fatalities from 2001 through 2007 (12, 49, 48, 52, 99, 98, 117) averaged about 68. In comparison, the murder total in Chicago for 2007 was 509. Some parts of Chicago were far more dangerous than the Hindu Kush. The decisive first three months of the war (October to December 2001) accounted for a little over 1 percent of American military deaths that year, one in which there were no other major combat operations.
Indeed, in 2002 Afghanistan accounted for about 4 percent of all military personnel lost while on duty. For the first two years of the war, an American soldier was far more likely to die of illness while stationed outside of Afghanistan than to be killed inside Afghanistan. By 2006, the fatality rate in Afghanistan was 6 percent of all military deaths, including those lost to accidents, illnesses, and the war in Iraq.
We are rightly alarmed about the spike in fatalities over the last few months, but even at these highest monthly death rates of the entire war, we have lost 153 over roughly the last 100 days of combat — an average of about 1.5 per day. For much of the 1990s, we lost well over 3 American soldiers per day to accidents, illnesses, and suicides, a military fatality rate far higher than the rate of combat losses in both Iraq and Afghanistan during the last three months. (In comparison, Iraq’s fatalities over those same 100 days were 27 American deaths, or less than one loss every three days)...
Posted on: Thursday, October 15, 2009 - 23:03
SOURCE: Private Papers (website of Victor David Hanson) (10-11-09)
Nobel Prizes from Lala Land
Norway is a tiny country that was born lucky. It is weak and defenseless (and was quickly overrun in World War II [while neighboring, neutral Sweden sold the Third Reich 40% of its iron ore, that went for everything from Tiger tanks to kill Americans to the ovens at Auschwitz — with free shipping across the Baltic included as a favor]. In the late 1940s it would have been Finlandized during the Cold War, if not for American-led NATO. And the world’s largest military is still pledged to its defense, in case any of the nations, to whose icons it bestows awards, some day decides to send terrorists or nukes its way
Second, it sits on or near enough to oil to allow what is otherwise a rather insignificant country to be the wealthiest per capita oil producer in the world, and enjoy the influence that many in the Gulf have grown accustomed to. Throw in minerals, natural gas, timber, and fish and the nation sits on a bonanza of natural wealth. No wonder there are philosophers who ponder how to dispense the largess and absenteeism is a national crisis (one receives almost ad infinitum the same cash whether “sick” at home or well on the job). The population of under 5 million is largely homogeneous (90% Nordic), and is thus stable, and both rich and safe beyond its wildest dreams. It does not border a Third World country; “difference” and the “other” — even with recent Islamic immigration — is still defined as speaking Swedish or Danish.
In other words, Norway has the leisure to be utopian, and cannot quite understand why other countries are not as liberal as it has proven. So Norway loves to give award to all sorts of right-thinking frauds (Menchu), scoundrels (Elbaradei), terrorists (Arafat), Stalinists (Le Doc Tho), Elmer Gantrys (Jimmy Carter) and hucksters (Gore) — as it sits in judgment of others from Lala land.
Remember, though, the Norwegians privately would not like to live under Central American communism of the Ortega brand, or right next to nutty nuclear Iran, or have Palestinian terrorists on their borders, or in general live the real life that the nation sanctimoniously advocates in the abstract. It sees what happens to neighboring Denmark’s cartoonists when they exercise free speech. It once saw what Neville Chamberlain wrought for its own neighborhood.
Norway is, in other words, the Hollywood nation. Imagine it is as the son or daughter of a movie star, one who grew up in Malibu, and feels so terribly about it that he lectures the U.S. about everything from global warming to George Bush’s assorted sins — confident that he will never have to work at Ace Hardware, and never have to live near South Central LA. That’ sums up Norway.
Effort and intention, not achievement, matter to these pious Europeans. We should honor preseason favorites, not 20-game winners; praise dazzling book proposals, not best sellers; gush about on-the-shelf Pentagon plans not battle victories. Don’t dare end the Cold War, or save millions in Africa from AIDs, or get rid of Milosevic; but most certainly do dare to convince the world that the Muslims jump-started the Renaissance. For that brave assertion, global peace will surely follow.
Norway on the Potomac
More seriously, the Obama Prize represents two recent larger Nobel trends: 1) an effort to curtail American foreign policy in favor of international deference (as in the case of rewarding Carter and Gore for their defamation of Bush in their opposition to Iraq); 2) a general disconnect from accomplishment in favor of leftist intentions, as in the case of Elbaradei or Rogaberta Menchu who accomplished essentially nothing (and spoke or wrote about that nothing in suspect fashion), but were a hit among international Western elites as authentically anti-Western non-Westerns.
Anyone who has taught in the university over the last thirty years has witnessed dozens of mini-sorts of Nobel Prizes each year handed out to faculty on the basis of what they represent or said rather than accomplishment; but it is still remarkable to see such postmodernism hit the world stage, where reality is virtual and constructed on language and expressed intent.
Think of the tiny Norway’s Machiavellianism: A utopian American President is now supported for his rhetoric — and yet also sent a signal that brave new Nobel Prize laureates simply don’t support Israel, pressure Iran, stay in Afghanistan or Iraq, or keep open Guantanamo. It is as if that Oslo is saying ‘our man in Washington’ is, well, now really ‘our man in Washington.’
The vision of Norway is now to be the aspiration of the world, albeit with the understanding that in the era of cap-and-trade someone will still buy Norway’s oil to power their carbon-foot-printing cars, and its timber for their ungreen homes, and still offer icky planes, rockets, nukes, and carriers to ensure Norway is safe in a fashion that it was not sixty-five years ago. Quisling is still its chief loan word to the English Language...
Posted on: Thursday, October 15, 2009 - 23:00
SOURCE: Philadelphia Inquirer (10-14-09)
Four boys assault their teacher, who later dies of her injuries. Across the country, newspapers compete to unearth the most lurid details of the episode. It seems the boys were annoyed at being detained after school.
So they threw rocks and other debris at the screaming teacher, until she couldn't scream anymore.
A modern-day example of inner-city youth violence? Hardly. It happened in the small town of Canton, Mass. - in 1870.
I thought of the Canton tragedy as I watched Attorney General Eric Holder at last week's news conference about youth violence in Chicago. Standing beside Education Secretary Arne Duncan, the former schools chief in Chicago, Holder expressed outrage at the recent murder of 16-year-old Derrion Albert as he walked home from school. Holder called for a return to America's "old-time" values.
But school violence is itself a time-honored American tradition, dating to the very dawn of the Republic. Despite our nostalgia for the good old days, America's schools have always been disorderly and violent places. By pretending otherwise, we might miss what is truly new - and truly troubling - about present-day violence.
In the one-room schools of the 19th century, older boys faced off against their teachers in a brutal struggle for control. When the "big boys" got out of hand, teachers hit them with sticks, rulers, and paddles. Sometimes teachers drew blood; in rare cases, they caused permanent injuries.
In his first piece of published fiction, in 1841, Walt Whitman describes a vicious schoolteacher who kills a boy by flogging him. Whitman's title told the whole story: "Death in the School Room (A FACT)".
But the boys fought back, too. In New Hampshire, they ripped the ruler from a violent schoolmaster's hand and threw him down an icy hillside; in Virginia, they bound a teacher hand and foot; and in Georgia, they chased a drunken teacher into the woods and covered him up with leaves.
"Such life-and-death struggles are as inseparably associated with the little red schoolhouse as they are with the ruins of the Roman amphitheater," one educator wrote in 1894. "As the early Christians were stretched over slow fires, and stung to death by bees, and torn to pieces by wild beasts, so the young man beginning a term in a new school expected to be tormented by older boys."
In the 20th century, as larger institutions came to replace the one-room schoolhouse, more and more "older boys" attended high schools. School violence changed, too. Clustered with peers of their own age, teenagers typically fought each other rather than their teachers.
By the 1950s, they had formed gangs. Bands of working-class "toughs" or "hoods" roamed school corridors and parking lots, bullying the weak and defacing property. Most of all, they attacked one another. Across urban America, gangs "rumbled" with knives, brass knuckles, and sawed-off baseball bats.
School violence would spike in the 1960s and early '70s, echoing the overall rise of crime in American society. Increasingly, though, it involved guns. By 1991, 26 percent of high school students reported that they had brought a weapon to school in the previous 30 days; of those, about a third said they had carried a gun.
Contrary to public perceptions, most forms of school violence have decreased since the 1990s. So has the reported carrying of weapons, to about 18 percent of high school students. But roughly a third of those still carry guns, which remain the most common cause of youth homicide in America.
And when it comes to guns, the White House has dragged its feet. Although President Obama promised during the 2008 campaign to restore the ban on assault weapons, he hasn't done so yet. Nor has the administration moved to close the loophole that allows people to purchase arms at gun shows without background checks.
Such reforms would do little by themselves to reduce school violence, which involves handguns more often than it does rifles. Symbolically, however, the gun measures would show that the White House takes the issue seriously.
So would a frank admission that most of the kids who die in or near schools are victims of gun violence, not just of "school violence" in general.
In this sense, Derrion Albert was an exception: Caught between rival gangs, he was beaten to death with wooden planks. The killing was captured on cell-phone video and posted on the Internet, where thousands have watched it.
Historically, however, this case could obscure a larger truth: We've always had school violence, and we've had youth gangs for a long time, too. The new factor is gun possession, plain and simple. Nothing will change until we're honest about that.
Posted on: Thursday, October 15, 2009 - 01:20
SOURCE: guardian.co.uk (10-12-09)
We are going through another of those odd periods when corners of our daily newspapers look as if they are reporting things that happened over 65 years ago. There are rows over what the Latvians did or did not do in the second world war, arguments about why the German Luftwaffe bombed Coventry and, most recently, Stephen Fry has upset the Poles with a careless remark about Auschwitz. What all of these spats show is that history matters.
Versions of the past remain central to a country's national identity and how its citizens think about themselves. The way that history, especially national history, is told and taught is a matter of public policy, and hence inevitably a political issue. It can even intrude into international relations, as demonstrated by the response of much of the international community to Mahmoud Ahmadinejad's proclivity for denying that millions of Jews were systematically murdered by the Germans and their allies between 1939 and 1945. The saga of the Bloody Sunday inquiry cannot be disentangled from the resolution of the conflict in northern Ireland and the long-term future of the province.
Fry's remarks, however, reveal something more specific. They exemplify the time lag between scholarship that demolishes historical myths and the more slowly shifting public understanding of the past. Fry, who admits to knowing "a little history", seems to think that Auschwitz was in wartime Poland and was, in some way, connected to "rightwing Catholicism". The camp was, in fact, in a part of Poland annexed to Germany and was a German creation. Before it was expanded and adapted to include a death camps devoted to the mass murder of Europe's Jews, tens of thousands of Catholic Poles died there. The camp's initial function was to terrorise the Polish population.
Fry also seems blissfully unaware of the research into Polish-Jewish history that has transformed our knowledge of that conflicted and tragic relationship. Over the last 20 years, beginning roughly with the debate over Claude Lanzmann's film Shoah in 1985-86, Poles have confronted the history of Polish antisemitism and the stance of the population towards the persecution of Polish Jewish citizens during the German occupation. There are now flourishing centres for the study of Polish Jewish history at several Polish universities, and a major Jewish museum is under construction in Warsaw.
Many Jewish historians, meanwhile, have shown the closeness between the two communities and challenged the stereotype that Jews and Christians on Polish soil lived in separate worlds. Relations between them, especially in small towns and villages, were more cordial and intimate than was once thought to be the case. The knowledge of the slaughter of the Jews in Poland and the bitter aftermath, including the attacks on survivors by rightwing Poles in 1945-7, created a distorting lens through which the past was viewed for decades. Fry is, evidently, still squinting backwards through these blood-coloured spectacles...
Posted on: Wednesday, October 14, 2009 - 21:00
SOURCE: Commentary (10-14-09)
The terms counterterrorism and counterinsurgency have become common currency this decade in the wake of September 11, the invasion of Afghanistan, and the war in Iraq. To a layman’s ear, they can sound like synonyms, especially because of our habit of labeling all insurgents as terrorists. But to military professionals, they are two very different concepts. Counterterrorism refers to operations employing small numbers of Special Operations “door kickers” and high-tech weapons systems such as Predator drones and cruise missiles. Such operations are designed to capture or kill a small number of “high-value targets.” Counterinsurgency, known as COIN in military argot, is much more ambitious. According to official Army doctrine, COIN refers to “those military, paramilitary, political, economic, psychological, and civic actions taken by a government to defeat insurgency.” The combined approach typically requires a substantial commitment of ground troops for an extended period of time.
When General Stanley McChrystal was selected on May 11 of this year as the American and NATO commander in Afghanistan, it was by no means certain which approach he would employ. His background is almost entirely in counterterrorism. He had been head of the Joint Special Operations Command (comprising elite units such as the Army’s Delta Force and the Navy’s SEALs) when it was carrying out daring raids to capture Saddam Hussein and kill Abu Musab al-Zarqawi, the leader of al-Qaeda in Iraq. If he had decided to follow the same approach in Afghanistan, he would have had the support of Vice President Joe Biden and numerous congressional Democrats who favor a narrow counterterrorism strategy to fight al-Qaeda and who want to cut the number of American troops to a bare minimum.
But that is not what McChrystal has chosen to do. He has decided, as he put it in an “interim assessment” dated August 30 that was later leaked to Bob Woodward of the Washington Post, that “success demands a comprehensive counterinsurgency (COIN) campaign.” A close reading of that document, which was directed at the Pentagon and White House, as well as the “Counterinsurgency Guidance” drafted at his behest around the same time and directed at his own troops, provides a window into his thinking. It shows why a COIN campaign is needed, how it would be carried out, and why the kind of narrow counterterrorism effort favored by so many amateur military strategists is unlikely to succeed.
The case against a counterterrorism approach in Afghanistan is laid out most clearly in the Counterinsurgency Guidance. McChrystal’s focus is on explaining why conventional military operations cannot defeat the insurgency in Afghanistan, but the same arguments apply to counterterrorism generally, which is a smaller-scale version of the same conceit—that the U.S. military can defeat an insurgency simply by killing insurgents. McChrystal writes that the math doesn’t add up:
From a conventional standpoint, the killing of two insurgents in a group of ten leaves eight remaining: 10 — 2 = 8. From the insurgent standpoint, those two killed were likely related to many others who will want vengeance. If civilian casualties occurred, that number will be much higher. Therefore, the death of two creates more willing recruits: 10 minus 2 equals 20 (or more) rather than 8.
He goes on to note that the “attrition” approach has been employed in Afghanistan over the past eight years by a relatively small number of American forces and their NATO allies. Yet, he writes, “eight years of individually successful kinetic operations have resulted in more violence.” He continues: “This is not to say that we should avoid a fight, but to win we need to do much more than simply kill or capture militants.”
What else, then, must coalition forces do? McChrystal’s answer:
An effective “offensive” operation in counterinsurgency is one that takes from the insurgent what he cannot afford to lose—control of the population. We must think of offensive operations not simply as those that target militants, but ones that earn the trust and support of the people while denying influence and access to the insurgents.
The Counterinsurgency Guidance points out that firing guns and missiles can often make it more difficult to win “trust and support.” An anecdote makes the point:
An ISAF [International Security Assistance Force] patrol was traveling through a city at a high rate of speed, driving down the center to force traffic off the road. Several pedestrians and other vehicles were pushed out of the way. A vehicle approached from the side into the traffic circle. The gunner fired a pen flare at it, which entered the vehicle and caught the interior on fire. As the ISAF patrol sped away, Afghans crowded around the car. How many insurgents did the patrol make that day?
As an example of how “self-defeating” the use of force can be, McChrystal could just as easily have chosen an example involving a Predator drone firing a Hellfire missile or an F-16 dropping a 500-pound bomb—the kind of strike that often causes considerable “collateral damage” and that, if the more limited counterterrorism approach were to be adopted, would become the centerpiece of our strategy.
McChrystal counsels his troops to take a different path, to “embrace the people,” to “partner with the ANSF [Afghan National Security Forces] at all echelons,” and to “build governance capacity and accountability.” He urges coalition troops to be “a positive force in the community; shield the people from harm; foster stability. Use local economic initiatives to increase employment and give young men alternatives to insurgency.”
This would mean putting less emphasis not only on using force but also on “force protection” measures (such as body armor and heavily armored vehicles), which distance the security forces from the population. As an example of what he expects, McChrystal cites an anecdote involving an “ISAF unit and their partnered Afghan company” that were “participating in a large shura [tribal council] in a previously hostile village.” During the shura, which was attended by “nearly the entire village,” he writes, “two insurgents began firing shots at one of the unit’s observation posts.” The sergeant in charge of the post could have returned fire but he chose not “to over-react and ruin the meeting.” -“Later,” this example concludes, “the village elders found the two militants and punished them accordingly.”
While counterintuitive to a conventional military mind, such thinking is hardly novel for anyone familiar with the history of counterinsurgency. McChrystal’s advice to embrace the population and be sparing in the use of firepower has been employed by successful counterinsurgents from the American Army in the Philippines at the turn of the 20th century; to the British in Malaya in the 1950s and Northern Ireland from the 1970s to the 1990s; to, more recently, the Americans in Iraq. By contrast, counterinsurgency strategies that rely on firepower have usually failed, whether tried by the French in Algeria, by the U.S. in Vietnam, or by the Russians in Afghanistan.
The risk of the counterinsurgency approach—which helps to explain why it has not been adopted in Afghanistan until now or in Iraq until 2007—is that, in the short term, it will result in more casualties for coalition forces. Placing troops among the people and limiting their expenditure of firepower makes them more vulnerable at first than if they were sequestered on heavily fortified bases and ventured out only in heavily armored convoys. But in the long term, as the experience of Iraq shows, getting troops off their massive bases is the surest way to pacify the country and bring down casualties, both for civilians and security forces...
Posted on: Wednesday, October 14, 2009 - 11:34
SOURCE: The End is Coming (History Blog) (10-12-09)
For many years, the Egyptian director of Antiquities, Mr. Zahi Hawass has called upon worldwide museums for the return of his country’s archaeological heritage. In this case, Egypt petitioned Paris’ Louvre museum to return some frescoes that were stolen from the Valley of the Kings in the 1980s. Indeed, Hawass claims that the Louvre knew of the artefacts’ stolen origin and nevertheless purchased the frescoes in 2000 and have displayed them ever since. As a result, Egypt declared an end of relations between itself and the Louvre, ending a 200 year “partnership”. Although similar measures have been taken by Greek and Egyptian authorities in the past, this time it actually worked. France has announced the imminent return of the frescoes to Cairo. As a result, we may see increased political pressure to return mountains of stolen archaeology in the coming years.
From New York’s Metropolitan Museum of Art to London’s British Museum and Berlin’s Egyptian Museum, thousands of pieces have been plundered or otherwise acquired since the Renaissance to adorn the walls of the once-great empires. From Napoleon’s wholesale acquisition of Egyptian antiquities in the late 1700s to Henrich Schliemann’s violent plunder of Troy in the late 1800s, pieces of Ancient history were whisked away to great museums at the expense of national ownership. Some would say that Ancient Athens and Sparta have as much to do with modern-day Australia as with modern-day Greece but regardless, the legendary cultures trace their nationalism back to these Antique city-states and statesmen.
Now that Egypt has established a precedent for archaeological restitution, the door is opened for the return of much more.
The Rosetta Stone made in 196 BC was quite a mundane boasting of the Greco-Macedonian Pharaoh Ptolemy V and of his new taxation regulations; it was Jean-François Champollion who made the 760 kilogram stone famous. Highly reminiscent of Ptolemaic Egypt’s culture, the stone melded ancient Egyptian hieroglyphs, the more familiar Egyptian demotic dialect and finally the Greek alphabet known to our scholars. This is how the Frenchman translated the stone and eventually broke the code of hieroglyphics in 1822, finally revealing the true patrimonial treasure of Egypt. On display at the British Museum since 1802, Egyptian authorities have vainly tried to repatriate it but may have new methods of obtaining restitution.
More famous still is the Bust of Nefertiti made c. 1400 BC (or 1912 depending on its authenticity). The royal consort of pharaoh Akhenaten and possibly the mother of Tutankhamun, her only detailed portrait has been preserved on a plastered-covered bust found in the early XXth century. It has been essential for art historians, elucidating ancient Egyptian style and artistic influences; it has furthermore been hailed as a representative model of ancient beauty. Several specialists now affirm it is a fake or perhaps a copy of a once true bust. Nonetheless, Mr. Hawass, has demanded its return to Egypt from Berlin's Altes Museum directors, whom have categorically refused. It remains Berlin’s crowning jewel as a world-famous artefact but Cairo may soon use the political leverage they have discovered to repatriate the bust.
Finally, a relatively obscure yet poignant example of the quest for archaeological restitution remains the tale of the Elgin Marbles. In the early 1800s, Thomas Bruce, Earl of Elgin carved and wrapped up approximately half of the Parthenon’s frieze relief and many of the sculptures in Athens’ acropolis. Following a decade-long boat-ride back to England, the priceless treasures vandalized and carefully looted from Greece were proudly displayed in the British Museum where they are still today for all to enjoy. Greek officials have lobbied for the return of their heritage without much success or indeed any response of any kind for many decades. In protest, Greece’s new Antiquities museum was built with a very large room with virtually nothing in it safe for one sign that says “Elgin Marbles”. To remind visitors of the perceived theft and to be ready for an eventual return, Greece has not given up and may now follow in Egypt’s footsteps to involve political pressure in the process.
In the end, it will not matter who displays the artefacts as long as they are intact. They predate nationalism and national affiliation by millennia and will continue to represent the collective cultural legacy to all of western civilization and indeed to the world. That being said, it is inevitable that the more we move them, the greater the risk of loss and damage. And a stolen relic on display in Paris is better than no relic at all.
Posted on: Wednesday, October 14, 2009 - 02:41
SOURCE: Slate (10-9-09)
Sixty-seven years after his famous vow, Gen. Douglas MacArthur has returned.* Gen. Stanley McChrystal's recent warnings that the White House had better heed his call for an Afghanistan surge have provoked comparisons to the showdown between President Truman and his truculent commander in Korea. To the left, McChrystal's shot across President Obama's bow amounts to insubordination; some have demanded his firing. To the right, McChrystal is a soldier carrying out his mission, giving his superior his best advice on how to meet his own stated goals successfully.
Like most historical analogies bandied about in the media, this one is overdrawn. As serious as it is, the war in Afghanistan hardly approaches the Korean War in its magnitude or impact. McChrystal, for his part, has nothing like MacArthur's prestige, and his testing of Obama's supremacy falls well short of MacArthur's defiance. Yet the story of Truman and MacArthur remains useful to remember—not because it directly mirrors today's but because it created a dynamic in which subsequent presidents felt unduly constrained by the prospect of military commanders undermining them.
In 1950, the first year of the Korean War, MacArthur showed his strategic brilliance with his marine landing at Inchon, South Korea, then behind enemy lines. That move reversed the sagging fortunes of the multinational United Nations force, which the United States was leading to repel North Korea's invasion of the South. Alas, MacArthur's self-confidence, already outsized, could have done without the boost. Determined to rout the North Koreans, he ignored signals that a U.N. offensive beyond the 38th parallel—the line dividing the Koreas—would spur Chinese intervention, and his northward push resulted in a massive setback.
Disinclined to own up to defeat, MacArthur issued a volley of public statements blaming Washington for keeping him from attacking Chinese bases within Manchuria. Truman's decision to contain the war, he said, imposed "an enormous handicap, without precedent in military history." Until that point, Truman had indulged MacArthur's ego and recoiled from his mystique. He had let the general run the Japanese occupation unchecked and had praised him extravagantly. Now he responded, albeit with a mere wrist slap, ordering MacArthur to clear future statements with Washington.
Like the rift between McChrystal and Obama's civilian team, the Truman-MacArthur conflict stemmed from a split over war aims: MacArthur wanted victory at all costs, while the administration, seeking to avoid a wider conflict, set forth the more limited goal of restoring South Korea's integrity. To this end, the Truman administration was pursuing a diplomatic settlement. Yet MacArthur was incorrigible. (Perhaps he thought he was in Corregidor.) He publicly insisted on an enemy surrender and Korean reunification, derailing the diplomacy. Privately, the president resolved to fire him, but he delayed—letting the Joint Chiefs instead send a curt rebuke.
Soon, MacArthur forced Truman's hand. The general sent a letter to House Republican leader Joe Martin endorsing the congressman's demagogic call to have Chiang Kai-Shek's Chinese Nationalists open a second front. "There is no substitute for victory," MacArthur portentously asserted.
"Rank insubordination," Truman wrote in his diary. His trusted aides—Secretary of State Dean Acheson, Defense Secretary George Marshall, Averell Harriman, Gen. Omar Bradley—agreed that MacArthur's time was up. Political advisers warned of an outcry, but Truman took solace in Abraham Lincoln's Civil War decision to fire Gen. George McClellan—a controversial move for which Lincoln was ultimately vindicated.
MacArthur's sacking enraged the right. "This country today," charged Sen. William Jenner of Indiana, "is in the hands of a secret inner coterie which is directed by agents of the Soviet Union." Richard Nixon of California, typically, preferred insinuation: "The happiest group in the country will be the Communists and their stooges." Cries arose for Truman's impeachment. The foot soldiers of the grass-roots right hung effigies of the president and Acheson from the trees.
Returning home to the United States, MacArthur received a classic hero's welcome. Martin invited him to address the Congress, where he famously declared, "Old soldiers never die, they just fade away." A Senate inquiry grilled Acheson, Bradley, Marshall, and others on the decision to dismiss the general. The mayor of Chicago declared MacArthur Day and invited the general to star in a parade and a rally at Soldier Field.
MacArthur's support, though considerable, was disproportionately magnified by the ballyhoo. In a landmark 1953 essay, sociologists Gladys and Kurt Lang showed that the media hyped and overstated the enthusiasm for the deposed general. People who were actually present at MacArthur Day found the crowds to be sedate, with few attendees riled about the general's firing. A poll showed that most people came out of curiosity, not hero worship. Those who watched on TV, in contrast, had their expectations of wild crowds validated by the tenor of the coverage. The notion that the American people were lining up behind MacArthur had been blown out of proportion—and as the idea seared itself into the public consciousness, it became a self-fulfilling prophecy. (Not too self-fulfilling: MacArthur's bid for the Republican presidential nomination the next year foundered.)
The picture that took root—of a hotheaded, charismatic general with an angry and dedicated popular following—was worrisome, especially with the general positioned as the president's political rival. Americans didn't have to fear full-blown fascism to grasp that MacArthur's grandstanding could erode Truman's constitutional authority. While history came to look well on the president's decision as brave and correct, the episode nonetheless left a lasting current of popular sentiment that in matters of war and peace, the military really knows best. At odds with the American tradition of the primacy of civilian rule, this attitude—call it MacArthurism—has continued sporadically to haunt American politics. More than McChrystal's opinions on strategy, this disposition is what Obama now needs to address.
For all our incantations about the wisdom of civilian control, politicians remain afraid to cross the top brass. Democrats in particular take pains to show their respect for admirals and generals. Excepting Dwight Eisenhower, whose military bona fides were for obvious reasons never questioned, every postwar president has had to deal with MacArthurism. Curtis LeMay blustered through the Kennedy administration, though JFK resisted his counsel to invade Cuba during the missile crisis. LeMay and other generals urged Lyndon Johnson to intensify the bombing of North Vietnam and escalate the war. While LBJ managed to ease LeMay into retirement, he never conquered his fear of incurring a reputation for softness in the face of Communist aggression. That insecurity kept him from winding down the war until his last year in office.
Subsequent presidents found themselves bolstering the culture's MacArthurist tendencies by their very efforts to deflect it. Worried about being challenged by military leaders, they chose not to lay down the law, a la Truman, but to showcase their deference to the armed forces—often to little avail.
In 1992, Bill Clinton tried to palliate concerns about his ability to be commander in chief by touting endorsements from Adm. William Crowe, a former chairman of the Joint Chiefs, and 21 other retired officers. But these gestures didn't stop Gen. Colin Powell or others in the armed forces from undermining him as president—particularly over letting gays serve openly. Even after that humiliation, Clinton still pandered to the culture's MacArthurism with unseemly touches that Reagan had inaugurated, such as habitually saluting servicemen and turning every D-Day anniversary into a publicity stunt. Later, John Kerry, a real war hero, learned that stressing his military credentials hardly insulated him from—and in fact encouraged—MacArthurite demagoguery about his fitness to lead.
In 2008, Obama, under fire for lacking foreign policy chops, followed the Clinton route. At the Democrats' Denver convention he trotted out a procession of generals, who stood erect as a stadium of Democrats cheered. The tableau implied that the fence-sitting public should put special stock in these generals' judgment about who should make decisions on war and peace. Obama's selection of the relatively low-profile Gen. Jim Jones as his national security adviser also seemed designed to pre-empt military criticisms of the sort that hobbled Clinton. But having struck the familiar bargain with the forces of MacArthurism, Obama is now imperiled by them, as his own general threatens to undercut his authority as commander in chief.
It's premature to discuss McChrystal's firing. For now, the rebukes from Jones and Defense Secretary Robert Gates seem to have chastened him. But Obama should remember that, whatever the short-term political risks, history has rewarded those who stood tall for constitutionalism—and remember, too, that capitulating to MacArthurism may serve only to make it stronger.
Posted on: Wednesday, October 14, 2009 - 02:31
SOURCE: Washington Decoded (10-11-09)
Under the editorship of Graydon Carter, Vanity Fair has become the most reliable purveyor of Camelot nostalgia.
Over the last decade, either the glossy magazine or vanityfair.com has published 19 articles, book excerpts, or photo spreads about the Kennedys. In the last three years alone, Kennedys (Jack, Bobby, Jackie, or some combination thereof) have three times graced Vanity Fair’s cover, one of the most sought-after pieces of magazine real estate in the business. Single-handedly, Graydon Carter seems bent on extending what Garry Wills once called “the Kennedy time in our national life.”
The latest cover story in the October 2009 issue is unlike its predecessors though. It’s neither a JBK syrupy excerpt from a fawning book, nor an amply-illustrated puff piece. Instead, Sam Kashner’s “A Clash of Camelots” purports to be a serious, original piece of journalism about the 1966-67 “Manchester affair,” the first episode to tarnish the Kennedys’ escutcheon following the 1963 assassination of JFK. As the table of contents states, “[VF contributing editor] Sam Kashner unearths the story behind [William Manchester’s The Death of a President,] a best-seller that captivated the nation, divided the Kennedys, and nearly destroyed the author.”
Far from unearthing the story, or even shedding new light on it, the article mostly recycles what has long been known. Newspapers, especially The New York Times, covered the so-called “battle of the book” in agonizing detail for months. Afterwards, the heavily publicized struggle was the subject of three books, two of them authored by Times-men. Eventually, even Manchester had his unabridged say, publishing a 60-page chapter about the controversy in a compilation of his essays.
Still, there wouldn’t be much to criticize—and it would be churlish to do so—if this boast were only the issue. The genuine problem is the article’s many errors of commission and omission, including Kashner’s uncritical acceptance of clichés and untruths in Manchester’s own narrative that have been circulating for 42 years. Some of these issues have long been points of dispute, while the truth about others has only become known recently, through the release of such sources as the once-secret Lyndon Johnson tape recordings.
The VF editors’ note says Kashner’s interest in the Manchester story was first aroused in 1984, when John F. Kennedy, Jr. told Kashner that The Death of a President was the “only book he would ever read about his father’s assassination. This revelation piqued Kashner’s obsession with the controversial history . . . ” If so, Kashner’s obsessiveness is strictly limited to sources that conform to a flattering portrayal of William Manchester and his book, which Kashner considers a “masterpiece.”
Here are some of the outstanding errors of fact, interpretation, and history in “A Clash of Camelots.”
• And [Manchester] would have one crucial source that the [Warren] Commission did not: Jacqueline Kennedy.
This assertion reflects one of the themes in Kashner’s article: that Manchester’s work was superior to that of the supposedly discredited Warren Commission, even though the author and that panel happened to agree on who killed President Kennedy.
It is true that Manchester enjoyed unlimited access to the president’s widow. It is definitely untrue, however, that the Warren Commission had no access at all. On June 5, 1964, Supreme Court Chief Justice Earl Warren and the panel’s general counsel, J. Lee Rankin, interviewed Mrs. Kennedy at her Georgetown home in the presence of Robert Kennedy. That the interview lasted only 10 minutes had everything to do with the Warren’s desire not to make the former First Lady recount her husband’s violent death in any more detail than necessary.
Besides being inaccurate, Kashner’s assertion is also misleading. What Mrs. Kennedy had to say was of a widely different value, given that the Warren Commission andWmman Manchester had distinct assignments (which Kashner acknowledges). The Commission’s primary task was not to write a history, but to determine insofar as possible who had killed President Kennedy. For that purpose Jacqueline Kennedy’s testimony turned out to be wrenching, but of marginal probative value—the Zapruder film of her reactions conveyed more information than her testimony. To anyone writing a narrative account, however, Mrs. Kennedy’s recollections were not only vital but indispensable. Her refusal to be interviewed by anyone but Manchester was precisely why (as noted by Kashner) author Jim Bishop, Manchester’s arch-rival, complained, “She’s trying to copyright the assassination.”
• With [Jacqueline Kennedy’s] style, her youth and beauty, her intelligence, she was one of the president’s most formidable assets.
Kashner looks back at JBK with rose-colored glasses, his view influenced by her dazzling and steely performance after the assassination. In that he is hardly alone. In truth, however, reviews of Jacqueline Kennedy’s behavior and value as a political spouse were very mixed before November 22, 1963. Her distaste for retail politics was a poorly kept secret, if not the subject of venomous gossip. She was often aloof, like a modern-day Marie Antoinette, and frequently inconsiderate of the needs of other political wives. On the morning of November 22, for example, JBK didn’t bother to inform the other political spouses of her wardrobe selection, as any thoughtful First Lady would do, especially one as fashion-conscious as Jackie. Consequently, Nellie Connally was deeply chagrined at breakfast to find that she “had chosen wrong”—she had also donned a pink wool suit.
While First Lady, Jacqueline Kennedy regarded the repetitive dreariness of politicking as beneath her, making her far from a formidable campaigner, though the “charm of her shyness” made her a vivid attraction. The president’s sisters (and sister-in-law, Ethel) were always far more visible on the campaign trail than Jacqueline, and there were rumblings of discontent among many Democrats over her common refusal to participate in party functions.
• Manchester later showed [Jacqueline Kennedy] still frames of dress manufacturer Abraham Zapruder’s film, which had caught the entire assassination on 8mm Kodachrome.
The notion that the assassination was captured in full on the Zapruder film is a fallacy, one of the oldest factoids about the Kennedy assassination. Manchester wrongly accepted it and Kashner repeats it without hesitation, even though the film’s first few frames reveal Secret Service agents already reacting to the first shot. That means the film captured the assassination after it commenced, as this New York Times oped from 2007 explained, and these two articles lay out in abundant detail.
• [Manchester] discovered deep political enmities that had simmered at the time of the assassination, not just against the Kennedys, but among the Democrats as well. Indeed that’s what had compelled Kennedy’s trip to Dallas in the first place . . . Kennedy didn’t want to lose the state in the upcoming ’64 election so he’d agreed to go to Dallas in an attempt to heal the rift.
The suggestion that Manchester “discovered” the rift among Texas Democrats would be laughable to any self-respecting Texas political reporter from the early ‘60s. The bitter postwar struggle for the soul of the Democratic Party in Texas (there was no Republican Party, except in name, until the late ‘60s) had provided fodder for local political journalists for years. Even Manchester, who was quite proud of his extensive research for the book, never claimed to have made such a discovery.
It is true that in the immediate aftermath of the assassination, Kennedy aides put out the cover story that the president went to Texas to “heal” the state Democratic Party—but that’s because the truth, which has been known since at least 1967, was a lot less appealing. President Kennedy had gone to Texas for the same reason Willie Sutton frequented banks: that’s where the money was. As Charles Caldwell, a long-time aide to then-Senator Ralph Yarborough (D-Texas), recalled about Kennedy’s visit, “Is Texas a place to go for [campaign] money? It sure is! That’s what you do. You go to Texas to raise money.”
A month before the trip to Texas, the president had kicked off his personal fund-raising efforts with a mammoth, $100-a-plate dinner at the Commonwealth Life Armory in Boston that enriched party coffers by as much as $680,000. The next logical site was Texas because of Lyndon Johnson’s place on the ticket, which gave Kennedy entrée to well-heeled Democrats, of whom there were many in Texas. The president had begun prodding Texas’s reluctant governor, John Connally, to organize a big fund-raiser since mid-1962. But Connally, acutely aware that the president was not particularly popular in Texas, and that a presidential visit would put a strain on the state’s seriously riven Democratic Party, had continually made excuses, until the president finally pinned him down in June 1963.
“If we don’t raise funds in another state, I want to do so in Massachusetts and Texas,” Kennedy told Connally as they planned the trip in June. “If we don’t carry another state next year, I want to carry Texas and Massachusetts.” Indeed, President Kennedy’s initial impulse was to have fund-raisers in four Texas cities. Connally only fended him off by arguing that Kennedy would be seen as “trying to financially rape the state” after having made few appearances there during his first term.
Once it was decided to hold the fund-raiser in “neutral” Austin—the rivalry between Houston and Dallas being second only to the split in the state’s Democratic Party—a three-day tour of the state’s four largest cities was added in recognition of Texas’s vital electoral role, as well as the general slide in the president’s ratings in the South and West. Some of the Confederate states he had relied on in 1960 were undoubtedly going to be lost to him in 1964, making it more vital to hang onto more diversified and growing Southern states like Texas and Florida. Yet the trip’s primary purpose remained fund-raising.
After the assassination, Kennedy aides wanted to obscure that the president had gone to Texas for something as crass as raising money. So they invented the notion that he was on a mission to heal the state’s Democrats and perform a kind of political triage. That was “pure hogwash,” as John Connally put it 1978, and there is plenty of evidence to back him up.
But William Manchester bought into the fable, as does Kashner. Indeed, to this day one can find the story repeated credulously and endlessly in publications ranging from The Nation to The Washington Post. That doesn’t make it true though.
• Though he would eventually come to share [the Warren Commission’s] conclusion that Lee Harvey Oswald had been the sole gunman in the assassination . . . [Manchester] was not overly impressed by the men on the Commission, especially as much of their research fell to junior staff. (“I have more investigative experience than any of them,” he felt.) And Robert Kennedy remained skeptical of [the Warren Commission’s] findings for the rest of his life, which would be ended by another assassin’s bullet five years later.
Perhaps Kashner is merely trying to be sophisticated here, since it isn’t fashionable to admit the Warren Commission did a good (though not perfect) job. The findings of the “junior staff,” to which Manchester allegedly felt superior, have withstood the most important test of all, the test of time. On the most critical forensic issue—the shooting sequence in Dealey Plaza—the Commission’s junior staffers realized that the FBI’s initial analysis was impossible, and correctly found that one shot fired by Oswald had actually struck both President Kennedy and Governor John Connally.
Meanwhile, Manchester—the supposedly more experienced investigator who “watched the Zapruder film close to 100 times,” according to Kashner—both relied heavily on the Commission’s work, and also offered his own scenario that made even less sense than the admittedly confused explanation offered in the Warren Report. The majority of reliable witnesses heard three shots, and three expended cartridges were found in the assassin’s nest. Manchester’s own bias, however, led him to believe that only two shots were probably fired.
As a fellow ex-Marine, and because of the distances involved, Manchester decided that three shots were unlikely because Oswald “could scarcely have missed.” Consequently, Manchester alone suggested that Oswald was such a bungler that he might have started out with an expended cartridge in the rifle’s breech, thus accounting for the three cartridges on the floor. Simultaneously, and to mollify critics who might note that most witnesses heard three shots, Manchester suggested that Oswald could have fired his rifle three times in less than six seconds, and still have managed his feat of arms of hitting the president in the back (and then Connally) with one shot, and then mortally wounding JFK in the head with another, while missing with a third. But three shots in that time-span and with that degree of accuracy are all but impossible to achieve in real life.
In short, while the Warren Commission’s explanation was less than ideal, Manchester, who had the luxury of almost two more years to research the assassination, did even worse.
Kashner’s assertion that Robert Kennedy “remained skeptical” is also an exaggeration, Wrnrpt or at very least, woefully misleading. While many critics of the Warren Commission, ranging from Kennedy hagiographer Arthur M. Schlesinger, Jr. to the shyster Mark Lane, have alleged that RFK privately harbored deep reservations about the panel’s verdict, the one and only time Robert Kennedy spoke publicly about the Commission he endorsed it without qualification. After declaring for the presidency in 1968, Kennedy stated that he was against reopening the investigation into his brother’s assassination and “stands by the [Warren] Report.” If Kennedy harbored any private doubts, they undoubtedly stemmed from his own complicity in the CIA’s attempts to assassinate Fidel Castro, and from the constraints on the Commission’s investigation of possible Cuban involvement. The Commission did not state that there was no foreign complicity—only that it had been unable to find evidence of a foreign-directed (Soviet or Cuban) conspiracy.
Kashner’s errors of commission are compounded by several errors of omission. One would not know from reading the article that many of the poignant, heart-tugging episodes rendered in The Death of a President were simply not true, as the tape recordings from the Johnson presidency underscore. One of the most infamous smears, of course, was the “deer-hunting incident.” According to Manchester, Lyndon Johnson had shamed a reluctant President-elect Kennedy into going deer-hunting on the LBJ Ranch in November 1960, and then later, pestered an “inwardly appalled” JFK until the president finally displayed a mounted deer head in the White House. The “memory of the creature’s death had been haunting” to JFK, wrote Manchester, “and afterward he had relived it with his wife . . . to heal the inner scar.” Manchester actually opened his book with this episode until wiser heads prevailed. Supposedly, it was a graphic illustration of Texans’ penchant for violence, the same propensity that would later stoke Lee Harvey Oswald into action. It mattered not that the incident didn’t happen, at least not the way Manchester depicted it.
Manchester’s description of the hunt was not an isolated mistake, but one of many inaccuracies and apocryphal tales. And in every significant instance, the errors ran inSwear only one direction—against Lyndon Johnson. They all tended to cast him in the worst possible light, almost as if he embodied the forces of violence and irrationality that allegedly incited Oswald. And when Johnson wasn’t the personification of Texas blood lust, he was the boorish and impatient usurper, so anxious to get his hands on the levers of power that he could not bear to wait to be sworn in.
In essence, The Death of a President attempted to reverse what was rightly seen as Lyndon Johnson’s finest hour, and recast it to the Kennedys’ perceived advantage. Nor was Manchester shy, at least before the controversy erupted, about admitting what he had done. As he wrote Mrs. Kennedy in July 1966, “though I tried desperately to suppress my bias against a certain eminent statesman who always reminds me of somebody in a Grade D movie on the late show, the prejudice showed through.” Little wonder why Garry Wills, writing in 1994, called William Manchester “the Parson Weems of the Kennedy cult.”
Near the end of Kashner’s article, he openly regrets that Manchester’s masterpiece is out of print and hints at something untoward. This claim, too, is misleading because it is hardly unusual for a book published in 1967 to be out of print. Indeed, a majority (12) of Manchester’s 18 titles are out of print, and with one exception, the ones in print were all published in 1978 or later.
If there is something to lament, it’s that Manchester’s powerful, but romantic and at times malignant mytho-drama, in print or not, still defines how many Americans remember the assassination.
Posted on: Wednesday, October 14, 2009 - 02:23
SOURCE: CNN (10-12-09)
Did President Obama deserve the Nobel Peace Prize? That debate will likely continue for weeks to come. But the more interesting question may be about what impact the prize will have on President Obama himself and the key decisions he must make about national security.
The case of Woodrow Wilson, the last sitting president to be awarded the prize, offers some useful lessons.
On December 10, 1920, Albert Schmedeman, the American Minister to Norway, accepted the Nobel Prize on behalf of President Wilson, who was being honored for his work in creating the League of Nations. The president had first been nominated in 1918, but strong internal disagreement within the committee delayed his receiving the prize. It was his actual campaign to gain ratification for the League of Nations agreement in 1919 that persuaded the committee he had earned the recognition.
Schmedeman read a statement from Wilson, who was in poor health after suffering a stroke, that said: "In accepting the honor of your award, I am moved by the recognition of my sincere and earnest efforts in the cause of peace, but also by the very poignant humility before the vastness of the work still called for by this cause."
Wilson realized that the award came toward the end of a presidency where he had failed to achieve many of his goals. There was a certain irony that the prize was awarded right at the time that President Wilson had failed to persuade the U.S. Senate to ratify the Treaty of Versailles, the agreement signed at the end of World War I.
One of the Norwegian newspapers, the Aftenposten, ran an editorial that stated: "After disappointment in Versailles he returned home a beaten man, ridiculed by his adversaries and fellow-citizens. By circumstances out of his control he was restrained from promoting his international peace work. As President of the United States he was unable to do anything more, but history will keep memory of him as creator of the League of Nations.
"To Europe and to great parts of America President Wilson looms as the man of peace who broke with the old doctrines and showed the way toward new ideas. He is, first and last, the great peace promoter -- popular among the victorious and among those beaten."
When Wilson received the Nobel Prize, his presidency was one of dashed expectations. In addition to the fact that the U.S. Senate had refused to ratify the Treaty of Versailles -- despite a massive campaign by the president to pressure them into doing so -- many other things had been difficult in Wilson's second term. Though he had run for reelection in 1916 as a president who would keep the nation out of war, Wilson led American troops into a bloody battle...
Posted on: Tuesday, October 13, 2009 - 01:44
SOURCE: WSJ (10-11-09)
More than 30 years have passed since North Vietnam, in gross violation of the 1973 Paris Peace Accords, conquered South Vietnam. That outcome was partly the result of greatly increased logistical support to the North from its communist backers. It was also the result of America's failure to keep its commitments to the South.
Those commitments included promises to maintain a robust level of financial support, to replace combat materiel, and even the use of air power to support the South in case of aggression by the North. That failure was the doing of a U.S. Congress that had tired of the country's long involvement in a war in Southeast Asia and cared nothing for the sacrifices of its own armed forces or those of the South Vietnamese people.
Since then, whenever America has entered into other military actions abroad or contemplated such commitments, the specter of Vietnam has been raised. It is entirely appropriate that earlier military experiences be examined for such "lessons learned" as they may yield. But it is equally essential that those prior campaigns be accurately understood before any valid comparisons are made. When it comes to the Vietnam War, much skewed or inaccurate commentary has impeded our understanding of that conflict and its outcome.
All the better-known early works on the Vietnam War—by Stanley Karnow, Neil Sheehan, George Herring—concentrated disproportionately on the early period of American involvement when Gen. William C. Westmoreland commanded U.S. forces. As a consequence, many came to view the entirety of the war as more or less a homogeneous whole, and to apply to the whole endeavor valid criticisms of the early years, ignoring what happened after Gen. Creighton Abrams took command soon after the 1968 Tet Offensive.
William Colby, who headed American support for the South Vietnamese pacification program (and was later director of the CIA), once remarked that the prevalence of such truncated treatments of the Vietnam War was like what Americans would know about World War II if the histories of that conflict had stopped before Stalingrad, the invasion of North Africa and Guadalcanal.
We now know, or should, that virtually everything changed when Abrams took command. The changes grew out of his understanding of the nature of the war, and of his conviction that upgrading South Vietnam's armed forces and rooting out the enemy's covert infrastructure in rural hamlets and villages must be accorded equal priority with combat operations. Even combat operations were radically reconfigured...
Posted on: Tuesday, October 13, 2009 - 01:05
SOURCE: The Root (10-8-09)
First Lady Michelle Obama’s maternal third-great-grandfather was a white man who fathered Melvinia Shields’ (her maternal third great-grandmother) son, Dolphus T. Shields, both slaves. This discovery, like all recoveries of the identities of ancestors we thought had been obliterated in the crucible of slavery, is first and foremost a welcome gift for the first family, especially for Michelle’s mother, Marian Shields Robinson, and the Shields family line. And for anyone still naïve enough to believe in the myth of racial purity, it is one more corroboration that the social categories of “white” and “black” are and always have been more porous than can be imagined, especially in that nether world called slavery.
As I have learned since embarking upon my African American Lives series (for PBS), never before are more African Americans determined to ferret out the names of their slave ancestors, and never before have more resources, especially online, been available to facilitate these searches. But, be prepared. To paraphrase the Bible: seek; but fasten your seat belt as to what ye may find.
For those of us fortunate enough to lift the veil on our family’s slave past and identify our actual ancestors, these genealogical searches often yield startling results—two in particular. The first shock? That Cherokee Princess that family lore says is your great-great-grandmother most probably never existed. The sad truth is that the overwhelming percentage of African-American people have very little Native American ancestry in their DNA.
A Harvard colleague of mine likes to say, “DNA don’t lie.” And the Reverend Eugene Rivers likes to say that “DNA has freed more black men than Abraham Lincoln.” But genealogy and DNA tests are also “freeing” a lot of our white ancestors as well, revealing the vast extent of white ancestry that each black American has. Here are the facts: Only 5 percent of all black Americans have at least 12.5 percent Native American ancestry, the equivalent of at least one great-grandparent. Those “high cheek bones and “straight black hair” your relatives brag about at every family reunion and holiday meal since you were 2 years old? Where did they come from? To paraphrase a well-known French saying, “Seek the white man.”
African Americans, just like our first lady, are a racially mixed or mulatto people—deeply and overwhelmingly so. Fact: Fully 58 percent of African-American people, according to geneticist Mark Shriver at Morehouse College, possess at least 12.5 percent European ancestry (again, the equivalent of that one great-grandparent). As a matter of fact, if I analyzed the y-DNA (which a man inherits exactly from his father, and he from his father, etc.) of all the black players in the NBA, fully one-third (somewhere between 30 percent and 35 percent) would, incredibly, discover that they were descended from a white male who impregnated a black female, most likely a female slave, just as a white man did Michelle Obama’s third-great-grandmother. In the ‘60s, we were fond of saying that we are an “African people.” Well, our DNA proclaims loudly that we are a European people, a multicultural people, a people black as well as white. You might think of us as an Afro-Mulatto people, our genes recombined in that test tube called slavery.
For African American Lives, I’ve tested 21 African Americans, possessing a range of phenotypes—from a person who could “pass” for “white,” and whose father actually did, to people with darker and more traditionally African features, such as Don Cheadle and Chris Rock. Not once has any person tested turned out to be 100 percent African. Chris Rock, for example, is 20 percent European. Don Cheadle is 19 percent European. That straight black hair and high cheek bones on your grandmama’s head? Look at a Google Map of Europe, find Italy, then look straight north to England and Ireland, Germany and France. That’s where, in all probability, your ancestor’s hair texture and lighter complexion comes from, not from the rendezvous of a fugitive slave and an Native American compatriot, united in enmity toward a common enemy, sitting around a campfire, smoking a peace pipe and woofing on the white man...
Posted on: Tuesday, October 13, 2009 - 00:45
SOURCE: The Weekly Standard (10-12-09)
Ever since the 1972 Democratic convention nominated George McGovern over the objections of the AFL-CIO, the standard wisdom has been that organized labor's power in American politics has declined dramatically. The failure of the current Democrat-dominated Congress to pass labor's highest legislative priority, the Employee Free Choice Act ("card check"), is taken as indicative of unions' political incapacity. But the picture looks very different on the state and local level where public sector employee unions have gone from one victory to another. Indeed, they are the one group, besides Goldman Sachs executives, that's done well during the current Great Recession. Public sector unions have become political powerhouses in New York, New Jersey, Washington, California, and a host of other states. They have become so powerful as to threaten the Madisonian system set up to constrain any one faction from overwhelming the public interest.
Once upon a time public sector workers received less pay than their private sector counterparts in return for better benefits and greater job security. But that bargain has been breached. Public sector wages have more than caught up, while the differential between public and private sector benefits has increased so much that public sector work, particularly for the unskilled, is greatly coveted. To protect such benefits, the unions have tenaciously opposed Senator Max Baucus's plan to tax expensive health insurance plans to finance an extension of coverage. Supporters of public sector union power have developed a rationale for the government employees' gold-plated perks. The argument is that public employees are the vanguard of the working class. As such, the benefits they achieve will eventually have to be matched by private sector employers. As Carla Katz, the leader of New Jersey's Communications Workers of America, explained to Paul Mulshine of the Newark Star-Ledger, reformers embrace "the progressive theory that unless you create a substantial wage and benefits package that reflects good jobs and the ability to have a middle-class life style, there will be a perpetual race to the bottom."
Katz not only represents thousands of state employees, she is also the richly rewarded former girlfriend of New Jersey governor Jon Corzine. Katz's influence on Corzine became clear in 2006 when the impassioned governor spoke to a Trenton rally of roughly 10,000 public workers and shouted out: "We will fight for a fair contract." Corzine was of course management in that situation, not labor. But with the power of the public sector unions to drive election outcomes, they now sit on both sides of the bargaining table. Unlike private sector unions, the sheer number of workers represented is not the linchpin of their influence. Private sector unions have a natural adversary in the owners of the companies with whom they negotiate. But public sector unions have no such natural counterweight. They are a classic case of "client politics," where an interest group's concentrated efforts to secure rewards impose diffused costs on the mass of unorganized taxpayers. Also unlike private sector unions, those in the public sector can achieve influence on both sides of the bargaining table by making campaign contributions and organizing get-out-the-vote drives to elect politicians who then control the negotiations over their pay, benefits, and work rules. The result is a nefarious cycle: Politicians agree to generous government worker contracts; those workers then pay higher union dues a portion of which are funneled back into those same politicians' campaign war chests. It is a cycle that has driven California and New York to the edge of bankruptcy.
Consider what happened in Washington State. After helping Democrats win full control of the legislature in 2002, the state affiliate of the Association of Federal, State, County, and Municipal Employees (AFSCME) and other unions persuaded lawmakers to lift the collective bargaining restrictions. Within three years the number of union members had doubled. With more state employees paying dues, the amount of union dollars flowing into the coffers of Democrats running in state elections also doubled. A prime beneficiary of such union generosity was Christine Gregoire, who became governor in 2004 after one of the closest elections in the state's history. (AFSCME gave $250,000 to the state Democratic party to help pay for the recount that handed her the election by 129 votes). Once in office, Gregoire negotiated contracts with the unions that resulted in double-digit salary increases, some exceeding 25 percent, for thousands of state employees. In 2007, J. Vander Stoep, an adviser to Republican Dino Rossi, Gregoire's 2004 opponent, prophetically remarked that the unions' arrangement with the Democrats was "a perfect machine to generate millions of dollars for her reelection. .  .  . They are building something that conceivably can never be undone--at taxpayer expense." In their 2008 rematch, Rossi lost again to Gregoire, this time by 194,614 votes.
Public sector unions with political influence can negotiate detailed work rules in which they largely exempt themselves from accountability in return for providing political support for their nominal managers. In New York City, Mayor Michael Bloomberg and the United Federation of Teachers (UFT) have created a cartel to advance their own interests at the expense of the citizens and students. The teacher's contract is over 200 pages of small print. Reminiscent of the 12,000 United Auto Workers (UAW) who were paid not to work in the heyday of the UAW, nearly 800 Gotham "rubber room" teachers who have problems on the job are being paid not to work. The UFT has also negotiated with Bloomberg, mistakenly called an education reformer, a reduction in the number of days they must work to prepare for classes before school begins in September at the same time as their salary increases have been running at better than twice the rate of inflation.
But the teachers are not the only politically powerful labor force in New York, the nation's most-unionized state where 69 percent of public sector workers belong to collective bargaining units. In the nominally private health care sector, employees depend heavily on government programs, principally Medicare and Medicaid, for their livelihood. In the 1970s and 1980s, the local 1199 Drug, Hospital and Health Care Employees Union fought a running battle with New York's largely state and federally funded voluntary hospitals. Under the brilliant leadership of Dennis Rivera, 1199 built a top-notch political operation, and with the hospitals, which were barred from political activity, formed a partnership to maximize the flow of government revenue. The union-hospital alliance has been so successful in aligning itself with politicians, Democrat and Republican alike, that not only has 1199 been largely untouched by the downturn, but New York spends as much on Medicaid as California and Texas combined. And come boom or bust, hospital and health care employment in the state keeps growing. Rivera, who merged his local with the SEIU (Service Employees International Union), has now brought his political acumen to Washington as the SEIU's point-man on health care reform.
The combined power of the teachers and health care workers has made the New York state legislature a wholly owned subsidiary of the public sector unions. The law mandates that all new legislation be evaluated for its fiscal impact. In recent years those calculations were performed by an actuary named Jonathan Schwartz. In 2008, when Schwartz found that a piece of bipartisan legislation allowing city workers to retire early with full pension benefits would impose no new costs, the New York Times blew the whistle. Schwartz, who had been fired from a city job, worked not only for the state assembly but also, it turned out, for District Council 37 of the SEIU. When asked which other unions he had worked for, he replied, "How many unions are there?" His client list included the teachers, firefighters, detectives, correction officers, and bridge and tunnel officers. Not surprisingly New York State has the highest per-employee pension costs in the country.
Prior to World War II, a New York State Supreme Court justice neatly summarized the prevailing attitude toward public sector unions: "To tolerate or recognize any combination of Civil Service employees of the government as a labor organization or union is not only incompatible with the spirit of democracy but inconsistent with the spirit of democracy and inconsistent with every principle upon which our Government is founded." Laws permitting collective bargaining for public employees were virtually nonexistent. Even labor-friendly economists thought organizing most public sector employees was illegitimate. AFL-CIO president George Meany believed it was "impossible to bargain collectively with the government."
What produced the enormous expansion of public sector unions? In a case of unintended consequences, government unionism ironically developed from actions taken by those hostile to it. Many of the icons of the labor-left like New York's great mayor Fiorello LaGuardia and President Franklin Roosevelt were adamantly opposed to public sector unions. LaGuardia, who pledged to make New York a "one hundred percent [private sector] union" town, had a civic vision of public employees as the people's workers, exemplars of the common good. Famed for dropping in unexpectedly on city offices and dressing down slackers, LaGuardia explained that he did "not want any of the pinochle club atmosphere to take hold" in his city government. "The right to strike against the government," he insisted, "is not and cannot be recognized."
In 1935, Roosevelt signed the Wagner Act, the first peacetime effort to support the growth of private sector unions. Its aim in the words of its sponsor, New York senator Robert Wagner, was "encouraging the practice and procedure of collective bargaining." But like his close ally LaGuardia, Roosevelt drew a definite line when it came to government workers. "Meticulous attention," the president insisted, "should be paid to the special relations and obligations of public servants to the public itself and to the Government. . . . The process of collective bargaining, as usually understood, cannot be transplanted into the public service." Both men feared that liberalism would be compromised by the unavoidably self-serving nature of public sector unionism.
But the mayor and the president opened the door to just what they opposed. In the bad old days of Tammany Hall, which had fought both LaGuardia and Roosevelt, the average tenure of a cop or teacher or garbage collector was five years. But with the rise of civil service reform backed by both men in the 1930s, public employees both in New York and the federal government began to gain lifetime security. Civil service reform, it turned out, was the precondition for unionization because it gave workers a long-term interest in their jobs and facilitated their capacity to express collective concerns. In 1958, New York mayor Robert Wagner, son of the senator behind the 1935 federal act, issued an executive order generally known as "the little Wagner Act." It gave city employees bargaining rights and provided their unions with exclusive representation. The city was soon turning over the dues from its workers to the union. Those dues soon provided political action funds to support union-backed candidates.
Running for reelection in 1961, Wagner faced a Democratic party revolt. The party's five borough chiefs were supporting his opponent, and Wagner made the unions the basis of his winning campaign. It was a turning point. Looking back in the wake of New York's mid-1970s fiscal crisis, Alex Rose, the head of the once powerful (and now defunct) New York Liberal party and a former labor leader, concluded that "the little Wagner Act" had proven a dreadful "mistake." Rose, who had also led the private sector clothing workers, explained that public sector "workers are not extracting a share of the profits but rather a share of taxes." Ultimately, he noted, his workers would be among those "footing the bill."
Ten weeks after Wagner's victory, President John F. Kennedy, who had been elected by the narrowest of margins in 1960, decided to mobilize public sector workers as a new source of political support. In mid-January 1962, he issued Executive Order 10988 giving federal workers the right to organize, though not to collectively bargain. Kennedy's action and Wagner's victory set off a wave of local union activity across the nation's major cities.
In states with laws favorable to unionism, public sector organizing has flourished; in states without such laws, it has not. If there is a specific point from which to mark the beginning of the current looming fiscal crisis in many blue states, it would be the wave of local strikes by public employees that were set in motion by Kennedy's executive order. His strategy succeeded beyond his wildest expectations. Like entitlement programs, the expansion of public sector unionism produced a self-generating dynamic for continual expansion. Public sector unions would occasionally experience temporary setbacks--as in the New York fiscal crisis of 1975--but they had the political clout to claw back any concessions made under duress.
During the Reagan years, the growth in local and state jobs was double the rate of population growth. In the downturn of the early 1990s, the New York Times warned that the states faced a "fiscal calamity." In 2002, during the next serious downturn, the National Governors Association insisted that the "states face the most dire fiscal situation since World War II." But in each case the growth of government and public sector pay packages merely stalled. It resumed as soon as the economy recovered.
There is broad agreement among economists that public sector unions' political power increases government spending. As reported in the New York Times, public-sector wages and benefits over the past decade have grown twice as fast as those in the average private-sector. An Empire Center for New York State Policy study found that in 2006 state and local government employees in New York were paid higher average salaries in eight out of ten regions of the state. If one excludes jobs in finance in New York City and the Southern Tier, private sector employees earned slightly less than government ones statewide.
The downturn has been very tough on private sector workers. But the public sector, particularly when it comes to pensions for uniformed workers, has been a different matter. In New York City, where public sector union benefits have grown twice as fast as those in the private sector since 2000, firefighters may retire after 20 years at half pay. Pension benefits for a new retiree averaged just under $73,000 (all exempt from state and local taxes). Many also collect an annual $12,000 "Christmas bonus." To top it off, they receive a health insurance policy that is worth about $10,000 annually. New York City is also paying benefits to 10,000 retired police officers under 50 years of age.
Such cases abound. According to the Boston Globe, 225 of the 2,338 Massachusetts state police officers made more than Governor Deval Patrick's $140,535 annual salary in 2006. Four state troopers received more than $200,000, and 123 others were paid more than $150,000. The Chicago Sun-Times reports that in suburban Chicago, there are school administrators--a unionized profession--who are making over $400,000. California teachers are represented by one of the country's most powerful teachers' unions and earn 25 percent more than the national average. Forbes has reported that there are California prison guards making $300,000 a year.
While the wage parity between public and private sector workers is largely unchanged since 2002, public sector benefits are a different matter. For every $1-an-hour pay increase, noted Dennis Cauchon in USA Today, public employees have gotten $1.17 in new benefits. Private workers have gotten just .58 cents in benefits for every $1 raise. This gap worries left-liberal labor economist Barry Bluestone. The price of state and local public services increased by 41 percent nationally between 2000 and 2008. Private services only increased by 27 percent. The benefit growth has continued unabated into the Great Recession, and Bluestone says the gap will inevitably produce a backlash.
Like banks, but with even less self-control, state governments make long-term promises in boom times while depending on the short-term flow of revenues. But when the boom ends, the benefits that have been ratcheted up have to be paid for out of a declining private sector economy. Barring a sharp recovery, state and local government tax-funded pension contributions in New York are likely to triple over the next five years in order to pay out the pension benefits guaranteed by the state constitution. (This is equally true in Illinois.) California's public pension fund liability has already topped $200 billion, and in cities such as Oakland, Vallejo, and Rio Vista bankruptcy looms.
In the states and cities where government workers' unions are strong, they have formed alliances with nonprofit advocacy groups such as ACORN and foundations committed to greater government involvement in the economy and society. The Manhattan Institute's Steven Malanga argues that this constellation of forces is in effect a new Tammany Hall. It is, says Seymour Lachman, a former New York state senator who now heads a center for government reform at Wagner College, "the ward heeler system of Boss Tweed's Tammany Hall wrapped in some kind of progressive disguise." The old Tammany, however, was subject to electoral defeats. The new Tammanies have proved self-perpetuating. In California, Governor Schwarzenegger's ill-organized effort to roll back public sector union power in 2005 led to the muscleman's first defeat, then his political evisceration, and now the Golden State's fiscal humiliation. New York City and State are on a similar course. Across the country the new political machine has mostly been aligned with the Democratic party. Some individual unions, however, such as California's prison guards and New York's hospital workers, have been protected and advanced by Republicans. Still others play a pragmatic balance-of-power game, forging short-lived marriages of convenience with either political party.
Public sector unions are beginning to strike out on their own, too. If the recent primary elections in New York are any indication, it is only a matter of time before, using the vehicle of the Working Families party (WFP), they take control of New York City government. New York allows third parties on the ballot, and the Working Families party--organized in 1998 as an alliance between labor unions and ACORN--cross-endorses allies in the Democratic party. Yet the WFP is thriving while New York's Democrats atrophy. In last week's New York City primaries, WFP candidates for city council won easily, as did the party's candidates for the city's second and third highest offices: comptroller and public advocate. Those are the best platforms from which to make a run for mayor of New York City when Bloomberg finally gives up his throne.
Public sector unions bring to the fore what James Madison called "the violence of faction" and its threat to the "permanent and aggregate interests of the community." This can't be blamed on the unions; they're advancing their members' interests. The fault lies with politicians, particularly those governors and mayors who have been willing to sabotage the public interest to smooth the path to their own reelections.
In the absence of tough-minded reform leaders who will take on the public sector unions, the fiscal future of states and localities is bleak.
Posted on: Monday, October 12, 2009 - 02:08
SOURCE: TomDispatch.com (10-11-09)
It's early in 1965, and President Lyndon B. Johnson faces a critical decision. Should he escalate in Vietnam? Should he say "yes" to the request from U.S. commanders for more troops? Or should he change strategy, downsize the American commitment, even withdraw completely, a decision that would help him focus on his top domestic priority, "The Great Society" he hopes to build?
We all know what happened. LBJ listened to the generals and foreign policy experts and escalated, with tragic consequences for the United States and calamitous results for the Vietnamese people on the receiving end of American firepower. Drawn deeper and deeper into Vietnam, LBJ would soon lose his way and eventually his will, refusing to run for reelection in 1968.
President Obama now stands at the edge of a similar precipice. Should he acquiesce to General Stanley A. McChrystal's call for 40,000 to 60,000 or more U.S. troops for Afghanistan? Or should he pursue a new strategy, downsizing our commitment, even withdrawing completely, a decision that would help him focus on national health care, among his other top domestic priorities?
The die, I fear, is cast. In his "war of necessity," Obama has evidently already ruled out even considering a "reduction" option, no less a withdrawal one, and will likely settle on an "escalate lite" program involving more troops (though not as many as McChrystal has urged), more American trainers for the Afghan army, and even a further escalation of the drone war over the Pakistani borderlands and new special operations actions.
By failing his first big test as commander-in-chief this way, Obama will likely ensure himself a one-term presidency, and someday be seen as a man like LBJ whose biggest dreams broke upon the shoals of an unwinnable war.
The Conventional Wisdom: Military Escalation
To whom, we may ask, is Obama listening as he makes his decision on Afghanistan strategy and troop levels? Not the skeptics, it's safe to assume. Not the free-thinkers, not today's equivalents of Mary McCarthy or Norman Mailer. Instead, he's doubtless listening to the generals and admirals, or the former generals and admirals who now occupy prominent "civilian" positions at the White House and inside the beltway.
By his actions, Obama has embraced the seemingly sober, conventional wisdom that senior military officers, whether on active duty or retired, have, as they say in the corridors of the Pentagon, "subject matter expertise" when it comes to strategy, war, even foreign policy.
Don't we know better than this? Don't we know, as Glenn Greenwald recently reminded us, that General McChrystal's strategic review was penned by a "war-loving foreign policy community," in which the usual suspects -- "the Kagans, a Brookings representative, Anthony Cordesman, someone from Rand" -- were rounded up to argue for more troops and more war?
Don't we know, as Tom Engelhardt recently reminded us, that Obama's "civilian" advisors include "Karl W. Eikenberry, a retired lieutenant general who is the U.S. ambassador to Afghanistan, Douglas Lute, a lieutenant general who is the president's special advisor on Afghanistan and Pakistan (dubbed the "war czar" when he held the same position in the Bush administration), and James Jones, a retired Marine Corps general, who is national security advisor, not to speak of Secretary of Defense Robert Gates, a former director of the Central Intelligence Agency"? Are we surprised, then, that when we "turn crucial war decisions over to the military, [we] functionally turn foreign policy over to them as well"? And that they, in turn, always opt for more troops, more money, and more war?
One person unsurprised by this state of affairs would have been Norman Mailer, who died in 2007. War veteran, famed author of the war novel The Naked and the Dead (1948) as well as the Pulitzer Prize-winning report on Vietnam-era protests, The Armies of the Night (1968), self-styled tough guy who didn't dance, Mailer witnessed (and dissected) the Vietnam analog to today's Afghan events. Back in 1965, Mailer bluntly stated that the best U.S. option was "to get out of Asia." Period.
The Unconventional Wisdom: Military Extrication
Can Obama find the courage and wisdom to extricate our troops from Afghanistan? Courtesy of Norman Mailer, here are three unconventional pointers that should be driving him in this direction:
1. Don't fight a war, and clearly don't escalate a war, in a place which means so little to Americans. In words that apply quite readily to Afghanistan today, Mailer wrote in 1965: "Vietnam [to Americans] is faceless. How many Americans have ever visited that country? Who can say which language is spoken there, or what industries might exist, or even what the country looks like? We do not care. We are not interested in the Vietnamese. If we were to fight a war with the inhabitants of the planet of Mars there would be more emotional participation by the people of America."
2. Beware of cascading dominoes and misleading metaphors, whether in Southeast Asia or anywhere else. The domino theory held that if Vietnam, then split into north and south, was united under communism, other Asian countries, including Thailand, the Philippines, perhaps even India, would inevitably fall to communism as well, just like so many dominoes toppling. Instead, it was communism that fell or, alternately, morphed into a version that we could do business with (to paraphrase former British Prime Minister Margaret Thatcher).
We may no longer speak metaphorically of falling dominoes in today's Af-Pak theater of operations. Nevertheless, our fears are drawn from a similarly misleading image: If Afghanistan falls to the Taliban, Pakistan will surely follow, opening a nuclear Pandora's Box to anti-American terrorists in which, in our fevered imaginations, smoking guns will once again become mushroom clouds.
Despite the fevered talk of falling dominoes in his era, Mailer was unmoved. Such rhetoric suggests, he wrote in 1965, "that we are not protecting a position of connected bastions so much as we are trying to conceal the fact that the bastions are about gone -- they are not dominoes, but sand castles, and a tide of nationalism is on the way in. It is curious foreign policy to use metaphors in defense of a war; when the metaphors are imprecise, it is a swindle."
To this I'd add that, in viewing countries and peoples as so many dominoes, which by the actions -- or the inaction -- of the United States are either set up or knocked down, we vastly exaggerate our own agency and emphasize our sense of self-importance. And before we even start in on the inevitable argument about "Who lost Afghanistan?" or "Who lost Pakistan?" is it too obvious to say that never for a moment did we own these countries and peoples?
3. Carrots and sticks may work together to move a stubborn horse, but not a proud people determined to find their own path. As Mailer put it, with a different twist: "Bombing a country at the same time you are offering it aid is as morally repulsive as beating up a kid in an alley and stopping to ask for a kiss."
As our Predator and Reaper drones scan the Afghan terrain below, launching missiles to decapitate terrorists while unintentionally taking innocents with them, we console ourselves by offering aid to the Afghans to help them improve or rebuild their country. As it happens, though, when the enemy hydra loses a head, another simply grows in its place, while collateral damage only leads to a new generation of vengeance-seekers. Meanwhile, promised aid gets funneled to multi-national corporations or siphoned off by corrupt government officials, leaving little for Afghan peasants, certainly not enough to win their allegiance, let alone their "hearts and minds."
If we continue to speak with bombs while greasing palms with dollars, we'll get nothing more than a few bangs for our $228 billion (and counting).
What if LBJ Had Listened to Mailer in '65?
Not long before LBJ crossed his Rubicon and backed escalation in Vietnam, he could have decided to pull out. Said Mailer:
"The image had been prepared for our departure -- we heard of nothing but the corruption of the South Vietnam government and the professional cowardice of the South Vietnamese generals. We read how a Viet Cong army of 40,000 soldiers was whipping a government army of 400,000. We were told in our own newspapers how the Viet Cong armed themselves with American weapons brought to them by deserters or captured in battle with government troops; we knew it was an empty war for our side."
Substitute "the Hamid Karzai government" for "the South Vietnam government" and "Taliban" for "Viet Cong" and the same passage could almost have been written yesterday about Afghanistan. We know the Karzai government is corrupt, that it stole the vote in the last election, that the Afghan army is largely a figment of Washington's imagination, that its troops sell their American-made weapons to the enemy. But why do our leaders once again fail to see, as Mailer saw with Vietnam, that this, too, is a recognizably "empty war for our side"?
Mailer experienced the relentless self-regard and strategic obtuseness of Washington as a mystery, but that didn't stop him from condemning President Johnson's decision to escalate in Vietnam. For Mailer, LBJ was revealed as "a man driven by need, a gambler who fears that once he stops, once he pulls out of the game, his heart will rupture from tension." Johnson, like nearly all Americans, Mailer concluded, was a member of a minority group, defined not in racial or ethnic terms but in terms of "alienat[ion] from the self by a double sense of identity and so at the mercy of a self which demands action and more action to define the most rudimentary borders of identity."
This American drive for self-definition through constant action, through headlong acceleration, even through military escalation, the novelist described, in something of a mixed metaphor, as "the swamps of a plague" in which Americans had been caught and continued to sink. He saw relief of the desperate condition coming only via "the massacre of strange people."
To be honest, I'm not sure what to make of Mailer's analysis here, more emotionally Heart-of-Darkness than coolly rational. But that's precisely why I want someone Mailer-esque -- pugnacious, free-swinging, and prophetical, provocative and profane -- advising our president. Right now.
As Obama's military experts wield their battlefield metrics and call for more force (to be used, of course, with ever greater precision and dexterity), I think Mailer might have replied: We think the only thing they understand is force. What if the only thing we understand is force?
Mailer, I have no doubt, would have had the courage to be seen as "weak" on defense, because he would have known that Americans had no dog in this particular fight. I think he would intuitively have recognized the wisdom of the great Chinese strategist Sun Tzu, who wrote more than 2,000 years ago in The Art of War that "to win one hundred victories in one hundred battles is not the acme of skill. To subdue the enemy without fighting is the acme of skill." Our generals, by way of contrast, seem to want to fight those 100 battles with little hope of actually subduing the enemy.
What Obama needs, in other words, is fewer generals and ex-generals and more Norman Mailers -- more outspoken free-thinkers who have no interest in staying inside the pentagonal box that holds Washington's thinking tight. What Obama needs is to silence the endless cries for more troops and more war emanating from the military and foreign policy "experts" around him, so he can hear the voices of today's Mailers, of today's tough-minded dissenters. Were he to do so, he might yet avoid repeating LBJ's biggest blunder -- and so avoid suffering his political fate as well.
Posted on: Sunday, October 11, 2009 - 23:44
SOURCE: Informed Comment (blog of Juan Cole) (10-9-09)
The Right in the US objected to Obama getting the peace prize on the alleged grounds that he had not yet done anything to deserve it. But the Right in the United States is to peace as velociraptors were to vegetarianism. They don't believe in the ideal for which the award stands in the first place. And they find President Obama laughable, so they can't imagine him getting any awards. They have underestimated him badly and will probably pay a price for that. They misunderstand the Nobel Peace Prize and its history, and the Rupert Murdoch Right (he pays for a lot of this pollution of our airwaves) would not have agreed with any of the past awards.
Alfred Nobel outlined in his will the grounds on which the Peace Prize was to be given, saying it should go annually to the person who "shall have done the most or the best work for fraternity between nations, for the abolition or reduction of standing armies and for the holding of peace congresses." The modern committee considers work toward the reduction of nuclear arsenals in the same light as the reduction of standing armies, hence its award to Linus Pauling.
The American Rightwing would not have approved of Woodrow Wilson getting the prize for helping found the League of Nations. They do not believe in international cooperation or multilateralism in the first place. They think America should cowboy it. They are the tribe of 'bring'em on' and 'wanted dead or alive.' They are about trapping the country in quagmires so as to throw cash to their cronies in the military-industrial complex. They like wars, not peace. They don't care how many people they kill in the global south. A million Iraqis dead? They deny it or justify it or blame it on someone else. They are bottom feeders.
They would have considered Frederic Passy, the first peace Nobelist, as woolly-headed dreamer and laughed at a Universal Peace Conference organized just a little over a decade before the mass slaughter of World War I. They would have dismissed Jane Addams as a "socialist." And what would have provoked them to more gales of laughter than the 1935 award to the German pacifist Carl von Ossietzky. How'd that work out, they'd snicker as they elbowed each other (with any luck breaking some of each other's ribs). If there is anyone they find more laughable than Barack Obama, it is Jimmy Carter (the greatest ex-president in American history), the 2002 awardee. Mohammad Elbaradei of the International Atomic Energy Agency repeatedly got in the way of the American Right's war plans, so presumably they didn't rejoice at his 2005 prize. They don't believe in climate change or global warming and want us to switch to the dirtiest coal possible, so Al Gore's 2007 prize set them giggling, as well.
Matt Corley explained at the time how Murdochians insisted that Al Gore had no accomplishments worthy of the Nobel Peace Prize and that it should have gone to Gen. Petraeus instead. I admire both men, but by the criteria outlined in Nobel's will, it was Gore who had a claim on the prize.
Barack Obama was given the prize because he is a game-changer. Obama has dedicated himself to reducing and ultimately scrapping the nuclear arsenals that threaten the world with nuclear winter or a destruction of the ozone layer; either event would be catastrophic for human beings' existence on the planet. Obama has already made a substantial change in relations between the US and the Muslim world. Two years ago we were talking about whether Cheney could convince Americans to go to war on Iran. Now Washington is engaging in direct talks with Tehran that have eased tensions.
Whether she or he actually achieves peace or not is unpredictable, but game changers are clearly visible to everyone. The hand shake between Rabin and Arafat in the early 1990s was potentially a game changer, and the Oslo deal would have profoundly enhanced world peace if it had worked (it might even have averted 9/11 and the subsequent wars). Al Gore's campaign for the environment was a game changer. Shirine Ebadi's dedication to a rule of law in Iran is a game changer, and she gives hope to many otherwise cynical youth and women.
For those who are giggling and demanding concrete improvements, it is worth noting that most of the recipients have been idealists rather than practical persons. Obama is both, and therefore he has a real shot at vindicating the social worth of his policies in future. Rightwing policies were tried for 8 years and they failed. Miserably.
Posted on: Sunday, October 11, 2009 - 23:10
SOURCE: Private Papers (website of Victor David Hanson) (10-6-09)
Two-Front Wars — Theirs and Ours
Something is not quite right about the conventional wisdom about the Afghanistan war. For nearly eight years, yearly casualties in Afghanistan sometimes were less than a month's losses in the dire days in Iraq (e.g., 98 Americans killed in 2006 in Afghanistan, 112 killed in Iraq during December 2006). And while many argue that we took our eye off the ball, to quote the president, by going into Iraq to fight the optional war and shorting the essential one, it remains true that while Iraq was hottest, Afghanistan was weirdly sometimes quietest.
One might suggest of course that the Taliban and their Arab terrorist allies were quietly and stealthily laying low, regrouping, gathering support, and then blew back onto the scene in a fury in late 2008 and 2009, but that would still be at a post-surge time in Iraq when we were already deploying more Marines to Afghanistan.
Just as likely are two other developments never mentioned:
1) Just as Iraq was our second theater in the war on terror, so it was for al Qaeda and generic jihadists as well. They diverted thousands into Anbar Province and Baghdad proper rather than into Afghanistan; and while for a period they gained traction, ultimately they lost thousands in combat or through defection. That fact may have weakened their efforts in Afghanistan rather than strengthened them; and after their material and psychological defeat in Iraq they have returned their attention to the single front in Afghanistan. In other words, they took their eye off the ball in Afghanistan and focused on Iraq, but lost both materially and psychologically, and now, like us, are refocusing on the single front.
2) We were far more able to inflict casualties (given the terrain, geopolitics, and nature of the fighting) in Iraq than in Afghanistan, and that resulted in both more damage to terrorism in general, and a greater sense of deterrence than was true of the fighting alone in Afghanistan/Pakistan. When bin Laden and Zawahiri announced that Iraq was the major front in the terrorist war on the U.S., they raised the stakes, and were in essence inviting terrorists to go there rather than to Waziristan. Note we hear no more from either one of them about winning in Iraq, the central front in Iraq, the need to join jihad in Iraq, etc. Now, it is all Afghanistan again.
Polls in the Middle East are now quite different from the radical Islam's glory days following 9/11 when al Qaeda and bin Laden were iconic; the latter's ratings have nosedived along with the tactic of suicide bombing. Rather than seeing the spike in violence in Afghanistan as a sign of a lost theater, it may well be that the Islamists are now increasingly unpopular, down to one front, and waging their all on a last big effort to demoralize us. Both in conventional wars and in insurgencies (as we saw in 2007 in Iraq) sometimes the fiercest fighting is near the end rather than the beginning of the war, as a final offensive is seen as a last gambit. All this means that we should meet the challenge, support the president, and deal with the Taliban and its al-Qaeda allies as we did in 2007 to the terrorists in Iraq, despite the wide differences in culture and conditions on the ground in the respective countries.
If there really is such a thing as a global war on radical Islamic terrorism, and bin Laden is to be taken at his word that both Afghanistan and Iraq have at times been alternately central fronts in that war, then it would be a tragedy that after fighting a two-front war, and winning one, we, rather than the losing enemy, would become demoralized by our success, and they emboldened by their defeat.
I think soon the president will be entering the murky zone in Afghanistan that George W. Bush went through in 2006 in Iraq. A once seemingly successful war was stalled. The public had turned against the effort. Political opponents who once voiced"let me at 'em" enthusiasm about the war had long ago bailed."Wise men" frowned, cleared their throat and declared Iraq lost in op-ed opinions. The base was jittery and saw the war was losing Republicans influence and power. And then Bush did the lonely and good thing by placing his all with Petraeus, the surge, and keeping our commitment to those brave Iraqis who had risked their lives for the common goal of consensual government.
Obama now finds himself with those same bad and worse choices. The good war he once championed as a candidate on the stump has turned bad among his supporters. No one is talking about finally having our eye back on the ball. He has so polarized conservatives that it is apparently hard for him to reach out to them for support on the war, though on the war they are not the vocal opponents that the Left proved to be with Bush. Pundits now talk of the preferable"more rubble/less trouble" punitive route of bombs, Predators, and incursions rather than the messy, unpopular, 8-year effort at saving Afghanistan from al Qaeda, the Taliban, and Islamic medievalism in order to prevent a repeat of 9/11. Most agree that surging in Afghanistan, as was true in Iraq, could spike casualties rather than reduce them for several months to come. Moveon.org could also take out another New York Times-subsidized ad about the purported perfidy of General McChrystal, who might still be grilled on Capitol Hill with sneers that his testimony requires a suspension of disbelief. And so on...
Posted on: Sunday, October 11, 2009 - 22:32
SOURCE: NYT (10-10-09)
PRESIDENT OBAMA’S surprise Nobel Peace Prize is only the second in the last century that a sitting president has received. The first was presented in December 1920, when the Nobel Committee of the Norwegian Parliament awarded Woodrow Wilson the peace prize for 1919. Beyond the coincidence of both men residing in the White House, however, Presidents Obama and Wilson look like the starkest study in contrasts in when and how each received this prize.
For example, while Mr. Obama is being honored at the beginning of his presidency and while his popularity is high, Wilson’s prize came three months before the end of his presidency and at the lowest point in his personal and political life. It came a year and a half after the signing of the Treaty of Versailles, which ended World War I. The intervening months had witnessed a series of defeats and disasters for Wilson and his policies.
He had returned home from Paris to face a spite-filled, partisan deadlock over ratification of the peace treaty, which would carry with it membership in the new League of Nations — the international organization Wilson had played the central role in creating. In the Senate, some of his Republican opponents rejected league membership altogether, while others, most notably Henry Cabot Lodge of Massachusetts, would accept membership only under severe restrictions that many believed would cripple America’s ability to help the organization fulfill its mission.
Wilson first tried negotiating with the senators, and when they rebuffed his efforts, he took his case to the people, mounting a whirlwind campaign-style speaking tour across the country. He never completed that tour. His doctor, seeing him exhausted and showing symptoms of impending collapse, rushed Wilson back to the White House, where he suffered a stroke. He never fully functioned as president again, and his remaining time in office saw the worst instance of presidential disability in the nation’s history.
Physically, Wilson’s stroke left him partly paralyzed and enfeebled; psychologically, it unhinged his emotional balance and impaired his judgment. Faced with Lodge’s reservations, Wilson from his sickbed rejected all talk of compromise and ordered Democratic senators to accept virtually unconditional approval of the treaty. After an initial defeat of the treaty in November 1919, a bipartisan group of senators tried to find a middle ground, only to fail in the face of intransigence by both Lodge and Wilson. The treaty went down to a second defeat in March 1920, prompting the president to tell his doctor, “the Devil is a busy man.” As a result of those votes, the United States never ratified the Treaty of Versailles and never joined the League of Nations.
Still worse humiliation followed for Wilson. Public opinion largely turned against him. The Republican nominee in 1920, the handsome, mellifluous Warren Harding, called for “not heroics but healing, not nostrum but normalcy, not revolution but restoration, not agitation but adjustment, not surgery but serenity” — all ways of drawing a sharp line between himself and Wilson. Not surprisingly, the Republicans won one of the biggest popular and electoral victories ever; Cordell Hull, then a just-defeated member of Congress but later Franklin Roosevelt’s secretary of state, called it “a tidal wave.”
Coming just a month after that repudiation at the polls, the peace prize offered balm to Wilson’s wounded body and soul. The news surprised him, but it should not have. With the Great War over, the leading peacemaker and author of a bold new plan to rid the world of war was the logical, well-nigh inescapable choice
Yet the prize was a case of a prophet enjoying greater honor among others than among his own people. Abroad, reactions were generally approving, even among some figures from the defeated Central Powers. At home, Republicans in general and Lodge in particular ignored the event, while privately they were itching to take power and have a chance to show what they trumpeted as their better approaches to peace and security — plans that did not include the League of Nations...
Posted on: Sunday, October 11, 2009 - 21:31
SOURCE: Truthout (10-10-09)
I was dismayed when I heard Obama was given the Nobel Peace Prize. A shock, really, to think that a president carrying on wars in two countries and launching military action in a third country (Pakistan), would be given a peace prize. But then I recalled that Woodrow Wilson, Theodore Roosevelt and Henry Kissinger had all received Nobel Peace Prizes. The Nobel Committee is famous for its superficial estimates and for its susceptibility to rhetoric and empty gestures, while ignoring blatant violations of world peace.
Yes, Wilson gets credit for the League of Nations - that ineffectual body which did nothing to prevent war. But he also bombarded the Mexican coast, sent troops to occupy Haiti and the Dominican Republic and brought the US into the slaughterhouse of Europe in the first World War - surely, among stupid and deadly wars, at the top of the list.
Sure, Theodore Roosevelt brokered a peace between Japan and Russia. But he was a lover of war, who participated in the US conquest of Cuba, pretending to liberate it from Spain while fastening US chains around that tiny island. And as president he presided over the bloody war to subjugate the Filipinos, even congratulating a US general who had just massacred 600 helpless villagers in the Phillipines. The Committee did not give the Nobel Prize to Mark Twain, who denounced Roosevelt and criticized the war, nor to William James, leader of the anti-imperialist league...
Posted on: Sunday, October 11, 2009 - 21:23
SOURCE: danielpipes.org (10-9-09)
"He won what?" is the universal first reaction.
And second, at least on the Right: "Why did they do that?"
Even the Nobel committee's citation does not pretend Barack Obama has actually achieved anything. Rather, it was given to him "for his extraordinary efforts to strengthen international diplomacy and cooperation between peoples." That's efforts, not achievements.
Reading carefully through the entire citation suggests that Obama is being celebrated for two reasons. Its chatter about "a new climate," the United Nations, a "vision of a world free from nuclear arms," and "great climatic challenges" points to his being the anti-George W. Bush.
Second, the prize committee hopes to constrain Obama's hands vis-à-vis Iran. It lauds him for not using force: "Dialogue and negotiations are preferred as instruments for resolving even the most difficult international conflicts." This is obviously gibberish: whereas Bush did not use force against North Korea, Obama does not rely on dialogue in Afghanistan. But the statement does pressure Obama not to use force in the theater that counts the most, namely the Iranian nuclear build-up.
So, from the Leftist Norwegian point of view, it's a twofer – bash Bush and handcuff Obama.
My prediction: The absurdity of the prize decision will harm Obama politically in the United States, contrasting his role as international celebrity with his record devoid of accomplishments. Michael Steele, chairman of the Republican National Committee, notes that Obama "won't be receiving any awards from Americans for job creation, fiscal responsibility, or backing up rhetoric with concrete action." Expect to hear much more along those lines.
Posted on: Sunday, October 11, 2009 - 14:36
SOURCE: TomDispatch.com (blog of Tom Engelhardt) (10-8-09)
An unremarkable paragraph in a piece in my hometown paper recently caught my eye. It was headlined "White House Believes Karzai Will Be Re-elected," but in mid-report Helene Cooper and Mark Landler of the New York Times turned to Afghan War commander General Stanley McChrystal's "redeployment option." Here's the humdrum paragraph in question: "The redeployment option calls for moving troops from sparsely populated and lawless areas of the countryside to urban areas, including Kandahar and Kabul. Many rural areas 'would be better left to Predators,' said an administration official, referring to drone aircraft."
In other words, the United States may now be represented in the Afghan countryside, as it already is in the tribal areas on the Pakistani side of the border, mainly by Predators and their even more powerful cousins, Reapers, unmanned aerial vehicles with names straight out of a sci-fi film about implacable aliens. If you happen to be an Afghan villager in some underpopulated part of that country where the U.S. has set up small bases -- two of which were almost overrun recently -- they will be gone and "America" will instead be soaring overhead. We're talking about planes without human beings in them tirelessly scanning the ground with their cameras for up to 22 hours at a stretch. Launched from Afghanistan but flown by pilots thousands of miles away in the American West, they are armed with two to four Hellfire missiles or the equivalent in 500-pound bombs.
To see Earth from the heavens, that's the classic viewpoint of the superior being or god with the ultimate power of life and death. Zeus, that Greek god of gods, used lightning bolts to strike down humans who offended him. We use missiles and bombs. Zeus had the knowledge of a god. We have "intelligence," often fallible (or score-settling). His weapon of choice destroyed one individual. Ours take out anyone in the vicinity.
He made his decisions from Mount Olympus; we make ours from places like Creech Air Force Base outside Las Vegas, and Davis-Monthan Air Force Base in Tucson, Arizona. Those about whom we make life-and-death decisions, as they scurry below or carry on as best they can, have -- like any beings faced with the gods -- no recourse or appeal. Seen on screens, they are, to us, distant, grainy figures, hardly larger than ants. This is what implacable means.
Soothing the Children
And none of this strikes us as strange. Quite the opposite, it represents reasonable policy. Comments like the one quoted above are now commonplace. In the Washington Post, for instance, Rajiv Chandrasekaran recently recorded the thoughts of an anonymous U.S. officer in Afghanistan: "If more forces are not forthcoming to mount counterinsurgency operations in those parts of the province, he concluded, the overall U.S. effort to stabilize Kandahar -- and by extension, the rest of Afghanistan -- will fail. 'We might as well pack our bags and go home… and just keep a few Predators flying overhead to whack the al-Qaeda guys who return.'"
We know as well that, in the Washington debate over what to do next in the Afghan War, Vice President Joe Biden has come down on the side of "counterterrorism." He wants to put more emphasis on those drones and on special operations forces, while focusing more on Pakistan (though without dropping U.S. troop levels in Afghanistan). At the same time, the Pentagon has just created an Afghan Hands program and a Pakistan-Afghanistan Coordination Cell, two units focused on improving military performance in the Af-Pak theater of operations over the next three to five years. All of this represents the norm for military and civilian leaders who, whatever their differences, believe wars that go on for endless years thousands of miles from home are the sine qua non of American safety.
And none of this seems less than reasonable to us, especially given the much publicized "success" of the drone assassination program in taking out Taliban and al-Qaeda leadership figures. What does strike us as strange, though, is that the locals, whether in Pakistan or Afghanistan, find all this upsetting. A recent U.S. poll in Pakistan typically reported "that 76 percent of the respondents were opposed to Pakistan partnering with the United States on missile attacks against extremists by American drone aircraft."
Then again, we take it for granted that the people of such backward lands are strange, touchy types. Not like us. In George Packer's recent New Yorker profile of Richard Holbrooke, the president's special representative for Afghanistan and Pakistan, there were some classic lines reflecting this.
Packer describes Holbrooke on a flying visit to Afghanistan this way: "He seemed less like a visiting emissary than like a proconsul inspecting a vast operation over which he commanded much of the authority." When that same proconsul makes it out of impoverished, shattered Afghanistan (where the U.S. Embassy, at one point, had to deny he had engaged in a "shouting match" with Afghan President Hamid Karzai) and into Pakistan, a fractious, disturbed, unnerved country of genuine significance, he packs the proconsul away and, according to Packer, becomes Washington's cajoler-in-chief. As Packer writes, "In moments when I overheard him talking to Pakistani leaders, he took the solicitous tone of someone reassuring an unstable friend. 'It's like dealing with psychologically abused children,' a member of his staff said. 'You don't focus on the screaming and the violence -- you just hug them tighter.'"
So, if Afghan and Pakistani peasants in the mountainous tribal borderlands are so many ants or rabbits, Pakistani leaders are "children." It matters little that Holbrooke has a reputation himself as an egotist and a screamer who demands his way. (Among diplomats back in the 1990s when he was negotiating in the former Yugoslavia, one joke went: What's the most dangerous place in the Balkans? The answer: Between Dick Holbrooke and a camera.)
Packard reports Holbrooke's disappointment over the amount of aid Congress is ponying up for Pakistan ($7.5 billion) and, to add to his set of frustrations, there's this: "Because of Pakistan's sensitivity about its sovereignty, he had been unable to persuade its military to allow American helicopters to bring aid to the refugees," who had been driven from the Swat Valley by the Taliban and a Pakistani military offensive.
Let's think about that for a moment, especially since it's a commonplace of American reporting from the region and so reflects official thinking on the subject. Karen DeYoung and Pamela Constable, for instance, write in a Washington Post piece: "Pakistanis, who are extremely sensitive about national sovereignty, oppose allowing foreign troops on their soil and have protested U.S. missile attacks launched from unmanned aircraft against suspected Taliban and al-Qaeda targets inside Pakistan." In fact, let's reverse the situation.
Imagine that, after the next Katrina, Pakistani military helicopters based on a Pakistani aircraft carrier in the Gulf of Mexico are preparing to deliver supplies to New Orleans. Of course, you also have to imagine, minimally, that the Pakistanis are in the process of building a three-quarters of a billion dollar fortress of an embassy in Washington D.C. (to be guarded by armed Pakistani private contractors), that Pakistani drones are regularly cruising the Sierra Nevada mountains, launching missiles at residences in small towns below, that the Pakistanis are offering billions of dollars in desperately needed aid to a hamstrung American government and military in return for not complaining too much about whatever they might want to do in the United States, that top Pakistani military and civilian officials are constantly shuttling through Washington demanding "cooperation," and finally that Pakistani reporters covering all this regularly point to an "extreme American sensitivity about national sovereignty," as illustrated by a bizarre unwillingness to accept Pakistani aid delivered in Pakistani military helicopters. Then again, you know those Americans: combustible as spoiled kids.
Such reversals are, of course, inconceivable and so, nearly impossible to imagine. Today, were a Pakistani military helicopter to approach the U.S. coast with anything on board and refuse to turn back, it would undoubtedly be shot down. So much for American touchiness.
But here's a question that comes to mind: Why is it that Americans like Holbrooke seem to feel so at home so far away from home? Why, for instance, do U.S. military spokespeople so regularly refer to our indigenous enemies in Iraq as "anti-Iraqi forces," and in Afghanistan as "anti-Afghan forces"? Why does our military in Iraq speak of the neighboring Iranians as "foreign forces" without ever including our own military in that category?
Resistant as Washington may be to the thought, the obvious has recently been crossing some influential minds. Amid the debate over war options -- more troops, more training of the Afghan military and police, more drone attacks in Pakistan, or some mix-and-match version of all of the above, but certainly not a withdrawal from the country -- it has become more common to express concern that deploying up to 40,000 more U.S. troops might create too big an American "footprint." As Peter Baker and Thom Shanker of the New York Times wrote in a profile of Robert Gates, the secretary of defense "has repeatedly declared his concern that more troops would make Americans look increasingly like occupiers."
After almost eight years of war, only now does the danger that we might "look increasingly like occupiers" rise to the surface. Since "occupier" is a role Americans just can't imagine occupying, let's consider a fantasy alternative instead, one perhaps easier to imagine: What if it turns out that we are the Martians?
Crushing the Rabbits
The first Martian invasion of this planet -- they landed near the town of Woking in England and, before they were done, laid waste to London -- took place in 1898, thanks to the Tasmanians, and if you don't think that's worth considering more than a century later, think again. In fact, General McChrystal, President Obama, Proconsul Holbrooke, as you're doing your reassessments of the Afghan War, do I have a book for you.
I was perhaps 12 years old when I first read it -- under the covers by flashlight long after I was supposed to be asleep -- and it scared the hell out of me. Even now, when alien invasion plots are a dime a dozen, I have a hunch that it could do the same for you. I'm talking, of course, about H.G. Wells's The War of the Worlds. If you remember, that other Welles, Orson, successfully redid it in a 1938 radio version in which the fictional Martians landed in New Jersey, and many perfectly real New Yorkers were reportedly unnerved. (The 2005 Steven Spielberg movie version, the second film made from Wells's classic, had all the expectable modern pyrotechnics, but none of the punch of the book.)
Back in the era when Wells wrote his book, invasion novels were already commonplace in England, with the part of the implacable, inhuman invader normally played by the Germans. Wells, on the other hand, almost single-handedly created the alien invader genre, arming his brainy monsters from the dying planet Mars with poison gas and a laser-like heat ray, and then supplying them with giant walking tripods (think elevated tanks without treads) -- all prefiguring the weaponry of the world wars to come (and even of wars beyond our own).
However, nothing in the book -- not the weaponry, not even the destruction -- is more terrifying than the attitude of the Martians ("intellects vast and cool and unsympathetic"), for this is one of the great role-reversal novels of all time. They are implacable exactly because they see the English as we would see rabbits, or as English colonists in Australia did indeed see the Tasmanians, a people they all but exterminated with hardly a twinge of regret. In fact, that's where The War of the Worlds evidently began. It seems that Wells's brother Frank brought up the extermination of the Tasmanians one day and so launched the idea for a book still in print 111 years later. Evidently, the question that came to Wells's mind was this: What if someone arrived in England with the same view of the superior English that the English had had of the Tasmanians, and the sort of advanced weaponry and technology capable of turning that attitude into a grim reality?
As his unnamed central character comments in the first pages of the novel: "The Tasmanians, in spite of their human likeness, were entirely swept out of existence in a war of extermination waged by European immigrants, in the space of fifty years. Are we such apostles of mercy as to complain if the Martians warred in the same spirit?"
The Martians (actually transmogrified Englishmen) advance through the English countryside and into London, frying everything in sight in a version of what, in the next century, would come to be known as total war -- that is, war visited not just on the warriors, but on the civilian population. At the same time, they harvest humans and feed off their blood. In the coming century, there would indeed be Martians aplenty on this planet, more than ready to feed off the blood of its inhabitants.
General McChrystal, President Obama, Proconsul Holbrooke, The War of the Worlds, old as it is, offers a rare example of how to imagine us from the point of view of them. I urge you to study it with the intensity you now apply to counterinsurgency and counterterrorism strategies. After all, in our own way, we could be considered the Martians of the twenty-first century and (how typical!) we don't even know it.
Unlike Wells's Martians, who arrived on this planet without a propaganda department or a care in the world about English "hearts and minds," we landed in Afghanistan talking a people-friendly game, and we've never stopped, even if much of the palaver has been for home consumption. And yet during the first eight years of our Afghan War, as General McChrystal recently admitted in his 66-page report to the secretary of defense, we could hardly have exhibited a more profound ignorance of the Afghan world, or a more Martian lack of interest in finding out about it, even as we were blowing Afghans away.
Now, the Pentagon is attempting to correct that by setting up a new intelligence unit "to provide military and civilian officials in Afghanistan with detailed analysis of the country's tribal, political and religious dynamics." As Robert Dreyfuss of the Nation's Dreyfuss Report, points out, however, this unit will be based at a center in Tampa, Florida; we will, that is, now study the Afghans as anthropologists might once have studied the Trobriand Islanders. Then we will process that information thousands of miles away, just as our "pilots" do.
Perhaps it's time to study ourselves instead. What if, from an Afghan point of view, we really are Wells's Martians? Then, it's not a matter of counterinsurgency versus counterterror, or more American troops versus more American-trained Afghan ones, or even nation-building versus stabilization. What if -- and this is an un-American thought -- there is no American solution to Afghanistan? What if no alternative, or combination of alternatives, will work? What if the only thing Martians can effectively do is destroy -- or leave? (Remember, even Wells's aliens finally and involuntary chose to abandon their occupation of England. They died, thanks to bacteria to which they had no immunity.)
What if the Afghans will never see those Predators -- our equivalent of the Martian "tripods" and death rays combined -- as their protectors? After all, our drones represent the technologically advanced, the alien, and the death-dealing along with, as Toronto Sun columnist Eric Margolis wrote recently, the whole panoply of our "B-1 heavy bombers, F-15s, F-16s, F-18s, Apache and AC-130 gunships, heavy artillery, tanks, radars, killer drones, cluster bombs, white phosphorus, rockets, and space surveillance." Even our propaganda, dropped from the air (as if from another universe), can kill. Recently, an Afghan girl died after being hit by a box of propaganda leaflets, released from a British plane, that "failed to come apart." Her heart and mind may be stilled, but rest assured, those of her parents, her relatives, and others who knew her, undoubtedly aren't.
Here's a little exchange, as reported at a New York Times blog from an alien "encounter" in another land. A U.S. Army major, Guy Parmeter, had it near Samara in Iraq's Salahuddin province in 2004 ("[I]t made me think: how are we perceived, who are we to them?"):
Maj. Guy Parmeter: "Seen any foreign fighters?"
Iraqi farmer: "Yes, you."
Sometimes it takes 66 pages to report on a war. Sometimes a century old novel can do the trick. Sometimes you can write tomes about the "mistakes" made in, and the "tragedy" of, an American counterinsurgency war in a distant land. Sometimes a simple "yes, you" will do.
[Note on sources and resources: I thought I might mention several websites that I read avidly and rely on in writing pieces like this one, starting with Robert Dreyfuss's invaluable work at his Dreyfuss Report blog at the Nation magazine. On Iran, Afghanistan, and Iraq, it should not be missed. In addition, there are my long-term favorites: Antiwar.com (and Jason Ditz's regular news summaries there); Juan Cole's Informed Comment website -- always a must read, but lately he's been producing remarkable columns day after day; and the War in Context, another website I simply couldn't do without. I also find Noah Shachtman's http://www.wired.com/dangerroom/Danger Room blog at Wired magazine of special interest on military matters. On the Afghan War, check out Robert Greenwald's Rethink Afghanistan (and his striking new film of the same name), as well as the Af-Pak Channel and its "daily brief" newsletter. Finally, a small bow to Michael Maddox who, in a letter to the New York Times, brought Major Parmeter's exchange to my attention.]
Posted on: Friday, October 9, 2009 - 09:45