Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Tabsir (1-31-08)
Tomorrow I am scheduled to teach a class on the political advice of Niccolo Machiavelli, some five centuries removed. If this noted Florentine were alive today, he would probably display an unseemly Italian gesture at the political disarray of his beloved Italy and the ineptness of the world’s remaining superpower’s involvement to his geographic Orient. Can you imagine this advice in an updated edition of The Prince: when in doubt or unwilling to act, form a study group. Just over a year ago we had the highly touted and now conveniently shelved Iraq Study Group. The media touted the prominent bipartisan members, the report was available free to the public, and the sitting President more or less brushed aside any recommendation that did not flatter him. This would no doubt please Machiavelli’s realism. He just surged ahead, sending more troops rather than admitting a flawed policy in the first place.
Each day the news media report suicide bombings, now more commonly in Pakistan and Afghanistan. For some this might mean the surge is working. But what about the surge in violence outside Iraq, especially in Afghanistan, the place it all started. Somebody forgot to form a study group for Afghanistan, but now we have it. The Center for the Study of the Presidency has just issued its report of the Afghanistan Study Group, chaired by a former Marine Corps general and a former ambassador. And who could naysay the lofty goal:
“The Afghanistan Study Group will convene a series of working sessions in each topic in order to identify recommendations and milestones that would substantively aid in the economic and political development of the region.”
After all, as Secretary of State Condoleezza Rice is quoted as saying by the Study Group, “An Afghanistan that does not complete its democratic evolution and become a stable, terrorist-fighting state is going to come back to haunt us.” So the Afghanistan Study Group combed the old haunts of the think tanks and collected a number of study-group-savy men, read the tea leaves and have concluded that Afghanistan is on the brink of becoming a “failed state.”
So what is the overarching recommendation in the report, apart from the points obvious to just about anyone before the study began? More study.
“For that purpose, the Study Group proposes to establish an Eminent Persons Group to develop a long-term, coherent international strategy for Afghanistan and a strategic communications plan to garner strong public support for that strategy.”
The bottom line appears to be as follows: President Bush and his administration have failed by coupling Afghanistan with Iraq, so far NATO has failed by not having enough troops and egregious Taliban-backed suicide bombings continue to cripple the feeble and corrupted development efforts. So bring in the “Eminent Persons” for more study. But who could be more eminent than Afghans themselves and who could have more at stake than the current leaders in Afghanistan? Have they asked for another report, in English no less? Do we really need more commissions and more reports and more of the same talk? One year from now, will the Afghanistan Study Group report have any more resonance than that issued with patriotic fanfare by the Iraq Study Group only a year ago?
What more needs to be studied? The political history of Afghanistan, since the Neolithic, has been well documented by scholars. Certainly the long U. S. covert support for Islamic freedom fighters against the godless Communists generated lots of government reports. Afghans themselves have written about the events destroying their country and what to do. Take a country that has never in history been unified, except in name under the most coercive force, destroy its fledgling infrastructure through years of internal and external strife, allow its people to remain largely illiterate and uneducated apart from madrasa abcs, feed the corrupt bureaucracy with a massive influx of foreign aid and demand that Afghans embrace a democracy imposed from the outside. What part of this scenario has not been studied? The omain thing that needs to be studied is why there is such a need for more studies.
But the Eminent Persons Group is not about studying, certainly not in an academic fact-finding sense, as much as it is about political posturing. If our government officials currently making the decisions are not “eminent,” then the buck should stop there. Gathering a posse of former ambassadors and political functionaries who only seem to become eminent when they retire from government service allows the buck not to stop anywhere. Study groups do not solve problems, they mainly provide temporary cover. Blue ribbon commissions on issues of war seldom do anything to stop red-blooded individuals from being killed.
Read the Afghanistan Study Group report, but also look at the day’s news. “A suicide bomber has killed the deputy governor of Afghanistan’s southern Helmand province, along with five other people. Another 18 people were wounded in the attack, when the bomber exploded among worshipers at a mosque in the town of Lashkar Gah on Thursday,” notes Al-Jazeera.
All the study group reports along the Beltway will not bring any of these victims back to life. Nor will this report stop the death toll from surging forward tomorrow.
Posted on: Thursday, January 31, 2008 - 16:07
SOURCE: New Republic (1-29-08)
If the mixed results in the early Republican primaries--a Huckabee here, a McCain or Romney there--portends a split between the GOP's religious, fiscally conservative, and security-state wings, it won't be the first time a national American political coalition has failed. But it will be the third time in a hundred years an apparently strong Republican majority cracked up due to the party's inability to govern. By contrast, Democratic coalitions have failed mostly because the party has overreached after governing successes.
In the midst of an economic depression, the Republican Party assembled a presidential majority in 1896 for William McKinley and his conservative platform. McKinley won despite the revolt of many traditionally Republican western states, whose citizens believed the party's elite had grown too cozy with industrial and financial leaders, while leaving the stricken farmers of the heartland in the cold.
Within months of McKinley's election, the depression began to lift (although owing to factors outside of his or any American's control, including a worldwide increase in the supply of gold). McKinley then swept to an even more secure victory in 1900, bringing much of the West back into the fold, partly by putting Theodore Roosevelt onto the ticket.
With McKinley, the Republican Party shifted away from its post-Civil War habit of bludgeoning the South, and McKinley ran as a candidate of sectional reconciliation. He wooed the South with symbolic gestures, like declaring that their soldiers had demonstrated "American valor" in battle and taking federal responsibility for Confederate war graves. He wooed the West with promises of renewed prosperity under his tariff and monetary policies. And Roosevelt's subsequent presidency--he took 56% percent of the popular vote in 1904--appeared to show that the Republicans could campaign and govern as a truly national party.
But the seeming solidity of this coalition concealed real divisions, owing largely to the Republicans' unwillingness to give Westerners what they demanded. Out there in the new states, voters began agitating for and adopting democratic measures--women's suffrage; initiative, referendum, and recall; and ways to popularly elect Senators and presidential candidates. Mere national prosperity, unevenly spread as it was and almost never trickling down to farmers, wasn't going to satisfy them. They actually wanted to take part in the country's government and change it for themselves.
Roosevelt made the right noises in response to this stirring insurgency--as one observer dryly wrote, "He smote with many a message." But, since he was a Republican beholden to eastern industry, he could do little more than talk, and he did that very carefully. As another student of Rooseveltiana more acutely mentioned, he was "the greatest concocter of 'weasel' paragraphs on record." This was not to say that Roosevelt accomplished nothing--rather, that he accomplished just enough to contain the forces of the western insurgency. And, for what it's worth, the Republican Party might have enjoyed an entirely different history had he sought renomination in 1908.
Roosevelt's successor, William Howard Taft, couldn't weasel charmingly enough for an electorate increasingly dissatisfied with Republican complacency. In 1910, the Democrats took the Congress. Roosevelt tried to push his party back in his direction, and when that failed, he led a third-party movement in 1912 that put Woodrow Wilson into the White House, along with a Democratic House and Senate.
It was a democratic as well as a Democratic revolution, as constitutional amendments brought the direct election of Senators and women's suffrage. Moreover, through the early 1910s, Congress enacted legislation that responded materially, rather than rhetorically, to their constituents' economic concerns, passing laws to restrict working hours, raise an income tax, lower tariffs, and regulate banking by creating the Federal Reserve System--all measures that have, on balance, endured as methods of governing the United States.
Then they went too far: Not only did Wilson lead the nation into war despite campaigning on the promise not to, but he also created an immense domestic propaganda machine to promote the military effort and silence dissent. The Republicans took Congress in 1918 and, after a disillusioning conclusion to war, the presidency in 1920....
Posted on: Wednesday, January 30, 2008 - 16:25
SOURCE: Claremont Review of Books (12-14-07)
Preoccupied with the daily news from Baghdad, we seem to think our generation is unique in experiencing the heartbreak of an error-plagued war. We forget that victory in every war goes to the side that commits fewer mistakes—and learns more from them in less time—not to the side that makes no mistakes. A perfect military in a flawless war never existed—though after Grenada and the air war over the Balkans we apparently thought otherwise. Rather than sink into unending recrimination over Iraq, we should reflect about comparable blunders in America's past wars and how they were corrected. Without such historical knowledge we are condemned to remain shrill captives of the present....
[For example: Poor Leadership]
Have there ever been lapses in military leadership like the ones that purportedly mar our Iraq effort? The so-called "Revolt of the Generals" against Secretary of Defense Donald Rumsfeld was nothing compared to the "Revolt of the Admirals" that led to Secretary of Defense Louis Johnson's forced resignation in the midst of the bitter first year of the Korean War. Johnson himself had come to office following the removal (or resignation), and then probable suicide, of Secretary James Forestall, whose last note included a lengthy quotation from Sophocles' Ajax. Johnson's successor, the venerable General George Marshall, lasted less than a year—hounded out by Joseph McCarthy, and an object of furor in the wartime 1952 election that brought in Eisenhower (who did not defend his former superior from McCarthy's slanders). The result was that four different secretaries of defense—Forestall, Johnson, Marshall, and Robert Lovett—served between 1949 and 1951.
Critics of the Iraq war wonder how a workmanlike Lieutenant General Ricardo Sanchez, on whose watch Abu Ghraib occurred, had obtained command of all coalition ground forces in the first place, and later why General George Casey persisted in tactics that were aimed more at downsizing our forces than going after the enemy and fighting a vigorous war of counterinsurgency. But surely these armchair critics can acknowledge that such controversies over personnel pale in comparison to past storms. Lincoln serially fired, ignored, or bypassed mediocrities like Generals Burnside, Halleck, Hooker, McClellan, McDowell, Meade, Pope, and Rosecrans before finding Grant, George Thomas, Sherman, and Philip Sheridan—all of whom at one time or another were under severe criticism and nearly dismissed.
World War II was little better. By all accounts General John C.H. Lee set up an enormous logistical fiefdom in Paris that thrived on perks and privilege while American armies at the front were short on manpower, materials, and fuel. To this day military historians cannot quite fathom how and why Major General Lloyd Fredendall was ever given an entire corps in the North Africa campaign. His uninspired generalship led to the disaster at the Kasserine Pass and his own immediate removal. Lieutenant General Simon Bolivar Buckner, a competent officer, was bewildered by the unexpected Japanese resistance on Okinawa, and unimaginatively plowed head-on through fortified enemy positions—until killed in action on the island, the most senior-ranking officer to die by enemy fire in World War II. The generalship of Mark Clark in Italy was often disastrous.
The story of the U.S. army at war is one of sacking, sidetracking, or ostracizing its highest and best-known commanders in the field—Grant after Shiloh, Douglas MacArthur in Korea, Patton in Sicily, or William Westmoreland in Vietnam—for both good and bad reasons. Iraq and Afghanistan are peculiar in that there have been so few personnel changes, much less a general consensus about perceived military incompetence. In comparison to past conflicts, the wonder is not that a gifted officer like David Petraeus came into real prominence relatively late in the present war, but that his unique talents were recognized quickly enough to allow him the command and latitude to alter the entire tactical approach to the war in Iraq....
Posted on: Wednesday, January 30, 2008 - 15:06
SOURCE: Huffington Post (Blog) (1-29-08)
Ted Kennedy's statement Monday that Obama was like JFK set off a storm of historical analogies. Hillary's side fired back that she is like Bobby Kennedy - at least that's what three of Bobby's kids said the next day: "Like our father, Hillary has devoted her life to embracing and including those on the bottom rung of society's ladder," Kathleen Kennedy Townsend, Robert F. Kennedy Jr. and Kerry Kennedy declared.
Hillary herself has claimed not so long ago that SHE is our JFK: "A lot of people back then  said, 'America will never elect a Catholic as president,' " she said in New Hampshire last March. "When people tell me 'a woman can never be president,' I say, we'll never know unless we try." And of course she also compared herself to LBJ, whose political skills, she said, made it possible for him to sign into law what she called "Dr. King's dream."
Nicholas Kristoff of the New York Times compared Obama to Lincoln (both were undistinguished newcomers when they ran for president). Paul Krugman of the New York Times compared Hillary to Grover Cleveland (both were conservative Democrats in a Republican era). Biographer Joseph Ellis compared Obama to Thomas Jefferson (both spoke in favor of nonpartisan politics).
Sorting out these claims is, of course, a job for professionals - professional historians. They too are partisans. The only organized political group of historians in this campaign in "Historians for Obama," which includes Joyce Appleby, former president of the American Historical Association; Robert Dallek, the award-wining presidential biographer; David Thelen, former editor of the Journal of American History; and the Pulitzer-prize winning Civil War historian James McPherson.
Their statement made some sweeping analogies: "Lincoln signed the Emancipation Proclamation and kept the nation united; Franklin D. Roosevelt persuaded Americans to embrace Social Security and more democratic workplaces; John F. Kennedy advanced civil rights and an anti-poverty program. Barack Obama has the potential to be that kind of president."
On the other side, there is no historians-for-Hillary organization, but there is Sean Wilentz--the Princeton professor and award-winning author of The Rise of American Democracy: Jefferson to Lincoln who testified for the defense at the Clinton impeachment hearing. He recently took on the key Obama analogies in an LA Times op-ed. First, he said, Obama is no JFK: "By the time he ran for president, JFK had served three terms in the House and twice won election to the Senate," Wilentz wrote. "Before that, he was, of course, a decorated veteran of World War II, having fought with valor in the South Pacific."
And to compare Obama to Lincoln, Wilentz says, is "absurd": "Yes, Lincoln spent only two years in the House," but in 1858, when he ran unsuccessfully for the Senate, Lincoln "engaged with Stephen A. Douglas in the nation's most important debates over slavery before the Civil War."
On the other hand, Robert Dallek, author of biographies of LBJ and Kennedy, has explained that the appeal of JFK in 1960 has clear parallels to Obama's campaign today: "it's the aura, it's the rhetoric, the youthfulness, the charisma," he told the Chicago Tribune blog "The Swamp."
Then there is the Lincoln analogy. Eric Foner, the former American Historical Association president and author of Reconstruction, points out that, in 1860, the Republicans had to choose between two candidates: one who claimed decades of experience in politics, the other with much less, who won support because his oratory was so inspiring and he was deemed more electable. In 1860, the candidate with experience lost the nomination to Lincoln; he was William H. Seward. That makes it fair to say that Hillary could be our Seward.
Posted on: Wednesday, January 30, 2008 - 02:31
SOURCE: Washington Monthly (1-1-08)
... In light of the failures of the Bush presidency, Barack Obama recently remarked that the president’s job should be “to set a vision,” not “to run some bureaucracy.” Senator Obama seems to prize what President Bush’s father, who had a fine and firm grasp of the intricacies of foreign policy, once famously belittled as “the vision thing.” Obama claims he can execute his vision through an ability to “tease” policy details from his advisors. Senator Hillary Rodham Clinton, meanwhile, has countered that presidents “have to be able to manage and run the bureaucracy,” as well as to set a vision for government.
Other candidates, in both parties, will likely offer other conceptions of the ideal presidency in the coming weeks and months. All seem to agree that, as one of Republican Mitt Romney’s campaign slogans puts it, “Washington is broken.” But how they propose fixing Washington will depend on their ideas about what the job of president entails. Although it’s possible that the candidates may govern differently than they present themselves on the campaign trail, much as President Bush has, in the past, presidents’ ingrained notions of what the executive office should be have aptly foreshadowed the kind of administrations they led. Voters should be able to compare the ideas of today’s candidates to what, historically, have been the three major styles of presidential leadership over the past century, and determine for themselves what kind of leadership is best suited to current times.
The first model is the strong presidency. Strong presidents have come from both parties. They have been hands-on executives who have commanded the executive branch and decided what its priorities ought to be. Typically, these presidents have had considerable experience in Washington, often specifically in the executive branch, before they reach the Oval Office. While in the White House, they have applied this expertise to lead the country in what they believe is the right direction. Building on a tradition that originated during the presidency of Andrew Jackson, advocates of the strong presidency have claimed that the president, as the one federal officer (apart from the vice-president) who is elected by the people at large, is a singular carrier of the people’s trust and ought to be the driving force at the center of government.
Franklin Delano Roosevelt, who had been Assistant Secretary of the Navy during World War I, was the most distinguished of the modern strong presidents, and his expertise and conviction carried the nation through the Great Depression and then, after 1941, helped him become the self-described “Dr. Win-the-War.” Roosevelt’s successor, former Vice President Harry S. Truman – with his famous declaration “The Buck Stops Here” – established the modern notion that the president bears ultimate responsibility for the federal government’s successes and failures. Dwight D. Eisenhower, after serving as supreme allied commander in World War II, governed with what the presidential scholar Fred I. Greenstein has called “a hidden hand,” but was very much the supreme commander of his administration. Richard M. Nixon, another vice president who became a strong president, achieved several important reforms – not least with respect to redirecting superpower politics abroad – before his taste for unbridled power led to the creation and downfall of an imperial presidency in defiance of the Constitution.
The second model has been the advisory presidency, in which the chief executive relies on his counselors for basic information and guidance on major policy decisions. Warren G. Harding was one such president, whose cunning advisors exploited him as an unwitting front-man for corruption. Lyndon B. Johnson, a strong president when it came to promoting Civil Rights and the Great Society, took a disastrous turn when he deferred to the counsel of his Pentagon advisers on the topic of Vietnam. ...
With the federal bureaucracy in disrepair, military resources stretched to the breaking point, America’s diplomatic clout in the world at ebb tide, and the domestic economy sliding into recession, the nation has reached a turning point – one in which considerations of performance should utterly overshadow candidates’ personalities. In choosing our next president, we face a choice over conceptions of presidential power in a time of national crisis.
Posted on: Wednesday, January 30, 2008 - 02:24
SOURCE: Jerusalem Post (1-30-08)
Startling developments in Gaza highlight the need for a change in Western policy toward this troubled territory of 1.3 million persons.
Gaza's contemporary history began in 1948, when Egyptian forces overran the British-controlled area and Cairo sponsored the nominal"All-Palestine Government" while de facto ruling the territory as a protectorate. That arrangement ended in 1967, when the Israeli leadership defensively took control of Gaza, reluctantly inheriting a densely populated, poor, and hostile territory.
Nonetheless, for twenty years Gazans largely acquiesced to Israeli rule. Only with the intifada beginning in 1987 did Gazans assert themselves; its violence and political costs convinced Israelis to open a diplomatic process that culminated with the Oslo accords of 1993. The Gaza-Jericho Agreement of 1994 then off-loaded the territory to Yasir Arafat's Fatah.
Those agreements were supposed to bring stability and prosperity to Gaza. Returning businessmen would jump-start the economy. The Palestinian Authority would repress Islamists and suppress terrorists. Yasir Arafat proclaimed he would"build a Singapore" there, actually an apt comparison, for independent Singapore began inauspiciously in 1965, poor and ethnically conflict-ridden.
Of course, Arafat was no Lee Kuan Yew. Gazan conditions deteriorated and Islamists, far from being shut out, rose to power: Hamas won the 2006 elections and in 2007 seized full control of Gaza. The economy shrunk. Rather than stop terrorism, Fatah joined in. Gazans began launching rockets over the border in 2002, increasing their frequency, range, and deadliness with time, eventually rendering the Israeli town of Sderot nearly uninhabitable.
Faced with a lethal Gaza, the Israeli government of Ehud Olmert decided to isolate it, hoping that economic hardship would cause Gazans to blame Hamas and turn against it. To an extent, the squeeze worked, for Hamas' popularity did fall. The Israelis also conducted raids against terrorists to stop the rocket attacks. Still, the assaults continued; so, on January 17, the Israelis escalated by cutting fuel deliveries and closing the borders."As far as I'm concerned," Olmert announced,"Gaza residents will walk, without gas for their cars, because they have a murderous, terrorist regime that doesn't let people in southern Israel live in peace."
That sounded reasonable but the press reported heart-rending stories about Gazans suffering and dying due to the cutoffs, and these immediately swamped the Israeli position. Appeals and denunciations from around the world demanded that Israelis ease up.
Gazans crossing into Egyptian territory on January 23 through a breach in the 13-meter tall fence.
Israelis had brought themselves to this completely avoidable predicament through incompetence – signing bad agreements, turning Gaza over to the thug Arafat, expelling their own citizens, permitting premature elections, acquiescing to the Hamas conquest, and abandoning control of Gaza's western border.
What might Western states now do? The border breaching, ironically, offers an opportunity to clean up a mess.
Washington and other capitals should declare the experiment in Gazan self-rule a failure and press President Hosni Mubarak of Egypt to help, perhaps providing Gaza with additional land or even annexing it as a province. This would revert to the situation of 1948-67, except this time Cairo would not keep Gaza at arm's length but take responsibility for it.
Culturally, this connection is a natural: Gazans speak a colloquial Arabic identical to the Egyptians of Sinai, have more family ties to Egypt than to the West Bank, and are economically more tied to Egypt (recall the many smugglers' tunnels). Further, Hamas derives from an Egyptian organization, the Muslim Brethren. As David Warren of the Ottawa Citizen notes, calling Gazans"Palestinians" is less accurate than politically correct.
Why not formalize the Egyptian connection? Among other benefits, this would (1) end the rocket fire against Israel, (2) expose the superficiality of Palestinian nationalism, an ideology under a century old, and perhaps (3) break the Arab-Israeli logjam.
It's hard to divine what benefit American taxpayers have received for the US$65 billion they have lavished on Egypt since 1948; but Egypt's absorbing Gaza might justify their continuing to shell out $1.8 billion a year.
Posted on: Tuesday, January 29, 2008 - 19:05
SOURCE: Financial Times (1-25-08)
The picture is, as usual, especially bleak in Africa, where two erstwhile democratic role-models find themselves in serious difficulty. Only five years ago, Mwai Kibaki’s election as president was supposed to mark a new dawn for Kenya after 24 long years of misrule by Daniel arap Moi. But now allegations that Kibaki in effective stole last month’s presidential election from the opposition leader Raila Odinga have unleashed bloody ethnic conflict between Kikuyus and other tribes.
The problem in South Africa is not violent (as yet) but it is equally troubling. There, the African National Congress has chosen as its new leader, and therefore the country’s most likely next president, a man who currently faces serious corruption charges involving payments of more than R4m. Already, some of Jacob Zuma’s more radical supporters are warning that there will be “blood spilt in the courtroom” if he is convicted. It is not without significance that Zuma is a Zulu, while his arch-rival Thabo Mbeki is a Xhosa.
In Asia, too, democracy is in retreat. Benazir Bhutto’s assassination in Pakistan on December 27, two weeks before elections were due to be held there, has significantly reduced the chances of a peaceful transition from military rule back to democracy. In Thailand, the generals are still in power 16 months after staging a coup against Thaksin Shinawatra (another democratically elected leader facing accusations of corruption). Meanwhile, a much nastier military junta continues to rule Burma with the mailed fist, having crushed last summer’s protests by political dissidents and Buddhist monks. It is scarcely worth adding that the prospects for democracy in the world’s most populous country look little brighter. The Chinese Communist party shows no sign of wanting to relinquish its monopoly on power.
To be sure, communist rule is a thing of the past in the territory of the former Soviet Union. But Time magazine’s Man of the Year, Vladimir Putin, is making a mockery of the Russian constitution by, in effect, handing the presidency to one of his own sidekicks, who intends to appoint Putin as prime minister. Nor is Russia the only former Soviet Republic slipping back into old autocratic habits. In Kyrgyzstan, last month’s elections were condemned by international observers. Kazakhstan is little more than an Oriental despotism; the same goes for Tajikistan, Turkmenistan and Uzbekistan. Even Georgia’s “Rose Revolution” seems to be withering fast.
Latin America offers some consolations, though it still remains to be seen if Venezuela’s Hugo Chávez will really accept the unexpected defeat he was handed in last month’s referendum on constitutional “reform”. As for the greater Middle East, the Bush administration’s bid to spread democracy at gunpoint has proved far more costly in lives, money and time than almost anyone in Washington envisaged five years ago. Despite the success of the recent military “surge”, Iraq continues to teeter on the verge of civil war. Afghanistan is little better.
It was not supposed to be like this. Nearly 20 years ago, on the eve of the fall of the Berlin Wall, Francis Fukuyama published a seminal essay, “The End of History”, in which he prophesied “the end point of mankind’s ideological evolution and the universalization of Western liberal democracy as the final form of human government”.
In fairness to Fukuyama, he was writing after more than a decade of sustained improvement in global governance. In the mid-1970s, roughly half the world’s states could be classified as “autocracies”. By 1989 the number had very nearly halved. And the trend continued much as Fukuyama foresaw – by 2002 it was down to fewer than 30. In its 1998 report, the International Institute for Democracy and Electoral Assistance was able to announce that, for the first time, a majority of the world’s population were living in democracies. There really did seem to be a democratic wave, beginning in the Iberian peninsula in the mid-1970s, spreading to Latin America and parts of Asia in the 1980s and sweeping eastwards from central Europe in 1989-91. All Fukuyama did was surf it.
The trouble with waves is that sooner or later they break....
Why does democracy flourish in some countries, but shrivel and die in others? The simplest answers on offer are economic. According to the political scientist Adam Przeworski, there is a straightforward relationship between per capita income and the likelihood that a democracy will endure. In a country where the average income is below $1,000 a year, democracy is unlikely to last a decade. Once average income exceeds $6,000 a year, it is practically indestructible. This certainly seems plausible at first sight. The countries with the maximum Freedom House scores are, with the exception of Barbados, the rich countries of north-western Europe. The countries with the lowest scores include some of Africa’s poorest....
The key to spreading democracy is clearly not just to overthrow undemocratic regimes and hold elections. Nor is it simply a matter of waiting for a country to achieving the right level of income or rate of growth. The key, as Stanford political scientist Barry Weingast has long argued, is to come up with rules that are “self-enforcing”, so that the more they are applied, the more respected they become, until at last they become inviolable.
There is no reason why that should not be possible in any of the world’s civilisations. As the British example makes clear, however, it can (and probably must) be a very protracted process. And that is precisely why it would be rash, after a few bad years, to prophesy the death of democracy – as rash as it was to predict its triumph after a few good ones.
Posted on: Tuesday, January 29, 2008 - 18:15
SOURCE: Informed Comment (Blog) (1-29-08)
When Bush first came in, the comedian Will Ferrell did a skit on the television show"Saturday Night Live" that depicted the president cowering under his desk as bombs went off in Washington and the country went down the tubes. Coming after the prosperity and relative peace of the Clinton years, it seemed a fantastic parody. Little did we know that if anything SNL did not begin to capture the full extent of the catastrophe.
Nobody cares any more, unlike in 2003 when shills for the war were always on my case to"report the good news" and lay off Bush. Some of my"arguments with Bush" during the past 7 years were internet bestsellers. Now, the man has discredited himself so badly, he can't even get people to so much as yawn at him. But in honor of all those arguments of the past, I'm doing it one last time.
As usual, most of what he said in the State of the Union address was transparent lies. He praised private groups for doing charity work in Louisiana because he hasn't followed through on his own promises after Katrina. He did that phony thing of reporting the average tax"increase" if his"tax cuts" were allowed to expire. If I'm in the room with someone who made a billion dollars last year and Bush doesn't cut my taxes at all but he cuts those of the billionaire such that he saves 5% of his income, then the two of us in the room have an average tax cut of $25 million apiece. But in the real world, I get bupkus and the billionaire gets $50 million. That shell game sums up the Republican"tax cut" scam they keep running on the American middle class, which always falls for it.
So here are some last arguments with the man's bald faced lies, for old times sake.
Bush assertion:"We believe that the most reliable guide for our country is the collective wisdom of ordinary citizens."
Sad Fact: Indiana GOP tries to keep ordinary citizens from voting with restrictive photo identification law.
Bush assertion:"And so, in all we do, we must trust in the ability of free peoples to make wise decisions and empower them to improve their lives for their futures."
Sad fact: Amit Paley writes,"A strong majority of Iraqis want U.S.-led military forces to immediately withdraw from the country, saying their swift departure would make Iraq more secure and decrease sectarian violence, according to new polls by the State Department and independent researchers.
In Baghdad, for example, nearly three-quarters of residents polled said they would feel safer if U.S. and other foreign forces left Iraq, with 65 percent of those asked favoring an immediate pullout . . ."
Bush assertion:"We've seen Afghans emerge from the tyranny of the Taliban and choose a new president and a new parliament."
Sad fact: "Afghanistan Journalist sentenced to Death for Blasphemy" and I don't think women would agree with Bush's rosy picture of progressive democracy in Kabul. Not to mention that half the country's gross domestic product is generated by the heroin trade. Bush goes on to say that his democratic projects are only being interrupted by terrorists; but all the problems above are problems with the establishment, not with terror groups.
Bush assertion:"From expanding opportunity to protecting our country, we've made good progress."
Sad fact: Bush's Iraq is a major generator of terrorism, which it was not before 2003. "Iraq has replaced Afghanistan as the prime training ground for foreign terrorists who could travel elsewhere across the globe and wreak havoc, according to U.S. counterterrorism officials and classified studies" by the CIA and the Department of State, Warren P. Strobel reported July 4, 2005."Iraq's emergence as a terrorist training ground appears to challenge President Bush's rationale for invading and overthrowing leader Saddam Hussein in March 2003," Strobel wrote." So we are safer how again?
Bush assertion:"We launched a surge of American forces into Iraq. We gave our troops a new mission: Work with the Iraqi forces to protect the Iraqi people, pursue the enemy in his strongholds, and deny the terrorists sanctuary anywhere in the country."
Sad fact:"The Iraqi Red Crescent Organization and the U.N. reported last month that the “number of Iraqis fleeing their homes has soared since the American troop increase began in February. . . The chart reports some decreases in the intensity of “ethno-sectarian violence” in certain Baghdad districts (Note: This is based on military data). But where there have been decreases, they are due largely to the fact that “mixed Muslim” areas are being overrun by either Shia or Sunni enclaves.The map above demonstrates that Shias have been gradually taking over all of Baghdad (noted by the green mass that now covers much of the city), wiping out Sunni communities that stood in their path. Center for American Progress analyst Brian Katulis estimated that Baghdad, which once used to be a 65 percent Sunni majority city, is now 75 percent Shia."
A large proportion of the 1.5 million Iraqi refugees in Damascus was displaced to Syria during 2007, apparently as a side effect of Bush's troop surge.
So all this involves"protecting the Iraqi people" how, exactly? Does Bush think Iraqis are safer when they are refugees in a foreign country?
He won't be missed.
Posted on: Tuesday, January 29, 2008 - 16:03
SOURCE: LAT (1-26-08)
'God alone knows the future," Ambrose Bierce reputedly wrote, "but only an historian can alter the past." Although Bierce was undoubtedly right about historians, he should perhaps have added politicians and their ardent supporters as well.
In recent weeks, some of the presidential candidates and their surrogates have been evoking history more insistently than ever. Not surprisingly, those evocations often have been flimsy and faulty.
On the Republican side, the misuse of history has mostly centered on the presidency of Ronald Reagan; indeed, the GOP contest has at times looked like an "American Idol"-style competition over who can deliver the most convincing imitation of Reagan. At the Fox News debate on Jan. 5, the GOP candidates invoked the former president's name 34 times -- yet, on closer inspection, their evocations have more to do with nostalgia for a happier time for conservatives than with historical accuracy.
The more grievous abuses of history, though, have come from the Democrats, and particularly from the Barack Obama side, including his many avid supporters in the media and the academy. (Perhaps this is a good place to note that I am on record as a supporter of Hillary Clinton.)
Few will disagree that it is very rare for a candidate with as little experience in politics and government as Obama to capture the imagination of so many influential Americans. One way for a candidate like this to minimize his lack of experience is to pluck from the past the names of great presidents who also, supposedly, lacked experience. Early in the campaign, Obama's backers likened him to the supposed neophyte John F. Kennedy. More recently, some have pointed out (as did New York Times columnist Nicholas Kristof, among others) that Abraham Lincoln served only one "undistinguished" term in the House before he was elected president in 1860.
These comparisons distort the past beyond recognition. By the time he ran for president, JFK had served three terms in the House and twice won election to the Senate, where he was an active member of the Foreign Relations Committee. In total, he had held elective office in Washington for 14 years. Before that, he was, of course, a decorated veteran of World War II, having fought with valor in the South Pacific. Kennedy, the son of a U.S. ambassador to Britain, had closely studied foreign affairs, which led to his first book, "Why England Slept," as well as to a postwar stint in journalism.
This record is not comparable to Obama's eight years in the Illinois Legislature, his work as a community organizer and his single election to the Senate in 2004 -- an election he won against a late entrant, right-wing Republican Alan Keyes, in a state where the GOP was in severe disarray.
The Lincoln comparison is equally tortured. Yes, Lincoln spent only two years in the House after winning election in 1846. Yet his deep involvement in state and national politics began in 1832, the same year he was elected a captain in the Illinois militia -- and 28 years before he ran for president. He then served as leader of the Illinois Whig Party and served his far-from-undistinguished term in Congress courageously leading opposition to the Mexican War.
After returning home, he became one of the leading railroad lawyers in the country, emerged as an outspoken antislavery leader of Illinois' Republican Party -- and then, in 1858, ran unsuccessfully for the Senate and engaged with Stephen A. Douglas in the nation's most important debates over slavery before the Civil War. It behooves the champions of any candidate to think carefully when citing similarities to Lincoln's record. In this case, the comparison is absurd.
But on to the founding fathers. The historian Joseph Ellis, writing in the Los Angeles Times, likened Obama to Thomas Jefferson and James Madison, in a hazy way, as an advocate of nonpartisan politics. Yet Ellis had to sidestep what even he admitted is a large, inconvenient fact: Jefferson and Madison were not nonpartisan -- they actually founded what has evolved into the Democratic Party. Through highly selective and misleading quotations, Ellis then described them as nonpartisan at heart, ignoring Madison's recognition, in 1792, that "in every political society, parties are unavoidable," or Jefferson's pledge, as president, to sink the Federalist Party "into an abyss from which there shall be no resurrection for it."
Returning to more recent history: The Obama campaign, in asserting a supposedly innovative post-partisan politics, has endorsed a partisan Republican account of the post-Reagan years that is at odds with the facts. Obama has asserted that the GOP has been the "party of ideas" over the last 10 to 15 years -- that is, since 1993 or so. In other words: the old (and long discredited) right-wing bromides repackaged as the "Contract with America" in 1994, the Republican attack on Medicare that led to the government shutdown a year later, the endless recycling of supply-side economics (especially ironic, given the current meltdown), and the other ideological agendas pushed by Newt Gingrich, Tom DeLay, George W. Bush and Dick Cheney, have made the GOP the party of intellectual daring and innovation.
Historians cannot expect all politicians and their supporters to know as much about American history as, say, John F. Kennedy, who won the Pulitzer Prize for a work of history. But it is reasonable to expect respect for the basic facts -- and not contribute to cheapening the historical currency.
Spreading bad history is no way to make history.
Posted on: Tuesday, January 29, 2008 - 14:41
SOURCE: Britannica Blog (1-28-08)
At the January 21 South Carolina Debate (the one that was co-sponsored by the Congressional Black Caucus, where Barack Obama is not just a guest, but also a member), Hillary Clinton was asked to weigh in on whether or not her husband deserved Toni Morrison’s praise as “the first black president.” She took that opportunity (and every other one) to tie herself to that day’s celebration of Martin Luther King, Jr. It was an opportunity to redeem her faux pas of daring to suggest that MLK’s activism might have passed for naught if Lyndon Baines Johnson had not driven the Civil Rights Act through Congress. She came up with this nugget:
You have got a son of the South. You’ve got an African-American. You have a woman. What better way to celebrate the legacy of Dr. King than to look at this stage right here tonight? And, you know, I’m reminded of one of my heroes, Frederick Douglass, who had on the masthead of his newspaper in upstate New York, “The North Star,” that right has no sex and truth has no color. And that is really the profound message of Dr. King.
It looks great as a rhetorical turn. She got broad claims of inclusiveness, tied herself every bit as closely to Dr. King’s legacy as Barack Obama can, and demonstrated that she has (gasp) more than one African-American hero. Perhaps she came up with that one off the cuff, but I suspect that her writers are not on strike.
Little comments like this one can be zingers or they can be deathly errors. In the hands of opponents willing to twist each vignette into a dozen embarrassing implications, they provide terrible weapons - ask Obama what happens when you dare suggest that Ronald Reagan and his advisors might have had “ideas.” If he was half as brutal in twisting Clinton’s King remarks as she has been with his Reagan ones, she might have lost every single African-American vote in South Carolina, but it is not clear that this opening (tempting though it must be) would profit Obama, or any Democrat, in the long run.
The deeper and more intriguing complexities of these attempts to tie the candidates to historical icons are rarely acknowledged. Clinton’s rhetorical embrace of Frederick Douglass is as complex and unintentionally illuminating as any.
Although Clinton cited Frederick Douglass to escape any lingering suspicion that she might discount the works of great black leaders, she could have cited him in defense for the earlier claims that some took as detracting from King. In his oration “In Memory of Abraham Lincoln,” Douglass enunciated a remarkably complex portrait of the relationship between social transformation and political power. He did not shrink from calling Lincoln to task for the many ways in which he was a lukewarm and late convert to the ideal of full equality for the blacks in America. But he then proclaimed, in no uncertain terms, that it was Lincoln’s presidency that freed the slaves. After enumerating all of his shortcomings, Douglass proclaimed that Lincoln, the man who combined a practical adherence to the right ideas with the undoubted potency of presidential power, stands “at the head of [our] great movement.”
On the question of the primacy of moral and political leadership, it appears that Douglass might have been willing to give Johnson his due.
However, there is another, more disturbing complexity in the invocation of Douglass - one that has too many eerie echoes in the current debate:
At the end of the Civil War, Frederick Douglass’s long-time friendship and alliance with the leaders of the suffrage movement quickly deteriorated into an ugly argument about who deserved the vote “more.” In her editorials in “The Revolution,” Elizabeth Cady Stanton contrasted the ignorance, lack of education, and lowly employments of the newly freed slaves in her no holds barred efforts to press the case that well-educated white women were far better suited for the vote than black men. In a particularly uncomfortable echo of what some say is the Clintons’ current strategy, Stanton suggested that blacks would prove so incapable of making informed decisions about how to vote that they would simply express racial solidarity, giving votes en bloc to candidates who might in fact be tools of their oppressors. Bill Clinton’s efforts to link Obama’s victory to those of Jesse Jackson in the 1980s, thus treating the candidate who is at least a co-frontrunner as little more than a symbolic racial favorite son, suggest that black votes are as unthinking and unthoughtful as Stanton said they would be - 150 years ago.
Douglass, in turn, lashed back at the convention of the Equal Rights Association insisting that the oppressions suffered by African-Americans were far worse than those inflicted upon women. Susan B. Anthony incredibly retorted that Douglass did not know what he was talking about when he spoke of oppression.
I fear it is this little bit of Douglass’s historical drama that we are re-enacting today but with less reason and less excuse.
In the 1860s, two terribly oppressed groups turned against each other in a desperate struggle to see which one would seize something that both clearly needed and completely deserved. It is quite likely that the division of the suffrage movements in the 1860s cost both dearly. Black men inherited a franchise that was soon rendered meaningless by racist mobs who controlled much of the country, and white women waited fifty years during which the respective fears of the two groups became the pretense for the oppression of both by southern white males.
In this incarnation, we have two ambitious, Ivy League-educated leaders trying to claim the mantle of their respective group’s oppression as a superior title to rule. They both risk exploiting real historical injustices for dubious political advantages that may in the end have little impact beyond raising the now slim chances that the Republicans retain the White House in 2008.
Is it more revolutionary to choose a black man or a white woman for a major party’s presidential nominee? It has taken too long to do either, and “first” is less important than “whether.” It looks like we will have one or the other in 2008, if they don’t manage to sabotage each other in their desperate desire to win.
If Hillary Clinton and Barack Obama consider Frederick Douglass, Susan B. Anthony, and Elizabeth Cady Stanton as heroes for their lofty ideals, they should both reconsider whether expropriating the worst elements of their tactics is a very good idea.
Posted on: Monday, January 28, 2008 - 15:07
SOURCE: San Francisco Chronicle (1-25-08)
This seems to be the moment for which feminists have waited. Those of us who came out of the women's movement of the late '60s and '70s have longed for a greater presence of women in the political and public spheres. So why are we not ringing doorbells for Democratic presidential candidate Hillary Clinton? In my case, I would say it is ambivalence. I am certainly not against her. I just can't bring myself to be for her.
To be sure, I love the idea of a female president; and I appreciate the value of a role model in the office. There are moments when I writhe at the blatant misogyny of the comments about her tears, the timbre of her voice or the cut of her pantsuits. But in the end, being a woman is not enough. My generation of feminists had at least two goals: one of these was equal opportunity and equality for individual women in the home and in public. The second was, and is, more community oriented: We believed that social justice could only be achieved within a fairer and more humane world. We were pretty successful at achieving the first goal, but the second one got put on hold - not least by the climate of conservatism that prevailed in the '80s and '90s and which extolled the free market while denigrating any positive role for government in the solution of social and economic problems.
In the '90s, faced with a dominant conservative ideology, the mantra of "centrism" fostered by the Clintons seemed about as far as Democrats could go politically. I disliked many of the Clintons' pro-business and anti-welfare policies. But, and still, I admired Hillary's place as an engaged and politically active first lady; her advocacy of children; even her flawed effort to take a lead on the health care issue.
But this is a new world with new opportunities for reaching our feminist dreams. The conservative ethos of the '90s is on the wane; three-quarters of Americans, according to the latest poll, think the country is moving in the wrong direction. Neoliberal desires to "starve the beast" are in question. A large majority of our population now believes that we live in a "global" context, where we share environmental, labor and family concerns that we can resolve only by acting across national borders. More people question the value of an unrestrained free market; or say they are willing to increase taxes to pay for things such as universal health care; or debate the cost to an embattled environment of ever-increasing individual consumption.
In this context, the 2008 presidential election poses an opportunity for candidates to offer a new vision that evokes the sense of shared responsibility intrinsic to liberal democracy. Each of the three leading Democratic candidates has begun to talk about a more effective government, and one that balances the marketplace. From all three of them we hear calls for "change;" a cry to unify the nation; demands for accountability in government; a rhetoric of equality and of empathy for the economic problems of the neglected poor; an end to the war in Iraq; I like what I am hearing in general, but I'm waiting for each of them to mobilize us to reach for these goals.
I am not reassured by learning that John Edwards is the son of a mill worker, that Barack Obama is black, or that Clinton is a woman. Each of those identities reflects something about which I care deeply. But I can't vote for any of them because of who they are. Supporting Hillary because she is a woman fosters a debate about whether to place race or sex or religion at the top of our list of priorities. If I support Hillary (or any candidate) because I am drawn to her identity, I am simply encouraging others to support their candidates for the same reason. And identity is no guarantee that a particular individual will speak for feminist values and issues. Remember Margaret Thatcher supervised the dismantling of the British welfare state; Clarence Thomas has routinely made judgments that have closed the gates to economic opportunity for African Americans.
As a feminist, I want a president who will inspire us to achieve at least some of the values that I care about. I want a president who will use government resources to make this a more humane and equitable society by enhancing educational opportunity and economic security for the poor and constructing health care for all. I want a president who will actively protect our civil liberties and civil rights; one who will speak loudly against the grotesque impositions of secrecy, surveillance, torture and incarceration. I want a president who will not only end this war quickly, but who will change the direction of American foreign policy in acknowledgement of the new global realities; I want a president who will eschew fear-mongering, tear down fences, and work out dignified ways to cope more effectively and more humanely with the inevitable movement of workers across the borders. I want these things as a feminist.
Hillary's record, like those of the other candidates, falls far short of what's needed to achieve my desires. She has voted against expanding funds for subsidized day care or allowing poor women to use welfare funds to get an education; her health care program is still too closely tied to the insurance industry. Her foreign policy seems to look toward the more hawkish Democratic advisers of the '90s; she has equivocated about how rapidly she would withdraw American forces from the Middle East. But I have hopes that all three candidates will push the others into articulating the kind of vision that recalls the feminism I once knew. When and if Hillary gets there, I will be her most ardent supporter.
Posted on: Monday, January 28, 2008 - 14:44
SOURCE: Nation (2-4-08)
The controversy inspired by Hillary Clinton's remark crediting Lyndon Johnson with the civil rights movement's successes seems to have subsided. Contrary to much recent punditry, this contretemps does not prove that the Democratic primary has been reduced to a zero-sum game of identity politics. Rather, it reveals the complexity of bringing together the aspirations of different social groups within a single political movement--something Americans have experienced before.
Some commentators have already compared Barack Obama to Frederick Douglass, the former slave and crusader for emancipation who insisted that the post-Civil War years constituted the "Negro's hour" and that the struggle for the rights of the newly freed slaves took precedence over gaining the vote for women. In this scenario, Hillary Clinton is a latter-day Elizabeth Cady Stanton, who broke with her male allies when they called on women to subordinate their claims. But like many historical analogies, this one distorts as much as it reveals.
American feminism was born of the abolitionist movement, with its powerful insistence on universal equality. Before the Civil War, abolitionists and feminists, male and female, worked together for an end to slavery and a new definition of citizenship in which rights would not be limited by race or gender. During the war feminists put aside the campaign for women's rights to join in the struggle to save the Union and free the slaves. But they saw Reconstruction as a golden opportunity to claim for women their own emancipation.
In the war's aftermath, Congress rewrote the Constitution to guarantee equality before the law for blacks and the right to vote for black men. The Fourteenth Amendment, ratified in 1868, decreed that all people born in the United States were citizens who must enjoy equal protection of the law. But for the first time, the amendment introduced the word "male" into the Constitution, in a convoluted section related to voting. It punished states that failed to enfranchise black men with a loss of some of their seats in Congress. There was no penalty when women were denied suffrage. The Reconstruction Act and the Fifteenth Amendment barred states from denying the right to vote on the basis of race but left them free to do so because of gender.
These measures launched the era known as Radical Reconstruction, the first experiment in interracial democracy (for men) in our history. For the first time, large numbers of African-Americans voted and held office. Mississippi elected two black men to the Senate. (In the century and a half since, only three have followed, including Obama.)
But feminists like Stanton saw abolitionist support for these laws and amendments as a betrayal of the movement's long-standing commitment to full equality. A bitter controversy ensued, which resulted in Stanton and her supporters cutting their ties with their allies and forming an independent national organization to promote women's suffrage. They now felt free to appeal to racial and ethnic prejudices, arguing on occasion that native-born white women deserved the vote more than nonwhites and immigrants.
This episode has come down to us as the feminist-abolitionist split. But the story is more complicated. What actually happened was a split within the feminist movement. Nearly every black feminist supported the Fourteenth and Fifteenth amendments. So did many white women. Supporters believed these measures were necessary to protect all African-Americans from oppression in the aftermath of slavery. They saw the enfranchisement of black men as a step toward universal suffrage, not a retreat from it. Women like Abby Kelley, one of the era's greatest feminists, formed their own women's suffrage group, still linked to the abolitionist tradition. Not until the 1890s were the rival organizations reconciled.
The point is not that one position was right and one wrong--either in 1868 or 2008. During Reconstruction, both sides offered cogent arguments. One thing we can learn from their experience is that debating who is more oppressed is a fool's game. Advocates of the rights of African-Americans and women achieve more working together than fighting among themselves.
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Monday, January 28, 2008 - 14:23
SOURCE: NYT (1-27-08)
SENATOR Hillary Clinton has based her campaign on experience — 35 years of it by her count. That must include her eight years in the White House.
Some may debate whether those years count as executive experience. But there can be no doubt that her husband had the presidential experience, fully. He has shown during his wife’s campaign that he is a person of initiative and energy. Does anyone expect him not to use his experience in an energetic way if he re-enters the White House as the first spouse?
Mrs. Clinton claims that her time in that role was an active one. He can hardly be expected to show less involvement when he returns to the scene of his time in power as the resident expert. He is not the kind to be a potted plant in the White House.
Which raises an important matter. Do we really want a plural presidency?
This is not a new question. It was intensely debated in the convention that formulated our Constitution. The Virginia Plan for the new document submitted by Edmund Randolph and the New Jersey Plan submitted by William Paterson left open the number of officers to hold the executive power.
Some (like Hugh Williamson of North Carolina) argued for a three-person executive, each member coming from a different region of the country. More people argued (like George Mason of Virginia) for a multiple-member executive council.
The objection to giving executive power to a single person came from the framers’ experience with the British monarchy and the royal governors of the colonies. They did not want another monarch.
But as the debate went forward a consensus formed that republican rule would check the single initiative of a president. In fact, accountability to the legislature demanded that responsibility be lodged where it could be called to account. A plural presidency would leave it uncertain whom to check. How, for instance, would Congress decide which part of the executive should be impeached in case of high crimes and misdemeanors? One member of the plural executive could hide behind the other members....
Posted on: Sunday, January 27, 2008 - 11:56
SOURCE: National Post (1-10-08)
Today's religious map of the Middle East traces to the unification of the Arabian tribes under the banner of Islam in the 7th century, and their subsequent conquest of much of the known world. Muhammad's genius was in finding a way to unite the myriad of fissiparous, feuding Bedouin tribes of northern Arabia into a cohesive polity. Just as he had provided a constitution of rules under which the people of Medina could live together, so he provided a constitution for all Arabs, but this one had the imprimatur not just of Muhammad, but of God. Submission -- Islam -- to God and His rules, spelled out in the Koran, bound Arabian tribesmen into the community of believers, the umma.
Building on the tribal system of "balanced opposition" -- the subject of yesterday's essay -- Muhammad was able to frame an inclusive structure within which the tribes had a common, God-given identity as Muslims. But unification was only possible by creating a tribalized enemy against which Muslims could make common cause. This Muhammad did by opposing Muslims against infidels; and the dar al-Islam, the land of Islam and peace, against the dar al-harb, the land of infidels and conflict. Through the precepts of Islam, traditional Bedouin raiding was sanctified as an act of religious duty.
With every successful battle against local unbelievers, especially after the critical early battle against the Meccans, more Bedouin joined the umma. Once united, the Bedouin warriors of the umma turned outward, teaching the world the meaning of jihad, holy war. The rest, as they say, is history.
The Arabs, in lightning thrusts, challenged and beat the Byzantines to the north and the Persians to the east, both weakened by their continuous wars with one another, thus imposing their control over the Christian majority in the Levant and the Zoroastrian majority in Persia, and therefore over the entire Middle East. These stunning successes were rapidly followed by conquests of Christian and Jewish populations in Egypt, Libya and North Africa's Maghreb (Arabic for "the West"), and, in the east, central Asia and the Hindu population of northern India. Not content with these triumphs, Arab armies invaded and subdued much of Christian Spain and Portugal, and all of Sicily. Since the Roman Empire, the world had not seen such power and reach. All fell before the Saracen blades.
Most accounts of Islamic history, even that of the Lindholm's esteemed The Islamic Middle East, glide over these conquests, as if they were friendly takeovers. But the truth was very different.
The evidence is overwhelming that vast numbers of infidel male warriors and civilians were slain, and that most of those spared, particularly the women and children, were enslaved for domestic and sexual servitude. While men who willingly converted were spared, their wives and children were taken as slaves. In conquered regions, children were regularly taken from parents, while on the borders -- especially in Central and Eastern Europe, Central Asia and Africa south of the Sahara -- raiding for slaves was normal practice. Of the male slaves, a substantial number were made eunuchs by the removal of sex organs, in order to serve in harems. This account of the Arab campaign in northern India illustrates the usual procedures:
"During the Arab invasion of Sindh (712 CE), Muhammad bin Qasim first attacked Debal…It was garrisoned by 4,000 Kshatriya soldiers and served by 3,000 Brahmans. All males of the age of 17 and upwards were put to the sword and their women and children were enslaved.
"[Seven hundred] beautiful females, who were under the protection of Budh (that is, had taken shelter in the temple), were all captured with their valuable ornaments, and clothes adorned with jewels." Muhammad dispatched one-fifth of the legal spoil to Hajjaj, which included 75 damsels, the other four-fifths were distributed among soldiers."...
Posted on: Saturday, January 26, 2008 - 21:45
SOURCE: Cato Unbound (1-14-08)
Any serious discussion of the future of marriage requires a clear understanding of how marriage evolved over the ages, along with the causes of its most recent transformations. Many people who hope to “re-institutionalize” marriage misunderstand the reasons that marriage was once more stable and played a stronger role in regulating social life.
For most of history, marriage was more about getting the right in-laws than picking the right partner to love and live with. In the small-scale, band-level societies of our distant ancestors, marriage alliances turned strangers into relatives, creating interdependencies among groups that might otherwise meet as enemies. But as large wealth and status differentials developed in the ancient world, marriage became more exclusionary and coercive. People maneuvered to orchestrate advantageous marriage connections with some families and avoid incurring obligations to others. Marriage became the main way that the upper classes consolidated wealth, forged military coalitions, finalized peace treaties, and bolstered claims to social status or political authority. Getting “well-connected” in-laws was a preoccupation of the middle classes as well, while the dowry a man received at marriage was often the biggest economic stake he would acquire before his parents died. Peasants, farmers, and craftsmen acquired new workers for the family enterprise and forged cooperative bonds with neighbors through their marriages.
Because of marriage’s vital economic and political functions, few societies in history believed that individuals should freely choose their own marriage partners, especially on such fragile grounds as love. Indeed, for millennia, marriage was much more about regulating economic, political, and gender hierarchies than nourishing the well-being of adults and their children. Until the late 18th century, parents took for granted their right to arrange their children’s marriages and even, in many regions, to dissolve a marriage made without their permission. In Anglo-American law, a child born outside an approved marriage was a “fillius nullius” - a child of no one, entitled to nothing. In fact, through most of history, the precondition for maintaining a strong institution of marriage was the existence of an equally strong institution of illegitimacy, which denied such children any claim on their families.
Even legally-recognized wives and children received few of the protections we now associate with marriage. Until the late 19th century, European and American husbands had the right to physically restrain, imprison, or “punish” their wives and children. Marriage gave husbands sole ownership over all property a wife brought to the marriage and any income she earned afterward. Parents put their children to work to accumulate resources for their own old age, enforcing obedience by periodic beatings.
Many people managed to develop loving families over the ages despite these laws and customs, but until very recently, this was not the main point of entering or staying in a union. It was just 250 years ago, when the Enlightenment challenged the right of the older generation and the state to dictate to the young, that free choice based on love and compatibility emerged as the social ideal for mate selection. Only in the early 19th century did the success of a marriage begin to be defined by how well it cared for its members, both adults and children.
These new marital ideals appalled many social conservatives of the day. “How will we get the right people to marry each other, if they can refuse on such trivial grounds as lack of love?” they asked. “Just as important, how will we prevent the wrong ones, such as paupers and servants, from marrying?” What would compel people to stay in marriages where love had died? What would prevent wives from challenging their husbands’ authority?
They were right to worry. In the late 18th century, new ideas about the “pursuit of happiness” led many countries to make divorce more accessible, and some even repealed the penalties for homosexual love. The French revolutionaries abolished the legal category of illegitimacy, according a “love child” equal rights with a “legal” one. In the mid-19th century, women challenged husbands’ sole ownership of wives’ property, earnings, and behavior. Moralists predicted that such female economic independence would “destroy domestic tranquility,” producing “infidelity in the marriage bed, a high rate of divorce, and increased female criminality.” And in some regards, they seemed correct. Divorce rates rose so steadily that in 1891 a Cornell University professor predicted, with stunning accuracy, that if divorce continued rising at its current rate, more marriages would end in divorce than death by the 1980s.
But until the late 1960s, most of the destabilizing aspects of the love revolution were held in check by several forces that prevented people from building successful lives outside marriage: the continued legal subordination of women to men; the ability of local elites to penalize employees and other community members for then-stigmatized behaviors such as remaining single, cohabiting, or getting a divorce; the unreliability of birth control, combined with the harsh treatment of illegitimate children; and above all, the dependence of women upon men’s wage earning.
In the 1970s, however, these constraints were swept away or seriously eroded. The result has been to create a paradox with which many Americans have yet to come to terms. Today, when a marriage works, it delivers more benefits to its members — adults and children — than ever before. A good marriage is fairer and more fulfilling for both men and women than couples of the past could ever have imagined. Domestic violence and sexual coercion have fallen sharply. More couples share decision-making and housework than ever before. Parents devote unprecedented time and resources to their children. And men in stable marriages are far less likely to cheat on their wives than in the past.
But the same things that have made so many modern marriages more intimate, fair, and protective have simultaneously made marriage itself more optional and more contingent on successful negotiation. They have also made marriage seem less bearable when it doesn’t live up to its potential. The forces that have strengthened marriage as a personal relationship between freely-consenting adults have weakened marriage as a regulatory social institution.
In the 1970s and 1980s, the collapse of the conditions that had forced most people to get and stay married led to dramatic - and often traumatic - upheavals in marriage. This was exacerbated by an economic climate that made the 1950s ideal of the male breadwinner unattainable for many families. Divorce rates soared. Unwed teen motherhood shot up. Since then, some of these destabilizing trends have leveled off or receded. The divorce rate has fallen, especially for college-educated couples, over the past 20 years. When divorce does occur, more couples work to resolve it amicably, and fewer men walk away from contact with their children. Although there was a small uptick in teen births last year, they are still almost 30 percent lower than in 1991.
Still, there is no chance that we can restore marriage to its former supremacy in coordinating social and interpersonal relationships. Even as the divorce rate has dropped, the incidence of cohabitation, delayed marriage and non-marriage has risen steadily. With half of all Americans aged 25-29 unmarried, marriage no longer organizes the transition into regular sexual activity or long-term partnerships the way it used to. Although teen births are lower than a decade ago, births to unwed mothers aged 25 and older continue to climb. Almost 40 percent of America’s children are born to unmarried parents. And gay and lesbian families are permanently out of the closet....
Posted on: Saturday, January 26, 2008 - 21:43
SOURCE: China Beat (1-25-08)
Last year the International Olympic Committee (IOC) invited me to write an essay on the Beijing Olympics, and “The Beijing Effect” was published in the July-September 2006 issue of The Olympic Review. At the end of that essay I wrote, “China hopes that it will change the Olympic Games, but is the West really open to that possibility? Are we truly ready for ‘One World, One Dream’?” Since that article appeared in the official magazine of the IOC, it is not implausible that Beijing decided to answer my question. On August 8, 2007, Beijing marked the one-year countdown to the Games with the premier of what became a hit song and a slogan that one can see everywhere on TV advertisements and billboards: “We Are Ready,” 我们准备好了. Indeed, Beijing’s preparations exceed all previous Olympic Games in their scale and financial investment. Beijing is ready for us. But are we ready for Beijing?
I don’t think the outside world is ready to understand what it will see in August 2008. So I am doing my small part to get it there. My participation on The China Beat is one part of my effort. If you want to know more about me and my experience of China, take a look at the interview with me that was just posted by my fellow Fulbrighterin Beijing, Dan Beekman, who is “Blogging Beijing” on the homepage of the Seattle Times.
As one of the world’s few academic experts on Chinese sports, I am getting a lot of requests from journalists these days. And then there are my opinionated and sometimes politically-misguided family members in the U.S. (you know who you are), and my academic colleagues (thanks, Allen Guttmann). Since there are a few basic questions that get repeated over and over, I have started compiling my e-mail responses into Beijing Olympic FAQs. Below I give my answers to FAQ#1: Is it possible to keep politics out of the Beijing 2008 Olympics?
FAQ#1: Is it possible to keep politics out of the Beijing 2008 Olympics?
I get a little impatient with this naive question,"is it possible to keep politics out of the Olympics?" The Olympics have been intimately tied to national politics at least since the 1906 Intermediate Olympic Games in Athens. These were the first Olympic Games at which athletes marched into the stadium behind national flags and the three flags of the medalists were raised in the awards ceremony. To protest that Irish athletes had not been allowed to compete as a separate nation, the silver medalist in the triple jump, Peter O'Connor, climbed up the flagpole to wave the Irish flag in place of the British Union Jack that had been raised. [The first Olympics in Athens in 1896 were so well-supported by the Greeks that the IOC approved a Greek request to hold intermediate Olympic Games in the middle of the Olympiad. The 1906 Intermediate Games were the first and last because of political and economic instability in Greece.]
The reviver of the modern Olympics, Pierre de Coubertin, was a rather sophisticated thinker about the relationship between sports and politics, and always understood that politics were an integral part of the Olympic Movement. IOC presidents during the Cold War (Sigfrid Edstrøm, Avery Brundage, and Lord David Killanin) often tried to forbid people from"mixing sport and politics," but that was largely part of their effort to keep the political conflicts over which they had no control from disrupting the Olympic Games. It was never official IOC policy. And it is not today. The IOC’s only official stance on politics is contained in Fundamental Principle #5 of the Olympic Charter, which states, “Any form of discrimination with regard to a country or a person on grounds of race, religion, politics, gender or otherwise is incompatible with belonging to the Olympic Movement.”
The Olympic Games have often functioned as an alternative to mainstream diplomatic channels. The IOC is a non-governmental organization, which therefore is able to function in the cracks between governments. And it is important for it to maintain that independent intermediate position, so its presidents and other leading thinkers have correctly understood that they must maintain political independence from national governments to the degree possible. This complex political reality was captured in sayings like"keep the politics out of sport," but in order to understand what this really means, you have to delve a little bit deeper and understand the global structure that underlies Olympic sport. I will get into that in my answer to FAQ#2.
So the answer is, no, it is not possible to keep politics out of the Olympics, and in fact their political role is what makes them important in today's world and in the quest for world peace. This is as true in 2008 as it was over 100 years ago.
Stay tuned for FAQ#2: Will calls for a boycott of the 2008 Olympic Games be successful?
Posted on: Saturday, January 26, 2008 - 21:24
SOURCE: Informed Comment (Blog) (1-25-08)
About 3:00 am on Wednesday morning Jan. 23, well-coordinated explosions demolished the iron wall built by Israel to seal the southern border between the Gaza Strip and Egypt (the Philadelphi axis). Tens of thousands of Palestinians streamed across the border and entered the Egyptian side of the town of Rafah, which had been bisected by the wall, in search of food, gasoline, and other basic commodities which have been in short supply for many months in Gaza. The first wave of Palestinians to cross consisted of hundreds of women who were met with water canons and beatings by Egyptian security forces.
The wall was the starkest expression of the international boycott of Hamas imposed by the United States, Israel, and the European Union after Hamas won a majority of the seats in the Palestinian Legislative Council elections of January 2006 and formed a government the following March. Hamas has been in sole control of the Gaza Strip after it executed a coup d'état against Palestinian President Mahmoud Abbas in June 2007. Since then, Israel has tightened the siege of Gaza which had been in effect since June 2006.
In response, Hamas and Palestinian Jihad militants have fired thousands of Qassam missiles on the town of Sderot and other Israeli population centers near the Gaza Strip. According to the 2007 annual report of B'Tselem, the Israeli human rights organization, Hamas and Jihad killed twenty-four Israeli civilians in the West Bank and the Gaza Strip during 2006 and 2007 and thirteen Israeli military personnel.
In retaliation, Israel escalated the pace of its targeted assassinations of Hamas and Jihad militants, killing hundreds of civilians in the process. Based on B'Tselem's 2007 annual report, a Ha-Aretz investigation (Jan. 14, 2008) concluded that Israeli forces killed 816 Palestinians in the Gaza Strip during 2006 and 2007; at least 360 of them were civilians not affiliated with any armed organizations; 152 of the casualties were under age 18, and 48 were under the age of 14.
Despite the siege, Israel continued to provide electricity and water to the Gaza Strip, allowing people to live on the edge of survival, hoping that the economic pressure would bring down the Hamas government. Half the population now depends on charity handouts from the UN refugee relief organization and other humanitarian NGOs. Four days before the wall came crashing down, Israel sharply cut back fuel and water supplies, imposing a harsh collective punishment on the entire population of 1.5 million.
According to Ha-Aretz columnist Amira Hass (Jan. 24, 2008), for several months Hamas leaders had been discussing measures to end Gaza's torment, described by Rela Mazali, an Israeli feminist peace activist with the New Profile organization and an editor of Jewish Peace News, as"an abomination." Apparently, Hamas decided that four days of hermetic closure, following months of siege, created conditions in which Egypt and the international community would be willing to accept bringing down the wall. Hamas did not take official responsibility for blowing up the wall, but praised the action.
The Egyptian press reported that, several days before the wall was blown up, the General Guide of the Muslim Brothers, the largest opposition force in Egypt, spoke by telephone to Khaled Mash'al, the head of the Political Bureau of Hamas who resides in Damascus. Hamas emerged from the Palestinian branch of the Muslim Brothers; and there is a high likelihood that the actions of the two organizations were coordinated. Following this consultation, the Brothers began to organize demonstrations throughout Egypt beginning on Friday, Jan. 18. The number of its supporters in the street gradually increased, culminating on Wednesday. Jan. 23. That morning, thousands of Egyptian security forces surrounded Tahrir Square in downtown Cairo and arrested hundreds (according to some reports thousands) of people who were attempting to demonstrate in solidarity with the people of Gaza. The demonstration was supported by both the Muslim Brothers and secular nationalists.
Meanwhile, at Rafah, Egyptian security forces initially tried to stop the Palestinians from streaming across the border. But as the numbers swelled to tens of thousands, the government had no choice but to acquiesce. President Hosni Mubarak told journalists that he had instructed the security forces to:"Let them come in to eat and buy food" and return"as long as they are not carrying weapons."
What are the implications of these developments?
It appears that the Annapolis summit and the sham"peace process" it was supposed to have reinvigorated are dead -- killed by tens of thousands of unarmed Palestinians crossing the boarder into Egypt to meet their basic human needs. Shortly before President George W. Bush's visit to the Middle East, Israel began an expanded campaign of pressure on the Gaza Strip, including an escalation in targeted assassinations. Hamas has sent several signals that it was prepared for an informal ceasefire with Israel. But the political perspective articulated at Annapolis and its aftermath requires that Palestinian Authority President Mahmoud Abbas cooperate with Israel in crushing Hamas rather than try to restore Palestinian national unity. Egypt's task in this drama is to stand silently by.
This is an impossible task and cannot in any way contribute to peace. Even if Mahmud Abbas were to come to terms and sign an agreement with Israel, it would have no credibility and would be very short lived without some degree of approval and participation from Hamas. A government of national unity that represents all the factions of the Palestinian people is the only entity capable of signing a viable peace agreement with Israel.
The Israeli government led by Prime Minister Ehud Olmert opposes the kind of agreement that a Palestinian national unity government would demand, as has every previous government of Israel. Such an agreement would require recognition of Palestinian national rights rather than paternalistic" concessions" granted by a magnanimous but ultimately all-powerful Israel.
The limited capacity of the Egyptian government to acquiesce to this program has been exposed. The Mubarak regime would like very much to see Hamas crushed, since it is an ally of the Muslim Brothers, its most substantial domestic opposition force. But the Palestinian cause is too popular and emotional an issue in Egypt for Mubarak to appear to be assisting Israel in starving the people of Gaza. Moreover, some of the demonstrations in solidarity with Gaza also raised slogans against the drastic rise in the price of food in recent months and against Husni Mubarak himself. Opposition demonstrations linking the Palestine cause with domestic economic issues and autocracy have the potential to threaten a regime whose legitimacy is already minimal.
Palestine, Israel, and Egypt after the fall of the Gaza wall are more unstable than before. It is desirable, but alas unlikely, that this instability will bring the leaderships to their senses and impel them to negotiate a just peace for the benefit of all. But it is more likely that Olmert, Abbas, and Mubarak -- all weak and discredited leaders -- will seek to hold onto power by clinging to the United States, which has a long record of opposing Palestinian-Israeli peace. The people of the Gaza Strip have taken their survival into their own hands and have shown that the power of ordinary people is more likely to shape the future than polished diplomatic formulas.
Posted on: Friday, January 25, 2008 - 15:39
SOURCE: Christian Science Monitor (1-24-08)
Accra, Ghana | Last Sunday evening, in the final minutes of play, Ghana's Sulley Muntari scored a dramatic goal to defeat visiting Guinea. Seven hours later, at 3 a.m., we could still hear the horns and revelers on the street outside of our house.
And believe it or not, the party has just begun.
Mr. Muntari's heroics came in the opening match of the three-week Africa Cup of Nations soccer tournament, which Ghana is hosting. The entire country has been gripped by "football fever," as the newspapers call it. Taxis and buses all fly the Ghanaian flag, honking incessantly as they pass. Wherever you look – on shirts, neckties, scarves, hats, and even earrings – you see the bright national colors of red, green, and yellow.
To an American, the most refreshing part of this celebration is the apparent absence of national jingoism or chauvinism. The Ghanaians are intensely patriotic, of course, but you don't hear people talking about how much better they are than other countries.
Quite the contrary: the airwaves and papers are filled with demands for the nation to better itself, especially with the tournament placing Ghana in a world spotlight. Please don't litter, a television news reporter told us last week; please use proper food hygiene; please conserve water; please drive more carefully.
And please, most of all, pray for the soccer team. In Ghana, a deeply devout and increasingly Christian country, religious fervor merges easily with football fever. "Sometimes one is tempted to believe that God is a Ghanaian," a local newspaper exulted, the morning after Ghana's first-round victory. "For the very spiritual, it was the hidden hand of God at work or simply the miraculous."
But even here, Ghanaians exude gratitude rather than arrogance. They all want God to shine upon the team, naturally, but there's no suggestion that He ought to do so. I even heard one television reporter urge viewers to pray for all of the African nations, lest anyone get left out.
In public interviews, the players steer away from comments that might sound smug or conceited. We need to work hard, they say; we need to stay focused; and, with God's grace, we need to remain healthy. But I haven't heard any athlete promise to score a goal or to shut down an opponent, which is the type of thing you hear in the United States all the time.
And you'll probably hear even more of it during the next two weeks, as the hype for Super Bowl XLII intensifies. Football players often predict victory, waving index fingers in the air or thumping their chests. And on game day, as in Ghana, I suspect that many will kneel in prayer and ask God for the same.
The difference is that Americans actually believe they deserve it. The US was founded by people who imagined that God had assigned them a special destiny; they would be a city on the hill, lighting the way for everybody else.
At its best, this ideal has inspired Americans to fight for the universal ideals in our founding documents – liberty and justice for all. But at its worst, it has tricked us into thinking that we're actually better than the rest of the universe.
In 1945, near the end of the world's most destructive war, George Orwell drew a now-famous distinction between patriotism and nationalism.
Patriotism, Orwell wrote, is simply a "devotion to a particular place and a particular way of life." It was inward-looking and generally beneficial, insofar as it connected humans through shared communities. But nationalism is aggressive and outward-looking, always aiming to enhance the power of one community over another. Patriots and nationalists both love their countries, but only nationalists insist that their own country is – or should be – the superior one.
That's what Americans are wont to believe, given the burden of our history. But perhaps we could take a few cues from the Ghanaians, who manage to celebrate their country without diminishing others. As the Africa Cup of Nations tournament continues, I'll root hard for Ghana. And no matter who wins, I'll be grateful for the lesson.
Posted on: Thursday, January 24, 2008 - 22:18
SOURCE: TomDispatch.com (1-22-08)
The military adventurers of the Bush administration have much in common with the corporate leaders of the defunct energy company Enron. Both groups of men thought that they were the"smartest guys in the room," the title of Alex Gibney's prize-winning film on what went wrong at Enron. The neoconservatives in the White House and the Pentagon outsmarted themselves. They failed even to address the problem of how to finance their schemes of imperialist wars and global domination.
As a result, going into 2008, the United States finds itself in the anomalous position of being unable to pay for its own elevated living standards or its wasteful, overly large military establishment. Its government no longer even attempts to reduce the ruinous expenses of maintaining huge standing armies, replacing the equipment that seven years of wars have destroyed or worn out, or preparing for a war in outer space against unknown adversaries. Instead, the Bush administration puts off these costs for future generations to pay -- or repudiate. This utter fiscal irresponsibility has been disguised through many manipulative financial schemes (such as causing poorer countries to lend us unprecedented sums of money), but the time of reckoning is fast approaching.
There are three broad aspects to our debt crisis. First, in the current fiscal year (2008) we are spending insane amounts of money on"defense" projects that bear no relationship to the national security of the United States. Simultaneously, we are keeping the income tax burdens on the richest segments of the American population at strikingly low levels.
Second, we continue to believe that we can compensate for the accelerating erosion of our manufacturing base and our loss of jobs to foreign countries through massive military expenditures -- so-called"military Keynesianism," which I discuss in detail in my book Nemesis: The Last Days of the American Republic. By military Keynesianism, I mean the mistaken belief that public policies focused on frequent wars, huge expenditures on weapons and munitions, and large standing armies can indefinitely sustain a wealthy capitalist economy. The opposite is actually true.
Third, in our devotion to militarism (despite our limited resources), we are failing to invest in our social infrastructure and other requirements for the long-term health of our country. These are what economists call"opportunity costs," things not done because we spent our money on something else. Our public education system has deteriorated alarmingly. We have failed to provide health care to all our citizens and neglected our responsibilities as the world's number one polluter. Most important, we have lost our competitiveness as a manufacturer for civilian needs -- an infinitely more efficient use of scarce resources than arms manufacturing. Let me discuss each of these.
The Current Fiscal Disaster
It is virtually impossible to overstate the profligacy of what our government spends on the military. The Department of Defense's planned expenditures for fiscal year 2008 are larger than all other nations' military budgets combined. The supplementary budget to pay for the current wars in Iraq and Afghanistan, not part of the official defense budget, is itself larger than the combined military budgets of Russia and China. Defense-related spending for fiscal 2008 will exceed $1 trillion for the first time in history. The United States has become the largest single salesman of arms and munitions to other nations on Earth. Leaving out of account President Bush's two on-going wars, defense spending has doubled since the mid-1990s. The defense budget for fiscal 2008 is the largest since World War II.
Before we try to break down and analyze this gargantuan sum, there is one important caveat. Figures on defense spending are notoriously unreliable. The numbers released by the Congressional Reference Service and the Congressional Budget Office do not agree with each other. Robert Higgs, senior fellow for political economy at the Independent Institute, says:"A well-founded rule of thumb is to take the Pentagon's (always well publicized) basic budget total and double it." Even a cursory reading of newspaper articles about the Department of Defense will turn up major differences in statistics about its expenses. Some 30-40% of the defense budget is"black," meaning that these sections contain hidden expenditures for classified projects. There is no possible way to know what they include or whether their total amounts are accurate.
There are many reasons for this budgetary sleight-of-hand -- including a desire for secrecy on the part of the president, the secretary of defense, and the military-industrial complex -- but the chief one is that members of Congress, who profit enormously from defense jobs and pork-barrel projects in their districts, have a political interest in supporting the Department of Defense. In 1996, in an attempt to bring accounting standards within the executive branch somewhat closer to those of the civilian economy, Congress passed the Federal Financial Management Improvement Act. It required all federal agencies to hire outside auditors to review their books and release the results to the public. Neither the Department of Defense, nor the Department of Homeland Security has ever complied. Congress has complained, but not penalized either department for ignoring the law. The result is that all numbers released by the Pentagon should be regarded as suspect.
In discussing the fiscal 2008 defense budget, as released to the press on February 7, 2007, I have been guided by two experienced and reliable analysts: William D. Hartung of the New America Foundation's Arms and Security Initiative and Fred Kaplan, defense correspondent for Slate.org. They agree that the Department of Defense requested $481.4 billion for salaries, operations (except in Iraq and Afghanistan), and equipment. They also agree on a figure of $141.7 billion for the"supplemental" budget to fight the"global war on terrorism" -- that is, the two on-going wars that the general public may think are actually covered by the basic Pentagon budget. The Department of Defense also asked for an extra $93.4 billion to pay for hitherto unmentioned war costs in the remainder of 2007 and, most creatively, an additional"allowance" (a new term in defense budget documents) of $50 billion to be charged to fiscal year 2009. This comes to a total spending request by the Department of Defense of $766.5 billion.
But there is much more. In an attempt to disguise the true size of the American military empire, the government has long hidden major military-related expenditures in departments other than Defense. For example, $23.4 billion for the Department of Energy goes toward developing and maintaining nuclear warheads; and $25.3 billion in the Department of State budget is spent on foreign military assistance (primarily for Israel, Saudi Arabia, Bahrain, Kuwait, Oman, Qatar, the United Arab Republic, Egypt, and Pakistan). Another $1.03 billion outside the official Department of Defense budget is now needed for recruitment and reenlistment incentives for the overstretched U.S. military itself, up from a mere $174 million in 2003, the year the war in Iraq began. The Department of Veterans Affairs currently gets at least $75.7 billion, 50% of which goes for the long-term care of the grievously injured among the at least 28,870 soldiers so far wounded in Iraq and another 1,708 in Afghanistan. The amount is universally derided as inadequate. Another $46.4 billion goes to the Department of Homeland Security.
Missing as well from this compilation is $1.9 billion to the Department of Justice for the paramilitary activities of the FBI; $38.5 billion to the Department of the Treasury for the Military Retirement Fund; $7.6 billion for the military-related activities of the National Aeronautics and Space Administration; and well over $200 billion in interest for past debt-financed defense outlays. This brings U.S. spending for its military establishment during the current fiscal year (2008), conservatively calculated, to at least $1.1 trillion.
Such expenditures are not only morally obscene, they are fiscally unsustainable. Many neoconservatives and poorly informed patriotic Americans believe that, even though our defense budget is huge, we can afford it because we are the richest country on Earth. Unfortunately, that statement is no longer true. The world's richest political entity, according to the CIA's"World Factbook," is the European Union. The EU's 2006 GDP (gross domestic product -- all goods and services produced domestically) was estimated to be slightly larger than that of the U.S. However, China's 2006 GDP was only slightly smaller than that of the U.S., and Japan was the world's fourth richest nation.
A more telling comparison that reveals just how much worse we're doing can be found among the" current accounts" of various nations. The current account measures the net trade surplus or deficit of a country plus cross-border payments of interest, royalties, dividends, capital gains, foreign aid, and other income. For example, in order for Japan to manufacture anything, it must import all required raw materials. Even after this incredible expense is met, it still has an $88 billion per year trade surplus with the United States and enjoys the world's second highest current account balance. (China is number one.) The United States, by contrast, is number 163 -- dead last on the list, worse than countries like Australia and the United Kingdom that also have large trade deficits. Its 2006 current account deficit was $811.5 billion; second worst was Spain at $106.4 billion. This is what is unsustainable.
It's not just that our tastes for foreign goods, including imported oil, vastly exceed our ability to pay for them. We are financing them through massive borrowing. On November 7, 2007, the U.S. Treasury announced that the national debt had breached $9 trillion for the first time ever. This was just five weeks after Congress raised the so-called debt ceiling to $9.815 trillion. If you begin in 1789, at the moment the Constitution became the supreme law of the land, the debt accumulated by the federal government did not top $1 trillion until 1981. When George Bush became president in January 2001, it stood at approximately $5.7 trillion. Since then, it has increased by 45%. This huge debt can be largely explained by our defense expenditures in comparison with the rest of the world.
The world's top 10 military spenders and the approximate amounts each country currently budgets for its military establishment are:
1. United States (FY08 budget), $623 billion
2. China (2004), $65 billion
3. Russia, $50 billion
4. France (2005), $45 billion
5. United Kingdom, $42.8 billion
6. Japan (2007), $41.75 billion
7. Germany (2003), $35.1 billion
8. Italy (2003), $28.2 billion
9. South Korea (2003), $21.1 billion
10. India (2005 est.), $19 billion
World total military expenditures (2004 est.), $1,100 billion
World total (minus the United States), $500 billion
Our excessive military expenditures did not occur over just a few short years or simply because of the Bush administration's policies. They have been going on for a very long time in accordance with a superficially plausible ideology and have now become entrenched in our democratic political system where they are starting to wreak havoc. This ideology I call"military Keynesianism" -- the determination to maintain a permanent war economy and to treat military output as an ordinary economic product, even though it makes no contribution to either production or consumption.
This ideology goes back to the first years of the Cold War. During the late 1940s, the U.S. was haunted by economic anxieties. The Great Depression of the 1930s had been overcome only by the war production boom of World War II. With peace and demobilization, there was a pervasive fear that the Depression would return. During 1949, alarmed by the Soviet Union's detonation of an atomic bomb, the looming communist victory in the Chinese civil war, a domestic recession, and the lowering of the Iron Curtain around the USSR's European satellites, the U.S. sought to draft basic strategy for the emerging cold war. The result was the militaristic National Security Council Report 68 (NSC-68) drafted under the supervision of Paul Nitze, then head of the Policy Planning Staff in the State Department. Dated April 14, 1950, and signed by President Harry S. Truman on September 30, 1950, it laid out the basic public economic policies that the United States pursues to the present day.
In its conclusions, NSC-68 asserted:"One of the most significant lessons of our World War II experience was that the American economy, when it operates at a level approaching full efficiency, can provide enormous resources for purposes other than civilian consumption while simultaneously providing a high standard of living."
With this understanding, American strategists began to build up a massive munitions industry, both to counter the military might of the Soviet Union (which they consistently overstated) and also to maintain full employment as well as ward off a possible return of the Depression. The result was that, under Pentagon leadership, entire new industries were created to manufacture large aircraft, nuclear-powered submarines, nuclear warheads, intercontinental ballistic missiles, and surveillance and communications satellites. This led to what President Eisenhower warned against in his farewell address of February 6, 1961:"The conjunction of an immense military establishment and a large arms industry is new in the American experience" -- that is, the military-industrial complex.
By 1990, the value of the weapons, equipment, and factories devoted to the Department of Defense was 83% of the value of all plants and equipment in American manufacturing. From 1947 to 1990, the combined U.S. military budgets amounted to $8.7 trillion. Even though the Soviet Union no longer exists, U.S. reliance on military Keynesianism has, if anything, ratcheted up, thanks to the massive vested interests that have become entrenched around the military establishment. Over time, a commitment to both guns and butter has proven an unstable configuration. Military industries crowd out the civilian economy and lead to severe economic weaknesses. Devotion to military Keynesianism is, in fact, a form of slow economic suicide.
On May 1, 2007, the Center for Economic and Policy Research of Washington, D.C., released a study prepared by the global forecasting company Global Insight on the long-term economic impact of increased military spending. Guided by economist Dean Baker, this research showed that, after an initial demand stimulus, by about the sixth year the effect of increased military spending turns negative. Needless to say, the U.S. economy has had to cope with growing defense spending for more than 60 years. He found that, after 10 years of higher defense spending, there would be 464,000 fewer jobs than in a baseline scenario that involved lower defense spending.
"It is often believed that wars and military spending increases are good for the economy. In fact, most economic models show that military spending diverts resources from productive uses, such as consumption and investment, and ultimately slows economic growth and reduces employment."
These are only some of the many deleterious effects of military Keynesianism.
Hollowing Out the American Economy
It was believed that the U.S. could afford both a massive military establishment and a high standard of living, and that it needed both to maintain full employment. But it did not work out that way. By the 1960s, it was becoming apparent that turning over the nation's largest manufacturing enterprises to the Department of Defense and producing goods without any investment or consumption value was starting to crowd out civilian economic activities. The historian Thomas E. Woods, Jr., observes that, during the 1950s and 1960s, between one-third and two-thirds of all American research talent was siphoned off into the military sector. It is, of course, impossible to know what innovations never appeared as a result of this diversion of resources and brainpower into the service of the military, but it was during the 1960s that we first began to notice Japan was outpacing us in the design and quality of a range of consumer goods, including household electronics and automobiles.
Nuclear weapons furnish a striking illustration of these anomalies. Between the 1940s and 1996, the United States spent at least $5.8 trillion on the development, testing, and construction of nuclear bombs. By 1967, the peak year of its nuclear stockpile, the United States possessed some 32,500 deliverable atomic and hydrogen bombs, none of which, thankfully, was ever used. They perfectly illustrate the Keynesian principle that the government can provide make-work jobs to keep people employed. Nuclear weapons were not just America's secret weapon, but also its secret economic weapon. As of 2006, we still had 9,960 of them. There is today no sane use for them, while the trillions spent on them could have been used to solve the problems of social security and health care, quality education and access to higher education for all, not to speak of the retention of highly skilled jobs within the American economy.
The pioneer in analyzing what has been lost as a result of military Keynesianism was the late Seymour Melman (1917-2004), a professor of industrial engineering and operations research at Columbia University. His 1970 book, Pentagon Capitalism: The Political Economy of War, was a prescient analysis of the unintended consequences of the American preoccupation with its armed forces and their weaponry since the onset of the Cold War. Melman wrote (pp. 2-3):
"From 1946 to 1969, the United States government spent over $1,000 billion on the military, more than half of this under the Kennedy and Johnson administrations -- the period during which the [Pentagon-dominated] state management was established as a formal institution. This sum of staggering size (try to visualize a billion of something) does not express the cost of the military establishment to the nation as a whole. The true cost is measured by what has been foregone, by the accumulated deterioration in many facets of life by the inability to alleviate human wretchedness of long duration."
In an important exegesis on Melman's relevance to the current American economic situation, Thomas Woods writes:
"According to the U.S. Department of Defense, during the four decades from 1947 through 1987 it used (in 1982 dollars) $7.62 trillion in capital resources. In 1985, the Department of Commerce estimated the value of the nation's plant and equipment, and infrastructure, at just over $7.29 trillion. In other words, the amount spent over that period could have doubled the American capital stock or modernized and replaced its existing stock."
The fact that we did not modernize or replace our capital assets is one of the main reasons why, by the turn of the twenty-first century, our manufacturing base had all but evaporated. Machine tools -- an industry on which Melman was an authority -- are a particularly important symptom. In November 1968, a five-year inventory disclosed (p. 186)"that 64 percent of the metalworking machine tools used in U.S. industry were ten years old or older. The age of this industrial equipment (drills, lathes, etc.) marks the United States' machine tool stock as the oldest among all major industrial nations, and it marks the continuation of a deterioration process that began with the end of the Second World War. This deterioration at the base of the industrial system certifies to the continuous debilitating and depleting effect that the military use of capital and research and development talent has had on American industry."
Nothing has been done in the period since 1968 to reverse these trends and it shows today in our massive imports of equipment -- from medical machines like proton accelerators for radiological therapy (made primarily in Belgium, Germany, and Japan) to cars and trucks.
Our short tenure as the world's"lone superpower" has come to an end. As Harvard economics professor Benjamin Friedman has written:
"Again and again it has always been the world's leading lending country that has been the premier country in terms of political influence, diplomatic influence, and cultural influence. It's no accident that we took over the role from the British at the same time that we took over… the job of being the world's leading lending country. Today we are no longer the world's leading lending country. In fact we are now the world's biggest debtor country, and we are continuing to wield influence on the basis of military prowess alone."
Some of the damage done can never be rectified. There are, however, some steps that this country urgently needs to take. These include reversing Bush's 2001 and 2003 tax cuts for the wealthy, beginning to liquidate our global empire of over 800 military bases, cutting from the defense budget all projects that bear no relationship to the national security of the United States, and ceasing to use the defense budget as a Keynesian jobs program. If we do these things we have a chance of squeaking by. If we don't, we face probable national insolvency and a long depression.
[Note:For those interested, click here to view a clip from a new film,"Chalmers Johnson on American Hegemony," in Cinema Libre Studios'Speaking Freely series in which he discusses"military Keynesianism" and imperial bankruptcy. For sources on global military spending, please see: (1) Global Security Organization, "World Wide Military Expenditures" as well as Glenn Greenwald, "The bipartisan consensus on U.S. military spending"; (2) Stockholm International Peace Research Institute, "Report: China biggest Asian military spender."]
Copyright 2008 Chalmers Johnson
This article first appeared on www.tomdispatch.com, a weblog of the Nation Institute, which offers a steady flow of alternate sources, news and opinion from Tom Engelhardt, a long time editor in publishing, the author of The End of Victory Culture, and a fellow of the Nation Institute.
Posted on: Thursday, January 24, 2008 - 21:59
Andrew Roberts, in the Guardian (Sept. 17, 2004):
... How predictable and absurd has been the knee-jerk outrage expressed over Ferry's schoolboyish prank. Once linked to the rather less stylish Fathers4Justice stunt at Buckingham Palace, the media, police and politicians have vied with one another to shed any sense of proportion in their analysis of what is really going on. "The most serious attack on the Houses of Parliament in living memory," intoned Sky TV, overlooking the Luftwaffe's direct hit on the Commons chamber in May 1941. As for charging the young toffs with burglary and "uttering a forged instrument" - whatever that may be - one might as well charge Bertie Wooster with theft for pinching a policeman's helmet after dinner at the Drones Club on Boat Race Night. Yet Peter Hain has thundered about the house's "antiquated" security arrangements and shadow home secretary David Davis has portentously informed the nation, via the Today programme, that: "What we have witnessed is something which puts a large number of people at risk, not just in the House of Commons."
I might as well declare my interest right away; I was sacked from my minor public school 20 years ago after a series of pranks so juvenile as to defy satire. Statue painting, chapel roof climbing, rearranging the furniture in the quad, you name it: I thought it was the very acme of sophisticated wit. Because I always, always got caught, the authorities finally chucked me out after no fewer than four rustications. Just as my Cambridge college couldn't have cared less about these stupid pranks, so today's police and media and Home Office spokesmen ought not to hyperventilate about what are - even at Buckingham Palace and the House of Commons - inherently harmless pranks. (Of course the scenes outside the Palace of Westminster are a different matter entirely.)
The people who invaded parliament, or "the very heart of our democracy" as it was put in endless po-faced editorials, were instantly recognisable as a bunch of hoorays, polo players, auctioneers, stud owners, chefs and point-to-point jockeys .... Their caper had precisely zero implications for the vulnerability of parliament to a genuine terrorist attack. Which of Osama bin Laden's chums has the wit or brio to invent the "All Party Electrical and Skills Group" - or don an ill-fitting Batman suit, for that matter? (One can almost see the copper yawning as he waved through the group headed for an assignation as dull as that.)
Rather, we ought to see both these pro-hunting hoorays and the (far less appealing) Fathers 4Justice as part of that long British protest tradition that seeks to push their single issues up the political agenda by using humour and high jinks instead of weapons. Whether you approve of their messages - as I do - or disapprove - as I imagine most Guardian readers do - it is important not to judge them by different standards than other protesters over the centuries.
The Suffragettes who handcuffed themselves to the railings of Buckingham Palace, the anti-Vietnam demonstrators who chanted witty anti-LBJ jibes in the 60s, the feminists who publicly burned their bras, the anti-globalisation campaigners who gave Winston Churchill's statue a mohican haircut on May Day 2000 (much to my disgust) - all used eye-catching, provoking, sometimes highly amusing stunts to grab the attention of an often bemused public. The phrase "However, we will not be able to provide your team with safety wear" is just the latest part of a very long - and for the most part honourable - British tradition....
Posted on: Thursday, January 24, 2008 - 12:32