Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Christian Science Monitor (1-14-10)
Thus far, President Obama has done all the right things in response to Tuesday’s catastrophic earthquake in Haiti. He hastily dispatched American ships, transport planes, and more than 2,000 troops to help distribute supplies, search for missing victims, and maintain order in the devastated Caribbean nation. At a press conference on Thursday, meanwhile, the president also pledged $100 million in immediate US aid.
But there’s one pledge Mr. Obama hasn’t made: to visit Haiti himself.
And he should, as soon as possible.
When disaster strikes in the United States, after all, presidents traditionally travel to the stricken areas to offer symbolic as well as material support. A presidential visit signals the nation’s concern for the victims, drawing renewed attention to their plight. The time has come for Obama to extend this practice outside our own borders to a country that is suffering beyond anything we can imagine....
An Obama visit to Haiti would galvanize American and international relief efforts, as nothing else could. It wouldn’t hurt that Obama is the first black president in America, and Haiti the first black republic in the West. Most of all, though, Obama’s visit would remind Haitians – and the world – about the common humanity that binds us all....
Posted on: Thursday, January 14, 2010 - 18:39
SOURCE: Private Papers (1-11-10)
In 480 B.C., a decade of Aegean tension culminated in the Persian invasion of Greece. Nothing seemed able to stop the onslaught of King Xerxes as he broke through the pass of Thermopylae — until the Greeks under Themistocles rallied at the sea battle of Salamis and saved the West.
In A.D. 69, the Roman Empire was tottering on its very foundations. Rome had been rocked by decades of corruption, assassinations, coups and military revolts. By the end of 69, Vespasian — the fourth emperor that year! — had put an end to over a century of erratic Julio-Claudian rule when he brought sanity back to Roman government.
Fast-forward to the modern era. The rise of fascism erupted into war and conquest in 1939. That year, Franco's Nationalists won the civil war in Spain. The Soviet Union fought Japan in a border war — during which it signed the Molotov-Ribbentrop non-aggression pact with Hitler's Germany. Weeks later, the Nazi invasion of Poland marked the start of the Second World War....
2010 may turn out to be a similar year of destiny. In 2009, the United States gave Iran at least four deadlines to stop its nuclear program. All were ignored. Does an emboldened theocracy believe this now is the year to become nuclear and change the entire strategic makeup of the Middle East?...
Meanwhile, the cash-flush Chinese have not been idle. This year they will continue to use their vast budget surpluses to expand their armed forces — as skyrocketing debts in the years ahead force us to curtail our own....
In 2010, our year of decision, events may come to a head and overwhelm the existing American-led global order unless President Obama can galvanize Western allies to meet the mounting danger.
Posted on: Thursday, January 14, 2010 - 16:53
SOURCE: The New Republic (12-25-09)
The airplane rises from the runway. Bent, folded, and spindled into the last seat in coach class--the one that doesn’t really recline--I pull my Kindle out of the seat pocket in front of me, slide the little switch, and lose myself in Matthew Crawford’s story of his passage from policy wonk to motorcycle mechanic. The gritty world of his workshop takes shape as I read. The airplane noise fades away, the pain in my cramped knees almost disappears, my eyes cease to blink. I am Kindled. It is by no means the first time. These days, in fact, I spend a lot of my time that way.
For the last few months, my Kindle has come with me pretty much everywhere: on research trips to Cambridge, London, Munich; to weekend conferences; even to lectures and movies, for the moments before they start. (So many books, so little time.) The one in my possession is the second-generation Kindle, not the big new DX that my hip colleagues have begun to pull out at meetings, their PDFs neatly readable on its 9.7” screen. But it is a neat little device--light, slender, as easy to hold for a long time as a small paperback--and it works very well. The battery charges quickly, which is more than I can sometimes say about myself, and then works for an astonishingly long time. Amazon’s Kindle Shop, made accessible by Sprint, is only a click or two away, and downloading books and periodicals from it is quick and simple. And everything you need to do to move forwards and backwards through the texts, mark them up and store them, immediately becomes apparent. The device seems to arrive borne on clouds of ease and simplicity. When I took it to Britain, a single search on Google turned up a converter that enabled me to recharge it as easily in my dorm room at Trinity College Cambridge as at home.
The Kindle liberates you. Once you have it, you don’t have to ransack the Fiction and Literature sections at airport bookshops, pulse fluttering and breath becoming shorter, as you face the prospect of an eight-hour flight with nothing but a laptop full of work to do (we addicts have a hard time with withdrawal). You don’t have to break your back schlepping six books in case you run out of reading matter during a sleepless night in your hotel. You can even get the Homeland Security people to crack a smile and engage in conversation--at least if you carry your Kindle, as I do, inside a special book hollowed out for the purpose, and produce it when you undergo your screening. Any road warrior with a taste for reading will find the Kindle indispensable....
What does the move from codex to Kindle mean? Will electronic reading kill the long, demanding novel and the deep, complex work of scholarship? Or--an even scarier thought--will it foster the vogue for vooks? (Vooks are hybrid works that combine text with video. They are to real books what PowerPoint presentations are to real lectures.) In search of help in thinking about the future that readers confront I turned, first, to a recent essay by the French philosopher Jean-Luc Nancy. An expert on Hegel, Schelling, and Heidegger, an interlocutor of Bataille, Lacoue-Labarthe, and Derrida, Nancy is not the horror-inspiring hater of the humanities that those names will evoke in all too many readers. He loves books in the good old way, in all their gritty materiality: “the engraved and impressed characters now repeated in numerous copies on mobile supports, like those that preceded them, traced with a stylus and copied numerous times on skin, bark, or silk.” But in the end, Nancy argues, books cannot be reduced to stable, material things. Born thanks to an author’s feverish creative energy, texts shift endlessly in form and meaning as publishers and booksellers, critics and readers, respond to them....
The Kindle is terrific for downtime reading. I have enjoyed reading several new books that I would never have bought at full price or put on my material bookshelves, but that were easy to find and whiled away an hour or two. I am even more pleased to know that the complete Sherlock Holmes and several Dickens novels, which cost pennies, are always available (as they are, of course, on laptops and iPhones). The bigger version offers a good way to bring all your papers, once scanned and turned into PDFs, to meetings. But it is not a great way to read demanding or technical books.
An experiment at Princeton and other universities, in which some courses have put all of their readings on Kindle, has yet to arouse much enthusiasm on my campus. “I hate to sound like a Luddite, but this technology is a poor excuse of an academic tool,” said one student who is taking part, as quoted in the Daily Princetonian. “It’s clunky, slow, and a real pain to operate.” He explained that “Much of my learning comes from a physical interaction with the text: bookmarks, highlights, page-tearing, sticky notes, and other marks representing the importance of certain passages--not to mention margin notes, where most of my paper ideas come from and interaction with the material occurs. All these things have been lost, and if not lost they’re too slow to keep up with my thinking, and the ‘features’ have been rendered useless.” There are readers--Adobe has an excellent one--that allow really efficient markup and search, but they are expensive, and they have to be used on a normal laptop or one of the increasingly popular miniature laptops known as netbooks. And by the unchanging laws of human nature, if twenty students in a class have their computers open, ten of them will be looking at the assignment--and the other ten will be checking their mail, messaging, Tweeting, surfing the Web, or playing Minesweeper. The Kindle does not offer these distractions, but for students and teachers working intensively together on a set text, it works far less well, so far, than books....
Posted on: Wednesday, January 13, 2010 - 15:35
SOURCE: Politico (1-12-10)
Still, there are a number of reasons why Democratic leaders should be pleased with its performance. Congress passed a $787 billion economic stimulus bill that many experts say helped stabilize the financial markets and is leading to a steady, if slow, economic recovery. Congress passed and the president signed several important pieces of legislation, including an expansion of the State Children’s Health Insurance Program and a bill protecting women from workplace discrimination. The House and Senate have both passed versions of health care reform and are close to sending President Barack Obama a final bill. The House has also passed a number of bills that are awaiting action in the Senate, including ones addressing climate change and financial regulation.
House Speaker Nancy Pelosi (D-Calif.) has skillfully navigated divisions between moderates and liberals in her caucus and has proved a pragmatic leader willing to cut deals, compromise and use party assets and appropriations to reach the magic number of 218 votes. She does this while preserving her broader agenda, constantly seeking to find ways to make contemporary liberalism politically viable. Time magazine named her a runner-up for person of the year.
Then why are so many Democrats feeling blue? The problem is that after the 2008 election, many of Obama’s supporters were hoping for much more. They thought that, by now, Congress would have passed a larger number of the major items on the president’s agenda and that the legislation that passed would not be so watered down. Too many bills (climate change) seem to be stuck in the Senate, while others (immigration reform) are off the agenda altogether....
Reform is not impossible. During the Progressive Era, congressional reformers empowered committees, weakened the speaker of the House and passed an amendment that resulted in the direct election of senators. During the 1970s, congressional reformers weakened committee chairs, opened the legislative process to public scrutiny, created new ethics rules and campaign finance regulations and lowered the number of members needed to end a filibuster.
Unless Democrats take on the ways in which Congress works, the political playing field will remain as treacherous as it was in 2009. Legislators with the best of intentions and the greatest of skills will still face the institutional pressures that bore down on them throughout Obama’s first term. In turn, the president will find that the possibility for bold change will remain slim.
Posted on: Wednesday, January 13, 2010 - 11:56
SOURCE: National Review Online (1-8-10)
Most of the current acrimony over counterterrorism is stale. The debate is simply a rehash of issues that were discussed and, in fact, resolved early last decade.
Let us review them one more time.
MOST TERRORISTS ARE NOT POOR AND DOWNTRODDEN
September 11 taught us that a Mohammed Atta or a Khalid Sheikh Mohammed does not commit mass murder out of hunger, want, illiteracy, or Western oppression.
No doubt Middle Eastern poverty contributes to religious violence. But the poor in Palestine, Saudi Arabia, Egypt, and Yemen are no more impoverished than those in the slums of São Paulo, Mexico City, Ho Chi Minh City, or Johannesburg. And the latter, despite their frequent claims against the West, do not feel a need to murder in mass in the name of their particular religion.
A Major Nidal Hasan or an Umar Farouk Abdulmutallab wishes to kill Westerners not because he is poor or even on behalf of the poor, but rather out of a warped sense of pride, hurt, and anger.
Such passions derive from a radical religious creed that insists that comparative failure in the modern Middle East is not self-induced — much less a product of fundamentalism, anti-Enlightenment thinking, autocracy, gender apartheid, tribalism, corruption, and statism. Instead the fact that there is no longer an intercontinental caliphate of rich and powerful believers is due to some sort of contemporary Jewish or Western oppression.
The wealthier, better educated, and more Westernized the radical Muslim, often the greater the sense of shame, alienation, and anger that he and his religion are not shown proper deference. We knew all that in 2001, but have apparently forgotten it during eight years of relative calm.
Hasan hated American soldiers not because our system had discriminated against him, much less because of “secondary post-traumatic-stress syndrome,” or any of the other wacky excuses that followed his crime. Instead, in part he sensed that the American military had bent over backwards for him and accommodated his extremism — and was therefore, in his own distorted worldview, weak, decadent, and deserving of what he would dish out.
THERAPY IS NO ANSWER
Radical Islam’s anger is irrational. It is not predicated on the degree of outreach shown by the United States. A contrite and compliant Jimmy Carter, after all, prompted the creation of the slur “The Great Satan.” The year 2009 saw the greatest number of foiled terrorist plots against America since 9/11. Indeed, one-third of all such attempts in the last eight years happened last year — the time of the Obama Al Arabiya interview, the Cairo speech, the bowing to Saudi royals, the promises to close Guantanamo Bay, and the ritual trashing of the Bush anti-terrorism policies.
We need not be gratuitously rude. There surely is a role for sober diplomacy and soft speech. But the degree to which radical Islam will be aggressive toward the West hinges a lot on what it imagines will be our reaction — in terms both of military responses, and of the sense of confidence we project about our own civilization.
Islamists, after all, ignore past American help to, and support for, Islamic peoples in Afghanistan, Bosnia, Chechnya, Indonesia, Iraq, Kosovo, Kuwait, and Somalia — only to pay far more deference to the Chinese and Russians, who have systematically oppressed and often butchered fellow Muslims. Apparently, Dr. Zawahiri and Osama bin Laden would rather recommend reading by Noam Chomsky and Jimmy Carter than offend Vladimir Putin and earn another Grozny.
The popularity of bin Laden and the tactic of suicide bombing itself plummeted throughout the Middle East between 2001 and 2009. And that was not because of the mellifluousness of George Bush’s Texas twang or a sudden love for our policy in Iraq.
Rather, the change of heart developed because bin Laden and his epigones were considered to be losing in Afghanistan and Iraq. They were endangering those who supported them, and murderously turning on their own — even as the United States was projecting both an image of confidence and readiness to extend support for consensual government and personal freedom.
In contrast, the current policy of apology and kowtow — coupled with a cynical realism (albeit cloaked in nonjudgmental, multicultural relativism) and presented abroad with a sense of hesitation and self-doubt — is, in fact, a prescription for reviving radical Islam.
That lesson likewise was apparent after 9/11.
A PROJECTION OF WEAKNESS IS DANGEROUS
Much of radical Islam’s posture is predicated on our expected response. When we did nothing during the Iranian hostage crisis, more or less whined after the Marine-barracks bombing, sent a few cruise missiles after the East African embassy attacks, litigated the 1993 World Trade Center bombing, and forgot the USS Cole, bin Laden concluded that the West was the “weak horse” and pressed on.
To some degree, Afghanistan and Iraq changed that impression, especially the devastating defeat of al-Qaeda in al-Anbar province in 2006–2008. But that costly progress was accompanied by more recrimination against the Bush administration than anger directed at radical Islam.
Equally important, the Western world said very little about the Danish-cartoon threats, the killing of Theo Van Gogh, and various premodern Muslim actions like rioting after the Pope’s Byzantine exegesis and the false stories of Koran burning in Guantanamo. Had Europe and the United States shown a united front on behalf of freedom of expression, rather than a fear of Islamic reaction, such incidents would have been written off as the lunacy they were.
Instead of reacting to perceived Muslim grievances, we should be continually directing questions to Islam: Why are there numerous mosques in the West, but few churches in Islamic countries? Why are Korans freely disseminated in the West, but Bibles not so under Islamic rule? Why do Muslims enjoy more freedom and rights under Western secular law than in their own countries? The aim of such interrogatories is not to score points, but to suggest to radical Muslims that we hold them to the same standards as we hold ourselves.
ISRAEL IS NOT THE PROBLEM
Just because radical Muslims and the Arab Street claim that a Jewish presence on the West Bank is the catalyst for terrorist outrage does not make it so — any more than Hitler’s insistence that Versailles drove him to the invasion of Poland in 1939, or Argentinians’ claims that their problems in the early 1980s originated with the British “occupation” of the Malvinas.
No Germans today are blowing up Poles for the loss of Danzig and East Prussia. Greek Cypriots are not planting IEDs at Turkish embassies to force the return of ancestral homelands. And the world is not concerned about the divided city of Nicosia or Russian occupation of the Kuriles.
No, what privileges the Palestinian question is largely three factors that have nothing to do with disputed ground: the presence of huge amounts of oil on Arab lands, endemic anti-Semitism in the West and at the U.N., and fear of radical Islamic terrorism.
Take those considerations out of the equation, and the West Bank is about as important to the world as a disputed South Ossetia. We forget that there were three Middle Eastern wars well before the so-called occupation of Palestine. Gaza did not become a calm place once the Israelis left.
Should Palestinians cease the violence, welcome investment from elsewhere in the Arab world, and establish a consensual government, one transparent and free of corruption, the West Bank could become like Dubai — and deal with Israel as a responsible neighbor adjudicating a common border. And yet radical Islamic terrorism in general would nevertheless continue with fresh and always mutating grievances.
All that was clear around 2001 — but apparently now ignored.
THE SO-CALLED WAR ON TERROR WAS WORKING
We constantly argue and bicker about what we should be doing rather than showing some appreciation for our past successes. Our country has not experienced another terrorist attack on the scale of 9/11. For all the tragedy of Iraq, what was unthinkable in 2006 has now become accepted — a continuing constitutional government and a month of “war” without a single American fatality. The U.S. military is not broken; in fact it has fought brilliantly in both Iraq and Afghanistan. General Petraeus’s surge, unfairly caricatured at the time and now largely forgotten, was a remarkable military and political victory.
There are now proven protocols for dealing with terrorism that work and are not at odds with the Constitution. For all the talk of al-Qaeda’s resilience, it has lost thousands of its top echelon. The regime in Iran is shaky — and shakier still for the continuance of a constitutional system in neighboring Iraq. Europe is shedding its politically correct appeasement of Islam, and several countries have already enacted statutes about Islamic dress and mosques unthinkable in the United States.
“Bush did it” is becoming ironic, and having the unintended consequence of reminding us how well we once defended ourselves — and how risky it is not to appreciate why and how.
WILL WE NEVER LEARN?
In short, soon after September 11 the United States correctly sized up radical Islam, its nature, its aims, and its pseudo-grievances. We may have made mistakes in implementation, and at times in tactics and strategy, but in large part we had contained the threat, and radical Islam was losing its currency.
Apparently we’ve forgotten why that was so, and thus continue to beat the old dead horse in our own self-recrimination.
— NRO contributor.
Posted on: Tuesday, January 12, 2010 - 19:37
SOURCE: TomDispatch.com (1-12-10)
Viewed objectively, though, this assumption is over-optimistic. It overlooks cardinal differences between the present moment and the 1978-1979 events which led to the overthrow of the Shah of Iran and the founding of an Islamic Republic under Ayatollah Ruhollah Khomeini. History shows that a revolutionary movement triumphs only when two vital factors merge: it is supported by a coalition of different social classes and it succeeds in crippling the country’s governing machinery and fracturing the state’s repressive apparatus.
Two Movements, Two Moments
A short review of Iran’s 31-year-old revolution is in order. In February 1979, the autocratic monarchy of the Shah collapsed when the country’s economy ground to a halt due to strikes not only by the religiously observant merchants of the bazaar, but also by civil servants, factory employees, and (crucially) leftist oil workers. At the same time, the foundations of the modern state -- the armed forces, special forces, armed police, and intelligence agencies, as well as the state-controlled media -- cracked.
The street demonstrations, launched in October 1977 by Iranian intellectuals and professionals to protest human rights violations by SAVAK, the Shah’s brutal secret police, lacked both focus and an overarching set of coherent demands articulated by a towering personality. That changed when Khomeini, a virulently anti-Shah ayatollah exiled to neighboring Iraq for 14 years, was drawn into the process in January 1978. From then on, the ranks of the protestors swelled exponentially.
Today, the key question is: Have the recent street protests, triggered by the rigged presidential poll of last June, drawn one or more of those segments of society which originally ignored the electoral fraud or dismissed the claims to that effect?
The evidence so far suggests that the protests, while remaining defiant and resilient, have gotten stuck in a groove -- even though on December 27, the day of the Shiite holy ritual of Ashura, they spread to the smaller cities for the first time. What has remained unchanged is the social background of the participants. They are largely young, university educated, and well dressed, equipped with mobile phones, and adept at using the Internet, YouTube, Facebook, and Twitter.
In the capital, they are usually from upscale North Tehran, which contains about a third of the city’s population of nine million. It is home to affluent families, many of whom have relatives in Western Europe or North America. They often spend their vacations in the West; and most are fluent in English and at ease with computers.
Naturally, then, Western reporters and commentators identify with this section of Iranian society, and focus largely on them, inadvertently or otherwise.
In the autumn of 1977, too, such people predominated in the street protests against the Shah. The difference now is one of scale. Since the Islamic Revolution, there has been an explosion in higher education. Between 1979 and 1999, while the population doubled, the number of university graduates grew nine-fold, from a base of 430,000 to nearly four million. The student bodies of universities and colleges have soared to three-quarters of a million young Iranians. That explains the vast size of the protests and their sartorial uniformity.
Now, the foremost question for Iran specialists ought to be: Over the past six months have significant numbers of residents from downscale South Tehran, with its six million people, joined the protest? Going by the images on the Internet and Western TV channels, the answer is “no.” South Tehranis do not wear fashionable jeans, and any protesting women would appear veiled from head to toe and without noticeable make-up.
It is South Tehran that contains the Grand Bazaar, covering five miles of warren-like alleyways and more than a dozen mosques. That bazaar is the commercial backbone of the nation with its intricately woven strands of trade, Islamic culture, and politics. Its lead is followed by all the other bazaars of Iran. Because Prophet Muhammad was a merchant, there has been a symbiotic relationship between the commercial class and the mosque from the early days of Islam. Iran is no exception and the importance of the bazaar’s influence still cannot be overestimated. After all, it was barely a century ago that oil was first found in the country, while industrialization gained a foothold only after World War II.
So, have bazaar merchants begun to shut their shops in solidarity with the protestors -- as they did during the anti-Shah movement? No again.
Leaving aside the shuttering of stores, if some bazaar traders were simply to resort to setting up their own blogs and joining the protests online, that in itself would surely draw the attention of the regime of Supreme Leader Ayatollah Ali Khamanei and might even lead it to consider a compromise with the reformers.
The Limits of 2010
So far the opposition has been led by the defeated candidates for the presidency -- Mir Hussein Mousavi and Mahdi Karroubi -- neither of whom has anything like the charisma or religious standing of a Khomeini.
Furthermore, the opposition suffers from the lack of a single overarching demand. During the 1978-1979 movement, Khomeini rallied diverse anti-Shah forces -- from Shia clerics to Marxist-Leninist groups -- around a maximum demand: Dethrone the Shah.
Then Khomeini managed to hold together this unwieldy alliance by championing the causes of each of the social classes in the anti-Shah coalition. The traditional middle classes of merchants and artisans saw in him an upholder of private property and a believer in Islamic values. The modern middle classes regarded him as a radical nationalist committed to ending royal dictatorship and foreign influence in Iran. The urban working class backed him because of his repeated commitment to social justice which, it felt, could only be achieved by transferring power and wealth from the affluent to the needy. The rural poor saw him as the one to provide them with arable land, irrigation facilities, roads, schools, and electricity.
Khomeini performed this superhuman task by maintaining a studied silence on such controversial issues as democracy, the status of women, and the role of clerics in the future Islamic republic.
Today, the most popular slogan of the protestors is “Death to the Dictator,” meaning Supreme Leader Khamanei. (In Persian, “Marg bur/ Diktator” rhymes well.) Yet that is certainly not what either Mousavi or Karroubi wants.
On his website, Mousavi recently demanded the release of all political prisoners and the amending of the electoral laws, along with the enforcement of freedom of expression, assembly, and the press as stated in the Iranian constitution. In short, he wants to reform the present system, not overthrow it.
As it is, there is a mechanism in the constitution for the removal of the Supreme Leader. The popularly elected 86-member Assembly of Experts has the authority to appoint or dismiss him.
That Assembly is presided over by Ali Akbar Hashemi Rafsanjani. As a former close aide to Ayatollah Khomeini, his revolutionary credentials are on a par with Ali Khamanei’s.
Rafsanjani backed Mousavi in his presidential bid with funds and strategic planning. Now, if he decides, he can summon the Assembly of Experts for an emergency session to debate the present crisis caused by the divisions at the top. Normally the Assembly meets only twice a year. But being a shrewd politician, Rafsanjani would first consult senior Assembly members individually to test the waters. It seems so far that he has not succeeded in gaining strong enough support for a special session.
At the grass-roots level, the numerous oppositional blogs and websites rarely deal with the big picture. They are mainly focused on highlighting the brutal repression and arguing that Khamanei’s regime has strayed wildly from its Islamic roots and its revolutionary promises of justice, freedom, and independence.
Their critique, however, covers only one major aspect of the situation. It is not enough to bring about regime change in the country. A second complimentary side would have to spell out some specifics about how the protestors want to see their vision of change realized in practice. At the very least, the opposition ought to debate the issue, which it is not doing now; or it could emulate Mousavi, who has dropped his earlier demand for a fresh presidential poll to be supervised not by the interior ministry but a non-governmental body. That gesture could, sooner or later, open the way for a compromise with President Mahmoud Ahmadinejad that might lead to a national unity government composed of his partisans and the opposition leaders.
One major difference between 1979 and 2010 is that the Internet provides a great opportunity for a kind of debate that was unthinkable until a decade ago. On the other hand, what the 1979 movement and the present one have in common is the idea of making political use of the Shiite religious days, the Islamic custom of commemorating a dead person on the 40th day of his or her demise, as well as of the martyr complex engrained among Shiites. It was Ayatollah Khomeini who pioneered such tactics. He consistently used the 40th day of mourning for the martyrs of the Shah’s regime to draw ever bigger, ever more enthusiastic crowds in the streets, and used the holy month of Ramadan to charge the nation with revolutionary fervor.
The attempts of today’s opposition leaders to emulate Khomeini’s example have not succeeded chiefly because their camp lacks a religious leader of his stature.
The near-fatal blow that Khomeini struck at the Shah’s regime lay in the fatwa he issued decreeing that firing on unarmed protestors was equivalent to firing at a copy of the holy Quran. Most of the Shah’s soldiers, being Shiite and often young conscripts, accepted Khomeini’s interpretation. Many of them had already lost faith in their commanders after bank employees revealed, in September 1978, that top army officers had been transferring vast sums abroad. Little wonder that, by the time the Shah left Iran in January 1979, the army’s strength had plummeted from 300,000 to just over 100,000, mainly due to desertions.
By contrast, there is little evidence so far that the present regime’s security forces -- the heavily indoctrinated Revolutionary Guards, the Basij militia, or the armed police -- are vacillating when ordered to break up demonstrations with force. On its part, the regime, aware of the danger of creating martyrs and of the historical precedent, has taken care to make minimal use of live fire in dispersing protesting crowds.
During the 12 months of the revolutionary movement that stretched from 1978 into 1979, the indiscriminate use of live fire by the Shah’s regime led to between 10,000 -- the government figure -- and 40,000 -- the opposition’s statistic -- deaths. In the six months of the street protest this time around, the total, according to the opposition, is 106.
Nationalism as a Factor
If this interpretation of the current situation in Iran has focused solely on internal political dynamics, that doesn’t mean external forces are unimportant. Given the geo-strategic significance of Iran in the region and the world, any move by not-too-friendly Western governments against Tehran is bound to alter the domestic situation dramatically.
Were the Western powers, for instance, to succeed in ratcheting up economic sanctions against Tehran through the United Nations Security Council, the opposition would undoubtedly cease its protests and cooperate with the Ahmadinejad administration to face a common national threat under the banner of patriotism.
With a proud recorded history stretching back six millennia, Iranians have evolved into staunch nationalists in modern times. That is a simple, if overarching, fact which leaders in the West cannot afford to ignore.
Posted on: Tuesday, January 12, 2010 - 17:54
SOURCE: American Thinker (1-11-10)
This past weekend it was revealed that Harry Reid in 2008 referred to then-Senator Obama as "light skinned" and "with no Negro dialect, unless he wanted to have one." The reaction to these revelations came as expected: Republicans called for Reid to resign from his leadership position, while Democrats circled the wagons around the Senate Majority Leader. Despite their criticisms, Republicans have missed the full meaning of Reid's statement in 2008 as well as his recent clarification.
While the immediate acceptance of Reid's apology from those who censured Rush Limbaugh's attempt to buy an NFL franchise is striking, the last part of Reid's comment and his subsequent statement from over the weekend give away his original motive. In his comments the Senate Majority Leader explicitly acknowledged that using race is a valuable instrument in the Democratic political playbook. When Reid noted that President Obama had "no Negro dialect, unless he wanted to have one" he recognized the ability to fluctuate racial accents as a political asset. In his apology he only regretted "using such a poor choice of words" not the meaning of the statement itself. In the defense of the Nevada senator, other liberals have only reinforced Reid's admission....
As others and I have expressed, throughout the year Democrats have had a penchant for using race as a way to attack their conservative critics instead of engaging with real opposition to their programs. Instead of just demanding that Harry Reid resign from his leadership of the Senate, after all he is not going anywhere, conservatives should remember his words as an acknowledgement that many Democrats see racial diversity as a political weapon to exploit--not something that makes the United States a unique place among the world. It's an opportunity not to only show the hypocrisy of one senator, but shed light on the racial cynicism found by some members of an entire party.
Posted on: Tuesday, January 12, 2010 - 15:17
SOURCE: LA Times (1-11-10)
'For a long time now there's been too much secrecy in this city." That's what President Obama said on his first day in office. He was talking about the way George W. Bush and Dick Cheney had used 9/11 as a pretext for pulling a veil over many of their key policies and actions. Last week, Obama announced he was replacing Bush's executive order on classified documents with a new one designed to reduce secrecy. Obama's policies are a distinct improvement, but they don't really solve the underlying problem.
The basic idea is a simple one. As Obama said in the order: "Our democratic principles require that the American people be informed of the activities of their government." Officials rely on secrecy to avoid being held responsible for their failures and to conceal illegality and misconduct -- waterboarding of suspected terrorists, for example. If practices like waterboarding are a good idea, the details of why, when, how and who should be knowable and defendable in public debate. That's the principle behind the Freedom of Information Act, which permits "any person" to request government documents.
As the new rule states, the "secret" classification should be reserved for documents "that would clearly and demonstrably reveal: (a) the identity of a confidential human source or a human intelligence source; or (b) key design concepts of weapons of mass destruction."
Obama's order requires all federal agencies to review their classification systems and identify information that no longer needs protection. Most important, the order states: "No information may remain classified indefinitely."
But the enemies of openness have had powerful weapons in their hands, and Obama probably hasn't done enough to defeat them. Agencies maintain secrecy first of all by delay. It took 19 years for the government to release documents -- long supposed to be declassified -- on the 1959 Berlin crisis requested by the National Security Archive at George Washington University.
Four hundred million documents dating from World War II and the Cold War remain classified, despite a long-standing mandate that all such documents be released after 25 years unless they fall under a few narrowly specified exemptions. In 2000, President Clinton gave agencies a deadline of Dec. 31, 2003, to obey the order. When they failed to do so, President Bush established a new deadline: Dec. 31, 2009. They didn't meet that one either. Obama's approach? Set a new deadline, four years from now.
He has also introduced procedures that ought to help agencies meet the deadline. The review is one of them, but so is a new rule about documents that belong to more than one agency. If a document has been circulated to the FBI, the CIA and the State Department, for example, each in sequence must review it before it can be released. To avoid this, Obama established a single authority, the National Declassification Center.
Another weapon in the hands of the enemies of openness was a rule decreed by Bush in 2003 giving the head of the intelligence community the power to veto decisions to release documents made by an interagency panel. Since the CIA lives by secrecy, the results were disastrous. Obama has repealed that rule; now only the president can reverse such decisions.
But there's a bigger problem that Obama's new order does not address. We have an antiquated Cold War-era secrecy machine that kept citizens in the dark about what their government was doing, but it didn't prevent the Soviets from getting our biggest secrets. Rather than tweaking the present system, as Obama is doing -- eliminating some roadblocks to declassification, streamlining the review process -- we need wholesale changes, in part because the threats today are so different from what they were in the 1950s. We need a broader public debate about how to combat terrorism, and we need a public that is knowledgeable and informed about how we are dealing with terrorist threats.
To start, all documents more than 25 years old should be automatically declassified. Cold War secrets are irrelevant in today's world. We don't need to spend taxpayer dollars going through these documents page by page. (In fact, a Department of Defense task force concluded that "perhaps 90%" of technical and scientific information could be safely revealed within five years of classification.)
Then we need a requirement that declassification rules serve the public's right to know. Without such a directive, it will be much easier for the Obama administration to continue to keep secret aspects of Bush-era national security policy. Jameel Jaffer, director of the ACLU National Security Project, has a list: "The CIA is still withholding documents about its rendition, detention and interrogation program. The Justice Department is still withholding the legal memos that supplied the basis for the National Security Agency's warrantless wiretapping program. The Defense Department is still withholding the interrogation directives used by special forces in Afghanistan." We need this information if we are to avoid repeating abuses from the past and to evaluate the wisdom of government policy in the present.
The principle that democracy requires open government applies not only to Bush administration records but also to Obama's. And yet Obama has violated that principle in his executive order by increasing the secrecy around his own decision-making, as pointed out by Patrice McDermott, director of OpenTheGovernment.org.
Under Clinton, information originating from the president and his staff was exempt from release while the president remained in office. George W. Bush added to that the vice president and his staff. Now Obama has expanded that to include "committees, commissions or boards appointed by the incumbent president; or other entities within the executive office of the president that solely advise and assist the incumbent president."
The strongest counterargument to greater openness is that it would permit terrorists to learn more about our vulnerabilities -- for example, if security lapses at nuclear sites were made public. But what's more likely to lead to improving nuclear site security: public disclosure of lapses, followed by public outrage and public demands for change, or continued secrecy?
In 2000, Congress created a Public Interest Declassification Board to solicit suggestions from the public and make recommendations about declassifying documents. In 2008, according to Steven Aftergood, director of the Project on Government Secrecy at the Federation of American Scientists, Bush ordered the FBI, the CIA and other agencies to comment on the board's recommendations. Open government groups then filed FOIA requests for copies of the agencies' comments. The agencies refused to release them. That suggests how difficult it will be to get the agencies to change, even with Obama's new policies.
Posted on: Monday, January 11, 2010 - 13:21
SOURCE: The End is Coming (History Blog) (1-7-10)
Historians are abuzz with a new announcement from the Vatican. They will be releasing 105 documents to the public from their secret archives, many of which were never known to even exist. Kept in the vaults of St. Peter for a millennium, this very small sampling of fascinating literary sources would probably not have braved the centuries were it not for the extremely stringent conservation and access policies of the Catholic institution. That being said, these documents are going to be revealed in the form of a volume available at your nearest book store. The Vatican has kept these (among others) treasures of History and will now be profiting from their sale. Why does this make me uneasy?
Among the documents to be released, we will get a first glimpse of correspondence with Mozart and Voltaire, a 1863 letter to Pope Pius IX defending the Confederacy in America, XIVth-century investigation of Templar Knights, 1934 letters from Adolf Hitler to Pope Pius XI, a 1550 note from the artist Michelangelo insisting on his payment that was three months late (for renovating St. Peter’s Basilica) and finally, a 1246 letter from Grand Khan Guyuk (Grandson of Genghis) to Pope Innocent IV demanding the Pontiff’s homage and submission to the Mongolian Empire.
Each one of these incredible sources has the historical community salivating yet there is an air of unease at the great breadth and variety of the works and the Roman plan to sell them. This not only implies that they are the exclusive property of the Vatican, but it also means that the Vatican will have thousands of critical historical documents to raffle off, a bit at a time, to potentially make billions.
In history, I am not the only one that has felt uneasy about this simony (the Biblical crime of trafficking Church property) and the exclusive secret nature of the archives. In 1881, following unending centuries of accumulating archives for no one to see, one Pope wondered if it was selfish of the Church to hoard and keep secret a bigger collection of historical heritage than anywhere else on the planet. The humanist and progressive Pope Leo XIII opened the archives to professional researchers on the very day he was elected, inviting scrutiny, analysis and diffusion of a huge new part of human history (including some very dark chapters in the history of the Vatican itself). Despite this opening however, it has remained tremendously difficult for ANYONE to gain access to the archives, even with the proper credentials, faith, experience, references and qualifications. That is why a handy book, available to all and perhaps interpreted in a certain way/light/bias would be perfect. This brings me to a second problem.
The Sources Remain Secret
If one is lucky enough to be admitted to the Vatican’s secret archives or is clever enough to break in, one could consult the tens of thousands of documents contained therein. Once it is in book form however, the documents may or may not have been edited with a lengthy interpretation that may be included for the (less-instructed) public. This will provide an official Vatican explanation for what was essentially unadulterated History. Now I have no doubt that the Vatican’s historians are professionals of great repute but seeing as I do not believe in complete objectivity, there will inevitable be a taint of bias and thus a rewriting of history which sort of defeats the purpose of revealing these primary sources.
And talking of Vatican historians, the historical community has not forgotten how they swindled us with the Donation of Constantine.
In the early 300s AD, the first Christian Roman Emperor, Constantine I, drafted this edict giving Pope Sylvester (technically held the title of Vicar of Rome at the time) , as successor to St. Peter, “dominion over lands in Judea, Greece, Asia, Thrace, Africa, as well as the city of Rome, with Italy and the entire Western Roman Empire”. This was fortuitously revealed centuries later and used by a series of Popes to maintain their political power over the “Papal States” of Italy during times where the Vatican itself was threatened by medieval conquerors. The Donatio effectively defended the holdings of the Popes, around Rome at least, to this day. But wait, there’s more. By the 1400s, the suspicious royal courts of Europe had indeed proven the document to be a complete fabrication. It was written around 750 AD by monks and/or Vatican historians (it is assumed) to affirm Christian territorial power in the face of aggressive Lombards and Franks. If you were wondering, it was the language that gave it away; feudal terms (fief) and anachronistic Latin proved that the document could never have been written so early.
With this in mind, it is no wonder that the only way the general population is going to get their hands on the treasure trove of the Vatican is by first submitting the material though intense official scrutiny and interpretation. With all these concerns in mind and on this page, I admit defeat in that I have no other way in EVER seeing these documents and yes, I will be buying the damn thing.
Posted on: Friday, January 8, 2010 - 10:16
SOURCE: New York Times (1-4-10)
The article didn’t make for pleasant reading, especially for people like myself who think that efficient railway services and other forms of well-run mass transport are a subtle but nifty measure of a country’s level of civilization and, in most cases, of its social and economic fabric. (Having being born in the village in northeast England where George and Robert Stephenson invented the world’s first locomotive makes me biased here, but no matter.) It was a report in the Financial Times on Dec. 27 of the successful initial runs in China of an ultra-high-speed-train service, which provides an amazing under-three-hours link between the 1,100 kilometers that separate the cities of Guangzhou and Wuhan.
Was I irritated that the Chinese, who plan to build a rail network of 18,000 kilometers by 2012, can legitimately claim that they have sprung to the fore in humanity’s train development, overtaking at one leap the super-speed trains of Japan and France and Germany? Certainly not. After all, the People’s Republic of China seems as purposeful in its planned development of railways, ports, new cities and nuclear-power plants as it is in its control of civil disobedience and obstruction of free expression.
What was depressing to me was the laconic comment by the article’s author, one of many foreign observers attending China’s stunning demonstration of high-speed rail. After noting that the “Harmony Train” averaged 350 kilometers per hour, as compared with the maximum speed of 300 k.p.h. by the Japanese and French high-speed trains, he added: “In America, Amtrak’s Acela ‘Express’ service takes three-and-a- half hours to trundle between Boston and New York, a distance of only 300 kilometers.” Ouch.
The Acela is, as noted, America’s “express” train, and only exists along parts of the Eastern Seaboard. Most rail commuters in this region have to take slower Amtrak connections, or even slower trains like the Metro-North, where the overcrowding and the bone-shaking ride makes one think this must be a train rattling across the plains of northern India. But we are the lucky ones: A large number of Americans don’t have access to any rail transportation system at all.
The comparison with Japan and Europe is staggering. When I am on study leave in Cambridge, in Britain, I can take a nonstop train to London twice a hour — and the trip takes less than 45 minutes. On disembarking at King’s Cross station, I can stroll through a tunnel to St. Pancras, and take the high-speed Eurostar train every hour to either Brussels or Paris; those trips take slightly over two hours.
Public and governmental fury over the Eurostar’s recent problems was because that system is expected to work on time, and normally does.
The high-speed Shinkansen makes it possible for someone from Tokyo to have lunch and a stroll in Kyoto in the middle of the day, and be back in the capital city by early evening. Now China is joining the club. Presumably the Gulf States will be next.
The reasons for America’s laggardliness is easily explained. To begin with, the initial investment in a rail network costs an awful lot of money, which national governments usually provide as a “public good.” That in turn means that the taxpayer pays, which is much less disagreeable when the taxpayer can observe the satisfactory results of that investment. In America, most of the country feels that it is handing over funds solely to support East Coast and West Coast commuters.
Then there is the American obsession with the automobile, and with aircraft. Given the sheer spread of the country, this once seemed to make a lot of sense and still does for many journeys today. Air travel is a fantastic conqueror of distance, and for decades car ownership has been synonymous with American individualism and love of the open road. Finally, there are considerable parts of America that are so under-populated that it would be highly uneconomical to have a rail network even one-quarter as dense as, say, Belgium’s. This article is in no way arguing for an end to air and road transportation....
Posted on: Friday, January 8, 2010 - 02:52
SOURCE: The Atlantic (1-7-10)
The promise and selling point of Barack Obama’s 2008 campaign—breaking with the past, delivering something new—was the oldest promise in American politics. Since European settlers crossed the Atlantic imagining (mistakenly) a “new world” without history, Americans have rewarded talk of new beginnings. The early colonists sought to create a society de novo in ways that Europe—with its religious wars, social stratification, and finitude of land—made impossible. To the Revolutionary generation, the acts of declaring independence and drafting a constitution seemed to ratify this mythology. And in every era since, Americans have fallen, starry-eyed, for leaders who speak of a future unencumbered by history’s weight. Theodore Roosevelt’s New Nationalism, Woodrow Wilson’s New Freedom, FDR’s New Deal, JFK’s New Frontier, even George H. W. Bush’s New World Order—all began with the promise of the new.
Of course, after the flush of a campaign, both voters and presidents have invariably discovered that history imposes constraints. After the Civil War, a cohort of young intellectuals invested hope in Ulysses S. Grant, only to see rampant corruption persist and the dream of reconstructing the South dissolve. After World War I, the crash-and-burn of Wilson’s noble quest for “peace without victory” soured Americans on an energetic executive for a decade. Bill Clinton’s New Covenant, a dead-on-arrival slogan, presaged the letdown that came as his followers realized that liberalism’s revival would require more than a few token compromises.
Obama in 2008 was just the latest aspirant to talk of beginning anew. He bested Hillary Clinton for the Democratic nomination in part by saddling her with the record of not one but two past presidents: the residual regret over her husband’s supposedly small-bore and blandly centrist Third Way agenda, and the collective buyers’ remorse over the Iraq War. In contrast to the dreaded “incrementalism” of the Clintons, Obama’s candidacy tantalized voters with a chance for what he called “transformational” or “fundamental” change.
One year later, transformation looks like a fleeting dream. No one knows whether Obama can deliver massive change on the scale of Lincoln, Wilson, FDR, or LBJ. But right now, the opportunity that loomed last fall seems to have passed. Conservatives—uncharacteristically mute last winter—have regained their voice, nearly derailing Obama’s health-care plan and keeping the administration on defense in the daily media wars. Meanwhile, liberals and leftists, who largely muffled their doubts when Obama had a presidency to win, are suddenly seething over his moderation and compromises—keeping suspected terrorists jailed indefinitely, countenancing his treasury secretary’s coziness with financial CEOs, letting center-right senators weaken his health-care plan. Washington pundits, for their part, intoned throughout 2009 that in taking on health care, energy, and financial reform in his first year, the president was attempting “too much.”
Yet the now-prevalent pessimism about Obama’s presidency is surely unwarranted. True, we can no longer expect Obama to be the agent of a post-partisan politics, or an uncorrupted anti-politician incapable of spin or triangulation, or America’s most civil-libertarian president, or a socialist. But in the modern age, presidents are never able to meet such expectations. Our hunger for presidential intervention, leadership, and salvation now exceeds any individual’s capacities. So the eclipse of these campaign-trail fantasies about Obama’s presidency hardly signals its death. On the contrary, it marks the true beginning.
“If there is anything that history has taught us,” John F. Kennedy said on the campaign trail in 1960, “it is that the great accomplishments of Woodrow Wilson and of Franklin Roosevelt were made in the early days, months, and years of their administrations. That was the time for maximum action.” But Kennedy was wrong—unless you choose to focus exclusively on the word years instead of days and months. As rich in opportunity as presidential honeymoons can be—and the best executives have used them to get important things done—a president’s real work doesn’t occur when he has what Obama calls the righteous wind at his back. It occurs when he has to soldier on into a fight, despite blustery headwinds.
Like the unit of 100 days, the benchmark of a president’s first year matters a lot to journalists but relatively little to historians. The 100-days concept itself, which originated with Roosevelt’s flurry of activity in early 1933, soon devolved into a transparent public-relations gimmick, as media-age presidents sweated over how to boost their grades on what soon came to be recognized as the president’s initial report card. Similarly, the now-ritualized year-one evaluation, though harmless as an exercise in journalistic stock-taking, offers a weak basis for predicting future performance. Indeed, none of the three presidents Obama has taken as his role models—Lincoln, FDR, and Kennedy—enjoyed a first year that foretold the direction of his presidency. Transformation doesn’t happen overnight.
Abraham Lincoln is Obama’s favorite president and his aspirational model. In 2007, the senator from Illinois launched his bid for the Oval Office in Lincoln’s shadow, on the steps of the Springfield Old State Capitol. With his message of national conciliation, Obama often echoed Lincoln’s second inaugural address. Even when he attacked his rivals, he suggested that he was merely combating their retrogressive politics, while he was summoning the better angels of our nature. At times, the Lincoln comparisons taxed credulity: Obama’s devotees even pointed to Lincoln’s one-term service in Congress—and his subsequent rise to become America’s greatest president—to answer the charge that Obama hadn’t accomplished enough in his career to earn him the White House. It was no surprise when, in January 2009, the incoming president took his inaugural oath on the Bible Lincoln had used, and presided over festivities branded as “A New Birth of Freedom.”
Yet as Obama surely knows, Lincoln—a transformative president if there ever was one—started his administration on a shaky note. His inaugural address fumblingly extended an olive branch to the seceding states of the South, promising (to no avail) that he would enforce the fugitive-slave law and uphold slavery in the states where it was legal. The Confederate attack on Fort Sumter forced Lincoln to change course. But on the crucial matter of slavery, the president—who had never considered himself an abolitionist—remained fairly conservative. “If I could save the Union without freeing any slave I would do it, and if I could save it by freeing all the slaves I would do it,” he wrote to Horace Greeley in 1862, “and if I could save it by freeing some and leaving others alone I would also do that.” Few foresaw that his presidency would end with the abolition of slavery and a redefinition of freedom, union, and equality.
Lincoln also needed time to gain his footing as commander in chief. Unsure of himself in military affairs, he was at the mercy of his generals, including the aging and detached Winfield Scott. Dispiriting defeats—notably at the First Battle of Bull Run, in July 1861—emboldened the South. Even after Lincoln mustered the wisdom to replace Scott, George B. McClellan, his new top commander, frustrated the president by declining to advance against Confederate forces. As for his domestic agenda, Lincoln, like most 19th-century presidents, followed Congress’s lead. But even there, despite a Republican leadership eager to exploit the sudden absence of Southerners, major laws—the Homestead Act, the Pacific Railway Act, and the Morrill Land Grant Act—didn’t get the president’s signature until 1862.
No one could say that Franklin Roosevelt began his first year in office hesitantly. His first 100 days were indeed a whirlwind of legislative and executive feats. But FDR geared his first-year efforts almost entirely toward recovery—a necessary but hardly transformative goal.
Certain measures—like solving the banking crisis, which had reached catastrophic proportions on the eve of his inauguration—made a palpable difference. But the core elements of FDR’s “First New Deal” turned out to be, on the whole, ineffectual or unconstitutional—or both. The National Recovery Administration, the centerpiece of it all, which relied on industry leaders to agree to production codes, was flawed in both conception and execution, and it failed miserably. When the Supreme Court unanimously ruled it unconstitutional, Roosevelt’s aide Robert Jackson called the decision a blessing in disguise, since it spared the president from having to watch Congress decline to renew the act. The Agricultural Adjustment Act, which regulated farm production through central planning, was also struck down. And then there was Roosevelt’s Economy Act, a misguided effort in budget balancing taken up before Washington discovered the wisdom of deficit spending.
Most of the New Deal’s lasting elements didn’t come until 1935. Only after taking a beating on the airwaves from demagogic populists like Senator Huey Long of Louisiana and the radio priest Charles Coughlin did FDR sign on to the Social Security Act, which created unemployment insurance, old-age pensions, and a safety net for the disabled. And not until his second term did his administration embrace a Keynesian strategy of aggressive spending to lift the economy out of crisis. If Roosevelt’s first year was historic for its activist spirit and purposeful intervention, its economic philosophy left little mark.
While Obama styled himself Lincolnian in his rhetoric of reconciliation, and Rooseveltian in his steadfastness in the face of economic distress, he just as often summoned the Kennedy mystique, presenting himself as the telegenic, inspirational torchbearer of an ascendant generation. Obama suggested that he wanted to “move the country in a fundamentally different direction,” as he believed Kennedy had. Just as Kennedy’s election shattered the anti-Catholic taboo in presidential politics, Obama’s promised to topple an age-old wall of racial prejudice. The Baby Boomers who flocked to Obama’s candidacy said he brought back memories of JFK. The claim was echoed most tellingly by the fallen president’s own brother, who anointed Obama as JFK’s successor after perceiving a slight to the family name in Hillary Clinton’s assertion that the skill of Lyndon Johnson—she didn’t mention Jack—had been instrumental in passing the 1964 Civil Rights Act.
In fact, on civil rights, as in other areas, Kennedy’s first-year performance dismayed his enthusiasts. As a candidate, he had vowed to desegregate federal housing with “a stroke of the presidential pen.” But once in office, he demurred; fearful of alienating powerful southern Democrats whose support he needed on other issues, he focused instead on foreign-policy problems. Not until he’d cleared the 1962 midterm elections did Kennedy issue the housing proclamation. Caution likewise informed his response to the Freedom Riders—the activists who rode buses across the South starting in May 1961 to force the government to uphold the Supreme Court’s desegregation of interstate travel. When white southerners brutally beat the activists, Kennedy and his aides, unprepared, at first tried to stop the rides, sending in federal marshals only when it seemed that the violence might turn deadly.
In foreign policy, too, the biggest developments of JFK’s debut year yielded little positive transformation. The Bay of Pigs invasion, an ill-conceived CIA scheme hatched under Dwight Eisenhower, redounded to Kennedy’s benefit only because he had the sense not to duck responsibility. At his June summit in Vienna with Nikita Khrushchev, the new president felt he was verbally pummeled by the Soviet premier, in what Kennedy called the “roughest thing in my life.” Kennedy’s tepid response may have encouraged Khrushchev to erect the Berlin Wall that fall. When that happened, too, JFK was slow to act (Kennedy: You can’t stop tanks with words, read one West Berliner’s protest sign), and even his decision to send retired General Lucius Clay and Vice President Johnson to West Berlin to boost morale did nothing to deter the Soviets. At the end of 1961, Kennedy’s aide Ted Sorensen mentioned that two reporters were considering writing books about the year gone by. Kennedy was mystified: “Who would want to read a book on disasters?”
The presidency that Obama’s resembles most so far isn’t any of these but, ironically, that of Bill Clinton—ironic because Obama, speaking in January 2008 about what makes a good president, implicitly denigrated Clinton even as he praised Ronald Reagan for having “changed the trajectory of America” and “put us on a fundamentally different path.” Obama, many speculated at the time, may have been playing head games with his peevish predecessor, goading him into another outburst that would thrill the press pack. Even so, it was a strange reading of history. Reagan’s election, after all, did not initiate but culminated a long conservative effort to gain control of the levers of power; his decisions as president moved his party to the right, but they also introduced fissures and frustrations into the conservative alliance. Clinton’s tenure, in contrast, began a new era for the Democrats, and after his eight years, virtually all of the party’s leading lights embraced what had been controversial stands in 1992: an internationalist foreign policy, a growth-centered economics, and a willingness to link social policies to family values.
The point would be trivial had Obama not reached for Clinton’s 1992 playbook during the fall 2008 campaign. Obama’s battle with John McCain, which centered on the hard-pressed middle class, showed that Obama represented less a repudiation of Clinton (as the primaries had suggested) than a continuation. His rhetoric wafted to earth to focus on everyday economic concerns. His convention speech opened, after the preliminaries, not with soaring visions of post-partisan unity but with issue-based, it’s-the-economy-stupid plain language:
Tonight, more Americans are out of work and more are working harder for less. More of you have lost your homes and even more are watching your home values plummet. More of you have cars you can’t afford to drive, credit-card bills you can’t afford to pay, and tuition that’s beyond your reach.
Obama discovered this idiom just in time for the financial chaos and the debates with McCain.
Obama’s successes and struggles in his first year bear striking resemblances to Clinton’s. Both men were elected with similar mandates—Clinton won 370 electoral votes, Obama 365—and majorities in both houses of Congress. Both opened their first years well by signing a few queued-up executive orders and bills—including the Family and Medical Leave Act, for Clinton, and the Lilly Ledbetter Fair Pay Act and the expansion of the Children’s Health Insurance Program, for Obama. And both made economic revival their first priority. Both men also entered office facing tooth-and-nail resistance from a right wing that had just lost the presidency. The right imagined Clinton, as it does Obama, to be far more radical than he really was, and it thus tried to delegitimize him. A short line connects the “Who shot Vince Foster?” conspiracy theories to those surrounding Obama’s citizenship.
Republicans also forced Clinton to pass his first economic plan without their support, much as they tried to scuttle Obama’s stimulus package. And despite losing the legislative battle, they succeeded in shaping public perception of these economic bills after their passage. Clinton’s 1993 budget—which not only set the government on course for a record surplus, but also cut taxes for millions while raising them on very few—was nonetheless portrayed, and viewed by most Americans, as a tax hike. In parallel fashion, economic evidence suggests that Obama’s spring stimulus bill has already done some appreciable good. But according to an August Gallup poll, Americans consider it too big and are uncertain about its benefits. And while Obama seems likely, as of this writing, to emerge from his first health-care fight with more to show for it than Clinton did from his, the final bill probably won’t be more than an incremental step or two forward—less like Medicare than like the 1996 Kennedy-Kassebaum Act, a now-forgotten consolation prize that Clinton garnered later in his presidency.
The reassertion of political limits and the deflation of campaign-season euphoria make it unlikely that Obama’s presidency will be “transformational” in the sense that he spoke of on the campaign trail—Lincolnian in its boldness, Rooseveltian in its activism, or Kennedyesque in its uplift. More likely, it will resemble Clinton’s presidency, with eight years of muddling through, frequent bouts of sharp partisan opposition, fluctuating poll ratings, and dashed hopes.
This should be no cause for distress. Obama could do worse than to emulate Clinton, who, at the end of the day, left the country better off than when he took office. Clinton’s record remains undervalued, partly because a misleading narrative took hold (that his impeachment cost him the chance to do more), and partly because many of his gains were achieved not through the big-ticket stand-alone legislation that journalists recite in their year-end summaries but through less visible allocations within the interstices of the federal budget. No single law or presidential order gave us the longest economic expansion in history, the lowest unemployment rates in three decades, or the declines in poverty, crime, and teen pregnancy. Nor does Clinton deserve sole credit for these feats. But all were accomplished during his eight years.
Twenty-five years ago, the political scientist Theodore Lowi published a book called The Personal President. It argued that the increasingly large responsibilities placed on the president since Franklin Roosevelt’s time—of regulation, social provision, and economic management, to say nothing of the leadership of the free world—have exploded into impossible expectations. Every postwar chief executive, Lowi noted—and the observation still holds—has begun his presidency with high approval ratings and left office with the public chastened of its early optimism, if not disillusioned altogether. (The president who has exited the White House with the highest approval ratings, post-FDR, is Clinton.)
It is easy to propose that we lower our expectations for our new presidents—even, or perhaps especially, for presidents who come bearing lofty promises of transformation. But we can’t correct the problem, Lowi’s diagnosis suggested, simply by resolving to demand less from our chief executives or by vowing to learn from the past. The problem is rooted in nothing less than the presidency’s assumption of immense powers, and of a central role in our imagination. Candidates have no better path to victory than by inspiring us with dreams of a new political era, and presidents have no choice but to attempt “too much.” In doing so, however, they can only disappoint us.
Posted on: Thursday, January 7, 2010 - 12:45
SOURCE: Tomdispatch.com (1-3-10)
[Nick Turse is the associate editor of TomDispatch.com and the winner of a 2009 Ridenhour Prize for Reportorial Distinction as well as a James Aronson Award for Social Justice Journalism. His work has appeared in the Los Angeles Times, the Nation, In These Times, and regularly at TomDispatch. Turse is currently a fellow at New York University's Center for the United States and the Cold War. He is the author of The Complex: How the Military Invades Our Everyday Lives (Metropolitan Books). His website is NickTurse.com.]
According to the Chinese calendar, 2010 is the Year of the Tiger. We don’t name our years, but if we did, this one might prospectively be called the Year of the Assassin.
We, of course, think of ourselves as something like the peaceable kingdom. After all, the shock of September 11, 2001 was that “war” came to “the homeland,” a mighty blow delivered against the very symbols of our economic, military, and -- had Flight 93 not gone down in a field in Pennsylvania -- political power.
Since that day, however, war has been a stranger in our land. With the rarest of exceptions, like Army psychiatrist Major Nidal Hasan’s massacre at Fort Hood, Texas, this country has remained a world without war or any kind of mobilization for war. No other major terrorist attacks, not even victory gardens, scrap-metal collecting, or rationing. And certainly no war tax to pay for our post-9/11 trillion-dollar “expeditionary forces” sent into battle abroad. Had we the foresight to name them, the last few years domestically might have reflected a different kind of carnage -- 2006, the Year of the Subprime Mortgage; 2007, the Year of the Bonus; 2008, the Year of the Meltdown; 2009, the Year of the Bailout. And perhaps some would want to label 2010, prematurely or not, the Year of Recovery.
Although our country delivers war regularly to distant lands in the name of our “safety,” we don’t really consider ourselves at war (despite the endless talk of “supporting our troops”), and the money that has simply poured into Pentagon coffers, and then into weaponry and conflicts is, with rare exceptions, never linked to economic distress in this country. And yet, if we are no nation of warriors, from the point of view of the rest of the world we are certainly the planet’s foremost war-makers. If money talks, then war may be what we care most about as a society and fund above all else, with the least possible discussion or debate.
In fact, according to military expert William Hartung, the Pentagon budget has risen in every year of the new century, an unprecedented run in our history. We dominate the global arms trade, monopolizing almost 70% of the arms business in 2008, with Italy coming in a vanishingly distant second. We put more money into the funding of war, our armed forces, and the weaponry of war than the next 25 countries combined (and that’s without even including Iraq and Afghan war costs). We garrison the planet in a way no empire or nation in history has ever done. And we plan for the future, for “the next war” -- on the ground, on the seas, and in space -- in a way that is surely unique. If our two major wars of the twenty-first century in Iraq and Afghanistan are any measure, we also get less bang for our buck than any nation in recent history.
So, let’s pause a moment as the New Year begins and take stock of ourselves as what we truly are: the preeminent war-making machine on planet Earth. Let’s peer into the future, and consider just what the American way of war might have in store for us in 2010. Here are 10 questions, the answers to which might offer reasonable hints as to just how much U.S. war efforts are likely to intensify in the Greater Middle East, as well as Central and South Asia, in the year to come.
1. How busted will the largest defense budget in history be in 2010?
Strange, isn’t it, that the debate about hundreds of billions of dollars in health-care costs in Congress can last almost a year, filled with turmoil and daily headlines, while a $636 billion defense budget can pass in a few days, as it did in late December, essentially without discussion and with nary a headline in sight? And in case you think that $636 billion is an honest figure, think again -- and not just because funding for the U.S. nuclear arsenal and actual “homeland defense,” among other things most countries would chalk up as military costs, wasn’t included.
If you want to put a finger to the winds of war in 2010, keep your eye on something else not included in that budget: the Obama administration’s upcoming supplemental funding request for the Afghan surge. In his West Point speech announcing his surge decision, the president spoke of sending 30,000 new troops to Afghanistan in 2010 at a cost of $30 billion. In news reports, that figure quickly morphed into “$30-$40 billion,” none of it in the just-passed Pentagon budget. To fund his widening war, sometime in the first months of the New Year, the president will have to submit a supplemental budget to Congress -- something the Bush administration did repeatedly to pay for George W.’s wars, and something this president, while still a candidate, swore he wouldn’t do. Nonetheless, it will happen. So keep your eye on that $30 billion figure. Even that distinctly low-ball number is going to cause discomfort and opposition in the president’s party -- and yet there’s no way it will fully fund this year’s striking escalation of the war. The question is: How high will it go or, if the president doesn’t dare ask this Congress for more all at once, how will the extra funds be found? Keep your eye out, then, for hints of future supplemental budgets, because fighting the Afghan War (forget Iraq) over the next decade could prove a near trillion-dollar prospect.
Neither battles won nor al-Qaeda and Taliban commanders killed will be the true measure of victory or defeat in the Afghan War. For Americans at home, even victory as modestly defined by this administration -- blunting the Taliban’s version of a surge -- could prove disastrous in terms of our financial capabilities. Guns and butter? That’s going to be a surefire no-go. So keep watching and asking: How busted could the U.S. be by 2011?
2. Will the U.S. Air Force be the final piece in the Afghan surge?
As 2010 begins, almost everything is in surge mode in Afghanistan, including rising numbers of U.S. troops, private contractors, State Department employees, and new bases. In this period, only the U.S. Air Force (drones excepted) has stood down. Under orders from Afghan War commander General Stanley McChrystal, based on the new make-nice counterinsurgency strategy he’s implementing, air power is anything but surging. The use of the Air Force, even in close support of U.S. troops in situations in which Afghan civilians are anywhere nearby, has been severely restricted. There has already been grumbling about this in and around the military. If things don’t go well -- and quickly -- in the expanding war, expect frustration to grow and the pressure to rise to bring air power to bear. Already unnamed intelligence officials are leaking warnings that, with the Taliban insurgency expanding its reach, “time is running out.” Counterinsurgency strategies are notorious for how long they take to bear fruit (if they do at all). When Americans are dying, maintaining a surge without a surge of air power is sure to be a test of will and patience (neither of which is an American strong suit). So keep your eye on the Air Force next year. If the planes start to fly more regularly and destructively, you’ll know that things aren’t looking up for General McChrystal and his campaign.
3. How big will the American presence in Pakistan be as 2010 ends?
Let’s start with the fact that it’s already bigger than most of us imagine. Thanks to Nation magazine reporter Jeremy Scahill, we know that, from a base in Pakistan’s largest city, Karachi, officers of the U.S. Joint Special Operations Command, with the help of hired hands from the notorious private security contractor Xe (formerly Blackwater), “plan targeted assassinations of suspected Taliban and Al Qaeda operatives, ‘snatch and grabs’ of high-value targets and other sensitive action inside and outside Pakistan.” Small numbers of U.S. Special Forces operatives have also reportedly been sent in to train Pakistan’s special forces. U.S. spies are in the country. U.S. missile- and bomb-armed drones, both CIA- and Air Force-controlled, have been conducting escalating operations in the country’s tribal borderlands. U.S. Special Operations forces have conducted at least four cross-border raids into Pakistan’s tribal borderlands unsanctioned by the Pakistani government or military (only one of which was publicly reported in this country). And the CIA and the State Department have been attempting (against some Pakistani resistance) to build up their personnel and facilities in-country. This, mind you, is only what we know in a situation in which secrecy is the order of the day and rumors fly.
In the meantime, the Obama administration has been threatening to widen its drone war (and possibly other operations) to the powder-keg province of Baluchistan, where most of the Afghan Taliban’s leadership reportedly resides (evidently under Pakistani protection) and to the fighters of the Haqqani network, linked to both the Taliban and al-Qaeda, in the Pakistani border province of North Waziristan. Right now, these threats from Washington are clearly meant to motivate the Pakistani military to do the job instead. But as that is unlikely -- both groups are seen by its military as key players in the country’s future anti-Indian policies in Afghanistan -- they may not remain mere threats for long. Any such U.S. moves are only likely to widen the Af-Pak war and further destabilize nuclear-armed Pakistan. In addition, the Pakistani military is not powerless vis-à-vis the U.S. For one thing, as Robert Dreyfuss of the Nation’s “Dreyfuss Report” recently pointed out, it has a potential stranglehold on the tortuous U.S. supply lines into Afghanistan, already under attack by Taliban militants, that make the war there possible.
Pakistan is the Catch-22 of Obama’s surge. As in the Vietnam War years, sanctuaries across the border ensure limited success in any escalating war effort, but going after those sanctuaries in a major way would be a war-widening move of genuine desperation. As with the Air Force in Afghanistan, watch Pakistan not just for spreading drone operations, but for the use of U.S. troops. If by year’s end Special Operations forces or U.S. troops are periodically on the ground in that country, don’t be shocked. However it may be explained, this will represent a dangerous failure of the first order.
4. How much smaller will the American presence in Iraq be?
Barack Obama swept into office, in part, on a pledge to end the U.S. war in Iraq. Almost a year after he entered the White House, more than 100,000 U.S. troops are still deployed in that country (about the same number as in February 2004). Still, plans developed at the end of the Bush presidency, and later confirmed by President Obama, have set the U.S. on an apparent path of withdrawal. On this the president has been unambiguous. “Let me say this as plainly as I can,” he told a military audience in February 2009. “By August 31, 2010, our combat mission in Iraq will end... I intend to remove all U.S. troops from Iraq by the end of 2011.” However, Robert Gates, his secretary of defense, has not been so unequivocal. While recently visiting Iraq, he disclosed that the U.S. Air Force would likely continue to operate in that country well into the future. He also said: “I wouldn't be a bit surprised to see agreements between ourselves and the Iraqis that continues a train, equip, and advise role beyond the end of 2011.”
For 2010, expect platitudes about withdrawal from the President and other administration spokespeople, while Defense Department officials and military commanders offer more “pragmatic” (and realistic) assessments. Keep an eye out for signs this year of a coming non-withdrawal withdrawal in 2011.
5. What will the New Year mean for the Pentagon's base-building plans in our war zones?
As the U.S. war in Afghanistan ramps up, look for American bases there to continue along last year’s path, becoming bigger, harder, more numerous, and more permanent-looking. As estimates of the time it will take to get the president’s extra boots on the ground in Afghanistan increase, look as well for the construction of more helipads, fuel pits, taxiways, and tarmac space on the forward operating bases sprouting especially across the southern parts of that country. These will be meant to speed the movement of surge troops into rural battle zones, while eschewing increasingly dangerous ground routes.
In Iraq, expect the further consolidation of a small number of U.S. mega-bases as American troops pull back to ever fewer sites offering an ever lower profile in that country. Keep your eyes, in particular, on giant Balad Air Base and on Camp Victory outside Baghdad. These were built for the long term. If Washington doesn’t begin preparing to turn them over to the Iraqis, then start thinking 2012 and beyond. Elsewhere in the Persian Gulf region, look for the U.S. military to continue upgrading its many bases, while militarily working to strengthen the security forces of country after autocratic country, from Saudi Arabia to Qatar, in part to continue to rattle Iran’s cage. If those bases keep growing, don’t imagine us drawing down in the region any time soon.
6. Will the U.S. and Israel thwart the Iranian insurgency?
Iran has long been under siege. A founding member of George W. Bush’s “Axis of Evil,” the Islamic Republic was long on his administration’s hit list. It also found itself in the unenviable position of watching the American military occupy and garrison two bordering countries, Iraq and Afghanistan, while also building or bolstering bases in nearby Qatar, Bahrain, Kuwait, Oman, and the United Arab Emirates. The Obama administration is now poised to increase key military aid to Iran’s nemesis, Israel, and the Pentagon has flooded allied regimes in the region with advanced weaponry. Years of saber-rattling and sanctions, encirclement and threats nonetheless seemed to have little palpable effect. In 2009, however, a disputed election brought Iranians into the streets and, months later, they’re still there.
What foreign militarism couldn’t do, ordinary Iranians themselves now threaten to accomplish. In earlier street protests, young middle-class activists in Tehran chanting “Where is our vote?" were beaten and martyred by security forces. Today, the protests continue and oppositional Iranians from all social strata are refusing to retreat while, when provoked, sometimes fighting back against the police or the regime’s fearsome Basiji militia, even inducing some of them to step aside or switch sides.
A continuing cycle of ever-spreading arrests, protests, and violence in 2010 threatens to further destabilize the regime. How Washington reacts could, however, deeply affect what happens. The memory of the CIA’s toppling of Iranian Prime Minister Mohammed Mossadegh in 1953 is still alive in Iran. Any perceived U.S. interference could have grave results for the Iranian insurgency, as could Israeli actions. Recently, President Obama, evidently trying to bring the Chinese into line on the question of imposing fiercer sanctions, reportedly told China’s president that the United States could not restrain Israel from attacking Iran’s nuclear facilities much longer. Such an Israeli attack would certainly strengthen the current Iranian regime; so, undoubtedly, would pressure to increase potentially crippling sanctions on that country over its nuclear program. Either or both would help further cement the current tumultuous status quo in the Middle East.
7. Will Yemen become the fourth major front in Washington’s global war?
George W. Bush unabashedly proclaimed himself a “war president.” President Obama seems to be taking up the same mantle. Right now, the Obama administration’s war fronts include the inherited wars in Iraq and Afghanistan, a not-so-covert war in Pakistan, and a potential new war in Yemen. (There are also rarely commented upon ongoing military actions in the Philippines and a U.S.-aided drug war in Colombia, as well as periodic strikes in Somalia.) Though the surge in Afghanistan and Pakistan was supposed to contain al-Qaeda there, the U.S. now finds itself focusing on yet another country and another of that organization’s morphing offspring.
In 2002, a USA Today article about a targeted assassination in Yemen began: “Opening up a visible new front in the war on terror, U.S. forces launched a pinpoint missile strike in Yemen...” Just over seven years later, following multiple U.S. cruise missiles launched into the country and targeted air strikes by the air force of the U.S.-aided Yemeni regime against “suspected hide-outs of Al Qaeda,” the New York Timesannounced, “In the midst of two unfinished major wars, the United States has quietly opened a third, largely covert front against Al Qaeda in Yemen.” In the wake of a botched airplane terror attack by a single young Nigerian Muslim, and credit-taking by a group calling itself al-Qaeda in the Arabian Peninsula, the usual cheery crew of U.S. war advocates are lining up behind the next potential front in the war on terror. (Senator Joseph Lieberman: "Iraq was yesterday's war. Afghanistan is today's war. If we don't act preemptively, Yemen will be tomorrow's war.") What began as a one-off Bush assassination effort now threatens to become another of Obama’s wars.
The U.S. has not only sent Special Forces teams into the country, but is now pouring tens of millions of dollars into Yemen’s security forces in a dramatic move to significantly arm yet another Middle Eastern country. At the same time, U.S.-backed Saudi Arabia -- whose alliance with Washington ignited the current war with al-Qaeda -- is aiding the Yemeni forces in a war against Houthi rebels there.
This is a witch’s brew of trouble. Keep your eye on Yemen (with an occasional side glance at Somalia, the failed state across the Gulf of Aden). Expect more funding, more trainers, more proxy warfare, and possibly a whole new conflict for 2010.
8. How brutal will the American way of war be in 2010?
When it comes to war, American-style, the key word of 2009 was “counterinsurgency” or COIN. Think of it as the kindly version of war the American way, a strategy based on “clearing and holding” territory and “protecting” the civilian population. Its value, as expounded by Afghan War commander McChrystal, lies not in killing the enemy but in winning over “the people.” On paper, it sounds good, like a kinder, gentler version of war, but historically counterinsurgency operations have almost invariably gone into the ditch of brutality. So here’s one word you should keep your eyes out for in 2010: “counterterrorism.” Consider it the dark underside of counterinsurgency. Instead of boots on the ground, it’s bullets to the head.
General McChrystal was, until recently, a counterterrorism guy. He ran the Joint Special Operations Command (JSOC) in Iraq and Afghanistan. His operatives were referred to, more or less politely, as “manhunters.” Think: assassins. With McChrystal, a general who credits his large-scale assassination program for a great deal of the Iraq surge’s success in 2007, it was just a matter of time before counterterrorism -- which is just terrorism put in uniform and given an anodyne name -- was ramped up in Afghanistan (and undoubtedly Pakistan as well). Though the planes may still be grounded, the special ops guys who kick in doors in the middle of the night and have often been responsible for grievous civilian casualties will evidently be going at it full tilt.
As 2009 ended, the news that black-ops forces were being loosed in a significant way was just hitting the press. So watch for that word “counterterrorism.” If it proliferates, you’ll know that the expanding Afghan War is getting down and dirty in a big way. For Americans, 2010 could be the year of the assassin.
9. Where will the drones go in 2010?
If there’s one thing to keep your eye on in the coming year, it might be the unmanned aerial vehicles -- drones -- flown secretly, in the case of the Air Force, from distant al-Udeid Air Base in Qatar and, in the case of the CIA, even more distantly out of Langley, Virginia. American drones are already in a widening air war in the Pakistani tribal borderlands, while Washington threatens to create an even wider one. Think of these robotic planes as the leading edge of global war, American-style. While “hot pursuit” into Pakistan may still be forbidden to U.S. troops in Afghanistan, the drones have long had a kind of hot-pursuit carte blanche in Pakistan’s tribal borderlands.
Perhaps more important, they can, to steal a Star Trek line, boldly go where no man has gone before. Since the first drone assassination attack of the Global War on Terror -- in Yemen in 2002 -- in which several men, reputedly al-Qaeda militants, were incinerated inside a car, drones have been taking war into new territory. They have already struck in Iraq, Pakistan, Afghanistan, and possibly Somalia. As the first robot terminators of our age, they symbolize the loosing of American war-making powers from the oversight of Congress and the American people. In principle, they have made borders (hence national sovereignty) increasingly insignificant as assassination attacks can be launched 24/7 against those we deem our enemies, on the basis of unknown intelligence or evidence.
With our drones, there is little price to be paid if, as has regularly enough been the case, those enemies turn out not to be in the right place at the right time and others die in their stead. Globally, we have become the world’s leading state assassins -- a judge, jury, and executioner beyond the bounds of all accountability. In essence, those pilot-less planes turn us into a law of war unto ourselves. It’s a chilling development. Watch for it to spread in 2010, and keep an eye out for which countries, fielding their own drones, follow down the path we’re pioneering, for in our age all war-making developments invariably proliferate -- and fast.
The Element of Surprise
We know one thing: 2010 will be another year of war for the United States and, from assassination campaigns to new fronts in what is no longer called the Global War on Terror but is no less global or based on terror, it could get a lot uglier. The Obama administration may, from time to time, talk withdrawal, but across the Middle East and Central Asia, the Pentagon and its contractors are digging in. In the meantime, more money, not less, is being put into preparations and planning for future wars. As William Hartung points out, “if the government’s current plans are carried out, there will be yearly increases in military spending for at least another decade.”
When it comes to war, the only questions are: How wide? How much? Not: How long? Washington’s answer to that question has already been given, not in public pronouncements, but in that Pentagon budget and the planning that goes with it: forever and a day.
Of course, only diamonds are forever. Sooner or later, like great imperial powers of the past, we, too, will find that the stress of fighting a continuous string of wars in distant lands in inhospitable climes tells on us. Whether we “win” or not in Iraq, Afghanistan, Pakistan, and now Yemen, we lose.
Which brings us to our last question:
10. What will surprise us in 2010?
It would be the height of hubris to imagine that we can truly see into the future, especially when it comes to war. It is, in fact, Washington’s hubris to believe itself in control of its own war-making destiny, whether via shock-and-awe tactics that are certain to work, a netcentric military-lite that can’t fail, or most recently, a force dedicated to a “hearts and minds” counterinsurgency war in Afghanistan and, in the future, globally (under the ominous new acronym GCOIN).
The essence of war is surprise. So, despite all those billions of dollars and the high-tech weaponry, and the nine areas discussed above, keep your eyes open for the unexpected and confounding, and in the meantime, welcome to the grim spectacle of war American-style as the second decade of the twenty-first century begins in turmoil.
Posted on: Tuesday, January 5, 2010 - 09:48
SOURCE: WSJ (1-3-10)
On Dec. 5, 1933, Americans liberated themselves from a legal nightmare called Prohibition by repealing the 18th Amendment to the Constitution. Today most people think Prohibition was fueled by puritanical Protestants who believed drinking alcohol was a sin. But the vocal minority who made Prohibition law believed they were marching in the footsteps of the abolitionists who sponsored a civil war to end another moral evil—slavery…
On Dec. 22, 1917, Congress passed the 18th Amendment, turning the whole nation dry—if and when two-thirds of the states ratified it...
The ratification process moved slowly at first. By the fall of 1918, only 14 states had approved the 18th Amendment. To speed things up, the drys in Congress tacked a rider on a vital agricultural appropriation bill, establishing national Prohibition as of July 1, 1919.
In the White House, President Wilson's Irish-American adviser, Joseph Tumulty, urged Wilson to veto the bill. Tumulty warned it would alienate millions of ethnic Democrats in the big cities in the upcoming midterm elections. Tumulty called the Dry rider "mob legislation pure and simple." But Wilson conferred with other members of his cabinet, who recommended signing it. Wilson had been re-elected in 1916 by a heavy percentage of dry states. The president signed the bill and, as Tumulty predicted, outraged Irish and German Americans voted Republican and the Democrats lost Congress…
For the next 13 years, Prohibition corrupted and tormented Americans from coast to coast...
In 1933, a new president, Franklin D. Roosevelt, made the repeal of the 18th Amendment one of his priorities. But the evil effects of this plunge into national redemption linger to this day, most notably in the influence of organized crime, better known as the Mafia, in many areas of American life.
In 2010, with talk of restructuring large swaths of our economy back in vogue, Prohibition should also remind us that Congress, scientists and economists seized by the noble desire to achieve some great moral goal may be abysmally wrong.
Posted on: Monday, January 4, 2010 - 17:51
SOURCE: Foreign Policy (1-4-10)
Neither a cold-blooded realist nor a bleeding-heart idealist, Barack Obama has a split personality when it comes to foreign policy. So do most U.S. presidents, of course, and the ideas that inspire this one have a long history at the core of the American political tradition. In the past, such ideas have served the country well. But the conflicting impulses influencing how this young leader thinks about the world threaten to tear his presidency apart -- and, in the worst scenario, turn him into a new Jimmy Carter.
Obama's long deliberation over the war in Afghanistan is a case study in presidential schizophrenia: After 94 days of internal discussion and debate, he ended up splitting the difference -- rushing in more troops as his generals wanted, while calling for their departure to begin in July 2011 as his liberal base demanded. It was a sober compromise that suggests a man struggling to reconcile his worldview with the weight of inherited problems. Like many of his predecessors, Obama is not only buffeted by strong political headwinds, but also pulled in opposing directions by two of the major schools of thought that have guided American foreign-policy debates since colonial times.
In general, U.S. presidents see the world through the eyes of four giants: Alexander Hamilton, Woodrow Wilson, Thomas Jefferson, and Andrew Jackson. Hamiltonians share the first Treasury secretary's belief that a strong national government and a strong military should pursue a realist global policy and that the government can and should promote economic development and the interests of American business at home and abroad. Wilsonians agree with Hamiltonians on the need for a global foreign policy, but see the promotion of democracy and human rights as the core elements of American grand strategy. Jeffersonians dissent from this globalist consensus; they want the United States to minimize its commitments and, as much as possible, dismantle the national-security state. Jacksonians are today's Fox News watchers. They are populists suspicious of Hamiltonian business links, Wilsonian do-gooding, and Jeffersonian weakness.
Moderate Republicans tend to be Hamiltonians. Move right toward the Sarah Palin range of the party and the Jacksonian influence grows. Centrist Democrats tend to be interventionist-minded Wilsonians, while on the left and the dovish side they are increasingly Jeffersonian, more interested in improving American democracy at home than exporting it abroad.
Some presidents build coalitions; others stay close to one favorite school. As the Cold War ended, George H.W. Bush's administration steered a largely Hamiltonian course, and many of those Hamiltonians later dissented from his son's war in Iraq. Bill Clinton's administration in the 1990s mixed Hamiltonian and Wilsonian tendencies. This dichotomy resulted in bitter administration infighting when those ideologies came into conflict -- over humanitarian interventions in the Balkans and Rwanda, for example, and again over the relative weight to be given to human rights and trade in U.S. relations with China.
More recently, George W. Bush's presidency was defined by an effort to bring Jacksonians and Wilsonians into a coalition; the political failure of Bush's ambitious approach created the context that made the Obama presidency possible.
Sept. 11, 2001, was one of those rare and electrifying moments that waken Jacksonian America and focus its attention on the international arena. The U.S. homeland was not only under attack, it was under attack by an international conspiracy of terrorists who engaged in what Jacksonians consider dishonorable warfare: targeting civilians. Jacksonian attitudes toward war were shaped by generations of conflict with Native American peoples across the United States and before that by centuries of border conflict in England, Scotland, and Ireland. Against "honorable" enemies who observe the laws of war, one is obliged to fight fair; those who disregard the rules must be hunted down and killed, regardless of technical niceties.
When the United States is attacked, Jacksonians demand action; they leave strategy to the national leadership. But Bush's tough-minded Jacksonian response to 9/11 -- invading Afghanistan and toppling the Taliban government that gave safe haven to the plotters -- gave way to what appeared to be Wilsonian meddling in Iraq. Originally, Bush's argument for overthrowing Saddam Hussein rested on two charges that resonated powerfully with Jacksonians: Hussein was building weapons of mass destruction, and he had close links with al Qaeda. But the war dragged on, and as Hussein's fabled hoards of WMD failed to appear and the links between Iraq and al Qaeda failed to emerge, Bush shifted to a Wilsonian rationale. This was no longer a war of defense against a pending threat or a war of retaliation; it was a war to establish democracy, first in Iraq and then throughout the region. Nation-building and democracy-spreading became the cornerstones of the administration's Middle East policy.
Bush could not have developed a strategy better calculated to dissolve his political support at home. Jacksonians historically have little sympathy for expensive and risky democracy-promoting ventures abroad. They generally opposed the humanitarian interventions in Somalia, Bosnia, and Haiti during the Clinton years; they did not and do not think American young people should die and American treasure should be scattered to spread democracy or protect human rights overseas. Paradoxically, Jacksonians also opposed "cut and run" options to end the war in Iraq even as they lost faith in both Bush and the Republican Party; they don't like wars for democracy, but they also don't want to see the United States lose once troops and the national honor have been committed. In Bush's last year in office, a standoff ensued: The Democratic congressional majorities were powerless to force change in his Iraq strategy and Bush remained free to increase U.S. troop levels, yet the war itself and Bush's rationale for it remained deeply unpopular.
Enter Obama. An early and consistent opponent of the Iraq war, Obama was able to bring together the elements of the Democratic Party's foreign-policy base who were most profoundly opposed to (and horrified by) Bush's policy. Obama made opposition to the Iraq war a centerpiece of his eloquent campaign, drawing on arguments that echoed U.S. anti-war movements all the way back to Henry David Thoreau's opposition to the Mexican-American War.
Like Carter in the 1970s, Obama comes from the old-fashioned Jeffersonian wing of the Democratic Party, and the strategic goal of his foreign policy is to reduce America's costs and risks overseas by limiting U.S. commitments wherever possible. He's a believer in the notion that the United States can best spread democracy and support peace by becoming an example of democracy at home and moderation abroad. More than this, Jeffersonians such as Obama think oversize commitments abroad undermine American democracy at home. Large military budgets divert resources from pressing domestic needs; close association with corrupt and tyrannical foreign regimes involves the United States in dirty and cynical alliances; the swelling national-security state threatens civil liberties and leads to powerful pro-war, pro-engagement lobbies among corporations nourished on grossly swollen federal defense budgets.
While Bush argued that the only possible response to the 9/11 attacks was to deepen America's military and political commitments in the Middle East, Obama initially sought to enhance America's security by reducing those commitments and toning down aspects of U.S. Middle East policy, such as support for Israel, that foment hostility and suspicion in the region. He seeks to pull U.S. power back from the borderlands of Russia, reducing the risk of conflict with Moscow. In Latin America, he has so far behaved with scrupulous caution and, clearly, is hoping to normalize relations with Cuba while avoiding collisions with the "Bolivarian" states of Venezuela, Ecuador, and Bolivia.
Obama seeks a quiet world in order to focus his efforts on domestic reform -- and to create conditions that would allow him to dismantle some of the national-security state inherited from the Cold War and given new life and vigor after 9/11. Preferring disarmament agreements to military buildups and hoping to substitute regional balance-of-power arrangements for massive unilateral U.S. force commitments all over the globe, the president wishes ultimately for an orderly world in which burdens are shared and the military power of the United States is a less prominent feature on the international scene.
While Wilsonians believe that no lasting stability is possible in a world filled with dictatorships, Jeffersonians like Obama argue that even bad regimes can be orderly international citizens if the incentives are properly aligned. Syria and Iran don't need to become democratic states for the United States to reach long-term, mutually beneficial arrangements with them. And it is North Korea's policies, not the character of its regime, that pose a threat to the Pacific region.
At this strategic level, Obama's foreign policy looks a little bit like that of Richard Nixon and Henry Kissinger. In Afghanistan and Iraq, he hopes to extract U.S. forces from costly wars by the contemporary equivalent of the "Vietnamization" policy of the Nixon years. He looks to achieve an opening with Iran comparable to Nixon's rapprochement with communist China. Just as Nixon established a constructive relationship with China despite the radical "Red Guard" domestic policies Chinese leader Mao Zedong was pursuing at the time, Obama does not see ideological conflict as necessarily leading to poor strategic relations between the United States and the Islamic Republic. Just as Nixon and Kissinger sought to divert international attention from their retreat in Indochina by razzle-dazzle global diplomacy that placed Washington at the center of world politics even as it reduced its force posture, so too the Obama administration hopes to use the president's global popularity to cover a strategic withdrawal from the exposed position in the Middle East that it inherited from the Bush administration.
This is both an ambitious and an attractive vision. Success would reduce the level of international tension even as the United States scales back its commitments. The United States would remain, by far, the dominant military power in the world, but it would sustain this role with significantly fewer demands on its resources and less danger of war.
Yet as Obama is already discovering, any president attempting such a Jeffersonian grand strategy in the 21st century faces many challenges...
Posted on: Monday, January 4, 2010 - 11:36
SOURCE: CNN (1-3-10)
Almost as soon as the botched Christmas airplane bombing hit the airwaves, the politics of national security reared its head…
While politicians play the blame game when things go wrong -- looking for the individual at fault -- what is more important is to look at public policies and government institutions to understand how our system in fact did not work. In this case, we are talking about the policies of multiple countries…
The Christmas incident made clear several problems. The first is that we are still having trouble connecting the dots. This was one of the main problems that revealed in the examination into 9/11; authorities had a substantial amount of information about the perpetrators but they failed to share it with each other or to put the story together…
A second question has to do with airline security. Once the decision was made to allow him to fly, it is simply confounding that he was able to bring high explosives on board with relative ease. As UCLA professor Amy Zegart, the author of the best book on intelligence reform, recently told the New York Times: "This textbook Al Qaeda 2001. They tried to hit the hardest target we have, the one on which the most money and attention has been spent since 2001. And yet we didn't prevent it."
The government must review our airline security program. In Politico, Josh Gerstein provided a useful analysis of issues that must receive more consideration now. For example, more thought will have to be given to allowing for the assemblage of a more expansive "no-fly" list. Governments will need to think about additional funding for installing whole-body imaging technology for airports that has been slowed down by privacy concerns and budgetary limitations…
There has been a ongoing debate about expanding the number of air marshals on airplanes and training other personnel so that they can better handle these kinds of situations. The Israeli airline, El Al, has famously depended on having undercover air marshals on their flights as one component of its successful deterrent strategy. Doing this on every flight would of course be prohibitive, but it is worth considering whether we are doing enough to provide flights enough marshals as a last line of defense.
These are just some of the questions to come out of this incident that must be examined…
Posted on: Sunday, January 3, 2010 - 19:21
SOURCE: Daily Mail (UK) (12-10-09)
No matter whether the measure is fair, enforceable or even legal, action to punish the greed of the City's fattest felines is assured of popular support.
The Chancellor's initiative is ruthlessly political. It represents his blunderbuss answer to an issue which is causing sleepless nights for finance ministers throughout the Western world, and to which there are no easy answers.
The international banking community has shown itself to possess a crocodile appetite, jungle morality and skin thickness to make rhinos envious.
Successful bankers have always been well paid, but only in the past 20 years have they institutionalised the bonus culture, whereby successful dealers and traders qualify for annual awards which can amount to £10 million or even £20 million, and average £500,000.
Many of the recipients possess limited talents - if they were cleverer, the financial crisis would not have happened.
Today, of course, even such vast and supposedly highly profitable businesses as Goldman Sachs would be ruined, but for taxpayer bail-outs of the financial system.
Hence the public outrage on both sides of the Atlantic, about these proven losers continuing to award themselves telephone number pay.
Since the system collapsed about their ears, the bankers have displayed an absence of shame which would command admiration among the accused at the Hague war crimes court.
They shrug: 'We are worth it. Our businesses cannot be allowed to fail. We have mansions and yachts to support. What's your problem?'
No employee, far less boss, would get onto the payrolls of such institutions as Goldman Sachs or J.P. Morgan if they subscribed to conventional standards of human decency.
There they are today, still rejecting voluntary, self-imposed curbs on remuneration, while maintaining a ruthless squeeze on loans to businesses in the real economy.
They brush aside assertions of the simple truth that they would have no profits from which to award bonuses, but for taxpayer-funded cheap money and systemic guarantees.
Stalin would have shot them. Lacking that option, most national leaderships are baffled.
The Obama administration, together with the German and French governments, have implicitly concluded that there is little they can do, though Berlin has capped the pay of bankers who received direct taxpayer support.
Most Western democracies have rejected more drastic measures - first, on the grounds that the financial institutions represent vital national interests which must be supported; and second, because punitive fiscal measures against one class of rich citizens are judged discriminatory and unconstitutional.
Alistair Darling, however, has chosen to adopt a ruthlessly political attitude, which Labour is confident fits the public mood.
Bankers have behaved antisocially and their threats to quit Britain should not be taken too seriously.
How great is international demand for the services of men and women who have presided over the worst financial catastrophe of modern times?
If nothing is done to curb bank behaviour, a year or two from now Britain will be a divided society.
So far, pain from our ghastly financial position has barely begun to be felt among the public. The Government is arranging matters so that it will not work through until after next year's general election.
Thereafter, almost every family in the land will suffer its share of grief with more job losses, higher taxes, further businesses going bust and public sector employment slashed.
It will be deeply damaging if the country finds itself divided between a real world in which hardship prevails, and the banking community still paying itself rewards so large that ordinary taxation scarcely affects lifestyles. Some attempt had to be made, however clumsy, to call time on the bankers.
Yet it would be foolish not to recognise the dangers inherent in Darling's measure. For 40 years after World War II, Britain was cursed by the 'politics of envy'.
Prohibitively high tax rates prevailed, choking enterprise. The socialist ethic held that if everybody could not become rich, nobody should.
Margaret Thatcher broke that culture in the 1980s, reducing top tax rates progressively from 83 per cent to 40.
Few pundits doubt that her boldness and imagination in raising incentives for success played an important part in rescuing the British economy from its slough of despond.
The fear now must be that cutting the bankers down to size will revive the broader Old Labour tradition of taxing the rich 'until the pips squeak', which would be a long-term disaster.
The Tories must thank their stars they are not in office yet. They support the charge against the bankers, but recognise the danger that, if a government goes too far down this road, the ghastly old 'equality of misery' doctrine which ruined Britain for two generations will once more take hold. It is a difficult balance to strike.
I think Darling is right to have done as he did yesterday, imposing a levy designed to punish the bankers' hubris, and fire a warning shot about future behaviour.
Given the cupidity and incompetence of thousands of City and Wall Street moguls which created the financial crisis, it is remarkable that only former Royal Bank of Scotland boss Sir Fred Goodwin's house has had its windows broken.
But Darling's strike against the bankers must be clearly perceived as a symbolic one- off. It should not represent the onset of a new socialist purge of 'the rich'.
It is foolish for the Government - for instance - to threaten to cap the salaries of top-earning civil servants, modest enough by private sector standards. We desperately need good people to run big councils and Whitehall departments. The gold-plated pensions of state employees are a much bigger scandal than their pay.
The bankers have brought public opprobrium and its consequences on themselves. But once this round of blood-letting is over, we should cherish at least a slim hope that they learn the lesson, so that they and the Chancellor can call a truce.
It is in nobody's interests, least of all those of the British economy, for the bankers' well-deserved six of the best to become a sustained caning for the nation's executives and wealth-generators, who have done nothing to justify being crippled by socialist crossfire.
Posted on: Saturday, January 2, 2010 - 22:21