Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Telegraph (UK) (2-25-07)
The best explanation is in fact the simplest. Being hated is what happens to dominant empires. It comes - sometimes literally - with the territory. George Orwell knew the feeling. As a young man he served as an assistant police superintendent in British-run Burma, an experience he memorably described in his essay "Shooting an Elephant". Called upon to kill a rogue pachyderm that had run amok, Orwell was suddenly aware "of the watchful yellow faces behind" him:
"The sole thought in my mind was that if anything went wrong those two thousand Burmans would see me pursued, caught, trampled on and reduced to a grinning corpse like that Indian up the hill. And if that happened it was quite probable that some of them would laugh."
Eric Blair, as Orwell was known then, could scarcely have been better prepared for his role as a colonial official. Born in Bengal, the son of a colonial civil servant, he had been educated at Eton, where boys learn not to worry much about being hated. Yet even he found the resentment of the natives hard to bear: "In the end the sneering... faces of young men that met me everywhere, the insults hooted after me when I was at a safe distance, got badly on my nerves ... [It] was perplexing and upsetting."
That's a feeling American soldiers in Baghdad must know pretty well. How does that old Randy Newman song go? "No one likes us - I don't know why. / We may not be perfect, but heaven knows we try."
But who hates Americans the most? You might assume that it's people in countries that the United States has recently attacked or threatened to attack. Americans themselves are clear about who their principal enemies are. Asked by Gallup to name the "greatest enemy" of the United States today, 26 per cent of those polled named Iran, 21 per cent named Iraq and 18 per cent named North Korea. Incidentally, that represents quite a success for George W. Bush's concept of the "Axis of Evil". Six years ago, only 8 per cent named Iran and only 2 per cent North Korea.
Are those feelings of antagonism reciprocated? Up to a point. According to a poll by Gallup's Centre for Muslim Studies, 52 per cent of Iranians have an unfavourable view of the United States. But that figure is down from 63 per cent in 2001. And it's significantly lower than the degree of antipathy towards the United States felt in Jordan, Pakistan and Saudi Arabia. Two thirds of Jordanians and Pakistanis have a negative view of the United States and a staggering 79 per cent of Saudis. Sentiment has also turned hostile in Lebanon, where 59 per cent of people now have an unfavourable opinion of the United States, compared with just 41 per cent a year ago. No fewer than 84 per cent of Lebanese Shiites say they have a very unfavourable view of Uncle Sam.
These figures suggest a paradox in the Muslim world. It's not America's enemies who hate the United States most, it's people in countries that are supposed to be America's friends, if not allies.
The paradox doesn't end there. The Gallup poll (which surveyed 10,000 Muslims in 10 different countries) also revealed that the wealthier and better-educated Muslims are, the more likely they are to be politically radical. So if you ever believed that anti-Western sentiment was an expression of poverty and deprivation, think again. Even more perplexingly, Islamists are more supportive of democracy than Muslim moderates. Those who imagined that the Middle East could be stabilised with a mixture of economic and political reform could not have been more wrong. The richer these people get, the more they favour radical Islamism. And they see democracy as a way of putting the radicals into power....
Posted on: Wednesday, February 28, 2007 - 17:55
SOURCE: Legal Times (2-26-07)
It is refreshing to see that the unsavory history of our late Chief Justice William Rehnquist is at last getting a fresh examination as a consequence of the release of FBI files in January under the Freedom of Information Act.
It is particularly noteworthy that the files reveal how Rehnquist’s friends at the FBI (read J. Edgar Hoover) tried to protect Rehnquist at his 1986 Senate confirmation hearings for chief justice from allegations that as a young Republican lawyer in Phoenix he worked to keep Hispanic voters from casting ballots. The allegation about Rehnquist’s conduct is not new, but the fact that the FBI tried to intimidate witnesses into not testifying about it is.
The new revelations show just how courageous former Assistant U.S. Attorney James Brosnahan was when he stepped forward to tell the Senate Judiciary Committee how he confronted Rehnquist at a Phoenix polling place in 1962. Brosnahan testified that he warned Rehnquist that voter intimidation was a violation of federal law.
Noting that Rehnquist had denied the allegations in his own testimony, Brosnahan told the senators, “This does not comport with my recollection of the events I witnessed in 1962, when Mr. Rehnquist served as a challenger.”
NOT ANCIENT HISTORY
But it is important to keep in mind that this is not ancient history. After the Senate went ahead and confirmed him as a justice in 1971, Rehnquist crafted a doctrine that to this day bars some 5 million Americans, disproportionately African-American and Hispanic, from voting.
It was Rehnquist’s opinion in 1974, in Richardson v. Ramirez, that has allowed 48 of our 50 states and the District of Columbia to bar anyone convicted of a crime from voting. Thirty-five of those states continue the practice even after the offender has completed his sentence.
It was Rehnquist’s cynical sleight of hand in Ramirez that reinterpreted Section 2 of the 14th Amendment, adopted right after the Civil War. Rehnquist transformed it from a prohibition on states’ barring ex-slaves from voting into an authorization to do so — so long as the states first arrested them and convicted them of a crime.
The language of Section 2 of the 14th Amendment is straightforward. It provides that any state prohibiting ex-slaves from voting would have its congressional representation proportionately reduced. It also had an exception “for participation in rebellion or other crime.” Despite the fact that practically no blacks were permitted to vote in the former Confederate states for the next 100 years, no state ever lost a single seat in Congress.
But in Ramirez, Rehnquist found a use for Section 2. He ruled that it was an affirmative authorization for the states to disfranchise blacks — and anyone else — so long as the states first convicted them of a crime. Thus, he said, the proviso in Section 2, intended to reduce a state’s representation in Congress, also provided the states an exception from Section 1 of the 14th Amendment, which forbids any state from denying any group the equal protection of the laws.
As Justice Thurgood Marshall said in dissent in Ramirez, this was in direct disregard of the “historical purpose” of the Section 2 proviso, which was to “put Southern States to a choice — enfranchise Negro voters or lose congressional representation.” ...
Posted on: Wednesday, February 28, 2007 - 17:30
SOURCE: Japan Focus (2-20-07)
Once upon a time, you could trace the spread of imperialism by counting up colonies. America's version of the colony is the military base; and by following the changing politics of global basing, one can learn much about our ever more all-encompassing imperial "footprint" and the militarism that grows with it.
It is not easy, however, to assess the size or exact value of our empire of bases. Official records available to the public on these subjects are misleading, although instructive. According to the Defense Department's annual inventories from 2002 to 2005 of real property it owns around the world, the Base Structure Report, there has been an immense churning in the numbers of installations.
The total of America's military bases in other people's countries in 2005, according to official sources, was 737. Reflecting massive deployments to Iraq and the pursuit of President Bush's strategy of preemptive war, the trend line for numbers of overseas bases continues to go up.
Interestingly enough, the thirty-eight large and medium-sized American facilities spread around the globe in 2005 -- mostly air and naval bases for our bombers and fleets -- almost exactly equals Britain's thirty-six naval bases and army garrisons at its imperial zenith in 1898. The Roman Empire at its height in 117 AD required thirty-seven major bases to police its realm from Britannia to Egypt, from Hispania to Armenia. Perhaps the optimum number of major citadels and fortresses for an imperialist aspiring to dominate the world is somewhere between thirty-five and forty.
Using data from fiscal year 2005, the Pentagon bureaucrats calculated that its overseas bases were worth at least $127 billion -- surely far too low a figure but still larger than the gross domestic products of most countries -- and an estimated $658.1 billion for all of them, foreign and domestic (a base's "worth" is based on a Department of Defense estimate of what it would cost to replace it). During fiscal 2005, the military high command deployed to our overseas bases some 196,975 uniformed personnel as well as an equal number of dependents and Department of Defense civilian officials, and employed an additional 81,425 locally hired foreigners.
The worldwide total of U.S. military personnel in 2005, including those based domestically, was 1,840,062 supported by an additional 473,306 Defense Department civil service employees and 203,328 local hires. Its overseas bases, according to the Pentagon, contained 32,327 barracks, hangars, hospitals, and other buildings, which it owns, and 16,527 more that it leased. The size of these holdings was recorded in the inventory as covering 687,347 acres overseas and 29,819,492 acres worldwide, making the Pentagon easily one of the world's largest landlords.
These numbers, although staggeringly big, do not begin to cover all the actual bases we occupy globally. The 2005 Base Structure Report fails, for instance, to mention any garrisons in Kosovo (or Serbia, of which Kosovo is still officially a province) -- even though it is the site of the huge Camp Bondsteel built in 1999 and maintained ever since by the KBR corporation (formerly known as Kellogg Brown & Root), a subsidiary of the Halliburton Corporation of Houston.
The report similarly omits bases in Afghanistan, Iraq (106 garrisons as of May 2005), Israel, Kyrgyzstan, Qatar, and Uzbekistan, even though the U.S. military has established colossal base structures in the Persian Gulf and Central Asian areas since 9/11. By way of excuse, a note in the preface says that "facilities provided by other nations at foreign locations" are not included, although this is not strictly true. The report does include twenty sites in Turkey, all owned by the Turkish government and used jointly with the Americans. The Pentagon continues to omit from its accounts most of the $5 billion worth of military and espionage installations in Britain, which have long been conveniently disguised as Royal Air Force bases. If there were an honest count, the actual size of our military empire would probably top 1,000 different bases overseas, but no one -- possibly not even the Pentagon -- knows the exact number for sure.
In some cases, foreign countries themselves have tried to keep their U.S. bases secret, fearing embarrassment if their collusion with American imperialism were revealed. In other instances, the Pentagon seems to want to play down the building of facilities aimed at dominating energy sources, or, in a related situation, retaining a network of bases that would keep Iraq under our hegemony regardless of the wishes of any future Iraqi government. The U.S. government tries not to divulge any information about the bases we use to eavesdrop on global communications, or our nuclear deployments, which, as William Arkin, an authority on the subject, writes, "[have] violated its treaty obligations. The U.S. was lying to many of its closest allies, even in NATO, about its nuclear designs. Tens of thousands of nuclear weapons, hundreds of bases, and dozens of ships and submarines existed in a special secret world of their own with no rational military or even 'deterrence' justification."
In Jordan, to take but one example, we have secretly deployed up to five thousand troops in bases on the Iraqi and Syrian borders. (Jordan has also cooperated with the CIA in torturing prisoners we deliver to them for "interrogation.") Nonetheless, Jordan continues to stress that it has no special arrangements with the United States, no bases, and no American military presence.
The country is formally sovereign but actually a satellite of the United States and has been so for at least the past ten years. Similarly, before our withdrawal from Saudi Arabia in 2003, we habitually denied that we maintained a fleet of enormous and easily observed B-52 bombers in Jeddah because that was what the Saudi government demanded. So long as military bureaucrats can continue to enforce a culture of secrecy to protect themselves, no one will know the true size of our baseworld, least of all the elected representatives of the American people.
In 2005, deployments at home and abroad were in a state of considerable flux. This was said to be caused both by a long overdue change in the strategy for maintaining our global dominance and by the closing of surplus bases at home. In reality, many of the changes seemed to be determined largely by the Bush administration's urge to punish nations and domestic states that had not supported its efforts in Iraq and to reward those that had. Thus, within the United States, bases were being relocated to the South, to states with cultures, as the Christian Science Monitor put it, "more tied to martial traditions" than the Northeast, the northern Middle West, or the Pacific Coast. According to a North Carolina businessman gloating over his new customers, "The military is going where it is wanted and valued most."
In part, the realignment revolved around the Pentagon's decision to bring home by 2007 or 2008 two army divisions from Germany -- the First Armored Division and the First Infantry Division -- and one brigade (3,500 men) of the Second Infantry Division from South Korea (which, in 2005, was officially rehoused at Fort Carson, Colorado). So long as the Iraq insurgency continues, the forces involved are mostly overseas and the facilities at home are not ready for them (nor is there enough money budgeted to get them ready).
Nonetheless, sooner or later, up to 70,000 troops and 100,000 family members will have to be accommodated within the United States. The attendant 2005 "base closings" in the United States are actually a base consolidation and enlargement program with tremendous infusions of money and customers going to a few selected hub areas. At the same time, what sounds like a retrenchment in the empire abroad is really proving to be an exponential growth in new types of bases -- without dependents and the amenities they would require -- in very remote areas where the U.S. military has never been before.
After the collapse of the Soviet Union in 1991, it was obvious to anyone who thought about it that the huge concentrations of American military might in Germany, Italy, Japan, and South Korea were no longer needed to meet possible military threats. There were not going to be future wars with the Soviet Union or any country connected to any of those places.
In 1991, the first Bush administration should have begun decommissioning or redeploying redundant forces; and, in fact, the Clinton administration did close some bases in Germany, such as those protecting the Fulda Gap, once envisioned as the likeliest route for a Soviet invasion of Western Europe. But nothing was really done in those years to plan for the strategic repositioning of the American military outside the United States.
By the end of the 1990s, the neoconservatives were developing their grandiose theories to promote overt imperialism by the "lone superpower" -- including preventive and preemptive unilateral military action, spreading democracy abroad at the point of a gun, obstructing the rise of any "near-peer" country or bloc of countries that might challenge U.S. military supremacy, and a vision of a "democratic" Middle East that would supply us with all the oil we wanted. A component of their grand design was a redeployment and streamlining of the military. The initial rationale was for a program of transformation that would turn the armed forces into a lighter, more agile, more high-tech military, which, it was imagined, would free up funds that could be invested in imperial policing.
What came to be known as "defense transformation" first began to be publicly bandied about during the 2000 presidential election campaign. Then 9/11 and the wars in Afghanistan and Iraq intervened. In August 2002, when the whole neocon program began to be put into action, it centered above all on a quick, easy war to incorporate Iraq into the empire. By this time, civilian leaders in the Pentagon had become dangerously overconfident because of what they perceived as America's military brilliance and invincibility as demonstrated in its 2001 campaign against the Taliban and al-Qaeda -- a strategy that involved reigniting the Afghan civil war through huge payoffs to Afghanistan's Northern Alliance warlords and the massive use of American airpower to support their advance on Kabul.
In August 2002, Secretary of Defense Donald Rumsfeld unveiled his "1-4-2-1 defense strategy" to replace the Clinton era's plan for having a military capable of fighting two wars -- in the Middle East and Northeast Asia -- simultaneously. Now, war planners were to prepare to defend the United States while building and assembling forces capable of "deterring aggression and coercion" in four "critical regions": Europe, Northeast Asia (South Korea and Japan), East Asia (the Taiwan Strait), and the Middle East, be able to defeat aggression in two of these regions simultaneously, and "win decisively" (in the sense of "regime change" and occupation) in one of those conflicts "at a time and place of our choosing."As the military analyst William M. Arkin commented, "[With] American military forces ... already stretched to the limit, the new strategy goes far beyond preparing for reactive contingencies and reads more like a plan for picking fights in new parts of the world."
A seemingly easy three-week victory over Saddam Hussein's forces in the spring of 2003 only reconfirmed these plans. The U.S. military was now thought to be so magnificent that it could accomplish any task assigned to it. The collapse of the Baathist regime in Baghdad also emboldened Secretary of Defense Rumsfeld to use "transformation" to penalize nations that had been, at best, lukewarm about America's unilateralism -- Germany, Saudi Arabia, South Korea, and Turkey -- and to reward those whose leaders had welcomed Operation Iraqi Freedom, including such old allies as Japan and Italy but also former communist countries such as Poland, Romania, and Bulgaria. The result was the Department of Defense's Integrated Global Presence and Basing Strategy, known informally as the "Global Posture Review."
President Bush first mentioned it in a statement on November 21, 2003, in which he pledged to "realign the global posture" of the United States. He reiterated the phrase and elaborated on it on August 16, 2004, in a speech to the annual convention of the Veterans of Foreign Wars in Cincinnati. Because Bush's Cincinnati address was part of the 2004 presidential election campaign, his comments were not taken very seriously at the time. While he did say that the United States would reduce its troop strength in Europe and Asia by 60,000 to 70,000, he assured his listeners that this would take a decade to accomplish -- well beyond his term in office -- and made a series of promises that sounded more like a reenlistment pitch than a statement of strategy.
"Over the coming decade, we'll deploy a more agile and more flexible force, which means that more of our troops will be stationed and deployed from here at home. We'll move some of our troops and capabilities to new locations, so they can surge quickly to deal with unexpected threats. ... It will reduce the stress on our troops and our military families. ... See, our service members will have more time on the home front, and more predictability and fewer moves over a career. Our military spouses will have fewer job changes, greater stability, more time for their kids and to spend with their families at home."
On September 23, 2004, however, Secretary Rumsfeld disclosed the first concrete details of the plan to the Senate Armed Services Committee. With characteristic grandiosity, he described it as "the biggest re-structuring of America's global forces since 1945." Quoting then undersecretary Douglas Feith, he added, "During the Cold War we had a strong sense that we knew where the major risks and fights were going to be, so we could deploy people right there. We're operating now [with] an entirely different concept. We need to be able to do [the] whole range of military operations, from combat to peacekeeping, anywhere in the world pretty quickly."
Though this may sound plausible enough, in basing terms it opens up a vast landscape of diplomatic and bureaucratic minefields that Rumsfeld's militarists surely underestimated. In order to expand into new areas, the Departments of State and Defense must negotiate with the host countries such things as Status of Forces Agreements, or SOFAs, which are discussed in detail in the next chapter. In addition, they must conclude many other required protocols, such as access rights for our aircraft and ships into foreign territory and airspace, and Article 98 Agreements. The latter refer to article 98 of the International Criminal Court's Rome Statute, which allows countries to exempt U.S. citizens on their territory from the ICC's jurisdiction.
Such immunity agreements were congressionally mandated by the American Service-Members' Protection Act of 2002, even though the European Union holds that they are illegal. Still other necessary accords are acquisitions and cross-servicing agreements or ACSAs, which concern the supply and storage of jet fuel, ammunition, and so forth; terms of leases on real property; levels of bilateral political and economic aid to the United States (so-called host-nation support); training and exercise arrangements (Are night landings allowed? Live firing drills?); and environmental pollution liabilities.
When the United States is not present in a country as its conqueror or military savior, as it was in Germany, Japan, and Italy after World War II and in South Korea after the 1953 Korean War armistice, it is much more difficult to secure the kinds of agreements that allow the Pentagon to do anything it wants and that cause a host nation to pick up a large part of the costs of doing so. When not based on conquest, the structure of the American empire of bases comes to look exceedingly fragile.
Posted on: Wednesday, February 28, 2007 - 16:19
SOURCE: Counterpunch (2-24-07)
Not long ago, I met Eyal Naveh, an Israeli historian, who explains that the United States has been the "model" for the Israeli state and society. He claims that the US was first a model for the Zionist pioneers, then for the founders of the state of Israel. Like the US, Israel was to be an entirely new country created in a savage, untamed land peopled only by savages. Like the US, Israel would be unique in its democratic institutions, its multicultural society and its modernity. Israel would also, like the US, apply the most advanced technology in the resolution of existential problems and towards the achievement of a high standard of living.
I agree with Naveh that the US influence over the Zionist enterprise is important. What is less understood is how Israel has become a model for the US. Recently the work of John Mearsheimer and Stephen Walt has raised the question of how Israel, through the Zionist lobby in the US, has perhaps come to exercise a virtually direct control over US policy in the Middle East. This is an important debate in which others, such as Noam Chomsky and Bill and Kathleen Christison have made important contributions. In this debate, in my opinion, the cultural connections between Zionism and the United States should not be minimized.
Because the state of Israel was created in part under the inspiration of the US the frontier society forged in North America images of the US have come to constitute an essential element of the vision that many Americans have of Israel and Palestine. In great part, the US understanding of the Israeli-Palestinian conflict involves an image of the US itself, an image first projected onto the Zionist settlements, and then onto the state of Israel. This is a process of "image transfer" which began long before the recognition of the state of Israel in 1948 and the substitution of US authority in the region for that of Great Britain.
The US presence, or involvement, in Israeli and Palestinian affairs was prepared long in advance of any concern for the "peace process". This US involvement has been not only the initiative of individual presidents-whatever their motivations-but an emotional commitment generated by a sense of identification. Identification between the American experience and the Zionist-Israeli experience was prepared by the refraction of a certain image of the United States through the prism of Zionist propaganda and colonization in Palestine. In the history of the United States in relation to Israel, this refracted image is both the means and the end (the objective) in the process of ideological formation.
How did the historical experience of the United States help shape the image of Palestine? How did the "New Jerusalem" contribute to a change in the vision of the "old Jerusalem"?
A first connection is between an understanding of the Jewish Diaspora and the Protestant-puritan Diaspora of the seventeenth century. Despite deep currents of anti-Semitism, the parallel between John Winthrop leading the brave Puritans to the Promised Land and Moses leading the children of Israel back to the Holy Land has been regularly exploited in (what is today) the United States. For example, Thomas Jefferson suggested that the official seal of the United States could depict the "Children of Israel" following a pillar light sent by God.
The associations envisioned by Jefferson are eloquent: the notion of a chosen people-the Elect-to whom providence has assigned a spiritual mission linked to the conquest of a particular land. All this provides the basis for an affinity that is, in fact, more than elective-it is divine. More specifically, both chosen peoples were, ultimately, "people without a land" called upon to colonize "a land without a people".
When we speak of the colonizers, of America and Palestine, it is logical to forget the indigenous inhabitants of both places, for it was the land that was colonized--not the people living on it. The importance of the American Indians and the Palestinians comes from the fact that they have figured as obstacles to the fulfillment of the missions in question. Both groups have, in different ways, been characterized as lower forms of civilization slowing the march of progress. Both peoples have been described as savage and cruel.
This image, at its worst racist and genocidal, at its best paternalistic, is well documented as it concerns Native Americans. As regards non-Jewish Palestinians, there is less documentation and more controversy. The rise of cultural prejudice and even racism concerning the non-Christian and Jewish populations of the Middle and Near East is not a popular subject in the West. The ideas presented in, for example, Edward Saïd's Orientalism, or in Martin Bernal's Black Athena, are in no way flattering to Western culture or to Western people in general.
The history of this negative form of "Orientalism" is being written today. I, for one, have attempted to elucidate how an already prejudiced perception of Palestinians was sharpened in the 1920s by Zionist spokespersons. Over a period of several years, religious designations, or territorial designations, ceased to be used in reference to non-Jewish inhabitants of Palestine. By the mid-1920s, only two parties in conflict were referred to-the "Jews" and the "Arabs". A concurrent tendency existed to refer to both groups as "races". I call this the "racializing of ethnicity". Although the vogue of racializing social terminology was abandoned (in most informed circles) after the outbreak of World War II, the cultural prejudices have persisted.
The development of a more exclusionary terminology used to designate the undesirable populations is certainly one characteristic of colonization. In order to preserve their own dignity, the colonizers are morally constrained to denigrate the human obstacles to the accomplishment of their project. Comparison of the two colonial experiences reveals how one borrowed from another, and vice-versa.
The history of the British colonies in North America, and then the history of the United States throughout the nineteenth century is that of continuous colonization. The religious and economic motives typical of the seventeenth century continued to inspire settlers until the "closing" of the Frontier in the 1890s. What appear as the real novelty of the nineteenth century were the various utopian experiments in communal living. Hundreds of socialistic communities were established throughout the United States during the nineteenth century. To our day, such initiatives continue as part of the social and cultural landscape.
The Zionist settlements in Palestine combined all these same motivations. Not only were the Zionist colonies of different types, they sometimes-as in the case of the Kibbutzim-united in themselves religious Puritanism and secular socialistic modernity. This was a phenomenon appealing to United-Statesians reared on frontier myths, such as the idea of cultural-spiritual regeneration through a confrontation with adversity and violence.
The "closing" of the US frontier in the early 1890s, accompanied by the rapid development of a mythologized literature and cinema concerning the Western hero, certainly facilitated support for the Zionist project. The idea of pioneers struggling to establish themselves in a hostile environment was romantic, and familiar.
Related to the settlement of frontiers by hardy pioneers, another affinity between Americans is the development and application of new agricultural techniques. "Making the desert bloom" was a powerful slogan and image for both emergent national cultures. US botanical technology, such as new plant varieties, insecticides, and chemical fertilizers, contributed to the success of Jewish settlements in Palestine. Going from the Great American Desert to Palestine was more than a symbolic transfer of images. In addition, in both cases, it involved a denial of the agricultural achievements of the indigenous inhabitants.
Another affinity between the creations of the American and Israeli "nations" is the demographic importance of immigration. Both populations are considered the product of disparate "waves" of new immigrants and their assimilation into a "New World" culture including a new language seen as deriving from those existing (although "American" cannot be said to be as innovative as modern "Hebrew"). The interconnection of American and Zionist immigration has meant the projection of an image of the United States onto the Zionist project. This projection has been assisted by 1) the idea of immigration as the means of recomposing or regenerating a population and, 2) the fact that so many Jews from Russia, Poland and elsewhere immigrated to the United States. Jewish immigrants in the US were prone to support emigration to Palestine. (In the latter half of the twentieth century, a significant number of their descendants immigrated to Israel.)
Other factors in the development of support for Zionism in the United States include a Christian education tending to reinforce revulsion for the "loss" of the Holy Land to Islam. The Christian Crusades of the Middle Ages tended to be particularly celebrated in the US towards the end of the nineteenth century.
Anti-Semitism also encouraged acceptance of the Zionist project in Palestine. Those who resented their presence viewed favorably the transfer of Jews to a relatively desolate part of the world. This factor intensified after World War II when the Jewish refugees became an embarrassment to Western governments, even though anti-Semitism was declining.
Such are some of the cultural affinities and conditions that have contributed to the orientation of US policies relative to the Israel-Palestine conflict. In some significant ways, US nationalism is linked to, or seen as having affinities with Jewish nationalism as represented first by the Zionist movement and then by the Israeli state. It is why Israel is not seen in the United States as an alien culture in the Middle East, but rather as an extension of American historical experience. It is perhaps in this cultural-ontological sense that Israel is the "51st state" (and not primarily because of the extensive economic, financial and military ties).
For all of these reasons, the rhetoric of nationalism in the Israel-Palestine conflict tends to reinforce established cultural values, values stemming from American historical experience. It is also why, in the United States, many people find it difficult to take seriously Palestinian claims, just as they could not take seriously the claims of the "Indian Nations". The similarities, in any case, are striking. One century later, the Palestinian resistance to colonization and ethnic cleansing is being dealt with in much the same ways as that of the Indians: forced evacuation, concentration in "reservations" (which could be called "Bantustans" or "autonomous territories"), periodic massacre and racist humiliations.
Consider, in the above light, how differently Israeli and Palestinian leadership must be perceived. On the one hand, there have been Israeli leaders like Golda Meir and Benjamin Netanyahu, Americans or American-educated, speaking faultless "American". On the other hand, the Palestinian leaders most often have an alien aspect; not to speak of the late Yassir Arafat, with his colorful headdress and his strange uniform of dubious origin. The cultivated descendants of brave Western-like pioneers make a singular contrast with the Palestinians.
The analogies and metaphors are there, underlying a US policy conceiving of "peace" mostly in terms of acquiescence or accommodation to the image and interests of the United States projected onto the Israeli state, an Israeli state considered by US policy makers to be a model for the Middle East in general.
For these US policymakers, it is not only a question of propagandistic manipulation, of the conscious deception of the public. The metaphors and analogies founded upon the special affinities between the US and the state of Israel are rather rooted in the social and cultural histories of both their societies and politics. If hypocrisy and bad faith are integral to political behavior, in the service of collective interests as much as in the service of individual designs, it is to be expected that such self-deception should be pronounced in, on the one hand, the critical, early phases of nation-state-making and, on the other hand, during the construction of an imperial presence in the Middle East.
Posted on: Wednesday, February 28, 2007 - 14:59
SOURCE: Chronicle of Higher Education (3-5-07)
In the summer of 1960, the noted American poet Kenneth Rexroth issued a warning to college undergraduates. The previous spring had witnessed a wave of student sit-ins against racial discrimination in the South, as well as a protest by students in the Bay area against the U.S. House of Representatives Committee on Un-American Activities. That the famously "silent generation" on campuses in the 1950s was making way for a new generation of student activists seemed a welcome development to Rexroth, who had a history of involvement in left-wing causes stretching back to the 1930s.
But Rexroth was less encouraged to learn that the trend had also reached the ears of the students' left-wing elders — his contemporaries. Writing in The Nation magazine, Rexroth imagined how the aging leaders of all the moribund sects of the American left would immediately conclude "Myself when young!" and dash off for the nearest campus with a stack of application blanks. "As the kids go back to school this fall," he warned, "this is going to be the greatest danger they will face — all these eager helpers from the other side of the age barrier, all these cooks, each with a time-tested recipe for the broth."
I was reminded of Rexroth's sense of unease when I learned last spring that a new group of eager helpers from the other side of the age barrier — this time, my contemporaries — planned to revive Students for a Democratic Society, the principal campus radical organization of the 1960s. When SDS first took form in 1960-62 under the leadership of Al Haber and Tom Hayden, it was a small organization of a few hundred members. Its Port Huron Statement, largely written by the 22-year-old Hayden and adopted at a convention of SDS delegates in Port Huron, Mich., in 1962, called for the creation of a "new left in America ... with real intellectual skills, committed to deliberativeness, honesty, reflection as working tools." By the time I joined the Reed College chapter as a freshman in 1968, SDS had grown into a very large organization — at least by the standards of the American left — with perhaps as many as 100,000 members.
But by that time, leaders of SDS, if not all of its rank and file, had largely forgotten the organization's original goals and values. At a final fractious convention in Chicago in the summer of 1969, a faction known as Weatherman (from a Bob Dylan lyric quoted in its manifesto's title) gained control of the SDS national office. A few months later, Weatherman leaders shut down the organization. True revolutionaries, they believed, should forsake the campuses for clandestine armed struggle. The national office, along with several dozen followers, disappeared into the "Weather Underground," leaving more than 99,000 of us with no place to belong.
The armed struggle did not turn out well: Three Weathermen blew themselves up in a Greenwich Village townhouse in March 1970 as they assembled a nail-studded dynamite bomb that they planned to set off at an enlisted men's dance at Fort Dix in New Jersey. Had they been better bomb makers, Weatherman might well have taken scores of innocent lives and, in the process, destroyed the antiwar movement as it had already destroyed SDS.
Historians have debated the meaning of those events ever since. But in retrospect one thing is clear: The legacy that SDS left for future generations of American radicals is dark and complicated, a cautionary example of good intentions gone awry in the midst of a decade defined by tragic and unintended consequences.
So when I heard that a new SDS was in the offing, I did not immediately conclude "Myself when young!" and reach for the application blanks. I felt that even if the organizers were determined to avoid a repetition of past disasters, it would still prove a mistake to revive an organization whose very name imposed on its members the necessity of constantly explaining to skeptical outsiders that, no, it wasn't the SDS of 1969 they sought to emulate, but that of earlier, saner years....
Posted on: Monday, February 26, 2007 - 22:31
SOURCE: Nation (2-22-07)
A baby is born. A child develops a high fever. A spouse breaks a leg. A parent suffers a stroke. These are the events that throw a working woman's delicate balance between work and family into chaos.
Although we read endless stories and reports about the problems faced by working women, we possess inadequate language for what most people view as a private rather than a political problem."That's life," we tell each other, instead of trying to forge common solutions to these dilemmas.
That's exactly what housewives used to say when they felt unhappy and unfulfilled in the 1950s:"That's life." Although magazines often referred to housewives' unexplained depressions, it took Betty Friedan's 1963 bestseller to turn"the problem that has no name" into a household phrase,"the feminine mystique"--the belief that a woman should find identity and fulfillment exclusively through her family and home.
The great accomplishment of the modern women's movement was to name such private experiences--domestic violence, sexual harassment, economic discrimination, date rape--and turn them into public problems that could be debated, changed by new laws and policies or altered by social customs. That is how the personal became political.
Although we have shelves full of books that address work/family problems, we still have not named the burdens that affect most of America's working families.
Call it the care crisis.
For four decades, American women have entered the paid workforce--on men's terms, not their own--yet we have done precious little as a society to restructure the workplace or family life. The consequence of this"stalled revolution," a term coined by sociologist Arlie Hochschild, is a profound" care deficit." A broken healthcare system, which has left 47 million Americans without health coverage, means this care crisis is often a matter of life and death. Today the care crisis has replaced the feminine mystique as women's"problem that has no name." It is the elephant in the room--at home, at work and in national politics--gigantic but ignored.
Three decades after Congress passed comprehensive childcare legislation in 1971--Nixon vetoed it--childcare has simply dropped off the national agenda. And in the intervening years, the political atmosphere has only grown more hostile to the idea of using federal funds to subsidize the lives of working families.
The result? People suffer their private crises alone, without realizing that the care crisis is a problem of national significance. Many young women agonize about how to combine work and family but view the question of how to raise children as a personal dilemma, to which they need to find an individual solution. Most cannot imagine turning it into a political debate. More than a few young women have told me that the lack of affordable childcare has made them reconsider plans to become parents. Annie Tummino, a young feminist active in New York, put it this way:"I feel terrified of the patchwork situation women are forced to rely upon. Many young women are deciding not to have children or waiting until they are well established in their careers."
Now that the Democrats are running both houses of Congress, we finally have an opportunity to expose the right's cynical appropriation of"family values" by creating real solutions to the care crisis and making them central to the Democratic agenda. The obstacles, of course, are formidable, given that government and businesses--as well as many men--have found it profitable and convenient for women to shoulder the burden of housework and caregiving.
It is as though Americans are trapped in a time warp, still convinced that women should and will care for children, the elderly, homes and communities. But of course they can't, now that most women have entered the workforce. In 1950 less than a fifth of mothers with children under age 6 worked in the labor force. By 2000 two-thirds of these mothers worked in the paid labor market.
Men in dual-income couples have increased their participation in household chores and childcare. But women still manage and organize much of family life, returning home after work to a"second shift" of housework and childcare--often compounded by a"third shift," caring for aging parents.
Conservatives typically blame the care crisis on the women's movement for creating the impossible ideal of"having it all." But it was women's magazines and popular writers, not feminists, who created the myth of the Superwoman. Feminists of the 1960s and '70s knew they couldn't do it alone. In fact, they insisted that men share the housework and child-rearing and that government and business subsidize childcare.
A few decades later, America's working women feel burdened and exhausted, desperate for sleep and leisure, but they have made few collective protests for government-funded childcare or family-friendly workplace policies. As American corporations compete for profits through layoffs and outsourcing, most workers hesitate to make waves for fear of losing their jobs.
Single mothers naturally suffer the most from the care crisis. But even families with two working parents face what Hochschild has called a"time bind." Americans' yearly work hours increased by more than three weeks between 1989 and 1996, leaving no time for a balanced life. Parents become overwhelmed and cranky, gulping antacids and sleeping pills, while children feel neglected and volunteerism in community life declines.
Meanwhile, the right wins the rhetorical battle by stressing"values" and"faith." In the name of the family they campaign to ban gay marriage and save unborn children. Yet they refuse to embrace public policies that could actually help working families regain stability and balance.
For the very wealthy, the care crisis is not so dire. They solve their care deficit by hiring full-time nannies or home-care attendants, often from developing countries, to care for their children or parents. The irony is that even as these immigrant women make it easier for well-off Americans to ease their own care burdens, their long hours of paid caregiving often force them to leave their own children with relatives in other countries. They also suffer from extremely low wages, job insecurity and employer exploitation.
Middle- and working-class families, with fewer resources, try to patch together care for their children and aging parents with relatives and baby sitters. The very poor sometimes gain access to federal or state programs for childcare or eldercare; but women who work in the low-wage service sector, without adequate sick leave, generally lose their jobs when children or parents require urgent attention. As of 2005, 21 million women lived below the poverty line--many of them mothers working in these vulnerable situations.
The care crisis starkly exposes how much of the feminist agenda of gender equality remains woefully unfinished. True, some businesses have taken steps to ease the care burden. Every year, Working Mother publishes a list of the 100 most"family friendly" companies. In 2000 the magazine reported that companies that had made"significant improvements in 'quality of life' benefits such as telecommuting, onsite childcare, career training, and flextime" were"saving hundreds of thousands of dollars in recruitment in the long run."
Some universities, law firms and hospitals have also made career adjustments for working mothers, but women's career demands still tend to collide with their most intensive child-rearing years. Many women end up feeling they have failed rather than struggled against a setup designed for a male worker with few family responsibilities.
The fact is, market fundamentalism--the irrational belief that markets solve all problems--has succeeded in dismantling federal regulations and services but has failed to answer the question, Who will care for America's children and elderly?
As a result, this country's family policies lag far behind those of the rest of the world. A just-released study by researchers at Harvard and McGill found that of 173 countries studied, 168 guarantee paid maternal leave--with the United States joining Lesotho and Swaziland among the laggards. At least 145 countries mandate paid sick days for short- or long-term illnesses--but not the United States. One hundred thirty-four countries legislate a maximum length for the workweek; not us.
The media constantly reinforce the conventional wisdom that the care crisis is an individual problem. Books, magazines and newspapers offer American women an endless stream of advice about how to maintain their"balancing act," how to be better organized and more efficient or how to meditate, exercise and pamper themselves to relieve their mounting stress. Missing is the very pragmatic proposal that American society needs new policies that will restructure the workplace and reorganize family life.
Another slew of stories insist that there simply is no problem: Women have gained equality and passed into a postfeminist era. Such claims are hardly new. Ever since 1970 the mainstream media have been pronouncing the death of feminism and reporting that working women have returned home to care for their children. Now such stories describe, based on scraps of anecdotal data, how elite (predominantly white) women are" choosing" to"opt out," ditching their career opportunities in favor of home and children or to care for aging parents. In 2000 Ellen Galinsky, president of the Families and Work Institute in New York, wearily responded to reporters,"I still meet people all the time who believe that the trend has turned, that more women are staying home with their kids, that there are going to be fewer dual-income families. But it's just not true."
Such contentious stories conveniently mask the reality that most women have to work, regardless of their preference. They also obscure the fact that an absence of quality, affordable childcare and flexible working hours, among other family-friendly policies, greatly contributes to women's so-called" choice" to stay at home.
In the past few years, a series of sensational stories have pitted stay-at-home mothers against"working women" in what the media coyly call the"mommy wars." When the New York Times ran a story on the controversy, one woman wrote the editor,"The word 'choice' has been used...as a euphemism for unpaid labor, with no job security, no health or vacation benefits and no retirement plans. No wonder men are not clamoring for this 'choice.' Many jobs in the workplace also involve drudgery, but do not leave one financially dependent on another person."
Most institutions, in fact, have not implemented policies that support family life. As a result, many women do feel compelled to choose between work and family. In Scandinavian countries, where laws provide for generous parental leave and subsidized childcare, women participate in the labor force at far greater rates than here--evidence that"opting out" is, more often than not, the result of a poverty of acceptable options.
American women who do leave their jobs find that they cannot easily re-enter the labor force. The European Union has established that parents who take a leave from work have a right to return to an equivalent job. Not so in the United States. According to a 2005 study by the Wharton Center for Leadership and Change and the Forte Foundation, those who held advanced degrees in law, medicine or education often faced a frosty reception and found themselves shut out of their careers. In her 2005 book Bait and Switch, Barbara Ehrenreich describes how difficult it was for her to find employment as a midlevel manager, despite waving an excellent résumé at potential employers."The prohibition on [résumé] gaps is pretty great," she says."You have to be getting an education or making money for somebody all along, every minute."
Some legislation passed by Congress has exacerbated the care crisis rather than ameliorated it. Consider the 1996 Welfare Reform Act, which eliminated guaranteed welfare, replaced it with Temporary Assistance to Needy Families (TANF) and set a five-year lifetime limit on benefits. Administered by the states, TANF aimed to reduce the number of mothers on welfare rolls, not to reduce poverty.
TANF was supposed to provide self-sufficiency for poor women. But most states forced recipients into unskilled, low-wage jobs, where they joined the working poor. By 2002 one in ten former welfare recipients in seven Midwestern states had become homeless, even though they were now employed.
TANF also disqualified higher education as a work-related activity, which robbed many poor women of an opportunity for upward mobility. Even as the media celebrate highly educated career women who leave their jobs to become stay-at-home moms, TANF requires single mothers to leave their children somewhere, anywhere, so they can fulfill their workfare requirement and receive benefits. TANF issues vouchers that force women to leave their children with dubious childcare providers or baby sitters they have good reasons not to trust.
Some readers may recall the 1970 Women's Strike for Equality, when up to 50,000 women exuberantly marched down New York's Fifth Avenue to issue three core demands for improving their lives: the right to an abortion, equal pay for equal work and universal childcare. The event received so much media attention that it turned the women's movement into a household word.
A generation later, women activists know how far we are from achieving those goals. Abortion is under serious legal attack, and one-third of American women no longer have access to a provider in the county in which they live. Women still make only 77 percent of what men do for the same job; and after they have a child, they suffer from an additional"mother's wage gap," which shows up in fewer promotions, smaller pensions and lower Social Security benefits. Universal childcare isn't even on the agenda of the Democrats.
Goals proposed in 1970, however unrealized, are no longer sufficient for the new century. Even during these bleak Bush years, many writers, activists and organizations have begun planning for a different future. If women really mattered, they ask, how would we change public policy and society? As one writer puts it,"What would the brave new world look like if women could press reboot and rewrite all the rules?"
Though no widely accepted manifesto exists, many advocacy organizations--such as the Institute for Women's Policy Research, the Children's Defense Fund, the National Partnership for Women and Families, Take Care Net and MomsRising--have argued that universal healthcare, paid parental leave, high-quality subsidized on-the-job and community childcare, a living wage, job training and education, flexible work hours and greater opportunities for part-time work, investment in affordable housing and mass transit, and the reinstatement of a progressive tax structure would go a long way toward supporting working mothers and their families. (In these pages in 2003, Deborah Stone documented campaigns on many of these issues by organizations in California, Massachusetts and Washington.)
Democrats don't need to reinvent the wheel; these groups have already provided the basis for a new progressive domestic agenda. And if Democrats embrace large portions of this program, they might attract enough women to widen the gender gap in voting, which shrank from 14 percent in 1996 to only 7 percent in 2004.
This is an expensive agenda, but the money is there if we end tax cuts for the wealthy and reduce expenditures for unnecessary wars, space-based weapons and the hundreds of American bases that circle the globe. If we also reinstate a progressive tax structure, this wealthy nation would have enough resources to care for all its citizens. It's a question of political will.
Confronting the care crisis and reinvigorating the struggle for gender equality should be central to the broad progressive effort to restore belief in the" common good." Although Americans famously root for the underdog, they have shown far less compassion for the poor, the vulnerable and the homeless in recent years. Social conservatives, moreover, have persuaded many Americans that they--and not liberals--are the ones who embody morality, that an activist government is the problem rather than the solution and that good people don't ask for help.
The problem is that many Democrats, along with prominent liberal men in the media, don't view women's lives as part of the common good. Consciously or unconsciously, they have dismissed women as an"interest group" and treated women's struggle for equality as"identity politics" rather than part of a common national project. Last April Michael Tomasky, then editor of The American Prospect, penned an essay on the" common good" that is typical of such manifestoes. It never once addressed any aspect of the care crisis. Such writers don't seem to grasp that a campaign to end the care crisis could mobilize massive support for this idea of the common good, because it affects almost all working families.
Now that Democrats are emerging from the wilderness, there are scattered indications they are willing to use their power to address the mounting care crisis. The Congressional Caucus for Women's Issues, one of the largest caucuses, has access to Speaker Nancy Pelosi, who has supported previous efforts to address the care crisis. The Senate has just created a new Caucus on Children, Work and Family, a sign, says Valerie Young, a lobbyist with the National Association of Mothers' Centers, that"this is no longer a personal problem--it's a national problem." Connecticut Senator Chris Dodd says he will introduce legislation that would provide paid leave for workers who need to care for sick family members, newborns or newly adopted children. Senator Pat Roberts of Kansas has just introduced the Small Business Child Care Act, which would help employers provide childcare for their workers. Members in both houses of Congress are reopening the discussion of universal healthcare reform.
The truth is, we're living with the legacy of an unfinished gender revolution. Real equality for women, who increasingly work outside the home, requires that liberals place the care crisis at the core of their agenda and take back"family values" from the right. So far, no presidential candidate has made the care crisis a significant part of his or her political agenda. So it's up to us, the millions of Americans who experience the care crisis every day, to take every opportunity--through electoral campaigns and grassroots activism--to turn"the problem that has no name" into a household word.
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Sunday, February 25, 2007 - 20:31
SOURCE: Nation (3-12-07)
An old marketing adage states that no product exists whose sales cannot be improved by associating it with Abraham Lincoln. The same seems to be true in politics. As Congress debated resolutions condemning the escalation of the Iraq War, the remaining supporters of George W. Bush's Iraq policy invoked Lincoln to tar the war's opponents with the brush of treason. But this reflects a complete misunderstanding of Lincoln's record.
The latest example of the misuse of Lincoln came in a February 13 article in the Washington Times by conservative writer Frank Gaffney. Gaffney quoted Lincoln as declaring that wartime Congressmen who "damage morale and undermine the military" should be "exiled or hanged." Glenn Greenwald, on Salon, quickly pointed out that the "quote," which has circulated for the past few years in conservative circles, is a fabrication. (Conservative use of invented Lincoln statements is nothing new--Ronald Reagan used a series of them in a speech to the 1992 Republican National Convention. But today, when Lincoln's entire works are online and easily searchable, there is no possible excuse for invoking fraudulent quotations.)
Greenwald did not point out that Lincoln's record as a member of Congress during the Mexican War utterly refutes the conservative effort to appropriate his legacy. Lincoln was elected to the House of Representatives in 1846, shortly after President James Polk invaded Mexico when that country refused his demand to sell California to the United States. Polk falsely claimed that he was responding to a Mexican invasion.
Shortly before Lincoln's term in Congress began, he attended a speech in Lexington, Kentucky, by his political idol Senator Henry Clay. "This is no war of defense," Clay declared in a blistering attack on Polk, "but one of unnecessary and offensive aggression." A month later, Lincoln introduced a set of resolutions challenging Polk's contention that Mexico had shed American blood on American soil and voted for a statement, approved by the House, that declared the war "unnecessarily and unconstitutionally begun by the President."
Clay and Lincoln objected as strenuously as any member of Congress today to a war launched by a President on fabricated grounds. When Lincoln's law partner, William Herndon, defended the President's right to invade another country if he considered it threatening, Lincoln sent a devastating reply. Herndon, he claimed, would allow a President "to make war at pleasure. Study to see if you can fix any limit to his power in this respect.... If, to-day, he should choose to say he thinks it necessary to invade Canada, to prevent the British from invading us, how could you stop him?" The Constitution, he went on, gave the "war-making power" to Congress precisely to prevent Presidents from starting wars while "pretending...that the good of the people was the object."
Like Bush, Lincoln spoke of the United States as a beacon of liberty, an example to the world of the virtues of democracy. But he rejected the idea of American aggression in the name of freedom. He included in an 1859 speech a biting satire of "Young America," a group of writers and politicians who glorified territorial aggrandizement. Young America, he remarked, "owns a large part of the world, by right of possessing it; and all the rest by right of wanting it, and intending to have it.... He is a great friend of humanity; and his desire for land is not selfish, but merely an impulse to extend the area of freedom. He is very anxious to fight for the liberation of enslaved nations and colonies, provided, always, they have land." Substitute "oil" for "land" and the statement seems eerily relevant in the early twenty-first century.
Conservatives should think twice before invoking Lincoln's words, real or invented, in the cause of the Iraq War and before equating condemnations of Bush's policies and usurpations with treason.
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Sunday, February 25, 2007 - 20:30
SOURCE: WaPo (2-25-07)
At the center of Baghdad's neglected North Gate War Cemetery, near the edge of the old city walls, stands an imposing grave. Sheltered from the weather by a grandiose red sandstone cupola, it is the final resting place of a man from whom George W. Bush could have learned a great deal about the perils of intervening in Iraq.
Lt. Gen. Sir Frederick Stanley Maude was head of the British army in Mesopotamia when he marched into Baghdad on a hot, dusty day in March 1917. Soon thereafter, he issued the British government's "Proclamation to the People of Baghdad," which eerily foreshadowed sentiments that Bush and his administration would express 86 years later: British forces, Maude declared, had entered the city not as conquerors, but as liberators.
Maude had arrived in Baghdad after a long and arduous military campaign. British forces had been fighting the Ottoman army for 2 1/2 years and had suffered one of the worst defeats of World War I in the six-month siege of the eastern city of Kut, which had ended in an ignominious surrender to the Turks in April 1916.
Having rallied from that loss and finally reached Baghdad, Maude tried to create common cause between the British army and the city's residents, whom he saw as having been oppressed by 400 years of Ottoman rule. "Your lands have been subject to tyranny," he declared in his proclamation, and "your wealth has been stripped from you by unjust men and squandered." He promised that it was not "the wish of the British Government to impose upon you alien institutions." Instead, he called on residents to manage their own civil affairs "in collaboration with the political representatives of Great Britain."
Maude did not live to see the failure of his efforts to rally the people of Iraq to the British occupation. He died eight months later, having contracted cholera from a glass of milk.
After his death, British policy toward Iraq changed repeatedly as the army attempted to dominate the country and suppress the population, while the government strove to adjust to Britain's diminished role in the international system after WWI. Initially, the aim was simply to annex the territory and make it part of the Empire, run in a fashion similar to India. But Woodrow Wilson's Fourteen Points speech in January 1918 did in that idea. In setting out America's vision for the postwar world, Wilson expressly attacked the duplicitous diplomacy of European imperialism, which he blamed for dragging the world into prolonged military conflict.
This meant that a modern, self-determining state was now to be built in Iraq. Britain was to take the lead, but its effort was to be continually scrutinized by the League of Nations, which had been set up under Wilson's watchful eye at the Paris Peace Conference at the end of the war.
In an echo of what is happening under the U.S. occupation, hopes for a joint Anglo-Iraqi pact to rebuild the country were dashed by a violent uprising. On July 2, 1920, a revolt, or thawra, broke out along the lower Euphrates, fueled by popular resentment of Britain's heavy-handed behavior in Iraq. The British army had set about taxing the population to pay for the building of the Iraqi state, while British civil servants running the administration refused to consult Iraqi politicians, judging them too inexperienced to play a role in the new government.
The rebellion quickly spread across the south and center of the country. Faced with as many as 131,000 insurgents armed with 17,000 modern rifles left over from the war, the British army needed eight months to regain full control of Iraq; 2,000 British troops were killed, wounded or taken prisoner and 8,450 Iraqis were killed. To make matters worse, the British government was forced to pour troops back into Iraq, long after the end of the war, to stabilize the situation.
The revolt forced Britain to devolve real power to Iraqi politicians. At the head of this new administration the British placed a newly created king, Faisal ibn Hussein, famous for his association with Lawrence of Arabia during the war. But the revolt had as much influence in Britain as it did in Iraq itself. The "blood and treasure" expended in putting down the violence made the continued occupation extremely unpopular. The public's discontent reached its peak in the general election campaign of November 1922. The leader of the opposition, conservative Andrew Bonar Law, captured the national mood when he declared: "We cannot alone act as the policeman of the world."...
Posted on: Sunday, February 25, 2007 - 18:04
I'm in broad agreement with almost everything that has been said. What I'd like to do is complement it by looking at what one might call the big picture.
You asked in the title of this hearing, "Next Steps in the Israeli-Palestinian Peace Process." I shall argue three points: First, the peace negotiations have so far been so counterproductive, they could better be called a war process; that their failure results from an Israeli conceptual error 15 years ago about the nature of warfare; and third, that the U.S. government should urge Jerusalem to forgo negotiations and instead return to its earlier policy of deterrence.
So first, Mr. Chairman, to review the peace process.
It is embarrassing to recall today the elation and expectations that accompanied the signing of the Oslo Accords in September 1993 when Prime Minister Yitzhak Rabin shook hands with Yasser Arafat. For some time after this, "The Handshake," as it was known, served as a symbol of brilliant diplomacy, whereby each side achieved what it most wanted - -- dignity and autonomy for the Palestinians; recognition and security for the Israelis.
President Clinton lauded that deal as, quote, "a great occasion of history," unquote. Yasser Arafat called it, quote, "An historic event, inaugurating a new epoch," unquote. Shimon Peres, the prime minister of Israel, discerned in it, quote, "the outline of peace in the Middle East," unquote.
These heady expectations were then grievously disappointed. Before Oslo, when Palestinians still lived under Israeli control, they benefited from the rule of law and a growing economy independent of international welfare. They enjoyed functioning schools and hospitals; they traveled without checkpoints and had free access to Israeli territory. They even founded universities.
Terrorism was declining as acceptance of Israel increased. However, then came Oslo, which brought Palestinians not peace but tyranny, failed institutions, poverty, corruption, a death cult, suicide factories, and Islamist radicalization.
Yasser Arafat early on promised that the West Bank and Gaza would evolve into what he called, quote, "the Singapore of the Middle East," unquote, but the reality that he shaped became a nightmare of dependence, inhumanity and loathing.
As for the Israelis, for them Oslo brought unprecedented terrorism. If the two hands in the Rabin-Arafat handshake symbolize Oslo's early hopes, it is the two bloody hands of a young Palestinian male who had just lynched Israeli reservists in Ramallah in October 2000 that represented its dismal end.
Oslo provoked deep internal rifts and harmed Israel's standing internationally. Israelis watched helplessly as Palestinian rage spiraled upwards, spawning such moral perversions as the United Nations World Conference against Racism in Durban in 2001. That rage also re-opened among Westerners the issue of Israel's continued existence, especially on the hard left. From Israel's perspective, seven years of Oslo diplomacy undid 45 years' success in warfare.
Palestinians and Israelis agree on little, but they concur that Oslo was a disaster.
Now, why was it a disaster? Where did things to so badly wrong? Why did the war—the peace process turn into a war process? Where lay the flaws in promising—in so promising an agreement?
Of its many errors—and I think all analysts will agree there are many—the ultimate mistake lay in Yitzhak Rabin's misunderstanding of how a war ends. And it's revealed in his catchphrase; what he said repeatedly: "One does not make war with one's friends. One makes"—I'm sorry; do that again. "One does not make peace with one's friends. One makes peace with one's enemy."
The Israeli prime minister implied by this that wars concluded through a mix of goodwill, conciliation, concessions, mediation, flexibility, restraint, generosity and compromise, all topped off with signatures on official documents. In this spirit, his government initiated an array of concessions, hoping that the Palestinians would reciprocate, but they did not. Those concessions, in fact, made matters worse.
Still in a war mode, Palestinians understood the Israeli efforts to "make peace" as signals, instead, of demoralization and of weakness. The concessions reduced Palestinian awe of Israel, made it appear vulnerable, and incited irredentist dreams of its annihilation. Each Oslo-negotiated gesture by Israel further exhilarated, radicalized, and mobilized the Palestinian body politic. The quiet hope of 1993 to eliminate Israel gained traction, becoming a deafening demand by the year 2000.
Rabin ensured—made a shattering mistake, which his successors then repeated. One does not in fact make peace with one's enemy; one makes peace with one's former enemy—former enemy. Peace nearly always requires one side in a conflict to give up its goals by being defeated. Rather than vainly trying to close down a war through goodwill, the way to end a war, Mr. Chairman, is by winning it.
"War is an act of violence to compel the enemy to fulfill our will." That's what the Prussian strategist Carl von Clausewitz wrote in 1832. War is an act of violence to compel the enemy to fulfill our will. And however much technological advancement there's been in the nearly two centuries since he wrote that, the basic insight remains valid. Victory consists of imposing one's will on the enemy by compelling him to give up his war goals. Wars usually end when one side gives up its hope of winning; when its will to fight has been crushed.
Arabs and Israelis since 1948 have pursued static and binary goals. Arabs have fought to eliminate Israel; Israelis have fought to win their neighbors' acceptance. The details have varied over the decades, with multiple ideologies, strategies, leading actors and so forth, but the goals have barely changed. The Arabs have pursued their war aims with patience, determination and purpose. In response, Israelis sustained a formidable record of strategic vision and tactical brilliance in the period 1948 to 1993.
Over time, however, as Israel developed into a vibrant, modern, democratic country, its populace grew impatient with the humiliating, slow, tedious task of convincing Arabs to accept their political existence. By now, almost no one in Israel sees victory as the goal; no major political figure on the scene today calls for victory in war. Since 1993, in brief, Mr. Chairman, the Arabs have sought victory while Israelis have sought compromise.
It is my view that he who does not win loses. To survive, Israelis must eventually return to the 1990 -- pre-1993 -- policy of establishing that Israel is strong, tough and permanent, the policy of deterrence. The long, boring, difficult, bitter and expensive task of convincing Palestinians and others that the Jewish state is permanent and that dreams of eliminating it are doomed.
This will not be quick or easy. Perceptions of Israel's weakness due to terrible missteps during the Oslo years and even after, such as the Gaza withdrawal of 2005, have sunk into Palestinian consciousness and will presumably require decades of effort to reverse. Nor will it be pretty. Defeat in war typically entails experiencing the bitter crucible of deprivation, failure and despair.
I look at this process, Mr. Chairman, through a simple prism. Any development that encourages Palestinians to think they can eliminate Israel is negative; any development that encourages them to give up that goal is positive. The Palestinians' defeat will be recognizable when, over a protracted period and with complete consistency, they prove that they have accepted Israel.
My third and final point: American policy.
Like all outsiders to the conflict, Americans face a stark choice. Do we endorse the Palestinian goal of eliminating Israel, or do we endorse the Israeli goal of winning its neighbors' acceptance?
To state this choice is to make clear that there is no choice—the first is offensive in intent; the second defensive. No decent person can endorse the Palestinians' goal of eliminating their neighbor, and along with every president since Harry S Truman and every congressional resolution and vote since then, the 110th Congress must continue to stand with Israel in its drive to win its acceptance.
Not only is this an obvious moral choice, but I think it's important to add that a Palestinian defeat at Israel's hands is actually the best thing that had ever happened to them. Compelling Palestinians finally to give up on their foul, irredentist dream would liberate them to focus on their own polity, economy, society and culture.
Palestinians need to experience the certitude of defeat to become a normal people—one where parents stop celebrating their children becoming suicide terrorists; where something matters beyond the evil obsession of anti-Zionist rejectionism. Americans especially need to understand Israel's predicament and help it win its war, for the U.S. government has, obviously, a vital role in this theater.
My analysis implies a radically different approach for the Bush administration, and for this Congress.
On the negative side, it implies that Palestinians must be led to understand that benefits will flow only after they prove their acceptance of Israel. Until then, no diplomacy, no discussion of final status, no recognition as a state and certainly no financial aid or weapons.
On the positive side, the administration and Congress should work with Israel, the Arab states and others to induce the Palestinians to accept Israel's existence by convincing them the gig is up—the gig is up—that they have lost.
Diplomacy aiming to shut down the Arab-Israeli conflict is premature until Palestinians give up their hideous anti-Zionist obsession. When that moment arrives, negotiations can re-open with the issues of the 1990s—borders, resources, armaments, sanctities, residential rights—taken up anew. But that moment is years or decades away. In the meantime, a war needs to be won.
Posted on: Sunday, February 25, 2007 - 17:24
SOURCE: Open Democracy (2-19-07)
Jan Morris spoke for many around the world in a piece in the Guardian on 14 February 2007 in which she admitted to disenchantment with what the United States has become. "[The] missionary instinct", she wrote, "which impelled Americans into so many noble policies, was to be perverted by power". And even, "[far] from being the most beloved country on earth, today the US is the most thoroughly detested".
No doubt there she exaggerated: think North Korea? And she was wrong to trace American exceptionalism back to Abraham Lincoln and his belief that America was "the last best hope": exceptionalism goes back a long way farther than that. But she has put her finger on something that puzzles and angers many Americans and distresses those of us who have loved what we thought America stood for.
There are, I think, two points to be made:
America has changed, hardened and mobilised since the cold war, and even more since the collapse of the Soviet Union and the coming of the age of terror on 11 September 2001
America was always a tougher, more ruthless society than American patriots wanted to admit, less innocent, less free from the common failings of mankind than American exceptionalists claimed.
The path to militarisation
Paul Nitze, arch-cold warrior, and incidentally the mentor of Paul Wolfowitz, drafted NSC-68 in 1950, the call to arms that persuaded the Truman administration to quadruple defence expenditure and gird itself, in Dean Acheson's image, to stand at Armageddon and do battle against communism for the Lord.
What is less well known is that, in NSC-68, Nitze wrote "the resort to force, to compulsion, to the imposition of its will is therefore a difficult and dangerous act for a free society, which is warranted only in the face of even greater dangers. The necessity of the act must be clear and compelling; the act must commend itself to the overwhelming majority as an inescapable exception to the basic idea of freedom." Measure the distance between the arch-hawk's reluctant formulation, and the triumphalist acceptance in 2003 of America's right to start a pre-emptive, or preventive, or even a merely punitive war in Iraq.
That is not the only example of a certain coarsening of American attitudes in international relations. Woodrow Wilson, revered as a prophet by both the Clinton and George W Bush administrations, hesitated to commit America to war for months and even years after many of his friends and allies thought America's entry into the great war was inevitable. "There is such a thing", he pronounced in a speech after the torpedoing of the Lusitania by a German U-boat in 1915, "as being too proud to fight".
Two days before the second world war broke out in Europe, President Franklin D Roosevelt lectured the belligerents about the "inhuman barbarism" of "the ruthless bombing from the air of civilians". To be sure, British bombers had already bombed villages in Iraq, and would destroy Berlin, Hamburg and Dresden from the air. Germans dive-bombers had already flattened Guernica, and would blitz Rotterdam and London, while the Japanese had indiscriminately bombed civilians in Shanghai and elsewhere. Yet it was Roosevelt who set in motion the machinery that led to the United States killing 100,000 civilians in one night in Tokyo, and then (just after his death) dropping atomic bombs on Hiroshima and Nagasaki.
Before the cold war, the United States remained an essentially civilian society, albeit one that had raised huge armies to fight in the civil war and in the first and second world wars. But the cold war militarised American society in myriad ways. There was the draft and the GI bill, one of the most benevolent measures in American history, but one justified by war; for the first time the federal government subsidised universities through "defence" programmes. Intellectuals were terrified by what came to be known as McCarthyism, a wave of infringements of civil liberties ostensibly justified by fear of communism, but in practice often used to suppress radical or progressive or just unpopular thought.
There was the rise of what President Dwight Eisenhower, a Republican military hero, warned his countrymen against: the "military-industrial complex". Interestingly, in the penultimate draft of that famous 1961 speech, Eisenhower warned against a "military-industrial-congressional" complex; political advisers persuaded him that to say that might upset members of Congress.
Interstate highways were defence highways, and gigantic investments in aviation, electronics and finally in computer and IT technology were justified on grounds of national security and funded by the Pentagon. Congressmen and senators fought like cats in a sack to bring defence plants, military and air bases and space stations to their districts and states. Even before "homeland security", a growing proportion of American workers need security passes to get into their places of work.
An old lady told me once in Washington that in the 1920s, if it began to rain when her father was driving along Pennsylvania Avenue, he would drive in through the White House gates, get out of the rain under the marquee outside the main entrance, get out and put up the hood. If you did that today, or at any time since the Kennedy assassination, you would be shot.
America is no longer a profoundly civilian society occasionally goaded into war, which is the way Americans used to see their country. It is a profoundly militarised society, one faction of whose leaders openly proclaim that they intend to maintain military supremacy over all comers for the 21st century.
Yet this militarisation is not just a horrible aberration provoked by outrage at al-Qaida's attacks on New York and Washington, or by the machinations of a few dozen neo-conservatives. There have always been two aspects of American society. As with the Roman god Janus, one face was for peace, the other for war.
America's two faces
Robert Kagan, he who so misunderstood European history as to say that Europeans were from Venus, not from Mars, has just published a remarkable history of the United States entitled Dangerous Nation. You do not have to agree with everything Kagan says to agree that there has always been an aggressive and bellicose strand in the American political personality.
The Americans did not conquer the lion's share of the north American continent in a fit of absent-mindedness, nor was it a question, as Israeli apologists like to say, of a people without land filling a land without people. From the Pequot war of 1636-37, when the godly folk of New England burned several hundred inhabitants of a native American village alive and celebrated the event with a thanksgiving, white Americans pushed the Indians relentlessly until the last tribes were conquered.
In the 1840s, while the halls of Congress echoed to cries of "Fifty-four forty or fight!" from politicos who wanted a third war with Britain to annex Canada. President James K Polk took advantage of a border incident to declare an aggressive war on Mexico. After an easy victory over an impoverished and divided Mexican republic, New York newspapers screamed for "all Mexico". But John C Calhoun warned against annexing so many non-white people, and the "empire of liberty" settled for only Texas, New Mexico, Arizona, Nevada, part of Colorado and California.
The great theme of Henry James's novels was the contrast between the "innocence" of his American heroines and the corrupt Europeans they wanted to marry. But the fathers and grandfathers of those sensitive maidens had expropriated the Indians, dismembered Mexico and slaughtered the buffalo. They were the robber barons of the gilded age. They had many virtues, including brains, courage and energy. But innocent they were not.
Perhaps the best way of understanding what has happened in the United States over the past thirty years is that one side of a dualist tradition has come to predominate. Americans were always a contradictory people: godly and dangerous, peaceful and warlike, deeply convinced that their republican constitution, dedicated to the sovereignty of the people and the rule of law, was the "last best hope of earth", and yet contemptuous of foreigners and quick to seize whatever they wanted.
They were idealists. But they were also realists. As long as they needed allies, against the Axis and against communism, they were constrained to muffle their universal ambitions. Then, abruptly, the Soviet empire dissolved. There seemed no reason to restrain any longer their ancient contempt for Europe. After all, most Americans are descended from ancestors who left Europe because they had had a bad time there. And after the atrocities of 9/11 there seemed no need to bother overmuch with what Thomas Jefferson called "a decent respect for the opinion of mankind".
For a time, the harsh face of America appeared. The present attorney-general is a man who considers the Geneva conventions "quaint". The vice-president appears to see nothing wrong with the imposition of what is indistinguishable from torture. American agents kidnap Muslims and drag them off to sinister prisons. The president analyses the world into the conflict between "good guys" and "bad guys", as if the world of men were a school playground.
Jan Morris is wrong. The United States is much, much better than the unworthy government its people have twice - or at any rate once! - elected. But nor, on the other hand, will everything change overnight when President George W Bush leaves office. The United States will remain an indispensable, but also a dangerous nation. As Jefferson wrote to John Adams near the end of their lives, the United States will go on, "prospering and puzzled as before". And we should go on, as before, forever grateful for what the United States has done for us in so many ways, but forever wary of what it can become when idealism sours into bullying.
Posted on: Friday, February 23, 2007 - 19:54
SOURCE: New York Post (2-23-07)
HEY, the Africans are trying to impose their culture on us!
That's what Episcopalians in the United States are saying about last week's summit in Tanzania, where global Anglican leaders urged Americans to bar homosexuals from becoming bishops and to stop blessing same-sex unions. As The New York Times reported, Episcopalians condemned"meddling" foreigners for"imposing their culture and theological interpretations on the American church."
In a sense, the Americans are right. Episcopalians in this country shade to the left, in theology as well as politics, while their African brethren tend to be more conservative. So it's not surprising that the African leaders would oppose gay marriage or that they'd demand that the entire Anglican communion do the same.
What is surprising, in light of history, is that the Africans are imposing on the West, not the other way around.
For nearly 500 years, Christians from Europe and the Americas tried to foist their own language, culture and religion upon Africa. Now the tables have turned.
To understand why, we need to return to the era immediately following World War Two. As anti-colonial movements swept Africa, sympathetic Western missionaries began to question the arrogant and ethnocentric assumptions that had marked so much Christian effort on the continent.
Decrying prior campaigns to" civilize" the Africans, liberals from the West substituted the language of culture. Every people had a culture, the argument went; no culture was inherently better or worse than another; hence Westerners should take special care to respect and even defend the cultures they encountered in Africa.
But how could you preserve African culture, even as you converted Africans to your own religion? For some missionaries, the answer lay in new syncretic forms of worship that fused indigenous traditions to Christian doctrine. For many Western liberals, however, the rise of the culture concept cast the entire missionary endeavor into doubt.
"We questioned what right we have to intervene in the education of people of another culture and what our motives are in desiring to intervene," wrote two American missionaries, in a typical statement."Do we want to 'domesticate' the people in one way or another, make them like us, convince them to adopt our culture?" The question contained its own answer.
To shed their ethnocentric baggage, indeed, liberal Americans increasingly abandoned the term"missionary" itself. One mission renamed its project"overseas service"; other missionaries simply called themselves volunteers, echoing the Peace Corps and other secular agencies."The very word 'missionary' calls up notions of superiority," explained one American.
And in an era of culture, that was the one thing nobody wanted to be.
Into this breach stepped a confident new generation of conservative missionaries, seeking to convert new souls to Christ. Conversant with African history and traditions, they did their best to couch their message in culturally appropriate terms. But they never wavered from the message itself: Jesus was Lord, Scripture was literal Truth, and anyone who believed otherwise was destined for hell.
Today, nine of 10 Westerners who call themselves"missionaries" hail from a conservative or evangelical church. And they have done their job well. That's why African Christians stand so far to the right of their brethren in the West on a host of religious and cultural questions: abortion, gay rights, female priest ordination and more.
And that's why they're starting to evangelize us, to the chagrin of many Americans.
The battle inside the Anglican Communion is only the first of many struggles that we can expect in the next few years, pitting Third World conservatives the against liberals in the West.
For almost half a millennium, Christians from the West told the rest of the globe how to think, behave and believe. Now, for the first time, we're getting a taste of our own medicine. For liberals, especially, it might be a very bitter pill to swallow.
Posted on: Friday, February 23, 2007 - 19:03
SOURCE: Salon (2-23-07)
Tony Blair's announcement that Britain would withdraw 1,600 troops from southern Iraq by May, and aim for further significant withdrawals by the end of 2007, drew praise from U.S. Vice President Dick Cheney. "What I see," said Cheney, "is an affirmation of the fact that there are parts of Iraq where things are going pretty well."
In reality, southern Iraq is a quagmire that has defeated all British efforts to impose order, and Blair was pressed by his military commanders to get out altogether -- and quickly. The departure has only been slowed, for the moment, by the pleas of Bush administration officials like Cheney. And far from the disingenuously upbeat prognosis offered by the vice president, the British withdrawal could spell severe trouble for both the Iraqi government and for U.S. troops in that country.
The British helped provide the security that allowed private supply convoys bearing fuel, food and ammunition to travel from Kuwait up through Shiite-held territory to the U.S. military's forward operating bases in and around Baghdad and in Anbar province. Col. Pat Lang, a retired senior officer with the Defense Intelligence Agency, has pointed out that if Shiite militias began attacking those trucks, American troops in the center-north of the country would become sitting ducks for the Sunni Arab guerrillas.
The other danger posed by the British withdrawal is to Iraq's economy. The southern port city of Basra is the country's primary economic window on the world. Exports of the 1.6 million barrels a day of petroleum it managed to produce in January all went out of Basra. The pipeline that used to take Iraqi exports from the northern oil city of Kirkuk to Ceyhan on Turkey's Mediterranean coast has been subject to constant sabotage. The Iraqi state depends on the revenue realized from Basra's exports for its survival. As it is, it has been charged that militias siphon off $2 billion a year in petroleum revenues through smuggling operations. Were the central government to lose control of even more of those revenues, it could be starved to death.
And the danger is imminent. Although it is often alleged that Basra is relatively calm because it lacks Sunnis, neither claim is true. Though heavily Shiite, Basra also has tens of thousands of Sunnis -- even, perhaps, the odd al-Qaida operative. Sunni spokesmen such as Qatar-based Yusuf al-Qaradawi maintain that thousands of Sunnis have been driven out of the city and that a hundred Sunni mosques have been confiscated by Shiites. The Shiite-dominated Iraqi government denies these allegations.
And Basra is certainly not calm. ...
Posted on: Friday, February 23, 2007 - 14:17
SOURCE: New Republic (2-19-07)
Anyone familiar with academic culture might have guessed that Lawrence Summers's tenure as president of Harvard University would be rocky. It was not that he came to the university from the power centers of the federal government and was unaccustomed to the expectations and sensitivities of faculty members. It was that he was trained as an economist and spent his formative years in the Harvard Economics Department. And, as most academics will tell you, economists tend to think that they're smarter than everybody else, can find the answer to any important question, and don't need to listen carefully to other opinions. Pity the poor fellow who must present research to an economics department seminar: He can hardly get a word in edgewise.
In selecting Drew Gilpin Faust to succeed Summers as the university's president, the Harvard Corporation and Board of Overseers have moved in a different direction. Faust is a distinguished historian of the American South and the Civil War era. What, if anything, might that mean for Harvard and her presidency?
Historians can be as arrogant and tone-deaf as any people who claim intellectual authority, but the nature of their work disposes them to be otherwise. Although historians pose large questions, they are skeptical of easy answers. Although they like to bring order out of apparent chaos, they quickly recognize the complexity of human undertakings. Although they seek to recover something of the past, they soon discover how much digging that requires. They come to learn that historical writing and historical experience involve conflicting perspectives and that they need to confront viewpoints different than their own. Historians have to be prepared to follow unexpected leads and uncharted paths. And they must develop skills (and patience) to hear and understand what their subjects are trying to tell them. It is all a very humbling process.
Faust has been one of America's most productive and influential historians (despite directing the Radcliffe Institute and participating in a months-long presidential search, she has just sent a new book off to press), yet two characteristics of her scholarship seem particularly relevant to her new position as Harvard's president. One is her ability to engage, with depth and sensitivity, people and problems that can be distasteful. The other is her intellectual curiosity and growth.
Faust was born in Virginia during the waning years of Jim Crow, and she grew up in Harry Byrd's Clark County--one of the centers of "massive resistance" to desegregation. But, even as a young girl, she wondered why black people weren't "given a chance" and began to "confront the paradox of being both a southerner and an American." That is to say, she was something of a rebel and eventually involved herself in civil rights and antiwar activism. By the time she enrolled in graduate school at the University of Pennsylvania and commenced a doctoral dissertation, she was looking for new ways to come to terms with her past. Rather than focus upon those who resisted the regimes of slavery and white supremacy, she chose to study those who were central to their creation and sustenance.
Throughout her scholarly career, Faust has written chiefly about men and women who owned slaves, defended slavery, and offered the slave South and the Confederacy cultural and intellectual support. They included writers, ministers, politicians, pro-slavery theorists, Confederate nationalists, and elite women who took charge of the homefront when their husbands and sons went off to war. They were not exactly an admirable lot, and Faust is not especially admiring. But she takes their thought and activities very seriously and tries to understand how they simultaneously reflected and shaped the world in which they lived--and how they attempted to make "sense out of a world that seems to us ... to make very little sense at all." She was drawn to their projects and dilemmas and to their efforts--amid increasingly tragic circumstances--to explain themselves. Her ear, as a consequence, has become finely tuned....
Posted on: Thursday, February 22, 2007 - 23:27
SOURCE: New Republic (2-22-07)
In October 2005, things looked grim for the Bush White House. The president was reeling from the serial disasters of Hurricane Katrina, the botched Harriet Miers nomination, and escalating chaos in Iraq. Perhaps worst of all, U.S. Attorney Patrick Fitzgerald's inquiry into "Plamegate"--the blowing of CIA agent Valerie Plame's cover--seemed poised to bear fruit. Left-wing bloggers giddily anticipated a "Fitzmas" day when Karl Rove might be "frog-marched out of the White House," as Plame's husband, Joseph C. Wilson IV, so memorably put it.
As it happened, Fitzgerald soon indicted not Rove but I. Lewis "Scooter" Libby, then Vice President Dick Cheney's chief of staff, on perjury and related charges. Despite some sadness that Rove eluded the prosecutor, left-liberal precincts continued to brim with kudos for Fitzgerald and with scorn for the targets of his probe and for the big-name journalists--notably columnist Bob Novak and The New York Times' Judith Miller--caught up in its snare.
Now, as the saga nears its end, we can begin to tally its costs and benefits. While the Bush administration remains intact (barely) and Rove is still ensconced in the West Wing, Libby seems likely to face some penalty for failing to come clean about his conversations with reporters--if not a guilty verdict, then the loss of his job and reputation and someday an obituary that will describe him as one of the few pelts that liberals were able to nail to the wall in the otherwise dismal years of the Bush presidency.
There are many emotional satisfactions to be gained from Libby's plight. It's easy to relish the thought of this administration, whose dirty politics went too long unpunished, finally being held accountable. And, given the subtext of the affair--a debate over the casus belli of a now widely reviled war--the invasion's opponents have naturally found vindication in the pursuit first of Rove and now of Libby. Indeed, on a cosmic level, this comeuppance is deserved. It's contemptible for the White House to have unmasked a CIA officer--and the Republicans' decades-long demagoguery on the security issue should now be seen for the hypocrisy and opportunism that it has always been.
But I think my fellow liberals, partaking in some hypocrisy of their own, have failed to grasp the true toll of this inquisition. We're supposed to be champions of the First Amendment and foes of overzealous prosecutors. For most of the postwar era, we were the ones who demanded greater exposure of government secrets, sharper skepticism about blanket claims of "national security," and stronger support for reporters against the assaults of the organized right. In keeping with those convictions, we should have protested this overwrought case from the start. In fact, applauding it actually benefits the Bush administration--and future regimes of its ilk--by further sanctifying secrecy and demonizing the press. ...
Posted on: Thursday, February 22, 2007 - 23:25
SOURCE: Nation (2-22-07)
Palestinian Authority President Mahmoud Abbas and Hamas leader Khaled Meshal went to Mecca after more than 130 people had been killed, hundreds wounded and a burning sense of urgency had taken over the Palestinian streets. Gazans had been confined to their homes for days, this time not as the result of an Israeli curfew or aerial attack but rather from fear of getting caught in the crossfire. Palestinian society, it seemed, was on the brink of civil war.
The pressure from below had an impact, and within two days Fatah and Hamas leaders, through the mediation of Saudi Arabia's King Abdullah, succeeded in accomplishing what they had not managed to achieve during the preceding year: a unity government. Thousands of Palestinians spontaneously filled the streets of Gaza to celebrate what seemed to be a historic moment.
As I write, the clashes have indeed subsided, but it is unclear whether the calm will persist. On the one hand, the agreement's fine points have yet to be hammered out, and the devil is often in the details. On the other hand, what lies ahead does not depend solely on goodwill between Fatah and Hamas.
Most pundits have understood the sectarian clashes as either a struggle over who will control the Palestinian government and resources or as a local manifestation of a much broader international conflict between fundamentalist and secular forces in the Islamic world. Such interpretations, however, have obscured the central role Israel and the United States have played.
Even before Hamas won the January 2006 democratic elections, after which the Quartet of Middle East mediators (the United States, the European Union, Russia and the United Nations) cut off aid to the Palestinians, Israel's closure of the territories had triggered an economic crisis. More than 60 percent of the Palestinian inhabitants were living under the international poverty line of $2 a day, with the World Bank reporting that 9 percent of children were suffering from acute malnutrition. Considering that the financial aid provided to the Palestinians had amounted to almost one-third of the per capita gross national income in the territories, the imposition of economic sanctions has been catastrophic.
The closure and sanctions have not only wreaked havoc on the Palestinian economy but have also helped precipitate the violent clashes among the factions. Indeed, the idea behind the sanctions, which both Israel and the United States have pressured other countries to enforce, is to shape power relations within Palestinian society by adopting a scheme that, for clarity's sake, one could call the Somalia Plan.
For months now the Palestinian Authority has been unable to pay the salaries of its 160,000 employees. These workers provide the livelihoods of more than 1 million people--almost a third of the population. Some 70,000 work for the numerous security organizations, most of which are linked to political factions. Like those employed by civil institutions, such as the education and health ministries, they are deeply angry because they cannot feed their families. But unlike the civilian workers, they are armed. Under these conditions, it is not surprising that a power struggle has erupted. Inadequate resources, economic sanctions, thousands of armed men in distress and foreign support of certain factions are, after all, the ingredients from which warlordism, à la Somalia, is made. Insofar as this is the case, success of the unity government is contingent on ending the economic sanctions.
But here comes the snag: Israel's unilateralist government is not really interested in a negotiating partner and sees Palestinian unity as a threat. Even before the Palestinian leaders returned home, Israel launched a well-orchestrated diplomatic campaign to persuade the Quartet to uphold sanctions. Simultaneously, Prime Minister Ehud Olmert declared that any future negotiations would be informed by a "three no" approach: no to dividing Jerusalem, no to a withdrawal to the 1967 borders and no to a solution to the Palestinian refugee problem. Thus, Israel is unwilling to discuss any of the contentious issues that need to be resolved.
The unfolding events also set the stage for making the February 19 Jerusalem summit inconsequential. Nonetheless, Secretary of State Condoleezza Rice did make some significant statements during her Middle East visit. It is unlikely that a Palestinian state will be created before President Bush ends his term, she told the Palestinian newspaper Al Ayam. And to the Israeli daily Ha'aretz she added that the Palestinian unity government will not be recognized--suggesting that sanctions would continue--until it abides by Israel's three conditions: that the Palestinian government recognize Israel, renounce violence and ratify the Oslo agreements and the road map. While these demands are in many respects legitimate, they could readily be part of negotiations rather than a weapon used to precipitate internal violence and thus destroy Palestinian society.
Ironically, Israel's position is inimical to its own interests. If the internal Palestinian clashes resume and develop into a full-blown civil war, there will be no hope of resolving the conflict between Israel and the Palestinians. One has to be extremely shortsighted not to see how the absence of a united Palestinian leadership will undermine all efforts to bring about local and regional peace.
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Thursday, February 22, 2007 - 18:11
SOURCE: Boston Globe (2-18-07)
Pity the overschooled old maid and the lonely career woman. Highly educated or high-achieving women are less likely to marry and have children than other women. If they do marry, they are more likely to divorce. Even if they don't divorce, their marriages will be less happy. And, oh, yes, they'll be sexually frustrated, too.
These maxims, widely accepted for at least two centuries, are bad news for a state so focused on brainy pursuits. Thirty-five percent of Massachusetts women 25 and older have a bachelor's degree or more, a level of educational attainment almost 10 points higher than the national average. So perhaps it follows that 28 percent of women in the state have never been married. Massachusetts's proportion of never-married females is the third highest in the nation, topped only by the District of Columbia and the state of New York. But are these women really educating themselves out of the marriage market? If a woman reads Proust or computes calculus, is she unable to attract a mate?
Conventional wisdom says the answer to both questions is yes. But a close look at the historical transformation of marriage in America suggests that educated women now have a surprising advantage when it comes to matrimony.
WHEN I WAS IN THE FIFTH GRADE IN 1954, my teacher pulled me aside after a class party to give me some friendly advice. "Stephanie," he said, "the boys would like you more if you didn't use such big words." I still remember his exact words, because they came as such a shock. Until that moment, it had never occurred to me that the boys might not like me. My teacher's advice didn't stop me from using big words or aspiring to academic success. I entered the citywide spelling bee that spring and was more upset by coming in second than I had been by my teacher's warning. But while my disappointment at losing the spelling bee quickly faded, the teacher's words stuck in my head. For the next 20 years, I believed that the things I most liked to do and most wanted to be made me less attractive to men.
I certainly wasn't the first girl to grow up thinking that aspiring to higher education or a fulfilling career meant jeopardizing her chance of marriage, motherhood, and personal happiness. As early as 1778, according to Harvard University historian Nancy F. Cott, author of the 2000 book Public Vows: A History of Marriage and the Nation, Abigail Adams complained to her husband, John, about the fashion of ridiculing female learning. In 1838, a prominent marriage adviser labeled intellectual women "mental hermaphrodites," less capable of loving a man or bearing a child than a "true" woman. In 1873, Dr. Edward H. Clarke, a prominent professor at Harvard Medical School, noted that the rigors of higher education diverted blood from a woman's uterus to her brain, making her irritable and infertile. Women who pursued careers, he warned, had little chance of marrying and even less chance of bearing a healthy child. Early in the next century, another doctor asserted that when women saw themselves as competent in school or at work, they acquired a "self-assertive, independent character, which renders it impossible to love, honor, and obey." In consequence, he complained, middle- and upper-class males were forced to remain single or dip into the lower classes to find an "uneducated wife" who would not scorn to perform the duties of her sex....
THE MYTH OF THE BITTER, sexually unsatisfied female college graduate has never been true. Surveys from the 1890s to the present reveal that college-educated women have always been at least as satisfied with their emotional and sexual lives as their less-educated counterparts. But until recently, it was true that women who completed the highest levels of education or landed high-status, high-paying jobs were less likely than other women to marry and have children. They were often perfectly happy with their choices, but the fact remains that many women did have to choose between family life and achievement in the public sphere.
One reason for this was that men of the past were more interested in marrying someone who would cook or clean for them than in an intellectual equal. ...
Posted on: Wednesday, February 21, 2007 - 00:39
We who began "sighting" religion in American public life a half-century ago had to open a file on "religion and presidential candidates" when "Ike" ran against "Adlai." Presidents Roosevelt and Truman were very religious, but Roosevelt's form of mainstream Protestantism was seen as inoffensive, and Truman disdained the "use" of religion in political contention. Then came Adlai Stevenson, who was controversial because he was a Unitarian -- and, of course, he was utterly dismissed by religious conservatives (pre-Reagan) because he had been divorced. Dwight Eisenhower ushered in the new era with what a critic called his "very fervent faith in a very vague religion." In 1960 religion came to the fore in the Kennedy-as-Catholic campaign, and it has stayed there ever since. One has to marvel at the naiveté or historical short-sightedness of communicators and analysts today who think that controversial religious identifications among candidates are new. The cast of characters changes; the stage is the same.
So the files bulge fatly and prematurely in this too-long campaign. Not a single candidate is discussed apart from her or his religious commitments. We can save comment on other candidates for future seasons until November 2008. First off, meanwhile, we have the Mormon context and involvement of new candidate Mitt Romney of Massachusetts. There is no hiding his Latter Day Saint identification, nor does he try to hide it. He did his three-year missionary stint, and is by all signs wholly engaged with his faith community. And that is controversial. We are told that though he is working his way into acceptability on hot-button issues among religious conservatives, they have more reservations than do other Americans about his being a Mormon.
Such reservations were better described as suspicious and hateful in the 1840s and for a century that followed. Almost nothing galvanized publics in that century more than the perceived threats by the Mormons. The practice of Church-sanctioned polygamy shocked non-Mormons in a time before the public accepted our serial polygamy of celebrities, stars, and sometimes neighbors. In case no one has noticed, except among fringe groups, all that is past. That Latter Day Saints theology differs radically from Christian orthodoxies is also obvious. That Mormons, especially in Utah, tend to be rather homogeneous in their expressions of right-wing Americanism is evident in polls and outcomes.
So we line things up. First, there is the U.S. Constitution, with its "no religious tests" clause. That settles things legally -- but it is in the ethos, mores, prejudices, and preferences of the public that this all has to be fought out. My own favorite among commentaries by Mormons is "Mormon President? No Problem: Have Faith," by Richard Lyman Bushman in the New Republic. Bushman is as notable and fair-minded an American historian as we have, and has written perceptively about Mormon history. His reassurances are well grounded. "Beliefs do matter. Romney's values, as he has said repeatedly, come from his Mormonism. But teasing out possible implications of theological positions can verge on fantasy. We should ask Romney what he believes, but it would be wrong to predict his future course."
In short: Let's keep the inquiry cool.
"Mormon President? No Problem: Have Faith," by Richard Lyman Bushman (New Republic, January 29, 2007), can be read by subscribers online at:
Posted on: Monday, February 19, 2007 - 14:05
SOURCE: TomDispatch.com (2-13-07)
"…the finest Secretary of Defense this nation has ever had." -- Vice President Dick Cheney
"The past was not predictable when it started." -- Donald Rumsfeld
On a farewell flight to Baghdad in early December 2006, the departing Secretary of Defense reminisced about his start in politics more than forty years before. Aides leaned in to listen intently, but came away with no memorable revelations. It hardly mattered. As usual with this man who dominated government as no cabinet officer before him -- including the power-ravenous Henry Kissinger he so despised and outdid in effect, if not celebrity -- authentic history and Don Rumsfeld's version of it bore little resemblance.
There was portent in those beginnings. He came out of an affluent Chicago suburb in the 1950s with brusque confidence and usable contacts at Princeton, among them Frank Carlucci, a future Defense Secretary of mediocre mind, yet the iron conceit and shrewd fealty far more effectual in government than intellect or sensibility. After college and two years as a Navy pilot, Rumsfeld did politic stints as a Capitol Hill intern and Republican campaign aide, and by twenty-nine, back in Chicago in investment banking, was running for Congress.
As with much to come, a darker thread lay beneath the surface from the start. In a Republican primary tantamount to election, he was outwardly the boyish, speak-no-evil, underfunded, underdog challenger of an old party stalwart set to inherit the open seat. In fact, he was generously financed by wealthy friends, while his operatives -- including Jeb Stuart Magruder of later Watergate infamy -- furtively harried and smeared his opponent, using tactics never traced to Rumsfeld.
He went to Washington in December 1962 a handsome, square-jawed, safe-seat tribune from the North Shore's lakeside preserves, epitomized by the leafy estates of Winnetka and high-end Evanston. The old Thirteenth District of Illinois was one of the wealthiest in the nation and had been smoothly in Republican grip for most of a century. In the House, Rumsfeld was soon seen by some as he always saw himself -- a prodigy in the dull ranks of his Party.
Then, as afterward, he had no authentic qualifications or independent achievements. But that was always masked by the same muscular, aggressive style he took onto the mat as an Ivy League wrestler -- "sharp elbows," a meeker, envious colleague called it -- as well as by the flaccid banality of most of the GOP in the 1960s. The Republican Party Rumsfeld strode into was already caught between the wasting death of Eisenhower worldliness and moderation (with Richard Nixon's haunted succession in the wings) and a fitful right-wing urge to seize control that, in little more than a decade, would deliver the Reagan Reaction.
Rumsfeld's own rightist mentality, his New Deal-phobic corporatist cant and Cold War chauvinism, came dressed more in modish vigor than telltale substance -- and he was already attracted by a tough-minded layman's zeal for the era's pre-micro-processing but grandly prospering military technology. Like most of his generation born in the early 1930s, the scrap-drive, victory-bond children of World War II who came to govern the postwar world and would be the decisive elders of the post-9/11 era, he had no doubt about the natural nobility of America's sway or the invincibility of its arms; all this made ever sleeker, ever more irresistible by the demonstrable twin deities of American capitalism -- technology and "modern" management.
That, after all, was the unquestioned, unquestioning faith of North Shore fathers and other successes like them across the nation. That was the world, according to postwar Princeton, as well as Harvard Business School. That was the supposed genius of future Secretary of Defense Robert McNamara's duly quantified Ford Motor Company as well as his Vietnam-era "systems analysis" Pentagon, and so much more.
In the early 1960s, that received world ended just beyond the suites and suburbs. Given America's moral and material omnipotence, its exemplary excellence (so evident on the North Shore), the remainder of the planet required no particular exploration, knowledge, or historical-political understanding, nor did such men need to have the slightest recognition of America's own non-mythologized past. Alert decision-makers, busy with the numbered bottom-line results, had no time for such "academic" ephemera.
When money or force needed to be applied to Asians, Arabs, Latins, or Africans, a crisp briefing by some underling who had read the necessary memos would always do. Caught up as we all have been in Rumsfeld's kinetic, churlish descent into the bloody chaos of his Iraq, it has been easy to neglect how richly cultural it all was from the beginning -- America's haunted half-century of vast might and presumption set beside our still vaster ignorance and irresponsibility. It was in 1963, during Don Rumsfeld's first months in Congress, that the Iraqi Ba'ath Party -- since 1959 recruited, funded, marshaled and directed by the CIA, and trailing a twenty-six-year-old Tikriti street thug named Saddam Hussein (himself a CIA-paid assassin) along with lists of hundreds of left-leaning Iraqi political figures and professionals to be murdered after the coup -- seized power in Baghdad.
On Capitol Hill, the spirited young Republican legislator was then absorbed in exhilarating new appropriations in aeronautics and weaponry. His trademark clipped fervor and biting sarcasm in questions and speeches already held a hint of the Pentagon E-Ring canon four decades later: the superpower military as classic wrestler -- lithe, superbly equipped, swift to pin a dazed foe, dominant beyond doubt, and with garlands all around. It was only a matter -- he began to learn early from helpful briefings and testimony by military-industrial executives -- of making the commanders (the branch managers, after all) change their sluggish old ways. The by-word would be: Procure to prevail. So superior was new technology and the management that went with it that it scarcely mattered who the competitor might be. In those long-gone days, in obscure Washington hearings unheard, in colloquies before empty chambers, there were the first faint drums of distant disaster in the Hindu Kush, Mesopotamia, and beyond.
Of course, in the 1960s, Rumsfeld's ardor for a high-tech military was only stirring, a minor dalliance compared to his preoccupation with advancement. While few seemed to notice, the brash freshman made an extraordinary rush at the lumbering House. In 1964, before the end of his first term, he captained a revolt against GOP Leader Charles Halleck, a Dwight D. Eisenhower loyalist prone to bipartisanship and skepticism of both Pentagon budgets and foreign intervention. By only six votes in the Republican Caucus, Rumsfeld managed to replace the folksy Indianan with Michigan's Gerald Ford.
In the inner politics of the House, the likeable, agreeable, unoriginal Ford was always more right-wing than his benign post-Nixon, and now posthumous, presidential image would have it. Richard Nixon called Ford "a wink and a nod guy," whose artlessness and integrity left him no real match for the steelier, more cunning figures around him. To push Ford was one of those darting Capitol Hill insider moves that seemed, at the time, to win Rumsfeld only limited, parochial prizes -- choice committee seats, a rung on the leadership ladder, useful allies.
Taken with Rumsfeld's burly style that year was Kansas Congressman Robert Ellsworth, a wheat-field small-town lawyer of decidedly modest gifts but outsized ambitions and close connections to Nixon. "Just another Young Turk thing," one of their House cohorts casually called the toppling of Halleck.
It seems hard now to exaggerate the endless sequels to this small but decisive act. The lifting of the honest but mediocre Ford higher into line for appointment as vice president amid the ruin of President Richard Nixon and his Vice President, Spiro Agnew; Ford's lackluster, if relatively harmless, interval in the Oval Office and later as Party leader with the abject passing of the GOP to Ronald Reagan in 1980; Ellsworth's boosting of Rumsfeld into prominent but scandal-immune posts under Nixon; and then, during Ford's presidency, Rumsfeld's reward, his elevation to White House Chief of Staff, and with him the rise of one of his aides from the Nixon era, a previously unnoticed young Wyoming reactionary named Dick Cheney; next, in 1975-1976, the first Rumsfeld tenure at a Vietnam-disgraced but impenitent Pentagon that would shape his fateful second term after 2001; and eventually, of course, the Rumsfeld-Cheney monopoly of power in a George W. Bush White House followed by their catastrophic policies after 9/11 -- all derived from making decent, diffident Gerry Ford Minority Leader that forgotten winter of 1964.
They were Nixon men. Rumsfeld and Cheney rose via the half-shunned political paternity of a cynical president who abided and used some he distrusted, even came to deplore. Brought into Nixon's 1968 presidential campaign through Ellsworth's influence, Rumsfeld fell into an opportune role -- spying on the Democratic Convention in Chicago, which exploded in the infamous "police riot" against antiwar demonstrators that tore apart the Democrats and lent the spy's reports unexpected gravity. (Among faces in the crowd watching the mayhem was another onlooker out of a comfortable Republican suburb, a twenty-one-year-old Wellesley student from Park Ridge named Hillary Rodham.) Though he gained attention in the Democrats' disaster, Rumsfeld ran up against Nixon's equally barbed campaign manager, Bob Haldeman and, despite their election victory, returned to Congress in 1969 without reward.
Bipartisan collusion rescued him. By 1968, President Lyndon Johnson's four year-old Office of Economic Opportunity (OEO), the heralded antipoverty program with its grassroots "Community Action" and its Legal Services for the poor, had become a potential success story -- and thus anathema for powerful Democrats as well as Republicans. Denied a 1964 cigarette tax (that would have funded it securely) by the tobacco lobby, then starved by the sinking of resources into the maw of the Vietnam War, OEO was ultimately doomed when the nascent political, economic, and legal assertiveness it nurtured among the thirty to fifty million dispossessed threatened the hold of vested-interest donors and the mingled power bases of governors and mayors, congressmen and legislators of both parties. As early as 1966 they began trooping in numbers through the Old Executive Office Building -- liberal and conservative but uniformly self-preserving, the single party of incumbent power -- to lobby Vice President Hubert Humphrey, who planned to cut the program when he himself became president.
With Nixon's victory over Humphrey, OEO's death became a certainty, though a tough infighter was needed as director to take out the agency's life support systems. Nixon first ignored the appointment; then, later in 1969, at the urging of ranking Senate and House Democrats as well as Ford and Ellsworth, named Rumsfeld to the post. He, in turn, chose as his deputy Princeton pal Frank Carlucci, already off to a buccaneering start in the Foreign Service amid early 1960s CIA coups and assassinations in the Congo. The writ was plain. On Capitol Hill, they called Rumsfeld "the undertaker."
So it was that a slight, already balding 28 year-old Republican Congressional intern, Richard Bruce Cheney, soon steered to the new OEO Director a 12-page memo setting out how to run the agency in a way that would kill what they all deplored. Cheney had failed at Yale. Returning to his native Casper to work as a telephone lineman, he eventually went to college in Wyoming and, avoiding the Vietnam draft like the plague, on to graduate school and a DC internship meant to satisfy his ambitious fiancée Lynn and to retrieve a white-collar career. Like so many in the neo-conservative swarm he came to head after 2001, Cheney brought to public life no intellectual distinction or curiosity, and certainly no knowledge of the wider nation and world. Washington in 1968 marked the first time he had lived in a town of more than 200,000.
Over his glacial insularity, though, lay a reassuringly phlegmatic manner. In Washington, he found he had an instinct for the quiet, diligent subordinate's exploitation of institutional indolence, and he brought with him a clenched-teeth, right-wing animus that more visible Republicans judged impolitic to express but impressive in a backroom staff man.
"Dick said what they all thought but didn't say aloud," a Hill aide (and later Congressman) recalled of often raw conversations about money, race, partisanship, and particularly Cheney's angry, acid scorn for college antiwar protests that gave reassuring voice to the publicly muted abhorrence of Republican politicians. Having earlier rejected him as a House intern, Rumsfeld now made the young right-winger his key personal assistant at OEO, where he proved devotedly efficient. The hiring brought three future Secretaries of Defense -- Rumsfeld, Carlucci, and Cheney -- into the same office, toiling to abort the unwanted embryonic empowerment of the poor.
When they became celebrities, there would be much written about how the styles of Rumsfeld and Cheney meshed – Rummy's sheer brio, his relishing combat and the limelight, his free-wheeling way of sparking ideas and decisions helter-skelter (his famous routine of dropping to the floor for one-arm push-ups, a tic that a bureaucrat-benumbed Washington media always found fetching); and steady, backroom Dick, the methodical organizer, the modest detail man seeing to practical execution.
Close up, the bond was even deeper. Across an age gap of almost a decade, despite the distance between charged and calm, North Shore and Casper, Princeton and Wyoming, country club Congressman and lumpen-proletarian repairman, they shared something rarely then so openly admitted on the right: an abhorrence of the liberations sweeping the 1960s, not just the right's pet scourges of bureaucracy, crime, drugs, social fragmentation, and (however suitably coded) racial integration, but the unsettling ferment of newfound freedoms and honesty, the defiance of cultural and institutional oppressions -- especially by minorities and women. They detested Lyndon Johnson's Great Society, the way it seemed to advance beyond the New Deal and Progressivism at the expense of settled money and power.
Altogether it was a moment of hurtling change that many saw as ominous weakness and laxity, of new public programs for the long-excluded, which the world of Rumsfeld and Cheney imagined as "socialism." For them, the balancing regulation of long-dominant business power was nothing short of "tyranny"; the new arrangements of race and class, the myriad threats of sheer liberty in a more equitable society and economy, were menacing.
Whatever their other ties, Rumsfeld and Cheney were two of the era's visceral reactionaries in the classic sense of the term. Musing with younger aides on one of his last days in the White House, Johnson came up with a telling term for their ilk. "The haters," he called them. "They hate what they can't run any more" was the way he put it. The calamity Rumsfeld and Cheney later wrought in American foreign policy traced not only to profound ignorance and immense, careless pretense about the world at large, but in some part to a four-decade-old kindred fear and loathing at home.
OEO began the Rumsfeld myths. "He saved it," Carlucci would blithely tell oblivious post-9/11 reporters hardly apt to check the actual fate of the agency. Carlucci would spin an image of an ever-energetic Rumsfeld taking up the cause of the needy, streamlining and fortifying the laggard agency despite the funeral that had been ordered. It was a blasé postmortem lie. Community Action, Head Start, VISTA, Job Corps, and most decisively Legal Services (whose leadership Rumsfeld and Cheney together decapitated in 1970) -- one by one, each of these beleaguered efforts was stifled or sloughed off to political sterility. This mission, at least, was accomplished. By the time the burial was complete -- with the agency's quiet extinction in 1973, unmourned by the powers of either party -- the undertaker had moved on to higher office.
In 1971, Nixon had been stymied in his plan to use Rumsfeld in a cabinet shakeup and so took him into the White House as a domestic affairs "counselor." The Rumsfeld White House interval over the next two years is captured on Nixon's infamous secret tapes. With his ever-aggressive, if not megalomaniacal, 40 year-old aide, the 60 year-old president adopts an avuncular tone, while Rumsfeld angles brazenly to supplant Henry Kissinger as a special envoy on Vietnam or even to replace Vice President Spiro Agnew on the 1972 ticket. Patiently, yet with audible derision and occasional incredulity, Nixon suggests seasoning in more modest positions. Thus, after the president's 1972 reelection triumph, an eager Rummy would be made ambassador to NATO, spoils previously in the hands of their mutual friend Ellsworth, who urged Rumsfeld for the job.
It all yielded more myths, more confected history by a submissive, uninformed media profiling post-9/11 power. There would be the image of Rumsfeld as White House "dove" on Vietnam, when his bent was exactly the opposite; or that Nixon, it would be claimed, saw him as uniquely in touch with the diversity of the country, especially the young -- when the reality was that Rumsfeld, having served an impatient three terms from his lavishly unrepresentative rotten borough of Winnetka wealth, with his generic contempt for the 1960s and his part at OEO suppressing the emergence of millions of the young poor, was anything but.
At the time, privately at least, his grasping shallowness led to withering -- now long-forgotten -- verdicts from knowing witnesses. Even a jaded Nixon would eventually deplore him as "a man without idealism." His extensive experience with despots giving the judgment added weight, Henry Kissinger came to think Rumsfeld the "most ruthless" official he had ever known.
In a Washington that routinely hides its ugly inner truths of character and incompetence, none of it mattered. Away at NATO in Brussels, frustrated by multinational diplomacy but expanding his own sense of political-military mastery, Rumsfeld managed to escape the Watergate incriminations of 1973-74. Instead, he seemed like a fresh face when Gerald Ford succeeded the disgraced Nixon in August 1974. Anxious to be rid of Nixon co-conspirators like then-White House Chief of Staff Alexander Haig, but facing a period of rule with inadequate crony aides, the earnest new president called back clean, hard-charging Don to be his chief of staff. Rumsfeld promptly brought in Cheney, just on the verge of vanishing mercifully into private business -- and the rest is history.
Barely a year after moving next to the Oval Office (and contrary to Ford's innocent, prideful recollection decades later that it was his own idea), Don and Dick characteristically engineered their "Halloween Massacre." Subtly exploiting Ford's unease (and Kissinger's jealous rivalry) with cerebral, acerbic Defense Secretary James Schlesinger, they managed to pass the Pentagon baton to Rumsfeld at only 43, and slot Cheney, suddenly a wunderkind at 34, in as presidential Chief of Staff.
In the process, they even maneuvered Ford into humbling Kissinger by stripping him of his long-held dual role as National Security Advisor as well as Secretary of State, giving a diffident Brent Scowcroft the National Security Council job and further enhancing both Cheney's inherited power at the White House and Rumsfeld's as Kissinger's chief cabinet rival. A master schemer himself, Super K, as an adoring media called him, would be so stunned by the Rumsfeld-Cheney coup that he would call an after-hours séance of cronies at a safe house in Chevy Chase to plot a petulant resignation as Secretary of State, only to relent, overcome as usual by the majesty of his own gifts.
With such past trophies on their shelves, it would never be a contest for Rumsfeld and Cheney after 2001. That fall of 1975, 29 year-old George W. Bush, the lineage's least fortunate son, was in Midland, Texas, partying heartily and scrounging for some role on the rusty fringes of the panhandle oil business.
By December 1975 having pushed aside Watergate-appointed Vice President Nelson Rockefeller, the longtime abomination of the Republican right, Rumsfeld was already positioning himself to be Ford's 1976 running mate -- and eventual successor. But that spring Ronald Reagan came so close to wresting the nomination from Ford, with primary victories in North Carolina and Texas, that the President's other advisors, many of whom detested Rumsfeld anyway, sprang to appease the Reagan camp by persuading the President to put choleric right-wing Kansas Senator Bob Dole on the ticket instead.
Among those advisors was George H.W. Bush, then-CIA Director. (He had gotten the job thanks to a cynical recommendation from Rumsfeld, calculating that to put Bush at the scandal-ridden agency would eliminate him as a potential rival). Another was Bush's onetime Texas campaign aide, a moneyed corporate lawyer and would-be power-broker from Houston, and now an obscure Commerce Department official who became Ford's 1976 campaign manager, James Baker III. It was an adroit back-corridor move, the sort Rumsfeld himself had been practicing so adeptly, and it embittered him for years toward his old patron Ford as well as Bush, Baker, and others -- one more wisp of a seamy, unseen history, of customary Republican cannibalism that wafted ironically over the last days of 2006 with Baker's Iraq Study Group and the Ford funeral.
Designs on the Oval Office thwarted but by no means given up, Rumsfeld spent scarcely fifteen months at the Pentagon in 1975-1976, but they were quietly, ominously historic. It was an interval, however brief, that proved far more significant and premonitory than commonly portrayed. In many ways, it both foreshadowed 9/11 and prepared the way for the fateful sequel to it.
At every turn, the new SecDef pulled policy to the right -- aligning Washington even more egregiously than usual with reactionary regimes in Asia and Latin America, smothering the nation's only serious attempt at intelligence reform, beginning the demolition of détente with Russia that would climax in its extinction under Jimmy Carter. At home and abroad, Rumsfeld seeded the Middle East for future crises and, even more insidiously, joined the military leadership in cravenly abandoning the post-Vietnam battlefield of historical understanding and institutional change.
In his first days in office, he quickly allied himself with the longtime (but until then vain) efforts of the Joint Chiefs of Staff to stall the pending Strategic Arms Control Agreement with Moscow. He also pushed Kissinger and Ford into one of the more disgraceful acts of that presidency (discreetly ignored in the recent Ford retrospectives) -- the assuring of the Indonesian military junta that U.S. support and arms would continue to flow, despite the brutal suppression about to be unleashed on East Timor.
It was only a taste of the Rumsfeld preference for uniformed right-wing tyrants, indulged over the next year in an ever closer Defense Department liaison with military dictatorships in Latin America, most notably through Operation Condor, joint covert actions involving several regimes, among them Gen. Augusto Pinochet's Chile and the Argentine military dictatorship, with Pentagon attaches and intelligence advisors looking on approvingly. The result was a plague of kidnappings, disappearances, and assassinations throughout the Hemisphere, including, in 1976, the brazen car bomb murder of former Chilean Foreign Minister Orlando Letelier and an American colleague on Massachusetts Avenue in downtown Washington. Unfailingly backed and expanded by Rumsfeld, the collusion with Indonesian and Latin American despots underwrote more than a decade of some of the most savage repressions of the second half of the twentieth century.
The customary Pentagon-State Department bureaucratic war Rumsfeld waged against Kissinger (with a vengeance fired by the Defense Secretary's presidential ambitions) involved a furtive alliance with Capitol Hill's ubër-hard-line Democrat, Armed Services Committee Chairman (and Kissinger nemesis) Henry "Scoop" Jackson. A Washington State backwoods, shoreline-county prosecutor, he had become the "Senator from Boeing." Jackson's Russophobia, demagoguery on arms control, and zealous backing of Israel (especially on the then-charged issue of Jewish emigration from the USSR) would land Rumsfeld in the milieu of the Israeli lobby, already formidable if only a kernel of the special interest colossus it would later become.
Jackson's Cold War mania was fattening military budgets along with the requisite Puget Sound contracts, not to speak of the senator's own war chest for a 1976 presidential run, and all this was being fomented by a bustling, pretentious, pear-shaped young Jackson aide named Richard Perle. Perle's somber, if oily, manner hid his own considerable lack of intellect or knowledge about either Russia or the Middle East, but his hard-line anti-Soviet and Zionist zeal gave him, as Jackson's policy broker in the politics of the moment, a cachet and following far beyond his meager substance. Rumsfeld's collusion with Jackson would thus introduce him to some of the still marginal publicists, ideologues, and Washington hangers-on who would take the term neoconservative as the label for their career-plumping chauvinism and, less audibly, their tragically intermingled allegiances to right-wings in both the U.S. and Israel.
In Rumsfeld's early tie to this wanna-be-establishment claque were omens of the history they would make together after 2001. It was his "sharp elbows" that were thrown to create the notorious "Team B," a collection of incipient neocons and Russophobes in and out of government, including Paul Wolfowitz. They were summoned to offer a fearsome analysis of Soviet capabilities and intentions that would be an alternative to comparatively unfrightening (and accurate) CIA assessments being attacked by Ronald Reagan and his right-wing minions in the 1976 campaign. In this surrender to election-year demagoguery could be found the hands of the White House and the elder Bush at the CIA (more Ford regime shame politely forgotten in the mournful, anxiety ridden, anyone-compared-to-George-W. fin de 2006 moment), but Rumsfeld's role was crucial -- and the consequences historic.
The absurdity and ideological corruption of Team B's "analysis" of the Soviet bogeyman (along with a desired future confrontation with China, a nakedly racist, essentially right-wing Israeli view of the Arab world, and a refusal to face the Vietnam defeat) would be plain even then; though decades later, the post-Soviet archives would definitively reveal it for the fraud it was. As it was meant to, it fed the massive arms buildup of the Reagan 80s, and with it the engorging of the military-industrial colossus that, in turn, filled Republican campaign coffers. And all of this, of course, including the ensuing distortions in domestic priorities, would pave the way for Rumsfeld's eventual return to power.
The "Team B" scandal also foreshadowed an insidious post-9/11 plague, the right-wing assault on relatively non-ideological national intelligence that was to lead to the blatant substitution of alternative "intelligence" operations in Rumsfeld's Pentagon and Cheney's vice-presidential office (full-time versions of "Team B," as it were), as well as the coercion and corruption of conventional CIA channels.
In 1976, Rumsfeld worked assiduously to undercut any intelligence that challenged his right-wing bias and, with Cheney helpfully in the background at the White House, fought hard to drown any meaningful intelligence reforms after mid-1970s hearings chaired by Senator Frank Church and Congressman Otis Pike offered shocking revelations of CIA covert-operations abuses. The resulting half-measures and truncated accountability sent unmistakable signals through Washington, setting the stage for various CIA rampages of the 1980s under Reagan campaign manager William Casey (and one of Casey's ambitious, agreeable aides named Robert Gates). The direct consequences in blowback and loss of professional integrity would be felt for decades to come.
Then, there was the Middle East. In mid-1976, exiled Palestinians allied with a Lebanese nationalist coalition to politically and economically challenge the traditional privileged rule of the West's Christian-dominated client regime in Beirut. Faced with this, the Secretary of Defense was decisive in the secret US-Israeli instigation of a Syrian military intervention meant to thwart both the Palestinians and the Lebanese rebels. Rumsfeld muscled the covert action through, despite Kissinger's initial hesitation. It ushered in a three-decade-long Syrian occupation of Lebanon, with relentless machinations in the Levant involving the Israeli intelligence service, the Mossad, the CIA and, beginning under Rumsfeld as never before, the Pentagon's Defense Intelligence Agency (DIA).
Already significant in the 1950s, the CIA-Mossad collaboration in Lebanon and elsewhere certainly pre-dated Rumsfeld, and crucial decisions in the deepening collusion would come after him. But the 1976 intervention, which he backed so strongly, would take the complicity to a new level, with a twisting sequel of tumult and intrigue that directly paved the way for the 1982 Israeli invasion of Lebanon, and thus for the eventual rise of Hizbullah.
At the same time, Rumsfeld avidly stepped up ongoing U.S. arms shipments to the Shah of Iran's corrupt, U.S.-installed oligarchic tyranny -- its torture-ready SAVAK secret police intimately allied with the Mossad, the CIA and the DIA. In 1976, Rumsfeld also pressed the sale to the waning Shah of up to eight nuclear reactors with fuel and lasers capable of enriching uranium to weapons grade levels. Ford was prudently uneasy at first, but relented under unanimous pressure from his men. Cheney backed Rumsfeld from the start in urging an Iranian nuclear capability; and, in this at least, they were joined by their arch-rival Kissinger, ever solicitous of his admirer the Shah, ever oblivious to internal Islamic politics – he himself primed by an obscure but vocal thirty-three-year-old State Department aide named Paul Wolfowitz.
At its Rumsfeldian peak in 1976, U.S. weapons and intelligence trafficking with the rotting Iranian imperial regime took up the time of some eight hundred Pentagon officers. Barely two years later, the Shah's regime would fall to the Ayatollah Khomeini's Islamic Revolution, in part under the sheer weight and waste of the Pentagon's patronage. Like CIA-DIA connivance with SAVAK -- which included coordinated assassinations of Iranian opposition political figures or clerics and, in 1977, even Khomeini's son -- Pentagon complicity with the hated old order made all but inevitable the widespread anti-American sentiment in Iran that would in the future be so effectively exploited by the Islamic regime's propaganda. Detonating in the 1979 seizure of U.S. embassy hostages in Tehran, popular Iranian hostility would burn out of a history of intervention and intrigue few Americans ever knew the slightest thing about.
In this way, Rumsfeld and others, including Gates and his slightly mad patron Casey at the CIA, would all, in some degree, become policy godfathers of the mullahs' regime in Tehran as well as of Hizbullah.
"The Dark Ages"
Even more costly would be the toll the Rumsfeld interregnum would exact deep inside the American military. However brief, Rumsfeld's mid-1970s rule over the Defense Department proved, in certain respects, the most crucial moment at the Pentagon since World War II. In seven tumultuous years from Johnson's fall to Nixon's, spanned by defeat and de facto mutiny in Vietnam, four secretaries would troop through Defense, each consumed by war or politics, none engaging the institution's historic plight.
Taking office six months after the fall of Saigon, Rumsfeld would inherit the first truly post-Vietnam military. Fittingly, the institutional crisis he faced had come into being over the full two decades of his adult life since the 1950s. By the time he settled in at the Pentagon, that crisis had already been extensively studied and well documented. Conclusions were available for the asking -- or hearing or reading -- in any Pentagon ring, at any military post at home or abroad as well as in Congress, the White House, and the press, not to speak of the American public. It was unmistakable in the searing experiences of a war whose dark-soil graves at nearby Arlington were still fresh.
By any measure, Rumsfeld arrived at a rare, and exceedingly fleeting moment when the enormous U.S. war machine might have come to terms with its past, and so the future. The failure to do so -- hardly Rumsfeld's alone, but his role was decisive -- would haunt America and the world into the twenty-first century.
Vietnam had laid bare the malignant decaying of America's armed forces that began in the wake of their first unwon war in Korea. There was "no substitute for victory," General Douglas MacArthur had written a Congressman in the letter that finally prodded President Harry Truman to fire him as commander of U.S.-U.N. forces in Korea in 1951. The services nonetheless promptly found a perfectly reasonable substitute -- for a while -- in the warm bath of a careerist managerial ethic.
Ruled in World War II by an ever-growing bureaucracy, ever more inhospitable to the officer as individual, America's superpower military was, as the Korean War began in 1950, already a sclerotic giant. "A glandular thing" was how Secretary of Defense Robert Lovett would describe it a decade later to John Kennedy. The brutal Korean stalemate, following on the early rout of a billet-flabby, semi-demobilized occupation army from Japan, and later the frozen, bloody retreat from a heedless, MacArthur-led advance to conquer North Korea right up to the Chinese border, added to the curse.
Faced with the demanding, unnerving politics of a nuclear-armed peace, a supposedly matchless force met its match in Korea not just on the battlefield, but in the murky realms of political sophistication. In response, grappling to redefine its place (and reassure itself at the same time), the military in the 1950s came to produce a preponderance of what one critic called the "formlessly ambitious" officer; one who saw climbing the military ladder like ascent in any other corporate culture. To a blight that Charles de Gaulle once deplored in his French Army as "solely careerism," the post-Korea U.S. military added the fetish and pseudoscience of "management" -- warriors astride desks, commanding paper flow and brandishing the numerology of budgets with ever-more expensive weapons systems.
Procurement plunder and corruption, the venal revolving door between senior officers and corporate contractors, the inveterate lack of authentic accounting and accountability at almost every level -- all the old Pentagon scourges now ran rampant. The good staff life rather than active command, "ticket punching," the right job at the right time -- all of this fostered an officer corps overwhelmingly pursuing rank as an end itself, at pains to do no more than what one embittered combat colonel recalled as "a necessary but minimal amount of field duty."
As credentials merely accumulated, as efficiency reports inflated and grew meaningless, there was the inevitable atrophy of ethics and the military art. Oddly enough, management itself, the faith and practice of the new creed, was the first casualty of institutional shallowness and self-protection. Winners emerged compromised and cynical; losers, alienated and contemptuous of superiors. General morale, credible command authority, and old-fashioned élan as well as esprit de corps were decimated in the process. Graduates and non-graduates alike trained their disillusion on institutions like West Point, which, by the early 1960s, many privately mocked as the South Hudson Institute of Technology -- SHIT. The Academy's sacred "duty, honor, country" now seemed eclipsed in practice by any mammoth organization's immutable rule of survival: Cover your ass.
Despite the need to understand the history and politics of vast new arenas of American policy -- regions of potential military embroilment such as Asia or the Middle East -- once-elite service graduate schools like the War Colleges became what one study termed "usually superficial and vapid." There would be no twentieth-century American Clausewitz, wrote Ward Just, the best of the era's military-affairs journalists, surveying the wreckage of a defense establishment driven by corporate inanity, "because the writing of Von Krieg (On War) took time and serious thought."
Much of this bureaucratic decadence overtook other arms of government in the 1950s, not least the State Department. As Vietnam soon would prove, however, a craven ethos and command mediocrity in a military -- whose business, as Korea savagely reminded everyone, is sometimes to fight wars -- would be catastrophic.
Within the system, there were predictable if vain attempts to hide the approaching disgrace. When, in 1970, a war-college study of "professionalism" in Vietnam was done with implications (as a pair of reviewing experts described it) "devastating to the officer corps," the Joint Chiefs of Staff quickly classified and suppressed the findings. Yet none of the inner withering was a secret, or even arcane knowledge, in government. Before, during, and after Rumsfeld's first regime at the Pentagon, Congressional hearings, journalism and memoirs exposed the reality for what it was; while nationally noted, amply documented books, often written by veteran officers or based on their testimony, appeared under titles that spoke eloquently of the disaster still to come: Crisis in Command: Mismanagement in the Army, Defeated: Inside America's Military Machine, Self-Destruction: The Disintegration and Decay of the United States Military, The Death of the Army.
Vietnam nearly made the figurative death literal. Ironically, there had been a portent of the debacle ahead in Southeast Asia (and of Iraq and Afghanistan 30 years later, for that matter) in a book discussed in Washington to the point of fad just as Rumsfeld began his political career in the early 1960s.
General Maxwell Taylor was a handsome, much-decorated World War II airborne hero, a Missouri country boy who became a reputed military intellectual, albeit given to the pandemic provincialism yet gall typical of postwar American officialdom, whose nation's new world power so outstripped its knowledge of the planet. The general could thus unabashedly extol the Shah's repressive Iranian troops as among the "armies of freedom," and instruct a West Point class on the eve of Vietnam that they were entering a world in which "the ascendancy of American arms and American military concepts is accepted as [a] matter of course."
More grandly, Taylor proposed to correct the errors of the key strategic doctrine of the Eisenhower presidency, the policy of "massive retaliation" in which America's overwhelming nuclear superiority -- its bombers ringing the USSR and China, some within minutes of their targets -- was to deter any move by Soviet or Chinese forces across the Cold War's post-Korea established boundaries. That strategy might keep the Red Armies in their kennels, Taylor argued, but it was hardly a response to campaigns waged by proxy communists on the periphery in the Third World.
To meet that threat -- and, not incidentally, to rescue his beloved Army from the mission and budget predations of the nuclear-armed Air Force throughout the 1950s -- Taylor proposed a new orthodoxy of "limited wars," adding to nuclear deterrence a "strategy of flexible response." He defined his breakthrough in a celebrated book, Uncertain Trumpet, as "the need for a capability to react across the entire spectrum of possible challenge for coping with anything from general atomic war to infiltration and aggressions…"
On whether the United States could practically, or should politically, as a matter of national interest cope "with anything," the confident paratrooper Taylor wisely did not elaborate. His point, after all, was at heart a bigger, better army with bigger better budgets. Properly selected "limited wars," with newly created forces chafing to be used, would presumably take care of themselves. But Taylor at least did warn that it would be necessary "to deter or win quickly," dictating an overwhelming application of men and weaponry and a victory so swift and decisive that everyone, including the defeated enemy, would accept it. "Otherwise," he noted ominously in a passage the general as well as his admirers later tended to overlook, "the limited war which we cannot win quickly may result in our piecemeal attrition."
Minus this gloomy caveat, Taylor's theme enjoyed swift vogue in the early 1960s -- with both Republicans and Democrats eager to engage what were seen as ubiquitous Russians and native communists scavenging post-colonial turmoil in the Third World. Among them were right-wingers like Rumsfeld, impatient with the aged caution of the Eisenhowers and Hallecks in their own Party, and among the Democrats, President John F. Kennedy himself. He promptly made Taylor a ranking advisor on Southeast Asia and other matters. Crippled by careerism, the military thus readied itself to fight in reassuring theory what in Vietnamese reality would be Maxwell Taylor's oxymoronic nightmare -- a limited war of attrition.
That war, of course, had its men of courage and integrity. More than ever, though, they were the exceptions to the prevailing system, and few of them made it as intact survivors to highest rank in the twenty-first century. The machinery that in peacetime routinely ground out rhapsodic officer efficiency reports instantly applied the same practiced reflexes to the surreal paper work of Saigon and its offshore carrier groups, fattening Vietcong body counts, bombing damage assessments, and accounts of South Vietnamese client efficacy that seemed to prove victory ever on the way. When intelligence reports discovered awkward enemy strength and resilience or detected unwanted signs of another losing war, they were simply falsified, destroyed, or buried.
The massively beribboned chests of commanders in Iraq and Afghanistan three decades later, many of whom had been junior officers in Southeast Asia, would be unintended reminders of how much the Vietnam fraud fed on even the old honor of citations. Like a debased currency, ribbons for courage or exceptional service lost value as they accumulated, with awards snidely known as "gongs" and oak leaf clusters as "rat turds." Once-respected air medals (800,000 of them) were handed out for almost any non-combat flight in that helicopter-swarming war, or even for hauling holiday frozen turkeys snugly behind the lines.
Decorations were heaped so bountifully on generals along with lesser staff officers that valor in such numbers, wrote one combat veteran, was "incomprehensible." To Vietnam's "grunts," as they related again and again, the war was too often fought with their officers 2,000 feet up in the comparative safety of "eye in the sky" command helicopters rather than with their "ass in the grass" with their troops.
Casualty figures were telling. In over a decade of fighting, with over 58,000 American dead, only four generals and eight colonels fell in combat. Commissioned rank was a guarantee of survival as for no other modern military at war (save perhaps in Iraq and Afghanistan in figures yet to come, but where we know high ranking officers were seldom at the front). "The officer corps simply did not die in sufficient numbers or in the presence of their men often enough," concluded two postwar analysts of the army's resulting "crisis."
With the corruption of standards came an inevitable loss of morale. To soldiers of honor at every level the ignorance, self-protection, and widespread opportunism of so many superiors made Vietnam what one colonel called "the dark ages in the army's history." Through the ranks, unprecedented, ran the unchecked contagion of disintegration -- refusal of orders amounting to mutiny; desertions in the tens of thousands; a drug epidemic and race riots; uncounted, unaccountable atrocities; and not least the assassination of officers and noncoms by their own men.
The American military's internecine murder acquired its own ugly Vietnam name, "fragging." Among the officer corps, according to a war-college appraisal, there had been "a clear loss of military ethic," not to be explained simply by a largely citizen-soldier, draft-dependent army. Altogether, another study concluded still more clinically and bluntly, the Armed Forces in Vietnam bordered on "an undisciplined, ineffective, almost anomic mass," its commanders high and low manifesting "severe pathologies."
Added to the war's vast profiteering and waste, all this spurred an exodus of disillusioned military professionals (unprecedented and unmatched until the Iraq War), depriving the services of most of their most promising young leaders. It also produced by 1975-1976 an unparalleled outpouring of public and internal criticism with often shocking revelations by officers, enlisted men, and other knowledgeable observers in and out of government.
The Great Evasion
Yet atop the Pentagon at the immediate postwar height of the now furious, anguished outcry -- what an admiral witnessing it called a "real rebellion of the heart" -- Rumsfeld took no meaningful part in the airing or soul-searching; nor did he take control of, or cleanse, the pestilent contract and accounting scandals. What he did was effectively ignore, dismiss, or on occasion repress and even punish critics and whistle-blowers.
Typically -- yet another grim foreshadowing of Iraq with its Abu Ghraib and Afghanistan with its Bagram prison in cavernous structures at the old Afghan and Soviet air base-- when new Congressional questions began to be asked about the involvement of the U.S. military as well as the CIA in the Saigon regime's infamous "Tiger Cage" torture camps in South Vietnam, an issue that surfaced well before his tenure at the Pentagon but which arose anew in 1975-1976 after fresh revelations of US-aided torture and assassinations, Rumsfeld led the Ford Administration in blocking damaging disclosures until the issue eventually trailed off. It was one more plot of buried history -- along with a seedy CIA front, the Office of Public Safety, implicated in advising and abetting the secret police "renditions" and torture practices of client regimes worldwide until its quiet disbanding by Congress in 1975 -- with echoes into the twenty-first century.
Officially, the crumbling of discipline and performance in Vietnam would be blamed not on the military's long-festering venality and incompetence, but on the ready scapegoats of antiwar agitation and the larger social turbulence of the 1960s, a perfect fit with Rumsfeld-Cheney demonology. To the relief of the Joint Chiefs, the Secretary of Defense scoffed at, or swiftly suppressed, any institutional self-examination; yet the counterattack on critics was vicious. "Overlong in battle and emotionally unbalanced," was the way one Pentagon-kept military columnist smeared an officer of legendary heroism who publicly deplored service careerism.
As America gladly celebrated its Bicentennial under Gerald Ford's calming, anodyne post-Watergate presidency, the tide of self-awareness in the Pentagon was "allowed to recede," as a later study recorded, and officers "whose careers were deeply rooted in the polices and practices [of the war] finally prevailed." The latter included leaders of the 1991 Gulf War and 2003 Iraq debacle, most famously Colin Powell, who as a mid-grade careerist was personally involved in a whitewash of the My Lai massacre.
When a superintendent of West Point was earlier removed for his implication in the My Lai cover-up, he bid farewell to a dining hall full of sympathetic cadets with the old adage of General Joe Stillwell, "Don't let the bastards grind you down." Who the Superintendent's "bastards" were, the new Secretary of Defense and his unreconstructed high command had no doubt in 1975-1976.
In the siege mentality of Rumsfeld's post-Vietnam Pentagon, the besieging force was never a blindly misjudged nationalism, an intrepid insurgency, corrupt, untenable clients, or persistent myopia, folly, self-delusion, and ultimate self-betrayal of U.S. policy. It was the curse of wavering civilian masters at home -- craven Washington politicians and the old foreign policy establishment, especially Democrats -- and a public too easily swayed by the treachery of a mythological "liberal media." Humiliation in Vietnam had come not from colossal blunder, but from homefront perfidy, from the hoary stab in the back. "Do we get to win this time?" Rambo famously asks about his return to Vietnam, echoing in popular lore that denial of debacle.
It was Rumsfeld's historic legacy to rubber stamp the Great Evasion performed by America's military and sullen ideological right, as both fled headlong from the Vietnam reckoning. In the process, they all jettisoned responsibility, much as Saigon's American-bred profiteers cast cumbersome loot from their Mercedes sedans as they honked south through pitiful hordes of refugees just ahead of the final North Vietnamese offensive in the spring of 1975.
While U.S. foreign policy -- in heedless covert action as well as an orgy of globalism begun even before the fall of the Soviet Union, and then the reactionary mania loosed by 9/11 -- broadcast the seeds of new insurgencies (the prospects for what a handful of largely ignored theorists were calling "Fourth Generation Warfare"), serious study of counterinsurgency all but vanished from Pentagon planning and even from the service schools' curricula. The Iraq war would be years old and long lost by the time the Army revised, postmortem as it were, its little-read counterinsurgency manual written two decades before and anachronistic even then.
With Vietnam lessons unlearned and careerist blight as well as contract pillage uninterrupted, the military system's answer -- already emerging as orthodoxy under Rumsfeld in 1976 -- would be the simplistic, foolproof dictum, claimed by Colin Powell but hardly his originally, of fighting only with overwhelming forces, crushing firepower, and uncontested air cover (and even then having a precise "exit strategy" in place). This was, in sum, a version of General Taylor's "deter and win quickly." (As a "doctrine," it was as if the Army or Navy football team would only go on the field with its own rules, its own referees, and a 33-man team in the latest equipment to face an opposite 11 without helmets, pads, or the ability to pass.)
The so-called Powell Doctrine would soon be applied in settings allowing the post-Vietnam Pentagon's ever costlier, ever more "managed" high-tech bludgeon to be wielded against suitably feeble foes, without troublesome duration of engagement or the need for political understanding. Intelligence gaffes and the usual civilian carnage ("collateral damage") aside, the results looked encouraging in Grenada in 1983, Panama in 1990, and most notably the 1991 "turkey shoot" of the First Gulf War, carefully conducted to keep American casualties to the level of industrial accidents.
Fastidious, blameless brevity and detachment tended, of course, to sacrifice controlling the political outcome in any geopolitically meaningful arena -- as in, for instance, allowing Saddam Hussein to remain in power after his troops were expelled from Kuwait, and then, in defeat, to butcher Shiite rebels who, at the call of the first Bush administration in the persons of Baker, Cheney, Powell, and Scowcroft, thought the moment ripe to overthrow the tyrant themselves. Regrettably, they misread Pentagon imperatives. Chilled by a ghost they stoutly denied for decades, joint chiefs and defense secretaries would not repeat hot pursuit into North Korea or Vietnam's limited war of attrition -- not until the undertaker's fortuitous last chance at greatness arrived so explosively and irresistibly on September 11, 2001.
This article first appeared on www.tomdispatch.com, a weblog of the Nation Institute, which offers a steady flow of alternate sources, news and opinion from Tom Engelhardt, a long time editor in publishing, the author of The End of Victory Culture, and a fellow of the Nation Institute.
Posted on: Sunday, February 18, 2007 - 21:46
SOURCE: Weekly Standard (2-19-07)
On December 12, 2006, Iranian president Mahmoud Ahmadinejad personally brought to a close the infamous Holocaust deniers' conference in Tehran. A strange parade of speakers had passed across the podium: former Ku Klux Klan leader David Duke, the nutty followers of the anti-Zionist Jewish sect Neturei Karta, and officials of the neo-Nazi German National party, along with the familiar handful of professional Holocaust deniers. Frederick Töben had delivered a lecture entitled "The Holocaust--A Murder Weapon." Frenchman Robert Faurisson had called the Holocaust a "fairy tale," while his American colleague Veronica Clark had explained that "the Jews made money in Auschwitz." A professor named McNally had declared that to regard the Holocaust as a fact is as ludicrous as believing in "magicians and witches." Finally, the Belgian Leonardo Clerici had offered the following explanation in his capacity as a Muslim: "I believe that the value of metaphysics is greater than the value of history."
If this motley crew had assembled in a pub in Melbourne, nobody would have paid the slightest attention. What gave the event historical significance was that it was held by invitation, at the Iranian foreign ministry: on government premises, in a country that disposes of the world's second-largest oil reserves (after Saudi Arabia) and second-largest natural gas reserves (after Russia). And in this setting, the remarks quoted above provoked not dismissive laughter, but applause and attentive nods. On the walls hung photographs of corpses with the inscription "Myth," and others of laughing concentration camp survivors with the inscription "Truth."
The Tehran deniers' conference marks a turning point not only because of its state sponsorship, but also because of its purpose. Up until now, Holocaust deniers have wanted to revise the past. Today, they want to shape the future: to prepare the way for the next Holocaust.
In his opening speech to the conference, the Iranian foreign minister, Manucher Mottaki, left no doubt on this point: If "the official version of the Holocaust is called into question," Mottaki said, then "the nature and identity of Israel" must also be called into question. The purpose of denying, among all the Nazis' war measures, specifically the persecution of the Jews is to undermine a central motive for the establishment of the state of Israel. Auschwitz is delegitimized in order to legitimize the elimination of Israel--that is, a second genocide. If it should turn out, however, that the Holocaust did happen after all, Ahmadinejad explains that it would have been a result of European policies, and any homeland for the Jews would belong not in Palestine but in Europe. Either way, the result is the same: Israel must vanish.
This focus explains why the conference's sponsors attached so much importance to the participation of a delegation from the Jewish sect Neturei Karta. Although it does not deny the Holocaust, the sect welcomes the destruction of Israel. That objective was the common denominator uniting all the participants in the conference. In his closing speech, Ahmadinejad formulated it with perfect clarity: "The life-curve of the Zionist regime has begun its descent, and it is now on a downward slope towards its fall. . . . The Zionist regime will be wiped out, and humanity will be liberated."
Holocaust denial and the nuclear program
Just as Hitler sought to "liberate" humanity by murdering the Jews, so Ahmadinejad believes he can "liberate" humanity by eradicating Israel. The deniers' conference as an instrument for propagating this project is intimately linked to the nuclear program as an instrument for realizing it. Five years ago, in December 2001, former Iranian president Hashemi Rafsanjani first boasted that "the use of even one nuclear bomb inside Israel will destroy everything," whereas the damage to the Islamic world of a potential retaliatory nuclear attack could be limited: "It is not irrational to contemplate such an eventuality." While the Islamic world could sacrifice hundreds of thousands of "martyrs" in an Israeli retaliatory strike without disappearing--so goes Rafsanjani's argument--Israel would be history after the first bomb.
It is precisely this suicidal outlook that distinguishes the Iranian nuclear weapons program from those of all other countries and makes it uniquely dangerous. As long ago as 1980, Khomeini put it this way: "We do not worship Iran, we worship Allah. For patriotism is another name for paganism. I say let this land [Iran] burn. I say let this land go up in smoke, provided Islam emerges triumphant in the rest of the world."
Anyone inclined to dismiss the significance of such statements might want to consider the proclamation made by Mohammad Hassan Rahimian, representative of the Iranian Supreme Leader Ali Khamenei, who stands even higher in the Iranian hierarchy than Ahmadinejad. A few months ago, on November 16, 2006, Rahimian explained: "The Jew"--not the Zionist, note, but the Jew--"is the most obstinate enemy of the devout. And the main war will determine the destiny of mankind. . . . The reappearance of the Twelfth Imam will lead to a war between Israel and the Shia." The country that has been the first to make Holocaust denial a principle of its foreign policy is likewise the first openly to threaten another U.N. member state with, not invasion or annexation, but annihilation.
Yet it's all confusing. Why, if Iran wishes Israel ill, does it deny the Holocaust rather than applaud it? Ahmadinejad's Holocaust denial has been especially well received in the Arab world, where it has won praise from Hezbollah, the Egyptian Muslim Brotherhood, and Hamas. Yet in the Arab world, Hitler is admired not for building highways or conquering Paris, but for murdering Jews. How can Holocaust denial be most prevalent in a region where admiration for Hitler remains widespread? To unlock this paradox it is necessary to examine the anti-Semitic mind.
Brother Hitler and Eichmann the Martyr
Holocaust denial is anti-Semitism at its most extreme. Whoever declares Auschwitz a myth implicitly portrays the Jews as the enemy of humanity: The assumption is that the all-powerful Jews, for filthy lucre, have been duping the rest of humanity for the past 60 years. Whoever talks of the "so-called Holocaust" implies that over 90 percent of the world's media and university professorships are controlled by Jews and are thereby cut off from the "real" truth. No one who accuses Jews of such perfidy can sincerely regret Hitler's Final Solution. For this reason alone, every denial of the Holocaust contains an appeal to repeat it.
Consider this passage written by an Egyptian columnist for the state-controlled newspaper Al-Akhbar, Egypt's second-largest daily, and published in April 2002:
The entire matter [of the Holocaust], as many French and British scientists and researchers have proven, is nothing more than a huge Israeli plot aimed at extorting the German government in particular and the European countries in general. But I, personally and in light of this imaginary tale, complain to Hitler, even saying to him from the bottom of my heart, "If only you had done it, brother, if only it had really happened, so that the world could sigh in relief [without] their evil and sin."
Often, however, enthusiasm for the Holocaust is expressed unvarnished. In 1961, when the trial of Adolf Eichmann dominated the headlines, such enthusiasm became evident for the first time. The Jordanian Jerusalem Times published an "Open Letter to Eichmann," which stated: "By liquidating six million you have . . . conferred a real blessing on humanity. . . .
But the brave Eichmann can find solace in the fact that this trial will one day culminate in the liquidation of the remaining six million to avenge your blood." Arab writers such as Abdullah al-Tall eulogized "the martyr Eichmann," "who fell in the Holy War." In her book Eichmann in Jerusalem, Hannah Arendt summarized the mood in the Arab world:
The newspapers in Damascus and Beirut, in Cairo and Jordan did not conceal either their sympathy for Eichmann or their regret that he "did not finish the job"; a radio broadcast from Cairo on the opening day of the trial even included a little sideswipe at the Germans, reproaching them for the fact that "in the last war, no German plane had ever flown over and bombed a Jewish settlement."
This heartfelt desire to see all Jews exterminated was reiterated in the Egyptian daily Al-Akhbar in April 2001 by the columnist Ahmad Ragab: "[Give] thanks to Hitler. He took revenge on the Israelis in advance, on behalf of the Palestinians. Our one complaint against him was that his revenge was not complete enough."
Obviously, from a logical point of view, enthusiasm for the Holocaust is incompatible with its denial. Logic, however, is beside the point. Anti-Semitism is built upon an emotional infrastructure that substitutes for reason an ephemeral combination of mutually exclusive attributions, all arising from hatred of everything Jewish. As a result, many contradictory anti-Jewish interpretations of the Holocaust can be deployed simultaneously: (1) the extermination of millions was a good thing; (2) the extermination of millions was a Zionist fabrication; (3) the Holocaust resulted from a Jewish conspiracy against Germany that Hitler thwarted and punished; (4) the Holocaust was a joint enterprise of the Zionists and Nazis; (5) the Zionists' "Holocaust industry" exaggerates the murder of the Jews for self-interested reasons; (6) Israeli actions against the Palestinians are the "true" Holocaust--and so on.
We are dealing here with a parallel universe in which the reality principle is ignored, and blatantly contradictory fantasies about Jews all have their place so long as they serve to reinforce anti-Semitic paranoia and hatred: a universe in which the laws of reason have been abolished and all mental energy is harnessed to the cause of anti-Semitism.
Amid the confusion, this universe is characterized by two constants: the refusal to come to terms with the facts of the Holocaust as it actually took place; and a willingness to find in the Holocaust a source of encouragement and inspiration, a precedent proving that it is possible to murder Jews by the million. This is why the precise content of Ahmadinejad's Holocaust tirades is not the issue. He is obsessed with the subject because he is fascinated by the possibility of a second Holocaust.
Why, then, did Ahmadinejad repeatedly and publicly embrace the ultra-orthodox Jews at the conference? Why did he personally greet every Jew present and say that "Zionism should be strictly separated from the Jewish faith"? Let us take a look at modern anti-Semitism in Iran.
Ahmadinejad and the Jews
Ahmadinejad's great inspiration, the Ayatollah Khomeini, not only recognized the mobilizing power of anti-Semitism in the struggle against the shah, he made use of it himself, as far back as the 1960s. "I know that you do not want Iran to lie under the boots of the Jews," he cried out to his supporters on April 13, 1963. That same year, he called the shah a Jew in disguise and accused him of taking orders from Israel. This drew a huge response from the public. Khomeini had found his theme.
Khomeini's biographer Amir Taheri writes: "The Ayatollah was by now convinced that the central political theme of contemporary life was an elaborate and highly complex conspiracy by the Jews--'who controlled everything'--to 'emasculate Islam' and dominate the world thanks to the natural wealth of the Muslim nations." When in June 1963 thousands of Khomeini-influenced theology students set off to Tehran for a demonstration and were brutally stopped by the shah's security forces, Khomeini channeled all their anger toward the Jewish nation: "Israel does not want the Koran to survive in this country. . . . It is destroying us. It is destroying you and the nation. It wants to take possession of the economy. It wants to demolish our trade and agriculture. It wants to grab the wealth of the country."
After the Six Day War of 1967, the anti-Semitic agitation, which drew no distinction between Jews and Israelis, intensified. "[I]t was [the Jews] who first established anti-Islamic propaganda and engaged in various stratagems, and as you can see, this activity continues down to the present," Khomeini wrote in 1970 in his principal work, Islamic Government. "[T]he Jews . . . wish to establish Jewish domination throughout the world. Since they are a cunning and resourceful group of people, I fear that . . . they may one day achieve their goal." Then in September 1977, he declared, "The Jews have grasped the world with both hands and are devouring it with an insatiable appetite, they are devouring America and have now turned their attention to Iran and still they are not satisfied." Two years later, Khomeini was the unchallenged leader of the Iranian revolution.
Khomeini's anti-Semitic attacks found favor with the opponents of the shah, both leftists and Islamists. His anti-Semitism ran along the same lines as The Protocols of the Elders of Zion, the turn-of-the-century hoax beloved of the Nazis that purports to expose a Jewish conspiracy to rule the world. The Protocols was published in Persian in the summer of 1978 and was widely disseminated as a weapon against the shah, Israel, and the Jews. In 1984, the newspaper Imam, published by the Iranian embassy in London, printed excerpts from The Protocols. In 1985, Iranian state authorities did a mass printing of a new edition. Somewhat later, the periodical Eslami serialized The Protocols under the title "The Smell of Blood: Jewish Conspiracies."
Just two years ago, in 2005, at the Iranian booth at the Frankfurt Book Fair, I was readily able to buy an English edition of The Protocols published by the Islamic Propagation Organization of the Islamic Republic of Iran. Other anti-Semitic classics were also available, such as Henry Ford's The International Jew and Mohammad Taqi Taqipour's screed Tale of the "Chosen People" and the Legend of "Historical Right." The cover of the latter volume caught my eye: a red Star of David superimposed over a grey skull and a yellow map of the world. Obviously, even after the death of Khomeini in 1989, the worldwide dissemination of anti-Semitism by Iran continued.
The fact that 25,000 Jews now live in Iran, making it the largest Jewish community in a Muslim country, is not incompatible with the foregoing. The Jews in Iran are made clearly to feel their subordinate Dhimmi status. Thus, they are not allowed to occupy higher positions than Muslims and so are disqualified from the leading ranks in politics and the military. They are not allowed to serve as witnesses in court, and Jewish schools must be managed by Muslims and stay open on the Sabbath. Books in the Hebrew language are forbidden. Up to the present, the regime, which has time and again published anti-Semitic texts and caricatures, has prevented such hate-mongering from resulting in violence against Jews. Nevertheless, the combination of incitement and restraint leaves the Jewish community in a state of permanent insecurity. Today, the Jewish community serves Ahmadinejad not only as an alibi in his power game, but also increasingly as a deterrent: In the event of an Israeli attack on Iranian nuclear facilities, this community would find itself hostage and vulnerable to acts of reprisal.
Irrespective of the leeway that Ahmadinejad has, for the time being, left the Iranian Jews, his rhetoric is steeped in an anti-Semitism that is unprecedented for a state leader since World War II. Ahmadinejad does not say "Jews" are conspiring to rule the world. He says, "Two thousand Zionists want to rule the world." He says, "The Zionists" have for 60 years now blackmailed "all Western governments." "The Zionists have imposed themselves on a substantial portion of the banking, financial, cultural, and media sectors." "The Zionists" fabricated the Danish Muhammad cartoons. "The Zionists" are responsible for the destruction of the dome of the Golden Mosque in Iraq.
The pattern is familiar. Ahmadinejad is not a racist social Darwinist who, Hitler-like, wants to eliminate every last trace of "Jewish blood." The term "half-Jew" is not used in Islamist discourse. But he invests the word "Zionist" with exactly the same meaning Hitler poured into "Jew": the incarnation of evil.
The Iranian regime can court the Jewish Israel-haters of Neturei Karta all it wants, but anyone who makes Jews responsible for the ills of the world--whether calling them Judas or Zionists--is clearly driven by an anti-Semitism of genocidal potential. Demonization of Jews, Holocaust denial, and the will to eliminate Israel--these are the three elements of an ideological constellation that collapses as soon as any one of them is removed.
Ahmadinejad inhabits a delusional world that is sealed off from reality. The louder the liberal West protests against Holocaust denial or the Islamists' demands for the destruction of Israel, the more conviced Ahmadinejad becomes of Zionist domination. In a conversation with the editors of the German newsweekly Der Spiegel, the Iranian president reacted as follows to the remark that the magazine does not question Israel's right to exist: "I am glad that you are honest people and say that you are required to support the Zionists." Only when we too finally realize that the Holocaust is a Jewish lie--only when we too want to annihilate Israel--only then will Ahmadinejad be convinced that we are academically credible and politically free. It is this lunacy that makes the revolutionary mission of the Iranian leadership so dangerous.
Which brings us to the question of the broader significance of Iranian Holocaust denial. The Islamist mission is by no means restricted to Israel.
In his first speech on the guiding principles of his politics, Ahmadinejad made this clear: "We are in the process of an historical war, . . . and this war has been going on for hundreds of years," he declared in October 2005. This is a war, then, that is not fundamentally about the Middle East conflict and will not end with the elimination of Israel. He continued: "We have to understand the depth of the disgrace of the enemy, until our holy hatred expands continuously and strikes like a wave." This "holy hatred" is boundless and unconditional. It will not be mitigated by any form of Jewish or non-Jewish conduct--other than subordination to sharia and the Koran.
In his letter to George W. Bush, the Iranian president described his objective: "Those with insight can already hear the sounds of the shattering and fall of the ideology and thoughts of the liberal democratic systems." The letter also tells how the liberal democracies will be shattered. Even here (if slightly diluted), the ideology of martyrdom--You love life, we love death--is propagated: "A bad ending belongs only to those who have chosen the life of this world. . . . A good land and eternal paradise belong to those servants who fear His majesty and do not follow their lascivious selves."
Shiite Islamism confronts us with an adversary who reviles the achievements of modernity as Satan's work, who denounces the international system created after 1945 as a "Jewish-Christian conspiracy," and who therefore wishes to overturn the accepted historiography of the postwar period. At the start of the Holocaust deniers' conference, Foreign Minister Mottaki explained that the problem is the "wording of historical occurrences and their analysis [are written from] the perspective of the West." As against this "Western" historiography, Islamism wants to create a new historical "truth," in which the Holocaust is declared a myth, while the Twelfth Imam is deemed real. Whereas the delusional worldview of Holocaust denial is elevated to the norm, any deviation from it is denounced as a symptom of "Jewish domination."
Even as he is conducting his religious war, Ahmadinejad is also playing the role of a global populist. He addresses his speeches to all the world's "oppressed." He cultivates good relations with Fidel Castro and Hugo Chávez and ingratiates himself with the Western left by using anti-American rhetoric. His use of the word "Zionist" is strategic. It is the Trojan horse by which he makes his anti-Semitism respectable, allowing him to be at once an anti-Semite and Holocaust denier and the ultimate spokesman for the "oppressed nations."
Of course, Iran would not have to rely on Holocaust denial to pursue its strategic objectives. Yet Ahmadinejad insists on the point, in order to provide ideological undergirding to his push to destroy Israel. He also speculates that this project might win the approval of the Europeans. After all, in Europe the delegitimization of Israel has been going on for some time--if for different reasons. Recently the BBC organized a symposium on the question of whether Israel would still exist in 50 years. In a poll taken four years ago in the E.U., 59 percent saw Israel as "the biggest danger to world peace." Even in the United States, a growing number of intellectuals are convinced that Israel and its American supporters are the real source of the problems facing American foreign policy.
The alarm cannot be sounded loudly enough. If Iran is not put under pressure without delay and forced to choose between changing course and suffering devastating economic sanctions, the only remaining alternatives will be a bad one--the military option--and a dreadful one--the Iranian bomb.
Posted on: Sunday, February 18, 2007 - 21:36
SOURCE: LAT (2-18-07)
IT WAS FUNNY, in a grim sort of way. Last week, Secretary of Defense Robert M. Gates responded to Russian President Vladimir V. Putin's polemical attack on the United States by remembering the 50-year Cold War as a "less complex time" and saying he was "almost nostalgic" for its return....
Was the Cold War era, on the whole, a safer era? Ponder the following counterarguments:
First, however tricky our relationships with Putin's Russia and President Hu Jintao's China are nowadays, the prospect of our entering a massive and mutually cataclysmic conflict with either nation are vastly reduced.
We seem to have forgotten that our right-wing hawks argued passionately for "nuking" communist China during the Korean War and again during the Taiwan Straits crisis of 1954. We also have apparently forgotten — although newly released archival evidence overwhelmingly confirms this — how close we came to a nuclear Armageddon during the Cuban Missile Crisis.
Likewise, we've forgotten the shock of the Soviet invasion of Afghanistan in 1979, which prompted then-German Chancellor Helmut Schmidt to ask, "Is this the new Sarajevo?" a reference to the outbreak of World War I. And who still remembers 1984-85, when we were riveted by Jonathan Schell's argument in the New Yorker that even a few nuclear explosions would trigger such dust storms as to produce a "nuclear winter"?
Those were really scary times, and much more dangerous than our present circumstance because the potential damage that could be inflicted during an East-West conflagration was far, far greater than anything that Al Qaeda can do to us now. No one has the exact totals, but we probably had 20,000 missiles pointed at each other, often on high alert. And the threat of an accidental discharge was high.
None of today's college-age students were born in 1945, 1979 or maybe even 1984. None lived with those triangular signs proclaiming their schools to be nuclear bomb shelters.
To recapture those frightening atmospherics these days, university professors must resort to showing Cold War movies: "The Manchurian Candidate," "Fail Safe," "Dr. Strangelove," "The Hunt for Red October," "Five Days in May," "The Spy Who Came in from the Cold." Students look rather dumbfounded when told that we came close, on several occasions, to World War III.
Yet what if, for example, Josef Stalin had prevented American and British supply aircraft from flying into Berlin in 1948-49? Phew! The years 1945 to, say, 1990 were horrible on other accounts. China's Mao Tse-tung's ghastly Great Leap Forward led to as many as 30 million deaths, the greatest loss of life since the Black Death. The Soviet Union was incarcerating tens of thousands of its citizens in the gulags, as were most of the other members of the Warsaw Pact. The Indo-Pakistan wars, and the repeated conflicts between Israel and its neighbors, produced enormous casualties, but nothing like the numbers that were being slaughtered in Angola, Nigeria, the Congo, Vietnam and Cambodia. Most of the nations of the world were "un-free."
It is hard to explain to a younger generation that such delightful countries as Greece, Spain, Portugal, Chile, Brazil, South Africa, Poland and Czechoslovakia (to name only a few) were run in those days by fascist generals, avowed racists or one-party totalitarian regimes. I am ancient enough to remember the long list of countries I would not visit for summer holidays; old enough to recall how creepy it was to enter Walter Ulbricht's East German prison house of a state via Checkpoint Charlie in the late 1960s. Ugh.
Let us not, then, wax too nostalgic about the good old days of the Cold War. Today's global challenges, from Iraq to Darfur to climate change, are indeed grave and cry out for solutions.
But humankind as a whole is a lot more prosperous, a great deal more free and democratic and a considerable way further from nuclear obliteration than we were in Dwight Eisenhower and John F. Kennedy's time. We should drink to that.
Posted on: Sunday, February 18, 2007 - 16:06