Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
Lets face it, making war is fast superceding sports as the American national pastime. Since 1980, overtly or covertly, the United States has been involved in military actions in Grenada, Libya, Nicaragua, Panama, Iraq, Afghanistan, El Salvador, Haiti, Somalia, Yugoslavia, Liberia, Sudan, the Philippines, Colombia, Haiti (again), Afghanistan (again) and Iraq (again) and that's not even the full list. It stands to reason when the voracious appetites of the military-corporate complex are in constant need of feeding.
As representatives of a superpower devoted to (and enamored with) war, it's hardly surprising that the Pentagon and allied corporations are forever planning more effective ways to kill, maim, and inflict pain -- or that they plan to keep it that way. Whatever the wars of the present, elaborate weapons systems for future wars are already on the drawing boards. Planning for the projected fighter-bombers and laser weapons of the decades from 2030 to 2050 is underway. Meanwhile, at the Department of Defense's (DoD's) blue-skies research outfit, the Defense Advanced Research Projects Agency (DARPA), even wilder projects -- from futuristic exoskeletons to Brain/Machine Interface initiatives -- are being explored.
Such projects, as flashy as they are frightening, are magnets for reporters (and writers like yours truly), but it's important not to lose sight of the many more mundane weapons currently being produced that will be pressed into service in the nearer term in Iraq, Afghanistan, or some other locale the U.S. decides to add to the list of nations where it will turn people into casualties or" collateral damage" in the next few years. These projects aren't as sexy as building future robotic warriors, but they're at least as dangerous and deadly, so lets take a quick look at a few of the weapons our tax dollars are supporting today, before they hurt, maim, and kill tomorrow.
Set Phasers on Extreme Pain
Recently, the Air Force Research Laboratory called for"research in support of the Directed Energy Bioeffects Division of the Human Effectiveness Directorate." The researchers were to" conduct innovative research on the effects of directed energy technologies" on people and animals. What types of innovative research? One area involved identifying"biological tissue thresholds (minimum visible lesion) and damage mechanisms from laser and non-laser sources." In other words, how excruciating can you make it without leaving telltale thermal burns? And a prime area of study?"Pain thresholds." Further, there was a call for work to:"Determine the effects of electromagnetic and biomechanical insults on the human-body." Sounds like something out of Star Trek, right? Weaponry of the distant future? Think again.
In a TomDispatch piece last spring, I mentioned a "painful energy beam" weapon, the Active Denial System, that was about to be field-tested by the military. Recent reports indicate that military Humvees will be outfitted with exactly this weapon by the end of the year.
I'm sad to report that the Active Denial System isn't the only futuristic weapon set to be deployed in the near-term. Pulsed Energy Projectiles (PEPs) are also barreling down the weaponry-testing turnpike. They are part of a whole new generation of weapons systems that the Pentagon promotes under the label"non-lethal." The term conveniently obscures the fact that such weapons are meant to cause intense physical agony without any of the normal physical signs of trauma. (This, by the way, should make them -- or their miniaturized descendents -- excellent devices for clandestine torture).
PEPs utilize bursts of electrically charged gas (plasma) that yield an electromagnetic pulse on impact with a solid object. Such pulses affect nerve cells in humans (and animals) causing searing pain. PEPs are designed to inflict "excruciating pain from up to 2 kilometers away" No one knows the long-term physical or psychological effects of this weapon, which is set to roll-out in 2007 and is designed specifically to be employed against unruly civilians. But let's remember, the Pentagon isn't the Food and Drug Administration. No need to test for future effects when it comes to weapons aimed at someone else.
20th Century Weaponry for 21st Century Killing
Just recently the Department of Defense's Defense Contracting Command-Washington put out a call for various technologies capable of"near-immediate transition to operations/production at the completion of evaluation." In other words, make it snappy.
In addition to a plethora of high-tech devices, from laser-sights for weapons to battlefield computers, the US Special Operations Forces had a special request: 40mm rifle-launched flechette grenades. For the uninitiated, flechettes are razor-sharp deadly darts with fins at their blunt ends. During the Vietnam War, flechette weaponry was praised for its ability to shred people alive and virtually nail them to trees. The question is, where will those Special Ops forces use the grenades and which people will be torn to bits by a new generation of American flechettes. Only time will tell, but one thing is certain -- it will happen.
The Special Ops troops aren't the only ones with special requests. The Army has also put out a call to arms. While Army officials recently hailed the M240B 7.62mm Medium Machine Gun as providing"significantly improved reliability and more lethal medium support fire to ground units," they just issued a contract to FN Manufacturing Inc. produce a lighter-weight, hybrid titanium/steel variant of the weapon (known as the M240E6). And these are just a few of the new and improved weapons systems being readied to be rushed onto near-future American battlefields.
Obviously, the military is purchasing guns and other weapons for a reason: to injure, maim, and kill. But the extent of the killing being planned for can only be grasped if one examines the amounts of ammunition being purchased. Let's look at recent DoD contracts awarded to just one firm -- Alliant Lake City Small Caliber Ammunition Company, L.L.C., a subsidiary of weapons-industry giant Alliant Techsystems (ATK):
Awarded Nov. 24, 2004:"a delivery order amount of $231,663,020 as part of a $303,040,883 firm-fixed-price contract for various Cal .22, Cal .30, 5.56mm, and 7.62mm small caliber ammunition cartridges." Work is expected to be completed by Sept. 30, 2006.
Awarded February 7, 2005:"a delivery order amount of $20,689,101 as part of a $363,844,808 firm-fixed-price contract for various 5.56mm and 7.62mm Small Caliber Ammunition Cartridges." Work is expected to be completed by Sept. 30, 2006.
Awarded March 4, 2005:"a delivery order amount of $8,236,906 as part of a $372,586,618 firm-fixed-price contract for 5.56mm, 7.62mm, and .50 caliber ammunition cartridges." Work is expected to be completed by Sept. 30, 2006.
You and I can buy 400 rounds of 7.62mm rifle ammunition for less than $40. Imagine, then, what federal purchasing power and hundreds of millions of dollars can buy!
Alliant Ammunition and Powder Co. is also making certain that, as the years go by, ammo-capacity won't be lacking. In February 2005, Alliant was awarded"a delivery order amount of $19,400,000 as part of a $69,733,068 firm-fixed-price contract for Services to Modernize Equipment at the Lake City Army Ammunition Plant" -- a government-owned facility operated by ATK. Alliant notes that this year it is churning out 1.2 billion rounds of small-caliber ammunition at its Lake City plant alone. But that, it seems, isn't enough when future war planning is taken into account. As it happens, ATK and the Army are aiming to increase the plant's "annual capacity to support the anticipated Department of Defense demand of between 1.5 billion and 1.8 billion rounds by 2006." Think about it. In this year, alone, one single ATK plant will produce enough ammunition, at one bullet each, to execute every man, woman, and child in the world's most populous nation -- and next year they're upping the ante.
The Military-Corporate Complex's Merchants of Death
Once upon a time, a company like ATK would have been classified as one of the world's"Merchants of Death." Then again, once upon a time -- we're talking about the 1930s here-- the Senate was a place where America's representatives were willing to launch probing inquiries into the ways in which arms manufacturers and their huge profits as well as their influences on international conflicts were linked to the dead of various lands. Back then, simple partisanship was set aside as the Senate's Democratic majority appointed North Dakota's Republican Senator Gerald P. Nye to head the"Senate Munitions Committee."
While today's fawning House members can barely get aging baseball heroes to talk to them, the 1930s inquiry hauled some of the most powerful men in the world like J.P. Morgan, Jr. and Pierre du Pont before the committee. Even back in the 1930s, however, the nascent military-industrial complex was just too powerful and so the Senate Munitions Committee was eventually thwarted in its investigations. As a result, the committee's goal of nationalizing the American arms industry went down in flames.
Today, the very idea of such a committee even attempting such an investigation is simply beyond the pale. The planning for futuristic war of various horrific sorts, not to speak of the production and purchase of weapons and ammunition by the military-corporate complex, is now beyond reproach, accepted without question as necessary for national (now homeland) security -- a concept which long ago trumped the notion of national defense.
The Future Is Now
While the military-academic complex and DARPA scientists are hard at work creating the sort of killing machines that a generation back were the stuff of unbelievable sci-fi novels, old-fashioned firearms and even new energy weapons are being readied for use by the American imperial army tomorrow or just a few short years in the future. In February 2005, Day & Zimmerman Inc., a mega-company with its corporate fingers dipped in everything from nuclear security and munitions production to cryogenics and travel services, inked a deal to deliver 445,288 M67 fragmentation hand grenades (which produce casualties within an effective range of 15 meters) to the Army in 2006. In which country will a civilian will lose an eye, a leg, or a life as a result? Weapons made to kill are made to be used. This year ATK's Lake City Army Ammunition Plant will produce 1.2 billion rounds of ammunition at the DoD's behest and the company proudly proclaims, "Approximately 75% of the ammunition produced annually is consumed."
With all those exotic pain rays, flechettes, super-efficient machine guns, and rounds and rounds of ammunition readied for action -- and they represent only a small part of the spectrum of weaponry and munitions being produced for war, American-style -- more people are sure to die, while others assumedly will experience"intense pain" from PEPs weapons and the like. Back in October of last year, a team of researchers from Johns Hopkins University, Columbia University, and Al-Mustansiriya University in Baghdad, knocking on thousands of doors throughout Iraq, demonstrated that an estimated 100,000 civilians had already died violently as the direct or indirect consequence of the U.S.-led invasion of Iraq. The main cause of these deaths: attacks by coalition (read as"U.S.") forces. The future promises more of the same.
No one should be surprised by these figures -- though many were (and many also continue to deny the validity of these numbers). It's obvious that, if you build them; they will kill. And you thought that we were supposed to"err on the side of life"?
Nick Turse is a doctoral candidate at the Center for the History & Ethics of Public Health in the Mailman School of Public Health at Columbia University. He writes for the Los Angeles Times, the Village Voice and regularly for Tomdispatch on the military-corporate complex and the homeland security state.
This article first appeared on www.tomdispatch.com, a weblog of the Nation Institute, which offers a steady flow of alternate sources, news and opinion from Tom Engelhardt, a long time editor in publishing, the author of The End of Victory Culture, and a fellow of the Nation Institute.
Copyright 2005 Nick Turse
Posted on: Thursday, April 7, 2005 - 18:08
In 1987, after he was exonerated of corruption charges, former Secretary of Labor Raymond Donovan issued the classic plea of the wronged man: "Which office do I go to to get my reputation back?" Whichever office it is, Ahmad Chalabi may want to apply there as well.
The leader of the Iraqi National Congress has been the most unfairly maligned man on the planet in recent years. If you believe what you read, Chalabi is a con man, a crook and, depending on which day of the week it is, either an American or Iranian stooge.
The most damning charge is that he cooked up the phony intelligence that led to the invasion of Iraq. In the words of that noted foreign policy sage Maureen Dowd: "Ahmad Chalabi conned his neocon pals, thinking he could run Iraq if he gave the Bush administration the smoking gun it needed to sell the war."
Such calumnies are so ingrained by now that La Dowd published that sentence on Sunday, three days after the release of the Robb-Silberman report that refutes it. The bipartisan commission headed by Chuck Robb and Laurence Silberman did not give Chalabi a totally clean bill of health. It found that two INC-supplied defectors were "fabricators." But it also determined that the most notorious liar popularly linked to the INC — a defector known as "Curveball" who provided false information on Saddam Hussein's biological weapons — "was not influenced by, controlled by, or connected to the INC."
"In fact, over all," the Robb-Silberman report concluded, "CIA's postwar investigations revealed that INC-related sources had a minimal impact on prewar assessments." Translation: The CIA's attempts to scapegoat Chalabi for its own failures won't wash.
This is only one of many unsubstantiated accusations against Chalabi. Last August, for instance, an Iraqi judge issued an arrest warrant for Chalabi and his nephew, Salem Chalabi. Ahmad was supposedly guilty of counterfeiting, Salem of having an Iraqi official murdered. Within weeks the bizarre charges were dropped for lack of evidence.
Unfortunately, no court of law has examined the accusations made by anonymous U.S. spooks that Chalabi told the Iranian government that one of its codes had been broken by the United States. U.S. officials claimed that they found out Chalabi was the source of the leak because they were able to decode a message to that effect to Tehran. But why would Iranian agents use the compromised code to transmit that information? And how would a foreign national such as Chalabi get access to secret intercepts? Guess we're supposed to take the U.S. intelligence community's word for all this, even though its judgment has been discredited in every outside inquiry.
Then there's the charge that Chalabi was guilty of fraud at a Jordanian bank he once owned. A secret Jordanian military tribunal convicted him in absentia in 1992. Chalabi argues that this was a frame-up by Jordanians eager to seize his assets and curry favor with Hussein. The truth may come out in a lawsuit that Chalabi has filed in the U.S. against the Jordanian government. In the meantime, claims that he's a swindler must be treated with skepticism....
Posted on: Thursday, April 7, 2005 - 15:21
With the passage last week of a budget bill in Israel, the government of Ariel Sharon appears to be ready to remove more than 8,000 Israelis living in Gaza with force, if necessary.
In addition to the legal dubiousness of this step and its historical unprecedented nature (challenge to the reader: name another democracy that has forcibly removed thousands its own citizens from their lawful homes), the planned withdrawal of all Israeli installations from Gaza amounts to an act of monumental political folly.
It also comes as an astounding surprise. After the Oslo round of Israeli-Palestinian negotiations (1993-2001) ended in disaster, many Israelis looked back on Oslo's faulty assumptions, their own naïveté, and resolved not to repeat that bitter experience. Israelis awoke from the delusion that giving the Palestinians land, money, and arms in return for airy-fairy and fraudulent promises would lessen Palestinian hostility. They realized that, to the contrary, this imbalance enhanced Palestinian rejection of the very existence of the Jewish state.
By early 2001, a divided Israeli electorate had largely re-unified. When Mr. Sharon became prime minister in February 2001, a wiser leadership had apparently taken over in Jerusalem, one that recognized the need for Israel to return to toughness and deterrence.
These optimistic expectations were indeed fulfilled for nearly three years, 2001-03. Mr. Sharon engaged in a quite masterful double diplomacy in which he simultaneously showed a cheery face (toward the American government and his leftist coalition partners) and a tough one (toward his Likud constituents and the Palestinians). The purposefulness and underlying consistency of his premiership from the start impressed many observers, including this one; I assessed Sharon's record to be"a virtuoso performance of quietly tough actions mixed with voluble concessions."
Mr. Sharon decisively won re-election in January 2003 over Amram Mitzna, a Labor opponent who advocated an Oslo-style unilateral retreat from Gaza. Mr. Sharon unambiguously condemned this idea back then:"A unilateral withdrawal is not a recipe for peace. It is a recipe for war." After winning the election, his talks in February 2003 about forming a coalition government with Mr. Mitzna failed because Mr. Sharon so heavily emphasized the"strategic importance" of Israelis living in Gaza.
By December 2003, however, Mr. Sharon himself endorsed Mr. Mitzna's unilateral withdrawal from Gaza. While he did so in a spirit very different from the prior Oslo diplomacy, his decision has the same two main characteristics.
First, because the decision to retreat from Gaza took place in the context of heightened violence against Israelis, it vindicates those Palestinian voices arguing for terrorism. The Gaza retreat is, in plain words, a military defeat. It follows on the ignominious Israeli abandonment of its positions and its allies in Lebanon in May 2000, a move which much eroded Arab respect for Israeli strength, with dire consequences. The Gaza withdrawal will almost certainly increase Palestinian reliance on terrorism.
Second, the retreat is heating up the political climate within Israel, bringing back the dangerous mood of exaggeration, incivility, hostility, and even lawlessness. The prospect of thousands of Israelis evicted from their homes under threat of force is rudely interrupting what had been a trend toward a healthier atmosphere during the relative calm of 2001-03.
Mr. Sharon's plans at least have a disillusioned quality to them, sparing Israel the wooly notions of a"new Middle East" that so harmed the country a decade ago. But in another way, Mr. Sharon's plans are worse than Oslo; at least that disaster was carried out by the clueless Left. A Right - led by Mr. Sharon – valiantly and staunchly opposed it. This time, it is the Right's hero who, allied with the far-Left, is himself leading the charge, reducing the opposition to marginality.
There are many theories for what reversed Mr. Sharon's views on the matter of a unilateral Gaza withdrawal in the 10 months between February and December 2003 – I have my own ideas about the hubris of elected Israeli prime ministers – but whatever the reason, its consequences are clear.
Mr. Sharon betrayed the voters who supported him, wounding Israeli democracy. He divided Israeli society in ways that may poison the body politic for decades hence. He aborted his own successful policies vis-à-vis the Palestinians. He delivered Palestinian, Arab, and Muslim rejectionists their greatest boost ever. And he failed his American ally by delivering a major victory to the forces of terrorism.
This article is reprinted with permission by Daniel Pipes. This article first appeared in the New York Sun.
Posted on: Tuesday, April 5, 2005 - 18:08
According to a remarkable article by Scott Macleod in the April 4 issue of Time Magazine, the suicide bomber who carried off the worst atrocity in Iraq since the collapse of the Saddam Hussein regime was a 32-year-old Jordanian who had lived for two years in California.
Ra'ed Mansour al-Banna was born in Jordan in 1973 and grew up in a religious, economically prosperous merchant family. He studied law at the university, graduating in 1996, and then started his own law practice in the Jordanian capital of Amman. After three years, he gave it up and in 1999 he worked a half year without pay for the United Nations High Commissioner for Refugees in Amman, helping Iraqis who fled Saddam Hussein's tyranny.
In 2001, sometime before 9/11, Banna received a visa and moved to the United States, where he apparently lived in California for nearly two years, moving from one unskilled job to another – factory worker, bus driver, and pizza maker. According to his father, Ra'ed even worked"in one of the Californian airports." If Ra'ed did not make it economically, he seemed to fit in well, traveling to such destinations as the Golden Gate Bridge and the World Trade Center, growing his hair long, and taking up American popular music. Photographs sent to his family in Jordan show Banna eating a crab dinner, walking on a beach in California, mounted on a motorcycle, and standing in front of a military helicopter while holding an American flag. He even planned to marry a Christian woman until her parents demanded that the wedding take place in a church.
Banna apparently loved America, reporting back to his family about the people's honesty and kindness;"They respect anybody who is sincere." Talal Naser, a young man engaged to one of Ra'ed's sisters, explained how Ra'ed"loved life in America, compared to Arab countries. He wanted to stay there." His father, Mansour, recounted that, despite the September 11 attacks, Ra'ed"faced no problems with his American workmates, who liked him."
Banna visited home in 2003 but on his return to the United States he was denied entry, accused of falsifying details on a visa application. He returned to Jordan and became withdrawn, holing up in a makeshift studio apartment, sleeping late, and displaying a new interest in religion. He began praying five times a day and listening to the Koran. In November 2004, he went on pilgrimage to Mecca, returning to Saudi Arabia in January 2005.
On Jan. 27, Banna crossed into Syria, presumably on the way to Iraq. He apparently spent February with Sunni jihadis in Iraq, during which time he called home several times, with the last call on about Feb. 28.
Feb. 28 also happens to be the date when Banna suited up as a suicide bomber and blew himself up at a health clinic in Al-Hilla, killing 132 people and injuring 120, the worst such attack of the 136 suicide bombings that have taken place since May 2003. On March 3, the family received a call informing them of Ra'ed's fate."Congratulations, your brother has fallen a martyr."
A friend revealed that Banna became politically radicalized against American policies in the Muslim world while living in the United States. He was especially distraught about developments in Iraq. A neighbor, Nassib Jazzar, recalled Banna upset with the coalition occupation."He felt that the Arabs didn't have honor and freedom.'"
The father notes that Ra'ed wore Western-style clothing, rarely went to mosque, and was ignorant of the names of local sheikhs."I am shocked by all of this because my son was a very quiet man, not very religious and more interested in pursuing his law profession and building a future for himself."
As Time cautiously concludes from this tale,
On the basis of accounts given by his family, friends and neighbors, Ra'ed apparently led a double life, professing affection for America while secretly preparing to join the holy war against the U.S. in Iraq."Something went wrong with Ra'ed, and it is a deep mystery," says his father Mansour, 56."What happened to my son?"
Ra'ed al-Banna's biography inspires several observations:
(1) When it comes to Islamist terrorists, appearances often deceive. That Banna was said to"love life in America," be"not very religious," and be interested in"building a future for himself" obviously indicated nothing about his real thinking and purposes. The same pattern recurs in the biographies of many other jihadis.
(2) Moving to the West often spurs Muslims to despise the West more than they did before they got there. This appears to be what happened with Banna.
(3) Taking up the Islamist cause, even to the point of sacrificing one's life for it, usually happens in a discreet manner, quite unobservable even to a person's closest relatives.
In brief, Banna's evolution confirms the point I have made repeatedly about the regrettable but urgent need to keep an eye on all potential Islamists and jihadis, which is to say Muslims.
Posted on: Monday, April 4, 2005 - 20:48
Chalmers Johnson, in In These Times (3-31-05):
The Rubicon is a small stream in northern Italy just south of the city of Ravenna. During the prime of the Roman Republic, roughly the last two centuries B.C., it served as a northern boundary protecting the heartland of Italy and the city of Rome from its own imperial armies. An ancient Roman law made it treason for any general to cross the Rubicon and enter Italy proper with a standing army. In 49 B.C., Julius Caesar, Rome's most brilliant and successful general, stopped with his army at the Rubicon, contemplated what he was about to do, and then plunged south. The Republic exploded in civil war, Caesar became dictator and then in 44 B.C. was assassinated in the Roman Senate by politicians who saw themselves as ridding the Republic of a tyrant. However, Caesar's death generated even more civil war, which ended only in 27 B.C. when his grand nephew, Octavian, took the title Augustus Caesar, abolished the Republic and established a military dictatorship with himself as "emperor" for life. Thus ended the great Roman experiment with democracy. Ever since, the phrase "to cross the Rubicon" has been a metaphor for starting on a course of action from which there is no turning back. It refers to the taking of an irrevocable step.
I believe that on November 2, 2004, the United States crossed its own Rubicon. Until last year's presidential election, ordinary citizens could claim that our foreign policy, including the invasion of Iraq, was George Bush's doing and that we had not voted for him. In 2000, Bush lost the popular vote and was appointed president by the Supreme Court. In 2004, he garnered 3.5 million more votes than John Kerry. The result is that Bush's war changed into America's war and his conduct of international relations became our own.
This is important because it raises the question of whether restoring sanity and prudence to American foreign policy is still possible. During the Watergate scandal of the early '70s, the president's chief of staff, H. R. Haldeman, once reproved White House counsel John Dean for speaking too frankly to Congress about the felonies President Nixon had ordered. "John," he said, "once the toothpaste is out of the tube, it's very hard to get it back in." This homely warning by a former advertising executive who was to spend 18 months in prison for his own role in Watergate fairly accurately describes the situation of the United States after the reelection of George W. Bush.
James Weinstein, the founding editor of In These Times, recently posed for me the question "How should US foreign policy be changed so that the United States can play a more positive role on the world stage?" For me, this raises at least three different problems that are interrelated. The first must be solved before we can address the second, and the second has to be corrected before it even makes sense to take up the third.
Sinking the Ship of State
First, the United States faces the imminent danger of bankruptcy, which, if it occurs, will render all further discussion of foreign policy moot. Within the next few months, the mother of all financial crises could ruin us and turn us into a North American version of Argentina, once the richest country in South America. To avoid this we must bring our massive trade and fiscal deficits under control and signal to the rest of the world that we understand elementary public finance and are not suicidally indifferent to our mounting debts.
Second, our appalling international citizenship must be addressed. We routinely flout well-established norms upon which the reciprocity of other nations in their relations with us depends. This is a matter not so much of reforming our policies as of reforming attitudes. If we ignore this, changes in our actual foreign policies will not even be noticed by other nations of the world. I have in mind things like the Army's and the CIA's secret abduction and torture of people; the trigger-happy conduct of our poorly trained and poorly led troops in places like Iraq and Afghanistan; and our ideological bullying of other cultures because of our obsession with abortion and our contempt for international law (particularly the International Criminal Court) as illustrated by Bush's nomination of John R. "Bonkers" Bolton to be US ambassador to the United Nations.
Third, if we can overcome our imminent financial crisis and our penchant for boorish behavior abroad, we might then be able to reform our foreign policies. Among the issues here are the slow-moving evolutionary changes in the global balance of power that demand new approaches. The most important evidence that our life as the "sole" superpower is going to be exceedingly short is the fact that our monopoly of massive military power is being upstaged by other forms of influence. Chief among these is China's extraordinary growth and our need to adjust to it.
Let me discuss each of these three problems in greater depth.
Posted on: Sunday, April 3, 2005 - 01:46
"Social Security is the soft underbelly," says right-wing activist Stephen Moore. "If you can jab your spear through that, you can undermine the whole welfare state." But things aren't going well for the jabbers. Their claims of crisis have been proved false, their motivations meanly ideological (shredding the New Deal safety net) or merely mercenary (billions in profits for Wall Street). It's tempting to think they've overreached, as did Newt when he tried to shutter the government, and that with facts on our side, saving Social Security could be a slam-dunk.
Big mistake. This battle will not be settled by fact-laden presentations, any more than the last election was, because no matter how compelling such messages may seem, recipients filter them through a pre-existing mesh of views and values--accepting, adjusting, resisting or rejecting incoming exhortations, depending on how they comport with previously accumulated information and attitudes. Given that citizens are being asked to transfer custodianship of their future from the government to the financial marketplace, it's crucial to understand what people think and how they feel about Wall Street, to understand how it's seen in the culture at large, to know whether its stock has been rising or falling.
Which makes the arrival of Steve Fraser's book, an account of how Americans have perceived Wall Street over the past 200 years, incredibly timely. But timeliness is not its only virtue. Every Man a Speculator: A History of Wall Street in American Life is fascinating in its own right. Though the title suggests a focus on financial affairs, it belongs on the shortlist of books that encompass and illuminate the entire trajectory of the American experience. That's because Fraser knows that Wall Street is far more than a workplace for bankers and brokers; rather, it is a place where Americans "have wrestled with ancestral attitudes and beliefs about work and play, about democracy and capitalism, about wealth, freedom and equality, about God and Mammon, about heroes and villains, about luck and sexuality, about national purpose and economic well-being."
Every Man's ambitious breadth is matched by investigative depth. In exploring Wall Street's cultural impact, Fraser has drawn on an astonishing array of sources-- potboiler novels and classics of American literature, biographies and memoirs, magazine pieces and poems, movies and plays, speeches and sermons, cartoons and board games, folk songs and Tin Pan Alley ditties, artworks and radio shows, legislative deliberations and judicial decisions, the work of academic economists and mass-movement pamphleteers. The range and abundance of material he has uncovered gives the book great authority--though also, in places, an overstuffed, repetitious and convoluted quality that slows the narrative flow: The manuscript would have benefited from one last pass through the editorial wringer. But the writing is so vivid, the sensibility so witty, the analysis so theoretically astute (yet utterly jargon free), that readers will shoot successfully through the occasional eddies and rapids.
So what can we glean from Fraser about the cultural resources available to contending parties in the Social Security debate? One of the most striking features of his survey is how starkly Manichean the responses to Wall Street have been over the centuries. From the very beginning--just after the Revolution, when, freed from Britain's imperial constraints, banking and stock trading emerged--Wall Street has been both revered and reviled. In part this mirrors its dualistic and contradictory character. It's been a site for assembling capital to fertilize productive enterprises. It's also been the place to speculate in ownership shares of such enterprises--speculation that has had beneficial but also deleterious and, at times, catastrophic consequences.
In the 1790s, wealthy merchants leapt to buy and trade public securities, to the delight of Treasury Secretary Alexander Hamilton, who, Fraser notes, "conceived of the Street as an engine of future national glory." Hamilton was sure these ur-speculators would invest their profits in long-term economic growth. Instead, in Wall Street's ur-burst of irrational exuberance, they chased after instant riches by plunging back into the market itself, led by Hamilton's close associate William Duer, the Darth Vader of early American finance. Drawing on insider information, Duer manipulated a bull market that sucked in unwary investors, then overreached and went belly up, precipitating the country's first stock market crash and first recession (a short one, given Wall Street's marginality to the existing agrarian economy).
Hamilton, dismayed by what he called "extravagant sallies of speculation," conceded that stock trading "fosters a spirit of gambling." But he resisted market regulation, fearing idle capital more than chaotic conditions and hoping that "public infamy" would draw the line between "dealers in the funds and mere unprincipled gamblers." Thomas Jefferson, appalled that (as he put it) "the credit and fate of the nation seem to hang on the desperate throws and plunges of gambling scoundrels," doubted such a line could be drawn. The antidemocratic implications of the links Hamilton was forging between financial elites and the federal government deeply troubled Jefferson, whose critique of monied aristocrats as threats to the Republic helped pave the way for his party's political triumph in 1800.
This initial debate, argues Fraser, foreshadowed the terms and tone of nineteenth- century combat over the steadily expanding role of financiers in the growing American economy, with Wall Street's status rising and falling with the country's economic fortunes. During upswings, speculators gained kudos for mobilizing investment in canals, railroads and industrialization: Their willingness to place bets on the future made them emblems of America's buoyant optimism and entrepreneurial audacity. And while few Americans as yet actually participated in the financial marketplace, the fortunes won there by clashing cliques of bulls and bears stimulated avaricious fantasies among a far wider circle. The fact that men of plebeian background could get rich by speculative wheeling and dealing also gave the Street a democratic sheen in a democratic era.
Wall Street titans were also credited with warriorlike virtues, even sexual prowess. Fraser is particularly sharp in spotting the emerging association of financiers, from at least the days of August Belmont and Commodore Vanderbilt, with rebellion against the buttoned-down standards of bourgeois masculinity. Freebooting speculators like Daniel Drew and Jim Fisk had bravado and dash--"I was born to be bad," Fisk boasted--and their defiance of conventional proprieties gave them a raffish glamour. Ordinary folks relished reading dime-novel accounts of their swashbuckling exploits, drinking their health while belting out barroom favorites like "Jim Fisk, or He Never Went Back on the Poor" and poring over inspirational accounts like James D. McCabe's Great Fortunes and How They Were Made (1870).
But even in flush times, when Wall Street exerted a mass fascination, it also inspired mass revulsion. Pious Protestants considered speculation a form of gambling and thus a sin, not least in dangling delusions of effortless gain before the improvident. "Wall Street is a thousand times deadlier than Monte Carlo," hissed a character in the 1887 Broadway play The Henrietta. At best, its rewards were ill gotten, leeched off the enterprise of others, not earned the old-fashioned way, through pluck, perseverance and productive labor. At worst, it seemed to be a species of con game--a conviction bolstered by the steadily rising number of frauds, defalcations and market manipulations. The Street's "code of laws," the Louisville Courier editorialized in 1857, was that of "the sharper, the imposter, the cheat, and the swindler," a view echoed in Herman Melville's mordant Confidence-Man. From this perspective, speculators were not lovable rogues but menacing outlaws. Jay Gould--whose failed effort to corner the nation's gold market in 1869 precipitated a crisis that shuttered hundreds of businesses and cost thousands of workers their jobs--suffered universal execration. Gould, said Gustavus Myers in his History of the Great American Fortunes, was a "pitiless human carnivore, glutting on the blood of his numberless victims...an incarnate fiend"; Hamilton couldn't have wished for a greater degree of "public infamy," but it did nothing to rein in speculation.
Worst of all was the ever more evident link between Wall Street's investment mechanisms and cyclical derangements like the panics of 1837, 1857, 1873 and 1893, each of which ushered in depressions and increasingly violent capital-labor conflicts. Among those most alienated by a competitive capitalism run amok were key figures on Wall Street itself. Investment bankers like J.P. Morgan took a longer-term, more systemic view of the national economy. They loathed speculators (whom they considered sociopathic gamblers) and the wild "free market" whose gyrations they engendered. From the 1870s on, Morgan and his colleagues struggled to limit self-destructive competition, and by century's end they had engineered the massive consolidations that ushered in corporate capitalism, establishing a privately owned command economy run largely by financiers like themselves.
For Fraser the heyday of finance capitalism that followed--he calls it the Age of Morgan and dates it, reasonably enough, from 1890 to 1920--marks both a distinct phase in the development of US capitalism and a cultural breakthrough for Wall Street, the moment when it attained middle-class respectability. Many Americans, he finds, especially in the burgeoning ranks of well-paid professionals and managers, accepted Wall Street's hegemony for much the same reason they voted for McKinley Republicans: as a bulwark against the mob and economic crises. Bankers and entrepreneurial grandees were lionized as financial and industrial Napoleons in popular fiction and middle-class magazines. Speculators were rehabilitated as well--no longer mere gamblers, they were anointed as professionals in their own right, practitioners of the "science" of speculation, members of the "securities industry," which irrigated industrial growth and opened speculative opportunities for the upper middle class. Academic economists offered guilt-free defenses of stock and commodities exchanges (and derided critics as ignorant rustics), while pragmatist philosophers like William James and Charles Sanders Peirce re-evaluated (as Fraser astutely observes) the very notion of chance, making it seem, like market risk, "both inescapable and controllable."
Yet the new Wall Street met with considerable opposition, too. Some critiques were rooted in the older economic and cultural order, like those of the genteel, old-monied intelligentsia (Henry Adams, Henry James, Edith Wharton), who deplored the financial oligarchy for its vulgarity and avarice. In The House of Mirth, as Fraser observes, Wharton provided a withering portrait of "the lethal moral and social psychology that lay concealed beneath the frivolity and mercenary pre-occupations of the Wall Street world"; James, for his part, expressed acute discomfort in The American Scene with the Street's "pushing male crowd."
Prairie farmer populists also had one foot in the vanishing moral economy. True, they straightforwardly castigated Eastern bankers and bondholders as regional exploiters, a charge neatly captured by an earthy sketch (in the bestselling tract Coin's Financial School) of a continentally scaled cow feeding in the West while getting milked in New York. But "Morganization," Fraser makes clear, threatened a way of life, not just a source of income, and it stirred ancient cultural anxieties. Populists produced ferocious denunciations of the "Money Kings of Wall Street" that depicted them as fiendish conspirators and sybaritic plutocrats out to overturn the moral foundations of the Republic, rhetoric that occasionally curdled into anti-Semitic rant. But their resistance was forward-looking as well as retrograde, Fraser insists, noting that their "ultimate goal was to wrest control of the monetary system from the Wall Street elite and vest it in the hands of the U.S. Treasury."
Other opponents (small businessmen, urban workers, many professionals) were more thoroughly enmeshed in modern industrial life; they accepted the new corporate order but rejected the financial elite's overlordship as pernicious and irrational. These Progressives devoured the writings of muckrakers--critics who deflated the puffery surrounding the heroes of finance and industry, but did so in a pragmatic, empirical and secular fashion. Assaults on "predatory wealth" accelerated after the Panic of 1907. Exposés by investigative journalists (Charles Edward Russell's Lawless Wealth), jurists (Louis Brandeis's Other People's Money) and renegade speculators (Thomas Lawson's Frenzied Finance) demonstrated that Wall Streeters traded on inside information, produced their own speculative booms and crises, blocked fruitful competition, suffocated innovation, made obscene amounts of money and used it to corrupt politicians and the press. These critiques, as Fraser shows, flowed out into ever wider channels of popular culture, from the Broadway musical (one play featured a chorus of demons chanting, "This seat's reserved for Morgan/That great financial Gorgon") to the newly arrived motion picture (in D.W. Griffith's 1909 film A Corner in Wheat, a wheat king is literally buried alive by his ill-gotten gains).
Once again, cultural subversion facilitated political initiatives. In the Progressive era many of Wall Street's opponents rallied around Theodore Roosevelt. As the scion of an ancient-monied clan, Roosevelt possessed both a "highly developed sense of social obligation" and a contempt for financial plutocrats that "was bred in the bone, part of an upbringing that dismissed materialistic strivings as unworthy, debilitating, and effeminate." Woodrow Wilson and Eugene Debs--TR opponents in the election of 1912--had different vantage points, different social backing and different approaches to curbing the "Money Trust." But the overall popular mood had clearly turned against Wall Street and modest reforms ensued, including the Federal Reserve Act (1913), though the arrival of World War I, which Wall Street played a crucial role in underwriting, short-circuited the Progressive crusade.
The ensuing 1920s-30s boom-bust cycle followed the well-established sine curve trajectory, but the oscillations were far more extreme, producing what Fraser calls Wall Street's first "season in utopia," which led to the most serious challenge yet to private control of the nation's capital markets.
The 1920s surfed in on a wave of commodities, and Wall Street regained its luster by channeling capital to frontier technologies like radio and movies, airplanes and autos. The stock market again shook off its moral stigma, its Dionysian aspects resurfaced and speculating on margin became as sexy as wearing short skirts and drinking bathtub gin. Indeed, "playing" the market, Fraser suggests, became part of a more epochal development--the shouldering aside of the Puritan work-and-thrift moral economy by a shiny new culture of consumption. Constraints on corruption eased, too: Rogues were back in vogue, and stock pools--conspiracies by insider wolves to shear outsider lambs--were reported like sporting events.
With the Dow on a roll, a broader (yet still narrow) swath of the population rushed to get in on the action--heightening the aura of democratization--but the biggest profits still rose to the top, where they triggered another round of profligate consumption. Mellon Republicans green-flagged the process--approving mergers, slashing taxes on the rich--while pocketing initial stock offerings before they went public, burnishing crony capitalism to glossy perfection. Yet seldom was heard a discouraging word. Religious critics lacked fervor and moral authority, while surviving Populist and Progressive skeptics were dismissed as killjoys or cranks. Even Henry Ford's anti-Semitic ravings about Wall Street's sinister influence on the country's moral fiber failed to gain traction. Wall Street oracles smugly announced the arrival of a "new era," the end of history.
The Crash punctured this fantasy. The Street's most august figures were exposed as cheats and felons--pillars of rectitude like New York Stock Exchange president Richard Whitney went off to Sing Sing wearing his bowler hat. Wall Street was reimagined as a bestiary of parasites, gamblers and noxious con men--a return of the repressed popular iconography of shame. Many, like F. Scott Fitzgerald, reconsidered the 1920s as one long drunk in which stocks had functioned like booze (as Fraser summarizes Babylon Revisited, Fitzgerald's post-mortem) "lubricating childish daydreams about an eternity of good times, anesthetizing any sense of responsibility, fostering a careless and criminal negligence." Meanwhile, theoreticians like John Maynard Keynes were deriding the purported "science" of stockbrokerage as a regime in which "enterprise becomes the bubble on a whirlpool of speculation." Worst of all, Wall Street was widely ridiculed. "Laughter is a punishing historical sentence," Fraser observes, and the flood of lampooning cartoons, literary satires and iconoclastic biographies "whittled away the puissance of the old ruling class."
This latest round of cultural subversion fatally compromised Wall Street's ability to hold its own against New Deal reformers. FDR--who, like TR, inherited a patrician disdain for the nouveaux riches--declared that "we cannot allow our economic life to be controlled by that small group of men whose chief outlook upon the social welfare is tinctured by the fact that they can make huge profits from the lending of money and the marketing of securities." The financial world was "subjected to a real if flawed public supervision under the New Deal"; social democratic programs like Social Security were set in place; and state-capitalist investment mechanisms loosened Wall Street's chokehold on capital formation--even when it came to financing World War II, the Street remained a junior partner to government.
Fraser is very good on the New Deal era--not surprisingly, as he's given us a superb biography of Sidney Hillman and co-edited an essential collection about the period--but he's particularly smart about the 1940s-60s boom, the first in which Wall Street failed to instantly overcome Depression-era obloquy. The 1930s had been too searing to be soon forgotten: Postwar America remained security-conscious, and flamboyant speculators stayed in the doghouse. Once again a gray-flannel Wall Street presented itself as the prudential guardian of widows and orphans, and as a patriotic bulwark against un-Americanism (broker Charles Merrill argued that nothing "would provide a stronger defense against the threat of Communism, than the wide ownership of stocks in the country"). Unfortunately, the little investors remained gun-shy, and the new institutional purchasers like pension funds and insurance companies bought only the bluest of blue-chip stocks. Worse, the great corporations now financed expansion and innovation largely out of internal capital. The result, Fraser argues, was that Wall Street, having been for a century "an essential element of the country's cultural iconography," now "vanished from the front page and lived out its life in the business section of the daily newspaper." Financiers, to be sure, became commanding figures in the economic and defense establishment. But Wall Street itself, "once a main thoroughfare running through the American imagination, now seemed deserted." Some swagger resurfaced in the go-go 1960s, only to vanish in the crash of 1970, which ushered in a decadelong period of stagflation and decline that Keynesian remedies failed to reverse.
The convergence of economic crisis with military defeat in Vietnam, Fraser suggests, left the Wall Street establishment open to another onslaught, this time from the right. In the 1980s, a rising generation of speculators like Ivan Boesky and Carl Icahn, more plebeian and outer-borough than the white-shoe old guard, stormed the twin fortresses of the ancien régime wielding "a capitalist version of liberationist theology." Wall Street warriors and Reagan Republicans repudiated big government and launched a counterrevolution against the New Deal. They also lit into the "Corpocracy" of complacent industrial managers, claiming that corporate takeovers and makeovers would revive the economy and succor disenfranchised shareholders.
Supply-side zealots, mergers and acquisitions wizards, greenmailers and assorted asset strippers, Fraser points out, failed to accomplish their professed goals--hostile mergers impaired efficiency, 1980s investment in plants and equipment sank below 1970s levels and S&L bailouts wasted billions--but they did manage to reignite the speculative sector. A torrent of capital surged into paper entrepreneurism--an endless reshuffling of nominal assets--and Wall Street went from being the economy's spark plug to being its engine, a structural shift (from finance capitalism to financialized capitalism) akin to the revolution wrought during the Age of Morgan.
The 1980s also revived--and deepened--the country's infatuation with Wall Street. With Americans hungry for signs that the days of defeat and decline were over, strident macho posturing by bond traders in suspenders seemed a Viagraesque antidote for a "wilted national masculinity." Corporate raiders, greenmailers and "hot-to-trade portfolio managers with ice in their veins" made out like financial samurai, sporting copies of The Art of War by Sun Tzu in an atmosphere, Fraser notes, that "reeked of pure male fantasy." Scorning what guru "Adam Smith" (a k a George Goodman) had derided in The Money Game as, in Fraser's words, the "stultifying emphasis on safety and security that had settled over the markets since the war," they chased after the biggest and quickest payoffs. Shareholders were happy--and their numbers exploded--as junk-bond financing generated handsome returns for those in mutual funds, pension funds and investment clubs (there were 7,000 of the latter by the end of the 1980s). And Wall Street resurfaced, big-time, in popular culture. Magazines like Success, Millionaire and Vanity Fair published lavish accounts of the "New Tycoonery"; soap operas (Dynasty, Dallas) doted on the doings of the nouveaux riches; and a multitude of media retailed each Gilded Age excess from masquerade balls to kitsch Hamptons palaces.
The 1987 crash momentarily soured the national mood, as did revelations (Liar's Poker, Den of Thieves, The Predators' Ball, Barbarians at the Gate) of gross corruption on the part of radical outsiders turned privileged insiders. Mike Milken and Ivan Boesky were sent to the slammer, and there was a brief reckoning with the social costs of the latest Great Barbecue--zooming homelessness, industrial collapse, declining wages, growing inequality. In popular culture, the trader went from being a Master of the Universe to being a villain, best exemplified by Gordon Gekko, the pitiless parasite played by Michael Douglas in Oliver Stone's film Wall Street ("I create nothing. I own," Gekko gloats). But much of the criticism, Fraser argues, was more ironic than indignant, as if any hope for serious reform had been abandoned. There was no stopping the free-market utopians' cultural momentum, in part because the crisis was quickly contained and the paper boom roared on.
For Fraser, the 1990s were the moment when a decisive cultural counterrevolution was wrought. At the decade's beginning, Wall Street's reputation rebounded, thanks to its financing of Internet innovation. Toward its end, as dot-com development reached its reality-based limits, capital sluiced instead into faith-based speculation. Ongoing casinoization was spurred by Republicans and Clintonian Democrats alike, the latter having shed their New Deal social conscience and converted to free-market orthodoxy. Together they cut capital gains taxes, deregulated the financial sector and turned a blind eye to the growth of gigantic speculative vehicles (the misnamed "hedge" funds), freed from government supervision but ready to accept government rescue.
Middle- and upper-class Americans piled into the market, assured that "risk-controlling" techniques--courtesy of university departments of "financial engineering"--had all but eliminated the downside. For the first time, roughly half the population participated, passively or actively: By 1998 there were 3,513 mutual funds. Wall Street seized hold of the popular imagination. TV viewers glued themselves to CNN-FN, Bloomberg and CNBC, captivated by financial analysts, business talk shows and real-time tickers scrolling (from 1996 on) across the bottom of their screens. At bookstores, parents snatched up bestsellers like Dow 36,000 and The Roaring 2000s: Building the Wealth and Lifestyle You Desire in the Greatest Boom in History, and took home Wow the Dow for the kiddies--who as likely as not were playing The Stock Market Game in grade school, or joining high school investment teams and competing in tournaments.
Media pundits and think tanks hailed this popular participation as a breakthrough for democracy--a triumphalism, as Fraser shrewdly notes, that mirrored American exaltation at winning the cold war. We were now the "Shareholder Nation," with the financial marketplace replacing the town meeting as the place where citizens were freest to determine their fate. The 1990s were also the moment, Fraser believes, when the merger between Wall Street and the wider consumer culture that began in the 1920s was finally consummated. Not only had the capital casino become a 24/7 playground in its own right, but the willingness (or need) of average people to spend tomorrow's anticipated capital gains today sustained purchasing in other sectors of the economy. Ordinary Americans did make money in the 1990s, though with the boom at white heat, 86 percent of the gains accrued to the top 10 percent. And the top 1 percent, which had 19.9 percent of the country's wealth going into the decade, had 40 percent of it coming out. Another round of gilded arrogance and rococo extravagance ensued, but this time there were virtually no complaints about "barbarians" or "vanities": "The '90s," Maureen Dowd noted, "are the '80s without the moral disgust."
In 2000 the bubble burst, with big losses all around. Stunning frauds were revealed that implicated virtually the entire Wall Street establishment. Corporate directors had cooked books (with help from banks and accounting firms), propped up share prices to cash in on stock options and looted pension funds on their way out the door. But this time, there was no backlash. The political fallout was minimal, the legislative response toothless.
Fraser takes this lack of reaction as evidence that a Rubicon has been crossed. The popular culture that for two centuries cast a cold or at least an ambivalent eye on Wall Street has now assumed a "stance of fateful inevitability about the reign of the free market." The national imagination that "once peopled the Street with usurers, monopolists, con men, aristocrats, and sinners" is now "a dimming memory." This waning of cultural antipathy has in turn sapped political energy; the wellsprings of outrage seem to have dried up, along with "the instinct to collectively resist the usurpations of presumptuous wealth." Wall Street, "for the moment at least," seems to have "won the war for hearts and minds."
The only thing that might conceivably reverse this triumph, Fraser thinks, would be another truly systemic crisis. But given the far greater centrality of today's financial sector, and the intricate interconnectedness of the international economy, a financial earthquake higher up on the Richter scale than that of 2000 might well trigger a global economic tsunami of horrendous proportions--which could, as Fraser is well aware, have horrific political consequences. Every Man thus ends on a dark and dispirited note.
I think this is too pessimistic. Let's set aside the question of the degree to which 9/11 absorbed and deflected the popular outrage that might otherwise have burst over the heads of Enronizers and privateers, turning princes back into frogs. And let's assume Fraser is right in attributing current complacency to the fact that the cultural underpinnings of resistance have been eroded, the moral goalposts moved down the field. Still, it's worth considering the possibility that the political attack on Social Security might prove the functional equivalent of an economic collapse and provide the shock needed to delegitimize reigning elites and revitalize popular opposition. It's better than an equivalent, in fact, because we confront not a sudden catastrophe that might encourage a panicky acceptance of authority but an Iraq-style voluntary war that affords us time to mobilize resistance in the court of public opinion.
In doing so, Every Man a Speculator can be a valuable resource. Fraser's reconnaissance of Americans' longstanding reservations about Wall Street has mapped the location of ancient stress points that are worth re-examining for signs of contemporary strain; and his assessment of previously proffered reforms can help distinguish outmoded critiques, inextricably tied to vanished moral economies, from proposals of continuing salience, if updated and refashioned.
Thus, it would clearly be witless to denounce speculation as gambling, and gambling as a sin. But I suspect that a culture chockablock with chapters of Gamblers Anonymous remains hip to the kinship between Wall Street-style irrational exuberance and Vegas-style bingeing, hence cautious about transferring nest eggs to casinos. On the corruption front, while cynicism about the possibility of checking financial fraud is understandably rampant, I'd bet that even those prepared to roll the dice with their retirement funds would prefer not to play in a rigged game, and might well back calls to quintuple the SEC's budget and put a pit bull like Eliot Spitzer in charge of the croupiers.
Similarly, while the social gospel is clearly in remission, surely the vast numbers of church people now running the nation's innumerable soup kitchens include many who find ethically despicable the zestful spear jabbers' lack of concern for those who would face an impoverished old age, should smiley-faced market projections prove as wrong as they have repeatedly in the past. Such community-minded activists have also probably spotted "ownership society" flummery for what it is: the latest version of "It's your money" self-centeredness, an irresponsible repudiation of social solidarity.
The nineteenth century's Victorian family code is definitely passé, but Fraser's intriguing identification of a centuries-long link between hypermasculinity and speculative excess should alert us both to the possible appeal of such macho posturing to some younger male voters, and its likely alienating effect on less testosterone-driven constituencies. (How's "Would you trust your rainy-day fund to Gordon Gekko?" as a slogan?) Moreover, there must be many Americans who see the privatizers' ugly effort to divide children from parents for what it is--a menace to contemporary family values. Most people know full well that Social Security has not only been a lifesaver for the old but has provided a measure of independence to the young, shifting some of the burden of caring for aged parents to the country's broad collective shoulders.
Fraser's recounting of how often euphoric complacency has given way to rude awakening might well resonate with those who lived through the 1990s boom and meltdown. People who personally experienced ulcer-making anxieties about the future as they watched their 401(k)s sag--and who agree the record suggests such lurches will likely recur--might be ready to oppose efforts to end psychic security as we've known it and pitch us into a Pepsid 'R' Us society.
Add these up and you've got a fair number of purchase points for moral resistance to self-serving ideologues like Stephen "Jabber" Moore, who would return us to a tooth-and-claw world. But Fraser offers far more than assistance on the Social Security front. His sweeping historical reconstruction is a powerful reminder that our current economic arrangements are the product of centuries of debate and struggle, not the inevitable legacy of invisible "market forces." In historicizing (and thus demystifying) the present, Every Man joins the long list of acts of cultural subversion, and invitations to political action, so admirably chronicled in its pages.
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Saturday, April 2, 2005 - 03:07
President Bush says we must “stay the course” in Iraq , and he promises to continue during his second administration the radical foreign and domestic policies laid out during his first term. We believe it is time to change course.
But can the course of U.S. foreign policy ever truly be altered?
Has there ever been a model for a dramatic shift away from militarism and unilateralism toward international cooperation and peace?
The answer to these questions is yes.
In the late 1920s, the State Department, Commerce Department, and War Department were all weary of staying the course. Reacting to popular protest and rising concern from business, Washington and Wall Street began turning away from territorial acquisition and imperialism as preferred instruments of U.S. foreign policy. Instead of considering it the mission of a “master race” to manage the affairs of the “weaker races,” as Teddy Roosevelt had, leaders in politics and commerce now spoke about the need for nations to be good neighbors.
The Good Neighbor Policy of the Franklin D. Roosevelt presidency in the pre-World War II period marked a dramatic turn in U.S. foreign affairs. The new policy constituted a public repudiation of imperialism, cultural and racial stereotyping, and military interventions and occupations.
Can such a far-reaching reversal be replicated?
If history is a guide, then again the answer is yes.
U.S. foreign policy is once again at a crossroads, and its present course could be disastrous. One way out of the current morass is to look back to the inter-war period of history and see what lessons it holds for us today.
Can Roosevelt ’s Good Neighbor Policy of the 1930s provide a model for a Global Good Neighbor Policy for the 21 st century?
The seeds of FDR’s new foreign policy had already been planted. By the early part of the 20 th century it was becoming clear that territorial conquests, military intervention, and occupations were proving costly and counterproductive. The Spanish-American War of 1898 had proved the new global reach of U.S. military power, but the Yanquis quickly found that the Cubans and Filipinos hated them as much as they hated the Spanish.
“Big Stick” policies did not stabilize and democratize intervened countries but instead incited armed popular rebellions. As in Iraq , post-intervention attempts to suppress these insurrections and impose order cost the United States more in lost lives and financial resources than the initial interventions.
As early as 1904, novelist and political humorist Mark Twain warned of the moral and political hazards to the United States if it continued to follow the imperialist path to progress. Twain observed that by occupying the Philippines the United States was committing “one grievous error, that irrevocable error,” playing the “European game” of imperialism and colonization.
By the late 1920s, a new consensus was emerging among Washington political leaders and Wall Street barons. After three decades of imperial conquest, followed by Gunboat Diplomacy of military occupations and the Dollar Diplomacy of heavy-handed financial control of other nations, the country’s elites found themselves agreeing with the popular wisdom of Mark Twain.
In election year 1928, both incumbent Herbert Hoover and Democratic Party leader Franklin D. Roosevelt advanced a new vision of international relations in which diplomacy and commerce would trump brute financial and military power. In a Foreign Affairs article in 1928, Roosevelt wrote that by seeking the “cooperation of others we shall have more order in this hemisphere and less dislike.”
Following his victory, President-elect Hoover undertook a goodwill trip to Central America . Citing complaints about Washington ’s overbearing and interventionist behavior, Hoover announced that a new policy was in the offing. “We have a desire to maintain not only the cordial relations of governments with each other,” he said, “but also the relations of good neighbors.”
As president, however, Hoover did little to pursue a new direction in foreign policy. Troop withdrawals did commence in Nicaragua and Haiti . After the onset of the Great Depression in 1929, however, Hoover only rarely addressed foreign policy issues.
It took FDR’s vision and political smarts—and the skills of his influential wife, Eleanor Roosevelt—to fashion a new policy agenda. Leveraging widespread dissatisfaction with existing directions in U.S. domestic and foreign policy, Roosevelt crafted a bold policy blueprint that addressed the crises both at home and abroad.
Roosevelt ’s view of international relations was a startling departure from the ideological frameworks that previously dominated foreign policy discourse. His perspectives on how nations should behave appealed to both common sense and moral values.
Two months after he moved into the White House, FDR promised to help “spell the end of the system of unilateral action, the exclusive alliances, the spheres of influence, the balances of power, and all the other expedients.”
To replace this prevailing system, Roosevelt began to chart a new system guided by international cooperation. “Common ideals and a community of interest, together with a spirit of cooperation, have led to the realization that the well-being of one nation depends in large measure upon the well-being of its neighbors,” the new president asserted.
Being a good global neighbor for Roosevelt meant promoting peace and deglorifying war. As he put it: “I have seen war on land and sea. I have seen blood running from the wounded. I have seen children starving… I have seen the agony of mothers and wives... I hate war.”
Roosevelt repeatedly alerted the nation about the rise of fascism and the new imperial ambitions of Germany and Japan . “We are not isolationists,” said FDR, “except so far as we seek to isolate ourselves completely from war. Yet we must remember that so long as war exists on earth there will be some danger that even the nation which most ardently desires peace may be drawn into war.”
At the same time, though, Roosevelt was formulating a foreign policy doctrine of nonaggression and demilitarization that would ensure that the United States did not precipitate wars as it had in the recent past with Spain and Mexico . “We seek to dominate no other nation,” he declared. “We ask no territorial expansion. We oppose imperialism. We desire reduction in world armaments.”
President Roosevelt intended that his Good Neighbor Policy improve U.S. relations with nations around the world. But it was in the Western Hemisphere that FDR’s new foreign policy framework had its most dramatic impact....
Posted on: Friday, April 1, 2005 - 18:34
Carolyn Eisenberg, in Newsday (3-31-05):
Carolyn Eisenberg is a professor of history at Hofstra University and the author of"Drawing the Line: The American Decision to Divide Germany."]
Two months past the dramatic day when millions of brave Iraqis lined up to vote, the country still lacks a functioning government.
Progress has been halted by the inability to select a new president and two vice presidents, who would together designate a prime minister. Whenever this demoralizing logjam is finally broken, it is important to recognize that the real source of failure resides in Washington and not Baghdad.
Americans are eager to believe that we have set Iraq on the road to freedom. How else to justify the deaths of more than 1,500 of our troops, the 10,000 wounded, the numerous veterans who are returning to their families with anguished memories that will shadow their lives? It is not surprising that the recent election resonated so widely here in the United States or that many critics of the Bush administration have been silenced.
Yet, since the overthrow of Saddam Hussein, the ability of the Iraqis to shape their own political destiny has been compromised by U.S. interventions. While hawking democracy, the Americans have not trusted Iraqis to choose the right leaders or to enact the right laws.
Hence, their endless tinkering with the machinery of governance, their unilateral promulgation of 100 laws under the Coalition Provisional Authority, and their imposition of an "interim constitution" that now constrains political life.
In recent months, the American press has barely mentioned this "interim constitution" or Transitional Administrative Law, signed in March 2004. Written behind closed doors by American legal experts and handpicked Iraqis, it is this document that has complicated the efforts of elected Iraqi representatives to choose a Presidency Council. The relevant provision requires that the new president and the two deputies must be chosen by two-thirds of the National Assembly.
This may seem innocuous. But it is worth noting that in November, President George W. Bush was returned to office by a mere 51 percent of the voters. What would have been the impact here if the Electoral College or Congress had been required to produce a two-thirds majority in order to install a chief executive?
A fair rejoinder is that these arrangements are only temporary and that during the next months elected Iraqis will have the opportunity to produce their own permanent charter. But the "interim" document will continue to have an inhibiting effect because of its stipulation that two-thirds of the voters in three of the 18 governates can block ratification of a new constitution....
Although our politicians and pundits are ignoring the point,"the new Iraq" remains an occupied land, not a free country. For this reason, our misused troops have been consigned to a mission impossible.
Posted on: Friday, April 1, 2005 - 04:04