Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: TruthDig.com (3-27-09)
Great crises and problems often have become the subjects of extensive congressional investigation and oversight. Congress has made prominent inquiries into, for example, the Civil War, the Reconstruction, the “money trust” in the Progressive Era, the banking follies of the 1920s and the Great Depression, the prewar defense preparations at Pearl Harbor, the oversight of military contracts during World War II, the Korean War and the emerging character of Cold War foreign policy during the mid-1950s.
Congress’ work gave us transparency and usually led to useful, progressive legislation. And now comes Financial Services Committee Chairman Barney Frank’s choreographed extravaganza in the House of Representatives, supported by an echoing committee, with sound bites worthy of a night in the Borscht Belt. The ostensible probe of executive bonuses at AIG—forget about any investigation of the company’s decisions that so damaged the financial world—offered a painful reminder of Congress’ now largely ignored unique power of investigation, derived from its constitutionally sanctioned authority to legislate. True, Congress has abused this power from time to time, but that is no argument against its existence.
Rep. Frank provided a perfunctory, carefully staged hearing this month. His fellow committee members had been prepped and primed—seemingly by their press aides rather than by any legal staff. The “hearing” proceeded with hilarity and irony, especially coming from legislators who over the past 20 years had enabled much of the corporate chicanery. The mice that roared eventually produced only a parody of legislation, mercifully about to die.
Frank’s congressional sideshow made more imperative than ever the need for thorough, uncompromising investigations and hearings on any number of issues that have brought us to the present crisis. When Republicans controlled Congress, they disdained anything that might detract from the doings of a Republican administration or would interfere with their fundraising for the next election. Democratic control has offered little beyond the one-day, made-for-television soap operas of Henry Waxman, chairman of the House Committee on Oversight and Government Reform. Alas, these short showings proved to be only a television pilot, not fit for renewal or continuation.
Remember the brief appearance by Monica Goodling, a graduate of Pat Robertson’s school of law, who vetted Justice Department appointees for ideological purity during the Bush administration, making certain no elite liberals (or elite anything, it seemed) made the grade? Goodling acknowledged in testimony in 2007 that she had “crossed the line” when she improperly used political considerations to evaluate applicants. But she testified for just one day, and followed her attorney’s strategy of running out the clock. Why did the committee fail to follow up? Why do witnesses appear for quick one-offs, offering only a limited opportunity for probing questions? Owen Lattimore probably set the record when he testified for 12 days in 1952, with famed attorneys Thurman Arnold and Abe Fortas providing the best civil liberties that money could buy.
Two examples from the not-so-distant past are instructive for what we do not need. Traditional congressional investigations can turn out to be duds—witness the Iran-Contra hearings in 1987. A joint committee conducted aimless, fragmentary proceedings. The senators and representatives vied for precious television exposure. The co-counsels—Arthur Liman for the Senate and John Nields for the House—reflected different cultures and constituencies. Liman was a senior partner in Paul, Weiss, and he had been counsel for the New York state investigation of the 1971 Attica prison riot. Nields had prosecuted Mark Felt for his role in the FBI’s illegal break-ins. To watch the committee’s proceedings was to view two field generals marching backward, constantly stumbling into each other.
Despite their impressive talents, Liman and Nields simply were overwhelmed by the committee’s elephantine proportions and its festering internal rivalries. Their task was not helped by President Reagan’s “memory lapses”; Vice President George H.W. Bush’s insistence he knew nothing; the generally unhelpful testimony by administration officials, some of whom were convicted (and later pardoned by the first President Bush) for unlawfully withholding information from Congress; and by the competing criminal investigation by Special Prosecutor Lawrence Walsh. Back then, we did not appreciate the doings of Rep. Richard Cheney (R-Wyo.), whose minority report a decade later morphed into disturbing theories of the “unitary executive,” with its notions of unbridled executive power.
The Senate Select Committee on Campaign Finance in 1973—better known as the Watergate Committee—offered a contrasting image. The seven-man committee was led by Sam Ervin (D-N.C.), highly respected by his colleagues on both sides of the aisle and who, despite a folksy and sometimes bumbling appearance, was a shrewd, savvy man totally in command of the proceedings. He selected Samuel Dash as his chief counsel, and together they worked nearly four months to prepare their inquiry.
The first 37 days of the Watergate investigation offers a model for other inquiries. Ervin successfully co-opted ranking Republican committee member Howard Baker (R-Tenn.), while another Republican member, Lowell Weicker (R-Conn.), soon distanced himself from President Nixon’s supporters. Dash engaged minority counsel Fred Thompson in what may have been the mismatch of the century. Dash carefully prepared his case, working from the bottom up. He and his staff selectively leaked information, usually to pique public interest. Dash began the public hearings by questioning relatively obscure but important officials, such as the financial officers of Nixon’s re-election committee or relatively low-level White House aides. The media predictably criticized him, demanding that Dash instead call Nixon’s top aides. He did, but by the time they appeared, Dash had made his case.
Our current economic and fiscal crisis deserves serious, thoughtful consideration. AIG’s mistakes involve billions of dollars, not millions; similar poor, reckless choices were made throughout the financial sector. The administration of George W. Bush must account for a variety of actions that include torture and rendition; we have had only glimpses into how politics trumped science and good public policy across a broad range of issues in the Bush era. All that—and more—is grist for congressional inquiry and, inevitably, outrage. President Obama is offering up plans for financial recovery, but he is terribly reliant upon a corps of advisers, a number of whom enabled the causes of our present economic condition. Congress, with its own expertise, might prod the executive into accepting alternatives. But given its present reactive, blustering responses, it is now merely a pathetic giant.
Posted on: Monday, March 30, 2009 - 19:10
SOURCE: Salon (3-30-09)
Obama realizes that after seven years, Afghanistan war fatigue has begun to set in with the American people. Some 51 percent of Americans now oppose the Afghanistan war, and 64 percent of Democrats do. The president is therefore escalating in the teeth of substantial domestic opposition, especially from his own party, as voters worry about spending billions more dollars abroad while the U.S. economy is in serious trouble.
He acknowledged that we deserve a "straightforward answer" as to why the U.S. and NATO are still fighting there. "So let me be clear," he said, "Al-Qaida and its allies -- the terrorists who planned and supported the 9/11 attacks -- are in Pakistan and Afghanistan." But his characterization of what is going on now in Afghanistan, almost eight years after 9/11, was simply not true, and was, indeed, positively misleading. "And if the Afghan government falls to the Taliban," he said, "or allows al-Qaida to go unchallenged -- that country will again be a base for terrorists who want to kill as many of our people as they possibly can."
Obama described the same sort of domino effect that Washington elites used to ascribe to international communism. In the updated, al-Qaida version, the Taliban might take Kunar Province, and then all of Afghanistan, and might again host al-Qaida, and might then threaten the shores of the United States. He even managed to add an analog to Cambodia to the scenario, saying, "The future of Afghanistan is inextricably linked to the future of its neighbor, Pakistan," and warned, "Make no mistake: Al-Qaida and its extremist allies are a cancer that risks killing Pakistan from within."
This latter-day domino theory of al-Qaida takeovers in South Asia is just as implausible as its earlier iteration in Southeast Asia (ask Thailand or the Philippines). Most of the allegations are not true or are vastly exaggerated. There are very few al-Qaida fighters based in Afghanistan proper. What is being called the "Taliban" is mostly not Taliban at all (in the sense of seminary graduates loyal to Mullah Omar). The groups being branded "Taliban" only have substantial influence in 8 to 10 percent of Afghanistan, and only 4 percent of Afghans say they support them. Some 58 percent of Afghans say that a return of the Taliban is the biggest threat to their country, but almost no one expects it to happen. Moreover, with regard to Pakistan, there is no danger of militants based in the remote Federally Administered Tribal Areas (FATA) taking over that country or "killing" it....
Posted on: Monday, March 30, 2009 - 18:30
SOURCE: TomDispatch.com (3-29-09)
Let's start by stopping.
It's time, as a start, to stop calling our expanding war in Central and South Asia"the Afghan War" or"the Afghanistan War." If Obama's special representative to Afghanistan and Pakistan Richard Holbrooke doesn't want to, why should we? Recently, in a BBC interview, he insisted that"the 'number one problem' in stabilizing Afghanistan was Taliban sanctuaries in western Pakistan, including tribal areas along the Afghan border and cities like Quetta" in the Pakistani province of Baluchistan.
And isn't he right? After all, the U.S. seems to be in the process of trading in a limited war in a mountainous, poverty-stricken country of 27 million people for one in an advanced nation of 167 million, with a crumbling economy, rising extremism, advancing corruption, and a large military armed with nuclear weapons. Worse yet, the war in Pakistan seems to be expanding inexorably (and in tandem with American war planning) from the tribal borderlands ever closer to the heart of the country.
These days, Washington has even come up with a neologism for the change:"Af-Pak," as in the Afghanistan-Pakistan theater of operations. So, in the name of realism and accuracy, shouldn't we retire"the Afghan War" and begin talking about the far more disturbing"Af-Pak War"?
And while we're at it, maybe we should retire the word"surge" as well. Right now, as the Obama plan for that Af-Pak War is being"rolled out," newspaper headlines have been surging when it comes to accepting the surge paradigm. Long before the administration's"strategic review" of the war had even been completed, President Obama was reportedly persuaded by former Iraq surge commander, now CentCom commander, General David Petraeus to"surge" another 17,000 troops into Afghanistan, starting this May.
For the last two weeks, news has been filtering out of Washington of an accompanying civilian"surge" into Afghanistan ("Obama's Afghanistan 'surge': diplomats, civilian specialists"). Oh, and then there's to be that opium-eradication surge and a range of other so-called surges. As the headlines have had it:"1,400 Isle Marines to join Afghanistan surge,""U.S. troop surge to aid Afghan police trainers,""Seabees build to house surge,""Afghan Plan Detailed As Iraq Surge 'Lite,'" and so on.
It seems to matter little that even General Petraeus wonders whether the word should be applied. ("The commander of the U.S. Central Command said Friday that an Iraq-style surge cannot be a solution to the problems in Afghanistan.") There are, however, other analogies that might better capture the scope and nature of the new strategic plan for the Af-Pak War. Think bailout. Think A.I.G.
The Costs of an Expanding War
In truth, what we're about to watch should be considered nothing less than the Great Afghan (or Af-Pak) bailout.
On Friday morning, the president officially rolled out his long-awaited" comprehensive new strategy for Afghanistan and Pakistan," a plan without a name. If there was little news in it, that was only because of the furious leaking of prospective parts of it over the previous weeks. So many trial balloons, so little time.
In a recent"60 Minutes" interview (though not in his Friday announcement), the president also emphasized the need for an"exit strategy" from the war. Similarly, American commander in Afghanistan, General David McKiernan, has been speaking of a possible"tipping point," three to five years away, that might lead to"eventual departure." Nonetheless, almost every element of the new plan -- both those the president mentioned Friday and the no-less-crucial ones that didn't receive a nod -- seem to involve the word"more"; that is, more U.S. troops, more U.S. diplomats, more civilian advisors, more American and NATO military advisors to train more Afghan troops and police, more base and outpost building, more opium-eradication operations, more aid, more money to the Pakistani military -- and strikingly large-scale as that may be, all of that doesn't even include the" covert war," fought mainly via unmanned aerial vehicles, along the Pakistani tribal borderlands, which is clearly going to intensify.
In the coming year, that CIA-run drone war, according to leaked reports, may be expanded from the tribal areas into Pakistan's more heavily populated Baluchistan province where some of the Taliban leadership is supposedly holed up. In addition, so reports in British papers claim, the U.S. is seriously considering a soft coup-in-place against Afghan President Hamid Karzai. Disillusioned with the widespread corruption in, and inefficiency of, his government, the U.S. would create a new" chief executive" or prime ministerial post not in the Afghan constitution -- and then install some reputedly less corrupt (and perhaps more malleable) figure. Karzai would supposedly be turned into a figurehead"father of the nation." Envoy Holbrooke has officially denied that Washington is planning any such thing, while a spokesman for Karzai denounced the idea (both, of course, just feeding the flames of the Afghan rumor mill).
What this all adds up to is an ambitious doubling down on just about every bet already made by Washington in these last years -- from the counterinsurgency war against the Taliban and the counter-terrorism war against al-Qaeda to the financial love/hate relationship with the Pakistani military and its intelligence services underway since at least the Nixon years of the early 1970s. (Many of the flattering things now being said by U.S. officials about Pakistani Chief of the Army Staff General Ashfaq Pervez Kayani, for instance, were also said about the now fallen autocrat Pervez Musharraf when he held the same position.)
Despite that mention of the need for an exit strategy and a presidential assurance that both the Afghan and Pakistani governments will be held to Iraqi-style"benchmarks" of accountability in the period to come, Obama's is clearly a jump-in-with-both-feet strategy and, not surprisingly, is sure to involve a massive infusion of new funds. Unlike with A.I.G., where the financial inputs of the U.S. government are at least announced, we don't even have a ballpark figure for how much is actually involved right now, but it's bound to be staggering. Just supporting those 17,000 new American troops already ordered into Afghanistan, many destined to be dispatched to still-to-be-built bases and outposts in the embattled southern and eastern parts of the country for which all materials must be trucked in, will certainly cost billions.
Recently, the Washington Post's Walter Pincus dug up some of the construction and transportation costs associated with the war in Afghanistan and found that, as an employer, the U.S. Army Corps of Engineers comes in second only to the Afghan government in that job-desperate country. The Corps is spending about $4 billion this year alone on road-building activities, and has slated another $4-$6 billion for more of the same in 2010; it has, according to Pincus, already spent $2 billion constructing facilities for the expanding Afghan army and police forces, and has another $1.2 billion set aside for more such facilities this year. It is also likely to spend between $400 million and $1.4 billion on as many as six new bases, assorted outposts, and associated air fields American troops will be sent to in the south.
Throw in hardship pay, supplies, housing, and whatever else for the hundreds of diplomats and advisors in that promised" civilian surge"; add in the $1.5 billion a year the president promised in economic aid to Pakistan over the next five years, a tripling of such aid (as urged by Vice President Biden when he was still a senator); add in unknown amounts of aid to the Pakistani and Afghan militaries. Tote it up and you've just scratched the surface of Washington's coming investment in the Af-Pak War. (And lest you imagine that these costs might, at least, be offset by savings from Obama's plan to draw down American forces in Iraq, think again. A recent study by the Government Accountability Office suggests that"Iraq-related expenditures" will actually increase"during the withdrawal and for several years after its completion.")
Put all this together and you can see why the tactical word"surge" hardly covers what's about to happen. The administration's"new" strategy and its"new" thinking -- including its urge to peel off less committed Taliban supporters and reach out for help to regional powers -- should really be re-imagined as but another massive attempted bailout, this time of an Afghan project, now almost 40 years old, that in foreign policy terms is indeed our A.I.G.
As Obama's economic team overseeing the various financial bailouts is made up of figures long cozy with Wall Street, so his foreign policy team is made up of figures deeply entrenched in Washington's national security state -- former Clintonistas (including the penultimate Clinton herself), military figures like National Security Adviser General James Jones, and that refugee from the H.W. Bush era, Defense Secretary Robert Gates. They are classic custodians of empire. Like the economic team, they represent the ancien régime.
They've now done their"stress tests," which, in the world of foreign policy, are called"strategic reviews." They recognize that unexpected forces are pressing in on them. They grasp that the American global system, as it existed since the truncated American century began, is in danger. They're ready to bite the bullet and bail it out. Their goal is to save what they care about in ways that they know.
Unfortunately, the end result is likely to be that, as with A.I.G., we, the American people, could end up"owning" 80% of the Af-Pak project without ever"nationalizing" it -- without ever, that is, being in actual control. In fact, if things go as badly as they could in the Af-Pak War, A.I.G. might end up looking like a good deal by comparison.
The foreign policy team is no more likely to exhibit genuinely outside-the-box thinking than the team of Tim Geithner and Larry Summers has been. Their clear and desperate urge is to operate in the known zone, the one in which the U.S. is always imagined to be part of the solution to any problem on the planet, never part of the problem itself.
In foreign policy (as in economic policy), it took the Bush team less than eight years to steer the ship of state into the shallows where it ran disastrously aground. And yet, in response, after months of"strategic review," this team of inside-the-Beltway realists has come up with a combination of Af-Pak War moves that are almost blindingly expectable.
In the end, this sort of thinking is likely to leave the Obama administration hostage to its own projects as well as unprepared for the onrush of the unexpected and unknown, whose arrival may be the only thing that can be predicted with assurance right now. Whether as custodians of the imperial economy or the imperial frontier, Obama's people are lashed to the past, to Wall Street and the national security state. They are ill-prepared to take the necessary full measure of our world.
If you really want a"benchmark" for measuring how our world has been shifting on its axis, consider that we have all lived to see a Chinese premier appear at what was, in essence, an international news conference and seriously upbraid Washington for its handling of the global economy. That might have been surprising in itself. Far more startling was the response of Washington. A year ago, the place would have been up in arms. This time around, from White House Press Secretary Robert Gibbs ("There's no safer investment in the world than in the United States...") to the president himself ("Not just the Chinese government, but every investor can have absolute confidence in the soundness of investments in the United States..."), Washington's response was to mollify and reassure.
Face it, we've entered a new universe. The"homeland" is in turmoil, the planetary frontiers are aboil. Change -- even change we don't want to believe in -- is in the air.
In the end, as with the Obama economic team, so the foreign policy team may be pushed in new directions sooner than anyone imagines and, willy-nilly, into some genuinely new thinking about a collapsing world. But not now. Not yet. Like our present financial bailouts, like that extra $30 billion that went into A.I.G. recently, the new Obama plan is superannuated on arrival. It represents graveyard thinking.
Posted on: Monday, March 30, 2009 - 18:24
SOURCE: Daniel Pipes website (3-29-09)
Some examples, culled over the years (with the caveat that some of these reporters no longer cover religion):
Geneive Abdo of the Chicago Tribune, who spoke at the 2003 annual conference of the Muslim Public Affairs Council.
David Crumm of the Detroit Free Press glorified an Islamist religious leader"Dearborn's Imam Qazwini: A champion for Islam's future."
Felix Hoover of the Columbus Dispatch, who was provided with meticulous, detailed information about problems at an Islamist school, the Sunrise Academy, only to ignore it.
Robert King of the Indianapolis Star, who covered the Islamic Society of North America convention as though it were the Elks or Masons.
Shirley Ragsdale of the Des Moines Register wrote a near hagiography of Ibrahim Dremali, still remembered for having exhorted a crowd in Florida,"not to be sad for the martyrs, or be afraid to die for what they believed in."
Bill Tammeus of the Kansas City Star wrote the memorably bad"Women of cover," a glorification of the hijab.
Rachel Zoll of the Associated Press, who naively accepted that a supposed anti-terror petition supported by the Council on American-Islamic Relations,"Not in the Name of Islam" is what it purports to be.
But – no surprise - my nominee for worst religion reporter goes to someone I have been watching since 2004:
Comment: (1) Is this incompetence a result of the mainstream media being so liberal that it cannot understand religion in general and radical Islam in particular? Probably. (2) As the MSM loudly laments its own demise, we conservatives see this as a mixed development, one that offers a chance for real improvement – and nowhere more than in the realm of reporting on religion.
Posted on: Monday, March 30, 2009 - 18:08
SOURCE: Telegraph (UK) (3-28-09)
The news that Gordon Brown has opened talks with Buckingham Palace over altering the 1701 Act of Settlement, which bars members of the Royal family from succeeding to the throne if they marry Roman Catholics, has profound implications for the long-term future of this country. For the Act of Settlement is not the bigoted, irrelevant and obsolete law that Downing Street presents it as – it is one of the key pieces of legislation that has defined what Britain was and still is. For a Prime Minister who claims to care deeply about the concept of Britishness, the Act should be sacrosanct, rather than sacrificed in a gross bout of politically correct gimmickry.
Britain is a Protestant country today largely because of the Act of Settlement. It secured the Hanoverian succession 13 years after the Glorious Revolution replaced the Catholic King James II with the Protestant William III (of Orange) and Mary II. Since the only surviving son of their daughter, the future Queen Anne, had died, it settled the Crown after her upon the Electoress Sophia of Hanover, a granddaughter of James I, and her heirs – if they were Protestants, and married to Protestants, as indeed the four King Georges were.
Because it is a central tenet of the Catholic Church that the children of Catholics should be raised as Catholics, it was understood that marriage of a Royal opened up the possibility either of a Catholic one day sitting on the Throne, or a Catholic parent committing apostasy by allowing their child to be raised as a Protestant – neither of which were desirable outcomes politically, religiously or morally. Since the monarch is also Supreme Head of the (Protestant) Church of England, above whom there is no one in the Church hierarchy – including the Pope – the ban on Catholics makes further sense.
The Glorious Revolution, Act of Settlement and Protestant succession ushered in a long period of prosperity and order for Britain, and ended a lengthy period of bloodshed and civil war. Religious turmoil was over, and the knowledge that the head of state would always be of the same faith as the national, Established, Anglican Church greatly contributed to that. The emancipation of Catholics proceeded throughout the 19th century, and in 1926 all restrictions were abolished except that the Sovereign, Regent, Lord Chancellor and Lord Keeper could not be Catholics, and that Roman Catholic priests could not sit in the House of Commons. There is no law preventing Catholics marrying into the Royal family, merely from inheriting the Throne if they do. Prince Michael, for example, nobly and uncomplainingly gave up his right to the Throne when he married.
So there is a "discriminatory" ban on Catholics becoming monarch – but since only direct descendants of King George II can do that, 99.9 per cent of us are discriminated against on that basis, whatever our religion. Similarly, the proposals to "open up" the monarchy to women, by scrapping the laws of primogeniture, is another way in which Gordon Brown is treating our thousand-year-old monarchy as though it is just another public-sector job, like those advertised in the Guardian, subject to anti-discrimination legislation...
Posted on: Monday, March 30, 2009 - 11:24
SOURCE: Guardian (UK) (3-25-09)
Behind the headlines about Wikipedia replacing the second world war and vlogging ousting Queen Victoria, Sir Jim Rose's new plans for the primary school curriculum aren't all bad. Indeed, if delivered intelligently, they might even begin to chip away at the nefarious "Hitler and the Henries" approach to history teaching in the classroom.
Of course, the mature-in-years Rose has been well and truly had by the Hoxton-finned IT brigade when it comes to courses on blogging and podcasts. School should be about learning and understanding, not delivering the endlessly shifting networking and social skills set which is easily picked up outside the school gates. Indeed, by the time the Twitter sub-committee has finalised its memorandum on "communication and technological understanding" today's technology fads will have gone the way of CD-Roms and Betamax.
More important is what happens to history. Rightly, Rose is stripping away the endless tier of regulations and stipulations that the Department for Children, Schools and Families – the last, great centralising Whitehall Lubyanka – has imposed on teachers, for a slimmed-down curriculum of six core "learning areas". Depressingly, I think history falls under something entitled "human, social and environmental understanding" (can you believe it?). And what Rose wants is for schools to focus on two key periods of British history – but it would be up to teachers to decide which.
If this means the end of the second world war for under-11s, then so be it. In fact, it's a good thing. For the 1940s is the one area of history that suffuses our public understanding of the past: on the radio, in newspapers, on television and film and at pretty much every major museum, the second world war is well and truly covered. More than that, at Key Stage 3 (11-14), then GCSE, and then AS- and A-level, the Reichstag fire, the rise of Hitler, the Nazi-Soviet pact and the D-Day invasion is pretty much all the history school kids learn. About 80% of A-level students study the Nazis. As a result, as a recent Ofsted report made plain, we have completely lost sight of the 18th century as a topic of teaching – not to mention the wars of the roses, the English civil war and the history of empire. Letting go of the second world war – even saying goodbye to Queen Victoria – could prove a liberation...
Posted on: Thursday, March 26, 2009 - 08:58
SOURCE: Time Magazine (3-21-09)
Our pundits worry that a populist rage is loose in the land—pitchforks everywhere! My first reaction upon hearing that was to dismiss the word "populist" as a distraction, an epithet meant to recall episodes in which mass rage made sound policy deliberation impossible. Think of dispossessed 19th-century farmers letting their righteous rage at bankers tumble easily into free-floating anger at "Jewish bankers" and then simply at Jews; of 1970s white South Boston parents stabbing busing advocates with American flags. My second reaction was to dismiss the word as inaccurate. What makes this rage "populist"? This is ordinary rage, rational and focused. The lead pitchfork bearers, after all, are people like New York Times business columnist Joe Nocera, who wrote that AIG's Financial Practices Group was guilty of a "scam" at which "we should be furious." You might more accurately call that common sense.
Casting my eye over the broader sweep of history, though, I no longer fear populism. The habit of messily dividing the world into "the people" and "the elite"—whether it's left calling out right, or right calling out left—is distinctively, ineluctably American. It's not going away. And there's much more to it than the name-calling of angry political factions. It is the governing folk wisdom of a nation without an inherited aristocracy, distrustful of privilege that is not "earned." It is our American common sense.
The first Americans to call themselves populists, in the 1890s, were the first to base a political program on the explicit principle that wealth properly belongs to those who produce it. They believed farmers to be the truest "producers"; what financial speculators did was not properly "work" at all. We're past that now; those categories no longer make sense in our present deskbound world. But the moral intuition behind separating out "productive" and "unproductive" classes, while often badly abused by demagogues (bad American populists include the Ku Klux Klan and the Weather Underground), is evergreen. It was best summarized in a maxim by, ironically, a Brit, John Maynard Keynes: "Nothing corrupts society more than to disconnect effort and reward."...
Take away taxpayers' sense of ownership stake in an issue (especially, as with AIG, when taxpayers literally own the company) and their rage will not go away. It festers. Quagmires result. And that's when the "bad" kind of populism—the hateful kind; the violent kind; the demagogic kind—can flourish.
Posted on: Wednesday, March 25, 2009 - 21:48
SOURCE: Guardian (UK) (3-25-09)
Imagine a country that appoints someone who has been found guilty of striking a 12-year-old boy to be its foreign minister. The person in question is also under investigation for money-laundering, fraud and breach of trust; in addition, he was a bona fide member of an outlawed racist party and currently leads a political party that espouses fascist ideas. On top of all this, he does not even reside in the country he has been chosen to represent.
Even though such a portrayal may appear completely outlandish, Israel's new foreign minister, Avigdor Lieberman, actually fits the above depiction to the letter.
• In 2001, following his own confession, Lieberman was found guilty of beating a 12-year-old boy. As part of a plea bargain, Lieberman was fined 17,500 shekels and had to promise never to hit young children again.
• In 2004, Lieberman's 21-year-old daughter Michal set up a consulting firm, which received 11m shekels from anonymous overseas sources. Lieberman, according to the police, received more than a 2.1m-shekel salary from the company for two years of employment. In addition, according to an investigation by Haaretz, he allegedly received additional severance pay – amounting to hundreds of thousands of shekels – in 2006 and 2007, while he was minister of strategic affairs and deputy prime minister. According to Israeli law, this is illegal.
• Lieberman is an ex-member of Meir Kahane's party, Kach, which was outlawed due to its blatantly racist platform. Moreover, his views towards Arabs do not appear to have changed over the years. In 2003, when reacting to a commitment made by Prime Minister Ariel Sharon to give amnesty to approximately 350 Palestinian prisoners, Lieberman declared that, as minister of transport, he would be more than happy to provide buses to take the prisoners to the sea and drown them there.
• In January 2009, during Israel's war on Gaza, Lieberman argued that Israel"must continue to fight Hamas just like the United States did with the Japanese in the second world war. Then, too, the occupation of the country was unnecessary." He was referring to the two atomic bombs dropped on Nagasaki and Hiroshima.
• Lieberman does not live in Israel according to its internationally recognised borders, but rather in an illegal settlement called Nokdim. Legally speaking, this would be like US Secretary of State Hillary Clinton residing in Mexico and UK Foreign Secretary David Miliband living on the Canary Islands.
And yet, despite these egregious transgressions, newly-elected Prime Minister Binyamin Netanyahu has no qualms about appointing Lieberman to represent Israel in the international arena. Netanyahu's lust for power has led him to choose a man who actually poses a serious threat to Israel. Both Lieberman's message and style are not only violent, but have clear proto-fascist elements; and, as Israeli commentators have already intimated, he is extremely dangerous.
Politics being politics, most western leaders will no doubt adopt a conciliatory position towards Lieberman, and agree to meet and discuss issues relating to foreign policy with him. Such a position can certainly be justified on the basis of Lieberman's democratic election; however much one may dislike his views, he is now the representative of the Israeli people. Those who decide to meet him can also claim that ongoing diplomacy and dialogue lead to the internalisation of international norms and thus moderate extremism.
These justifications carry weight. However, western leaders will also have to take into account that the decision to meet Lieberman will immediately be associated with the ban on Hamas, at least among people in the Middle East. In January 2006, Hamas won a landslide victory in elections that were no less democratic than the recent elections in Israel. While Hamas is, in many respects, an extremist political party that espouses violence, its politicians are representatives of the Palestinian people and are seen as struggling for liberation and self-determination.
If western leaders want to be conceived as credible, they must change their policy and meet with Hamas as well. Otherwise, their decision to meet Lieberman will be rightly perceived as hypocritical and duplicitous, and the pervasive perception in the region – that the United States and Europe are biased in Israel's favour – will only be strengthened.
Posted on: Wednesday, March 25, 2009 - 19:11
SOURCE: Japan Focus/Asia-Pacific Journal (3-24-09)
2008—Annus Horribilis for the world economy—produced successive food, energy and financial crises, initially devastating particularly the global poor, but quickly extending to the commanding heights of the US and core economies and ushering in the sharpest downturn since the 1930s depression.
As all nations strive to respond to the financial gridlock that began in the United States and quickly sent world industrial production and trade plummeting, there has been much discussion of the ability of the high-flying Chinese economy to weather the storm, of the prospects for the intertwined US and Chinese economies, even of the potential for China to rise to a position of regional or global primacy. The present article critically explores these possibilities.
In “China’s Way Forward,”  James Fallows offers an astute ground’s eye assessment of that nation’s economic prospects and reflects comparatively on the experience of the United States, Japan and others in the teeth of the storm of 2008-09. Beginning with compelling images of migrant workers in their millions returning to the countryside where they face protracted unemployment while container ships sit idle in port, Fallows explains why China’s industrialization and export-dependent economy will be hard hit by the looming world depression. He believes, however, that China will not only weather the storm, but is likely to emerge stronger from it.
History can provide important clues to future possibilities. Financial specialist Michael Pettis has compared China today with the 1920s when the US, taking advantage of World I, transformed its substantial trade deficit and became the workshop of the world and a major creditor nation. The inflow of gold paid for US agricultural and industrial goods driving the US economy.  When the depression struck in the 1930s, the US was harder hit by unemployment than many others, including Europe and Japan, yet it emerged from depression and war as the global hegemon. The geopolitics of war would uniquely worked to US advantage in the first half of the twentieth century, but only then, in fueling industrial advance, in decimating all major rivals, and in extending the US reach through military bases. China today, with burgeoning industry and a huge trade surplus, but five times the trade dependency of the US in the 1930s, faces the daunting prospect of industrial implosion, declining exports and spiraling unemployment. How will China respond? And with what effect on others, particularly the United States, in light of US-China economic and financial interdependence?
China’s trade surplus continued to grow even as its exports fell dramatically between December 2008 and February 2009. As economist Brad Setser documents, that surplus facilitated further purchase of US Treasuries and securities  even as China’s Premier Wen Jiabao warned the United States of its need to protect the value of China’s investment against the declining value of the dollar. It is precisely China’s competitively priced exports, now including a strong array of technologically sophisticated high end manufactures, together with its Treasury and agency purchases, which have allowed the US to continue its profligate debtor ways. Or, viewed conversely, the US market was critical to China’s industrial advance. For its part, the US now calls on China to reduce its deficit by revaluing its currency and consuming more. The real worry for both, however, is that a surge of protectionism at a time of recession—signs already emerging in spring 2009—would irreparably damage both nations and the global economy. It could, more ominously, touch off a protectionist wave leading the way eventually to hostilities and war.
Fallows believes that China will not only weather the storm but may emerge from it stronger than before. He offers several reasons: Unlike deficit nations such as the US, China has vast surpluses and it is vigorously allocating part of them to boost production and reduce unemployment. Indeed, not only is China vigorously promoting construction that will boost employment, it has also embarked on massive labor retraining programs. As Keith Bradsher reported, this year Guangdong province alone has begun to implement three-to-six month training programs to train 4 million workers.  Many of them combine training with part-time work in factories that are expected to hire them. The low wages paid to trainees are part of a process that is driving down wages so that China will be more competitive when export markets again expand. Nevertheless, the short-term prospects are bleak. China’s manufactures, by World Bank reckoning, account for 33% of GDP, so declining output and exports quickly bring substantial job losses. Significantly, China, the world’s number one steel producer faces plummeting production and exports in 2009. China’s Iron and Steel Association on March 18 projected an 80% fall in 2009 steel exports on top of a 6% drop in 2008.  US steel production in the first three and a half months of 2009 fell by 52.8% to 22.5 million tons, with a capacity utilization rate of 42.9% compared with 90.5% in 2008. 6] The critical issue is not, however, whether Chinese investment in industry and training will solve the immediate problem of unemployment. It will hinge on whether these measures simply fuel overproduction leading to sharpening international conflict, or whether investment and retraining of workers can be directed to new industries and technologies that can thrive when economic recovery begins by showing the way toward more environmentally friendly and less destructive forms of development while creating jobs.
Fallows emphasizes Chinese inventiveness, and entrepreneurship, comparing the Chinese national mood to that of a recovering Europe in the 1950s when everything seemed possible. His buoyant views of Chinese entrepreneurship are best illustrated by the case of BYD Battery, a firm whose horizons are not only dynamic but also green. Based in Shenzhen, BYD rose from a household enterprise within a decade to become the world’s leading battery producer. It is now investing heavily in technology that it hopes will drive the cleaner cars of the future. Indeed, it has begun producing its own plug-in electric car and anticipates international sales in the near future.
Fallows is at his best in drawing on interviews to convey a sense of that nation’s entrepreneurial energies. To assess China’s prospects within the sweep of the history of capitalism in general and of East Asia in particular, consider the observations of Giovanni Arrighi in a recent interview and his major works.  Building on Braudel and Marx, Arrighi observes that the US sequence of deindustrialization and financial expansion since the 1970s, culminating in the crash of 2007-09, is characteristic of the autumn of hegemonic systems. Analyzing five centuries of the geopolitics of capitalism and empire, Arrighi highlights the recurrent pattern of financialization giving rise to a period of chaos and the emergence of a new hegemon. Could China—or perhaps a greater East Asia region—emerge to reshape the world economy in the new millennium? Or, to the contrary, might the US restore its hegemonic position through astute reforms leading to new technological breakthroughs and a sounder financial order? Would transition through a world depression be smooth, or would a new order emerge out of the ruins of economic and financial implosion, protracted class struggles or wars?
Arrighi shares Fallows’ appreciation of Chinese strengths and energies. Drawing critically on the work of the economic historian Sugihara Kaoru on the “industrious revolution” in Europe and East Asia, he notes the specific character of China’s partial proletarianization, which lies behind its dramatic surge of production and export-driven development. Central to this understanding is the dynamic role played by the more than 130 million migrant workers who have fueled China’s low-wage industrialization while retaining land ownership rights in their villages while working, some for decades, in the cities. If China’s migrant workers share much in common with the tens of millions of undocumented workers in the US, including vulnerability to arrest and deportation (from the cities, not across national borders) during periods of economic downturn, there are important differences. Working in the cities but denied the benefits associated with urban citizenship by dint of their rural household registration, many migrants display an entrepreneurial ethos. Indeed, China’s household contract system guarantees equal land shares for rural (including migrant) people, a system that preserves household cultivation rights for all villagers, thus dodging the bullet confronting the scores of millions of landless farmers in other developing countries. The system, with links to the earlier household plots which complemented collective agriculture, provides more than a haven in hard times. Its importance becomes apparent in periods of downturn as a fallback against starvation, but its household-centered character also provides a breeding ground for the petty entrepreneurship that has been among the driving forces of China’s economy since the 1970s.
Far from mythologizing China’s inexorable rise to preeminence, however, Arrighi draws attention to the stability of world structures of inequality, which have preserved the dominance of the North over the South since the nineteenth century with little change in relative per capita incomes. For all the growth and income gains of recent decades, China’s per capita income remains low compared with that of core countries. Indeed, Arrighi finds that China’s per capita income grew only from 2% to 4% of that of the wealthy nations (more, of course, in PPP terms). And, if we exclude China, the position of the nations of the South has actually declined in relative terms since the 1980s; with China included, it has risen only slightly. This underlines both the extraordinary stability of the world order of inequality and how far China is from a position of equity, not to speak of preeminence.
If there has been significant upward mobility over the last half century as measured by per capita GDP, its primary locus has not been China but the East Asia region, led by Japan and including the Newly Industrializing Economies of Taiwan, South Korea, Singapore and Hong Kong as well as China. From this perspective, a rising China is far from achieving a position of primacy, even in Asia, not to speak of the world. It will not reach the commanding heights in economic, technological or income terms any time soon. And, despite systematic military buildup in recent decades, and even recognizing the vulnerability of the vast American structure of bases, battleships, and nuclear bombs, as indicated by US stalemate and defeat in successive wars, China is unlikely to be able to project its military power decisively on a global or even a regional scale. 
The more interesting possibilities, certainly in the short to medium term, center on the rise of East Asia. But can the region respond effectively to the contemporary economic and financial crisis? More important, can it overcome historical and political differences, including conflicting understandings of war and colonialism and deep divisions between its two most powerful nations, China and Japan, to construct a new regional- or eventually world economic order? The challenges of any attempt to do so are illustrated by the heavy inter-national strains that Europe and the Euro face in the context of world depression despite the institutional strengths and accomplishments of the European Union. In economic terms, a critical issue is whether China, Japan and other East Asian nations will achieve pre-eminence in the new green technologies that will critically shape the economies of the future. What is certain is that, while regions have risen and fallen over the centuries, there has been no regional as opposed to national hegemony in historical capitalism to date, an outcome precluded by inter-state conflicts.
It is necessary, moreover, to consider the internal problems confronting China. Fallows, along with many contemporary analysts, notes the proliferation of popular struggles in recent decades, but dismisses the possibility of intense social conflict or revolutionary change in China in the face of economic crisis. While recognizing the potential instability that could arise from large-scale unemployment and falling incomes, he emphasizes the fact that worker and villager discontent has addressed specific grievances, rather than targeting the system or the state. Such a perspective slights both the legacies of history and the significance of strikes and protests that shape societies without precipitating a revolutionary rupture, as in the US in the 1930s and many nations in the 1960s.
It is critical, in particular, to recall that in the course of the last two centuries, China, has repeatedly been in the eye of the world storm of rebellion and revolution. Indeed, it has perhaps the world’s longest and most fully developed tradition of rebellion and regime change from below of any nation. As Arrighi and Binghamton colleagues in the World Labor Group documented, the twentieth century was marked by two massive waves of worker and/or national insurgency, prior to and following the two world wars, giving rise to both national independence and revolutionary movements and transforming the social balance, with China figuring prominently in each.  Particularly if economic turmoil leads to regional and global wars, the possibility of tumultuous class struggle should not be ruled out for China, Asia, or other regions.
In the face of rising challenges from below, in recent years the Chinese state, with an accent on stability, has demonstrated uncanny ability to limit protest by preventing horizontal alliances, keeping protesters isolated, and channeling most protest into the legal system.  But it has done so while riding the wave of economic growth, mobility, and rising prosperity since the 1970s. In the face of world depression, the Chinese state has moved far more boldly than the United States or any industrialized nation to create jobs through funding construction and fostering new industries. Equally important, as Wang Shaoguang has documented, there is evidence that the current Chinese leadership has begun to reconstruct the welfare and health safety net that was largely swept aside in China, as in England and the US, in recent decades: through a basic income program, health care and pension programs, for example.  These measures, together, suggest the kinds of flexible response that the Chinese state is capable of mounting in the face of challenges from below.
China nevertheless confronts three formidable immediate and long-term hurdles above and beyond the current world overproduction and financial crisis. The first of these is the specter of famine. North and Northwest China are in the midst of the most severe drought in at least half a century, with precipitation levels 70-90% below normal and water tables ruinously depleted from excessive well drilling. The FAO’s 2009 report on “Crop Prospects and Food Situation” indicates that 9.5 million hectares of winter wheat in seven provinces have been severely affected by drought.  In this respect, China shares with other developing nations acute problems of hunger and poverty. Here, too, proactive state policies will be essential if the disaster is to be mitigated. Nevertheless, while the problems are acute, China’s financial and institutional resources appear to be greater than those of many other, and particularly developing, nations. 
Perhaps most challenging in the long run is whether China can shift gears to an environmentally sustainable development course. Thus far, with World Bank and US plaudits, it has followed the path of earlier developers to achieve rapid sustained growth but at a price of an environmentally disastrous combination of toxic industrialization, construction of the world’s largest dams, heavy reliance on coal- and oil-driven production, and mass automobilization. Cumulatively, these have taken an immense toll on land, water, and air. If China’s reckless development trajectory followed in the footsteps of earlier pioneers such as the US and Japan, the environmental consequences have been graver. All signs point toward a leadership that remains deeply committed to pursuit of mega engineering projects for damming and water diversion with potentially dire consequences not only for the Chinese earth and Chinese people, but for China’s neighbors in Southeast Asia threatened by water diversion. China may eventually join an emerging consensus that prioritizes green technology and even, perhaps, begins to rein in the God of Growth . . . but with its vast legions of rural poor, this will not be any time soon. Whether China, as exemplified by BYD’s green automotive production can become a pioneer in the emerging new industry remains to be seen.
A second challenge is the specter of rising inequality. In the course of three decades of rapid development, China’s developmental priorities transformed a highly egalitarian income distribution pattern into one of the world’s most skewed distributions, with class, city-countryside and ethnic divisions all pronounced. This structurally determined outcome coincided, moreover, with the dismantling of the nation’s extensive welfare network.  Can this genie be put back in the bottle? The state’s recent proactive welfare policies, if deepened and sustained, could help. Strikingly, US programs, and not only its bailouts for billionaires, thus far ignore issue of inequality in a nation in which income inequality soared and the welfare structure was evicerated in the same years that China’s did.
Arrighi argues in light of the history of capitalist transitions and financialization that US hegemony entered its twilight in the 1970s and reached its terminal phase with the collapse of the financial and real estate bubble in 2008, a conclusion made inevitable by the earlier transition from industrial to financial primacy and the neo-liberal regime that gave the latter free rein. Perhaps . . . Yet, while recognizing the formidable problems confronting the Obama administration, in the absence of a serious contender in the form of a new hegemon, whether a nation or a region, such a conclusion seems at least premature. The strengthening of the dollar in the face of the US financial meltdown and huge deficits, and the Obama administration’s attempts to launch the next wave of US growth on green foundations, suggest possible policy alternatives that could help to restore American economic preeminence and prevent, or at least forestall, the imminent demise of its hegemonic power. We should not rule out such possibilities, in particular a protracted muddling through in which the US remains indisputably the most powerful among rival powers for the foreseeable future. This could take place even under circumstances in which attempts to bail out the nearly bankrupt financial sector show few signs of gaining traction, in which a continued war in Iraq and an expanding war in Afghanistan and Pakistan, together with the stable growth both of the military budget and the global network of military bases, are emblematic of US vulnerability rather than of hegemony.
Recommended citation: Mark Selden, China’s Way Forward? Historical and Contemporary Perspectives on Hegemony and the World Economy in Crisis. The Asia-Pacific Journal, 13-1-09.
• I am indebted to Andrew DeWit, Gavan McCormack, R. Taggart Murphy and especially Giovanni Arrighi for suggestions of sources and perspectives on the issues.
1. James Fallows, “China’s Way Forward,” Atlantic Monthly, April, 2009.
2. “There are monetary echoes from the 1930s too,” China Financial Markets
January 21, 2009.
3. “Did SAFE really buy that many US (and global) equities?,” Follow the Money, March 19, 2009.
4. “In Downturn, China Sees Path to Growth, The New York Times, March 17, 2009. http://www.nytimes.com/2009/03/17/business/worldbusiness/17compete.html?_r=1
5. “Steeling for 80% Export Growth,” Shanghai Daily, March 19, 2009. http://www.shanghaidaily.com/sp/article/2009/200903/20090319/article_394723.htm
6. “This Week’s Raw Steel Production,” The American Iron and Steel Institute, Steelworks, March 14, 2009. http://www.steel.org/AM/Template.cfm?Section=Raw_Steel_Production1&TEMPLATE=/CM/HTMLDisplay.cfm&CONTENTID=29318
I am indebted to Andrew DeWit for data on Japanese and US steel production and exports.
7. Interview with David Harvey, “The Winding Paths of Capital,” New Left Review 56, Mar-Apr 2009. See also The Long Twentieth Century: Money, Power and the Origins of Our Time (London: Verso, 1994) and Adam Smith in Beijing: Lineages of the 21st Century (London: Verso, 2008).
8. Arrighi notes, however, factors which could work in China’s favor: (1) the importance of demographic size should be left open and (2) the possibility that China has more to gain from the US getting stuck in wars that it cannot win—as envisaged in Adam Smith in Beijing (part III)—should be left open. Personal communication March 23, 2009.
9. Giovanni Arrighi, Beverly Silver and Melvyn Dubofsky, eds., "Labor Unrest in the World-Economy, 1870-1990", special issue of Review, vol. 18, no. 1, Winter, 1995. The analysis is further developed in Beverly J. Silver, Forces of Labor: Workers' Movements and Globalization since 1870, Cambridge University Press, 2003.
10. Elizabeth J. Perry and Mark Selden, eds., Chinese Society: Change, Conflict and Resistance, 2nd edition, 2003.
11. “Double Movement in China,” Economic and Political Weekly, Jan 13, 2009. < http://epw.in/epw/user/loginArticleError.jsp?hid_artid=13017
12. FAO report. http://www.reliefweb.int/rw/rwb.nsf/db900SID/MVDU-7PD4Q8?OpenDocument
13. For early rumblings of North China drought, see Edward Friedman, Paul G. Pickowicz and Mark Selden, Chinese Village, Socialist State (New Haven: Yale University Press, 1991) and Revolution, Resistance and Reform in Village China (New Haven: Yale University Press, 2005).
14. See Ching Kwan Lee and Mark Selden, “Inequality and Its Enemies in Revolutionary and Reform China,” Economic and Political Weekly, Jan 13, 2009. http://epw.in/epw/user/loginArticleError.jsp?hid_artid=13014
Posted on: Wednesday, March 25, 2009 - 19:05
SOURCE: American Conservative (3-23-09)
Since the New Deal, fears of terrorism and subversion have played a central role in U.S. political life. But the ways in which government and media conceive those menaces can change with astonishing speed. Such tectonic shifts usually occur because of the ideological bent of the administration in power. When a strongly liberal administration takes office, it brings with it a new rhetoric of terrorism, and new ways of understanding the phenomenon.
Based on the record of past Democratic administrations, in the near future terrorism will almost certainly be coming home. This does not necessarily mean more attacks on American soil. Rather, public perceptions of terrorism will shift away from external enemies like al-Qaeda and Hezbollah and focus on domestic movements on the Right. We will hear a great deal about threats from racist groups and right-wing paramilitaries, and such a perceived wave of terrorism will have real and pernicious effects on mainstream politics. If history is any guide, the more loudly an administration denounces enemies on the far Right, the easier it is to stigmatize its respectable and nonviolent critics.
It’s difficult to understand modern American political history without appreciating the florid conspiracy theories that so often drive liberals, and by no means only among the populist grassroots. Time and again, Democratic administrations have proved all too willing to exploit conspiracy fears and incite popular panics over terrorism and extremism. While we can mock the paranoia that drives the Left to imagine a Vast Right-Wing Conspiracy, such rhetoric can be devastatingly effective—as we may be about to rediscover.
Long before Sept. 11, 2001, America experienced repeated outbreaks of concern over terrorism. In terms of shaping liberal perceptions, the most important was that of the FDR years, when anti-government sentiment spawned a number of extremist organizations. Some were “shirt” groups, modeled on European fascists—America, too, had its Black Shirts and Silver Shirts—while the German-American Bund attracted Hitler devotees. Isolationism and anti-Semitism drew some urban Irish-Americans into the Christian Front, while the Klan experienced one of its sporadic revivals. Beyond doubt, far-Right extremism did exist, and these movements had their violent side, to the point of organizing paramilitary training. A few plotted real terrorist acts.
But the public response was utterly out of proportion to any danger these groups posed. From 1938 through 1941, the media regularly presented stories suggesting that the U.S. was about to be overwhelmed by ultra-Right fifth columnists, millions strong, intimately allied with the Axis powers. (Actual numbers of serious militants were in the low thousands at most.) Reportedly, the militant Right was armed to the teeth and plotting countless domestic terror attacks—bombings in New York and Washington, assassinations and pogroms, the wrecking of trains and munitions plants. Plotters were rumored to have high-placed allies in the military, raising the specter of a putsch. The ensuing panic was orchestrated by newspapers and radio and reinforced by films, newsreels, and comic books. Historians characterize these years as the Brown Scare.
If the more bizarre accusations sound like the common currency of the show trials in Stalin’s Russia in these very years, that is no coincidence. The main exposés of fascist conspiracy emanated from Communist Party journalists like Albert Kahn and John Spivak. (Spivak himself was an operative for the Soviet NKVD.) Charges circulated through Kahn’s newssheet The Hour before being picked up in the liberal press. The Red agenda was straightforward in that the Brown Scare allowed the Left to discredit any opponent of radical New Deal policies. Scratch the surface of any enemy of the Left, they claimed, and you would find a fascist spy, a lyncher, a storm trooper....
Posted on: Tuesday, March 24, 2009 - 21:40
SOURCE: Telegraph (UK) (3-24-09)
... Suppose that a government can have any two of the following things, but not all three: globalisation, in the sense of openness to international flows of goods, services, capital and labour; social stability; and a small state. Or, to put it differently, conservatives can pick any two from an open economy, a stable society and political power – but not all three.
Why is this? Globalisation, it turns out, gives rise to an economic system which is highly efficient most of the time, because resources are optimally allocated through the effects of the division of labour and comparative advantage. But it is also prone to crises: minor ones roughly every decade, major ones roughly every 50 years.
The effect of globalisation is therefore double-edged. Most of the time, economic volatility is reduced by international integration. However, recent events show that volatility on a large scale has not gone for good, and never went away on the small scale, for the individual citizen or firm – on the contrary, globalisation seems to have increased it. In the short run, we have to live with bigger and more frequent changes in employment and income; in the long run, with the likelihood that we shall experience at least one really big global crisis.
It is far from coincidental, and far from illogical, that this crisis is perceived by many on the Left – not least the chief of staff of the most Left-wing Democrat ever elected to be President of the United States – as an opportunity too "good" to be wasted. Their vision is to resurrect the General Theory of John Maynard Keynes, with its claim that shortfalls in private consumption and investment can and should be compensated for by increases in government expenditure funded by borrowing.
The fact that these patent remedies for the last Great Depression ultimately got the Western world into the mess some of us remember from the 1970s is neither here nor there. As conservatives around the English-speaking world are discovering, the arguments that proved so effective in the Eighties are ineffective now. Should they decry the irresponsibility of such large deficits? Should they remain committed to tax cuts? It's hard to do both at the same time. Ken Clarke's recent faux pas on the abolition of inheritance tax on estates worth less than £1 million illustrates the problem: is it a Tory "aspiration"? A "commitment"? Or perhaps just an irrelevance?
By trying to have it three ways, Conservatives end up being identified with the social disruption globalisation brings in its wake, and particularly the losses of jobs associated with outsourcing, off-shoring and immigration. Only the Left appears to have a credible response: globalisation, plus social stability, plus a strong, interventionist state.
Is there any way for Conservatives to resolve the trilemma without abandoning either their commitment to the free market, their commitment to social order or their commitment to the small state?
I believe there is, but it will require them to redefine each of these things. They need to recognise not only the inherent vulnerability of the liberalised global economy, but also the continuing distortions imposed on the world market by governments, which have served to increase that vulnerability. Also, social stability should not in fact be sacrosanct: conservatives should be prepared to embrace social change on what might be called the "Leopard" principle, after the aphorism in Giuseppe Tomasi di Lampedusa's great novel Il Gattopardo: "If we want things to remain as they are, things have to change". Finally, they must argue that a small but "smart" state can more effectively regulate the interaction of globalisation and social change than the old Keynesian model.
Therefore, the first part of the Conservative solution to the current crisis is to reaffirm our faith in Adam Smith's vision of the wealth of nations enhanced by free exchange, by pointing out – as Boris Johnson does on the next page – how many government distortions still impede the working of globalisation.
The second part is to learn to stop worrying and love social change. No one wants instability, to be sure, but nor do we want stasis. The key here is to distinguish between an orderly society, in which crime and other forms of disorder are kept to a minimum, and a rigid society, in which order is achieved at the expense of social mobility. The fundamental difference between Conservatives and their opponents should not be (as the Left would have it) that Conservatives favour inequality, while they favour equality. The difference is that Conservatives favour social mobility, and believe that some measures adopted to promote equality – particularly large-scale redistribution of wealth – may have the unintended consequence of reducing mobility.
The third part of my solution is to reflect more deeply on how we think the state should work. One of the more troubling features of British Conservatism in recent years has been the tendency of the party's leaders to allow the Labour government to seize and retain the moral high ground on the issue of public spending and, to a degree, on taxation.
Emancipated by the financial crisis, the governments of the Left on both sides of the Atlantic are currently presiding over explosive increases in public expenditure, public debt and public employment, the short-run benefits of which they almost certainly overestimate, and the medium-term costs of which they almost certainly underestimate.
Yet Conservatives have no articulate response. It is high time they found one, and removed themselves from the horns of the trilemma.
Posted on: Tuesday, March 24, 2009 - 17:03
SOURCE: http://ehistory.osu.edu (4-1-09)
The demand that society legally recognize same-sex marriages is often called revolutionary. And so it is, but not in the way most people assume. The reason it is revolutionary is not because traditional marriage has always been “one man/one woman.” Nor is there anything historically unprecedented in societies accepting and validating same-sex relationships.
But the kind of same-sex relationships envisioned by most gays and lesbians who seek to wed is indeed unprecedented. The demand that individuals should be able to choose their partner solely on the basis of love, sexual attraction, and mutual interests is indeed a huge challenge to traditional marriage.
And it was heterosexuals who pioneered this revolution. Gays and lesbians have simply asked to join it.
The kind of marriage that we cherish today—the kind that opponents of same-sex marriage believe they are defending against desecration by gays and lesbians, and the kind that has inspired so many gays and lesbians who were once suspicious of the institution to now clamor for inclusion—represents a radical break with thousands of years of tradition. If modern Americans fully understood what traditional marriage actually entailed, few, we can be sure, would want any part of it.
The Many Types of Marriage
There is nothing unusual about forms of marriage that involve something other than one man and one woman.
In a majority of cultures throughout history, the most favored form of marriage was polygyny—one man and multiple women. We're not just talking about exotic island cultures or lost tribes in the African jungle. In 70 percent of more than 1,000 societies described in the Human Relations Area Files, polygyny is the preferred (though not necessarily the more frequent) form of marriage.
Polygyny is the family structure most often mentioned in the first five books of the Old Testament. It was common throughout ancient India, the Middle East, Africa, China, and many kingdoms in South America. The upper classes in several regions of what is now Europe also practiced it prior to the 7th century. A more recent study of almost 400 societies, which excluded smaller and less well-known samples, found that 60 percent of these contained significant numbers of polygynous marriages.
Polyandry—one woman and many men—has also been found in some societies. In Tibet and parts of India, Kashmir, and Nepal, a woman may be married to two or more brothers, none of whom can claim exclusive sexual rights to her.
Some societies have recognized marriages that didn’t even unite two live human beings. In China and the Sudan, when two sets of parents wanted to forge closer family ties through marriage, but no living spouse was available, they sometimes married off a child to the "ghost" of a dead son or daughter of the other family. Among the Bella Coola and Kwakiutl native societies of the Pacific Northwest, when two families wished to establish the trading ties that went with becoming in-laws but didn't have two sets of marriageable children available, they might draw up a marriage contract between a son or daughter and a dog belonging to the desired in-laws.
Nor is there anything revolutionary about cultures accepting same-sex relationships. In fact, the majority of cultures surveyed by anthropologists have accepted same-sex relationships under certain circumstances.
In ancient Greece, such relationships were regarded as purer and deeper than heterosexual bonds. The Greek philosopher Plato declared that love was a wonderful emotion, leading men to behave in honorable ways. But, he quickly explained, he was referring not to the love of women, “such as the meaner men feel,” but to the love of a man for another man....
HNN Hot Topics: Gay Marriage
Posted on: Monday, March 23, 2009 - 21:05
SOURCE: CNN (3-23-09)
... Traditionally, American politicians in times of crisis have resisted aggressive interventions by government into business which would tamper with managerial prerogatives and profits.
The political value of this strategy has been clear: It helps elected officials in the White House and Congress sell federal programs in a country stubbornly resistant to many kinds of government interventions in the private sector (though often happy with the interventions after they receive the benefits). It also dampens corporate opposition to government programs in moments when such programs are urgently needed.
In times of war, middle-of-the-road intervention has been the standard model of American governance. Presidents Woodrow Wilson in WWI and Franklin Roosevelt in WWII needed factories to quickly shift from producing manufacturing goods for private markets to producing wartime materiel.
But Wilson and Roosevelt both avoided government-centered programs, instead relying on business to do the work for government through voluntary arrangements and financial incentives.
Business was allowed to make specific decisions about production and to retain profits (though Congress used progressive taxation to mitigate some of these benefits and Roosevelt issued a largely symbolic executive order to limit the after-tax income of the wealthy in 1942 which Congress overturned in 1943). Incentives were offered from the start of the mobilization to assist in the process.
Both presidents established wartime boards to manage production, boards headed by business leaders: Bernard Baruch for President Wilson and Donald Nelson for FDR.
As FDR's Secretary of War, Henry Stimson, noted in his diary in 1940, "If you are going to try to go to war, or to prepare for war, in a capitalist country, you have to make business make money out of the process or business won't work." The outcome of this arrangement was that business retained enormous power over the production process and executives made money.
In the 1940s, a small group of the nation's biggest corporations received the lion's share of military contracts. Wartime agencies were staffed by "dollar-a-year men" who were business executives temporarily working for the government in exchange for a small wage. According to The New Republic, however, altruism was not their primary motivation: "Concern for the war is secondary to self-interest and the jealous protection of their competitive positions."
During the 1930s, the profits from WWI became an issue that opponents of the war tried to use against FDR as he called for intervention overseas to stop the expansion of Nazi power. During the 1950s and 1960s, the middle-of-the-road model to handle Cold War programs produced immense anger toward the development of a "military-industrial complex" that reaped enormous profits from the fight against communism.
Americans have also depended on middle-of-of-the road intervention in periods of economic crisis. One of striking aspects of the New Deal was that during the worst economic crisis in the nation's history, FDR resisted directly taking over American business.
The National Recovery Act (1933), for instance, established voluntary codes of production without any mechanism of enforcement. Under General Hugh Johnson, the act depended on business voluntarily complying with the codes. Those doing so received a Blue Eagle sticker to place on their windows so that consumers would know to purchase their goods. The program was failing by 1934, before the Supreme Court ruled it unconstitutional, because businesses were not following the agreements.
The historian Alan Brinkley has written about how the sentiment for intervention into business that did exist faded even more after 1937 when liberals turned to government spending as the method to pump the economy.
Now we have seen similar results emerge with the financial bailout. The administration quickly backed off from any proposals that could be characterized by their opponents as nationalizing the banks or as taking over the institutions that they were seeking to save.
As a result, like many of their predecessors, the White House and Congress allowed management the flexibility to make decisions, such as the bonuses, which have already come back to haunt them.
During the next round of negotiations, the administration and Congress might rethink their earlier approach, indeed the approach we have taken to economic intervention since the progressive era of the late 19th and early 20th centuries....
Posted on: Monday, March 23, 2009 - 20:54
SOURCE: LAT (3-23-09)
President Obama's $3.55-trillion budget tackles the nation's highest priorities while promising to cut deficits in the long run. Breathtakingly bold and refreshingly honest, the budget speaks louder than words about Obama's confident leadership. But, alas, it only makes more acute the dilemma he faces.
Last month, House Republicans voted unanimously against his stimulus plan, along with 37 Republican senators. Their solidarity turned Obama's bipartisan overtures into symbolic gestures with grave practical consequences. Obama has presented a budget for the next 10 years, but his programs already face an uncertain future unless he can repeat President Franklin D. Roosevelt's electoral triumph of 1934.
Currently, Republicans have neutralized the Democrats' 17-vote majority in the Senate by threatening to filibuster any measures they don't like. It takes a supermajority of 60 votes to end any filibuster, giving the 41 Republican senators a mighty weapon. Passage of the president's bill will depend on the votes of maverick Republicans and the Senate's two independents, assuming there are no Democratic defections or absences.
Add to this situation the historical norm that the president's party loses congressional members in off-year elections. A strong victory, such as Obama's last November, increases the chance of the winning party losing congressional seats. Without Obama's coattails, Democratic candidates will have more trouble winning. Given all this, it becomes obvious that time is short for fulfilling Obama's ambitious proposals for healthcare, energy independence, education and tax increases.
In similar circumstances, Roosevelt recognized the limited horizon he had, and he shared the same sense of urgency about moving the country in a new direction. Roosevelt's challenge was rendered scarier by the rise of Adolf Hitler. The road became even rockier for Roosevelt when his New Deal coalition began to come apart. Without full support from his party's progressives, his bills needed votes from his party's conservatives -- typically lawmakers of the solidly Democratic South, who had their own agenda.
An astute politician, FDR saw the days ahead bringing schisms among his supporters, comebacks from the opposing party and public disenchantment with the government's effectiveness. It was then that he looked beyond the politics of Congress and the Supreme Court to the fourth, informal branch of American government: the public. He would defy the odds of losing in the 1934 off-year election by carefully cultivating the ordinary men and women who had voted for him.
The first media-savvy president, Roosevelt initiated radio talks broadcast throughout the nation. Radio was then a new phenomenon in the U.S., and became the main provider of entertainment and news for many of the poor people who constituted the Democratic constituency.
In his"fireside chats," Roosevelt reassured a nervous people that New Deal measures were helping to get the country back on its feet. With a conversational tone, he created a folksy persona -- his favorite song was"Home on the Range" -- that conveyed calm, confident authority.
FDR kept tabs on the press by regularly canvassing hundreds of local papers. This took a strong stomach, because nine out of 10 of them often described him as a dictator intent on destroying the Constitution in order to plunge the U.S. into the same communist rat hole as the Russians. Turning their vitriol into good-humored joshing, he managed to infuriate his opponents and delight his friends. Where Obama tends to placate his critics, Roosevelt played them.
Roosevelt's opponents on the left were even more worrisome than those on the right, especially Sen. Huey Long of Louisiana. The wily"Kingfish" launched the Share-Our-Wealth Society in 1934 in a preliminary bid for the presidency. His idea of redistributing wealth by raising inheritance and income taxes had considerable appeal at a time when nearly 30% of American men were unemployed.
A demagogue with few scruples, Long was almost as effective as Roosevelt in reaching people through radio. But he lacked Roosevelt's political know-how, which was never better displayed than when he used the fears Long aroused to win support for Democrats.
Recognizing the public's yearning for some certainty in their lives, Roosevelt, in the summer of 1934, proposed a study group to explore a broad social security program based on employee and employer contributions. The president had shrewdly positioned his party between Long's crackpot schemes to soak the rich and the business community's adamant refusal to support a social safety net. Roosevelt counted on the virulence of his opponents to spread word of his plan. As they fulminated, he reassured Americans that he was actually strengthening their traditional institutions.
When the votes were counted that November, the Democrats had won nine new seats in the House and an even more astounding nine new Senate seats, giving the party a two-thirds, filibuster-proof control of Congress. The gain in seats and the margin of control made it an unprecedented victory, which has not been repeated since, although presidents Clinton and George W. Bush did garner some new members in the off-year elections of 1998 and 2002.
Does Roosevelt furnish a template for Obama? The two men share a lot. As president, both face the awesome task of reviving the economy. Obama's personal popularity outstrips support for his party, as did FDR's. Of necessity, Obama's hope for matching Roosevelt's successful record of reform and recovery is going to rest on his pulling off an electoral victory in 2010 like FDR's 76 years ago.
Posted on: Monday, March 23, 2009 - 14:31
SOURCE: NYT blog (3-20-09)
That is not the way the game is usually played. Even former President George W. Bush understood that. Speaking two days after Cheney to an audience of 2,000 people in Calgary, Canada, he was, predictably, asked about Cheney’s remarks and said of his successor:
“He deserves my silence. I love my country a lot more than I love politics. I think it is essential that he be helped in office.”
That’s better — or at least validates the norm of history.
On President Kennedy’s 86th day in office, April 16,1961, a United States-backed and trained brigade of exiles tried to overthrow Premier Fidel Castro of Cuba by invasion at a place called the Bay of Pigs. It was, of course, a total disaster, one of the great embarrassments in American history. The 35th president deflected some criticism by publicly taking responsibility for the incompetence and stupidity of that adventure. But he also immediately began a series of “national unity” meetings, beginning with former Vice President Richard Nixon, his 1960 opponent, and former President Dwight Eisenhower, revered by much of the country for his military leadership during World War II.
Nixon was first, coming to the White House on April 20. “It was the worst experience of my life,” Kennedy told his old adversary. Then he asked what Nixon would do now. The answer was: “I would find a proper legal cover and then I would go in.”
The president said no to that; he was afraid the Soviet Union would retaliate where the western Allies were weakest, in Berlin. Nixon nodded, then said that whatever Kennedy decided, “I will support you to the hilt.”
Two days later, he invited Eisenhower for lunch at Camp David. Kennedy had never seen the presidential retreat, which Eisenhower had named for his grandson. They walked the paths of the 125-acre reservation and the old five-star general gave the young ex-Navy lieutenant the tongue-lashing of his life. The conversation, reconstructed from Eisenhower’s notes from the meeting, went something like this:
“No one knows how tough this job is until he’s been in it a few months,” Kennedy began.
“Mr. President,” answered Eisenhower,” If you will forgive me, I think I mentioned that to you three months ago.”
“I certainly have learned a lot since then.”
“Mr. President, before you approved this plan, did you have everybody in front of you debating the thing so you got the pros and cons yourself and then made the decision, or did you see these people one at a time?”
“Well, I did have a meeting … I just took their advice.”
“Mr. President, were there any changes in the plan that the Joint Chiefs of Staff had approved?”
“Yes, there were. We did want to call off one bombing sally.”
“Why was that called off? Why did they change plans after the ships were at sea?”
“Well, we felt it necessary that we keep our hand concealed in this affair …”
“Mr. President, how could you expect the world to believe that we had nothing to do with it? Where did those people get the ships to go from Central America to Cuba? Where did they get the weapons? Where did they get all the communications and all the other things they would need? How could you have possibly kept from the world any knowledge that the United States was involved? I believe there is only one thing to do when you get into this kind of thing — it must be a success.”
“I assure you that, hereafter, if we get into anything like this, it is going to be a success.”
“Well, I’m glad to hear that,” said Eisenhower.
It was time to meet the press. Reporters were lined up along the electric fence around the reservation. Kennedy began by saying: “I asked President Eisenhower here to bring him up to date on recent events and get the benefit of his thoughts and experience.”
Despite the criticism he delivered in private, Eisenhower said, “I am all in favor of the United States supporting the man who has to carry the responsibility for our foreign affairs.”
Kennedy helicoptered back to the White House, angrily stalking the Oval Office, cursing his military leaders and the Central Intelligence Agency, ending his tirade with: “How could I have been so stupid?”
He called in the chief architects of the Bay of Pigs invasion plan, the C.I.A. director Allen Dulles, and Richard Bissell, who planned the operation. He told Bissell: “In a Parliamentary system I would resign. In our system the president can’t and doesn’t. You and Allen must go.”
And so, Kennedy began the last eight of his first 100 days in the White House.
Posted on: Monday, March 23, 2009 - 13:08
SOURCE: NYT (3-23-09)
IN what may come to be the definitive line about our current economic crisis, Warren Buffett said on the CNBC program “Squawk Box” this month that the United States economy has “fallen off a cliff.”
The most trusted investor in history went on the air to talk, with characteristic candor and humor, about the horrendous truth we pretty much know, possibly in an effort to calm things down and point toward some answers we don’t yet know. He proceeded to give his views on what went wrong (“everybody thought house prices could go nothing but up ... so you had $11 trillion of residential mortgage debt built on this theory ”), on people’s paralyzing fear and confusion (“We are in a very, very vicious negative feedback cycle .... I don’t want this to be the last line of the movie”), and on the absolute necessity of fixing the banks and taking clear, decisive action.
A look back at the handling of another financial crisis a full century ago underlines the point about decisive action. You just don’t want to take the wrong decisive action. Markets today are immeasurably more complex, global, fast-moving and regulated (a lot of good that did) than they were a hundred years ago, but the need for strong leadership has not changed.
In early 1906, the banker Jacob Schiff told a group of colleagues that if the United States did not modernize its banking and currency systems, its economy would, in effect, fall off a cliff — that the country would “have such a panic ... as will make all previous panics look like child’s play.”
Yet the country failed to reform its financial institutions, and conditions deteriorated steadily over the next 20 months. There was a worldwide credit shortage. The American stock market crashed twice. The young Dow Jones industrial average lost half of its value.
In October 1907, when a panic started among trust companies in New York and terrified depositors lined up to get their money out, Schiff’s dire prediction seemed about to come true. The United States had no Federal Reserve, the Treasury secretary did not have much political authority, and the president, Theodore Roosevelt, was off shooting game in Louisiana.
J. Pierpont Morgan, a 70-year-old private banker, quietly took charge of the situation....
Posted on: Monday, March 23, 2009 - 12:45
SOURCE: WSJ (3-21-09)
So great were the hopes for the launch of John F. Kennedy's presidency that even before his inauguration, the president-elect was griping about the pressure he felt to work magic. "I'm sick and tired of reading how we're planning another 'hundred days' of miracles," Kennedy complained to his chief aide, Ted Sorensen, as they composed the inaugural address. "Let's put in that this won't all be finished in a hundred days or a thousand."
JFK knew that the hundred-day yardstick for measuring presidential progress was as misleading as it is ubiquitous. The roundness of the number, though aesthetically seductive, is arbitrary; and while the short time span suggests swift, purposeful action, it really means that newcomers to the office will usually be too green to demonstrate true mastery.
Knowing this, Kennedy and Sorensen inserted into his speech its famous "thousand days" line. For good measure, they added an even bigger caveat, warning that the new administration probably wouldn't meet its goals "even perhaps in our lifetime on this planet." Talk about lowering the bar!
And yet if Kennedy hoped to lessen expectations for his first hundred days, he failed. By April, the hype continued apace. Sorensen was dispatched to draft a memo showing how Kennedy's accomplishments stacked up favorably next to those of Harry Truman and Dwight Eisenhower.
Given the crises that Barack Obama faces, he might do well to lower the bar himself. With April 30 looming, he has managed, to his credit, to pass a stimulus bill (albeit through rougher waters than he hoped), roll out a banking-crisis fix (with fewer details than Wall Street hoped) and propose a mortgage solution (with less money than everyone hoped). He's signed a few ballpoint-ready Democratic bills like the State Children's Health Insurance Program and the Lilly Ledbetter Fair Pay Act, and issued executive orders closing the Guantanamo Bay prison and overturning the anti-abortion "gag rule" for family-planning centers overseas.
A lot of people are still expecting more. In his speech before Congress last month, Obama promised initiatives to tap new sources of domestic energy, contain global warming, invest in education and toughen financial regulations -- not to mention the rather large matter of health-care reform. The hundred days is surely, as historian Arthur Schlesinger once said, a "trap."...
The main reason that the hundred days are an unreliable indicator of future performance is the same reason we watch them so closely: They constitute the period in which the public is just getting to know the new president, and in which the president is just getting to know his new job. New presidents tend to be clueless about governing. Even running a large state can't prepare them for the responsibilities, attention or demands to act quickly -- just as they need to find their footing. (FDR's term hardly defined his legacy; many of his greatest achievements came later.) Sizing up presidents based on their hundred days is like judging a rookie from his first cuts in spring training.
Posted on: Sunday, March 22, 2009 - 21:50
SOURCE: TomDispatch.com (3-22-09)
A block from my apartment, on a still largely mom-and-pop, relatively low-slung stretch of Broadway, two spanking new apartment towers rose just as the good times were ending for New York. As I pass the tower on the west side of Broadway each morning, one of its massive ground-floor windows displays the same eternal message in white letters against a bright red background:"Locate yourself at the center of the fastest expanding portion of the affluent Upper West Side."
Successive windows assure any potential renter that this retail space (10,586 square feet available! 110 feet of frontage! 30 foot ceilings! Multiple configurations possible!) is conveniently located only"steps from the 96th Street subway station, servicing 11 million riders annually."
Here's the catch, though: That building was completed as 2007 ended and yet, were you to peer through a window into the gloom beyond, you would make out only a cavernous space of concrete, pillars, and pipes. All those"square feet" and not the slightest evidence that any business is moving in any time soon. Across Broadway, the same thing is true of the other tower.
That once hopeful paean to an"expanding" and"affluent" neighborhood now seems like a notice from a lost era. Those signs, already oddly forlorn only months after our world began its full-scale economic meltdown, now seem like messages in a bottle floating in from BC: Before the Collapse.
And it's not just new buildings having problems either, judging by the increasing number of metal grills and shutters over storefronts in mid-day, all that brown butcher paper covering the insides of windows, or those omnipresent"for rent" and"for lease" signs hawking"retail space" with the names, phone numbers, and websites of real estate agents.
I hadn't paid much attention to any of this until, running late one drizzly evening about a month ago, and needing a piece of meat for dinner, I decided to stop at Oppenheimer's, a butcher shop only three blocks from home. I had shopped there regularly until a new owner came in some years ago, and then the habit slowly died. The store still had its awning ("Oppenheimer, Established 1964, Prime Meats & Seafood") and the same proud boast of"Steaks and Chops Cut to Order, Oven-ready roasts, Fresh-ground meats, Seasonal favorites," but you couldn't miss the"retail space available" sign in the window and, when I put my face to the glass, the shop's insides had been gutted.
Taken aback, I made my way home and said to my wife,"Did you know that Oppenheimer's closed down?" She replied matter-of-factly,"That was months ago."
Okay, that's me, not likely to win an award for awareness of my surroundings. Still, I soon found myself, notebook in hand, walking the neighborhood and looking. Really looking. Now, understand, in New York City, there's nothing strange about small businesses going down, or buildings going up. It's a city that, since birth, has regularly cannibalized itself.
What's strange in my experience -- a New Yorker born and bred -- is when storefronts, once emptied, aren't quickly repopulated.
Broadway in daylight now seems increasingly like an archeological dig in the making. Those storefronts with their fading decals ("Zagat rated") and their old signs look, for all the world, like teeth knocked out of a mouth. In a city in which a section of Broadway was once known as the Great White Way for its profligate use of electricity, and everything normally is aglow at any hour, these dead commercial spaces feel like so many tiny black holes. Get on the wrong set of streets -- Broadway's hardly the worst -- and New York can easily seem like a creeping vision of Hell, not as fire but as darkness slowly snuffing out the blaze of life.
A Stroll in the Neighborhood
Let me take you, then, on a little tour of the new face of my neighborhood. Along the ten blocks closest to my home, the banks (with one exception), the fast food restaurants (Subway, Dunkin' Donuts, Blimpie), and above all the chain drugstores that crowd onto successive blocks (Rite Aid, Walgreens, Duane Reade) still stand. It's the small places that seem to be dropping like flies.
So here we go up those subway steps at 96th where a branch of WaMu (Washington Mutual Bank, placed in receivership by the FDIC in September 2008 and quickly sold to JP Morgan) stands empty. Now, start walking up the east side of Broadway, past Citibank on 96th and the Bank of America at the corner of 97th, until you come to little Alpine Sound Electronics, or the shell of it anyway, where I used to buy my cheap, waterproof watches for my daily swim at the Y. Now it's gone, though an emphatic"sale, sale, sale, sale, sale" sign over the door is a reminder of its final moments.
Take another sec and check out the other side of the street, where at mid-block a canopy advertising"Moroccan & Indian Home Decoratives… Aromatherapy… Exotic Gifts" still stands, but with a"Store for Rent" sign in the window and a desolate interior -- a couple of ratty shelves, a single chair, a half-filled black garbage bag, and a broom. Right beside it is (or was) a tiny children's clothing store. Its striped awning now sports a gaping hole in its center as if it had been hit by a missile, though its window still says,"Made in New York City… enjoyed worldwide!" Not so much today.
But let's not tarry. Keep going past 98th, by that butchered butcher shop, but do note, next to it, another vacancy, the shell that housed a small wine bar and restaurant, Vinacciolo, that came and went. Only two long, bare, narrow tables remain on a floor scattered with trash.
Now, we're almost at 100th, passing those two towers with their unrented frontages and, on the east side of the street, the classic façade of the old Metro movie house, closed to build one tower, and still empty. The cracked glass of the ticket teller's booth backed by plywood gives the neighborhood that distinctive Last Picture Show feel.
Just above 100th on the west side of Broadway is the store once occupied by Sterling Optical. They moved more than two years ago (I followed them faithfully) and the metal security grill has remained in place ever since. Ditto the storefront next to it, empty but for a little hand-lettered sign on the door,"Fedex Please Knock Hard" -- it better be mighty hard! -- and a tiny"Zagat Rated 2006 Shopping Guide" decal on the window.
Well, you get the idea, if you haven't already experienced the equivalent wherever you live. At 101st, A & S Art/Framing (" custom framing and mirrors"), a sliver of a store, has closed up shop. Between 102nd and 103rd, Planet Kids is emptying out. ("After 18 years we are closing on March 31st...") On 103rd, the Royal Kabab & Curry restaurant has, like the optician, moved on to lower-rent digs without being replaced; and, on 105th, Tokyo Pop, a Japanese restaurant, all of whose wait staff mysteriously spoke English with French accents, has also disappeared, though its papered-over windows uniquely promise a"Pizzabar" in the Spring. (I'm not holding my breath.)
Actually, if you head in just about any direction, the toll is apparent. Go south on Broadway from 96th, for instance, and you pass the same proliferating patches of emptiness. At 93rd, the tiny storefront of the all-detective bookstore Murder Ink, which closed on the last day of 2006 (about the moment when this deepening recession officially began) remains unoccupied.
Further south, there are slaughtered neighborhood restaurants galore. Not surprisingly, even in food-mad New York, people are eating out less and our streets, except perhaps on a Saturday night, seem visibly less populated. Near the corner of 91st, Mary Ann's, a festive Tex-Mex spot, bit the dust; just before 90th, the upscale seafood restaurant Docks Oyster Bar shut its doors so recently that its red"restaurant" sign is still lit ("Docks thanks you all for your loyal patronage over the years but this restaurant is now closed…"); at the corner of 88th, in the spacious two-floor space that used to house Boulevard (on whose paper tablecloths my kids and I drew faces with restaurant-provided crayons), and then a dizzying succession of restaurants whose names escape me, the bar chairs are carefully stored upside down on the bar and a"For Rent" sign is in the window; and, on 77th, Ruby Foo's, a giant pan-Asian joint, described by Zagat's as"Disneyfied," has shut, too.
Only below 72nd street, where the neighborhood gets noticeably tonier, and the banks (TD, HSBC, Capital One, Chase, Bank of America) begin to breed and multiply, and the urban mall stores (Pottery Barn, Barnes & Noble, The Gap, Bed Bath & Beyond) proliferate, do the deaths end (except for a Circuit City branch at the corner of 67th that went down with that bankrupt chain).
Here, stores are still clean, well-lighted places, though a remarkable number of them sport signs that say:"save up to 50%,""up to 70% off…"
9/11, The Sequel
Let's not exaggerate. New York City is not downtown Elkhart, Indiana -- not yet anyway (although the other night on Amsterdam Avenue, just east of Broadway, I noted a block of 12 tiny storefronts, nine of which had been emptied). Yes, rents on avenues like Broadway remain sky-high and, these days, getting a bank loan if you're a small start-up is bloody murder, and the city's zoos are losing their state funding, the hospitals are getting rid of staff, the Metropolitan Museum of Art is having layoffs, the unemployment rate is rising fast, property values are sinking, mass transit riders are facing fare increases as well as major service cuts, and the Greater New York Orchid Society has canceled its annual show. Nonetheless, this global financial capital is still surfing the final modest wavelets of the tsunami of money that flowed through its veins in the good times (some of which continues to head"our" way, thanks to government bailout plans).
Still, as you walk past those patches of darkness, a thought almost can't help but form. For the last seven years, we've been waiting for 9/11, The Sequel, to arrive from Afghanistan or some similar place. The media has regularly featured fantasy scenarios in which Islamic terrorists sneak atomic bombs or"dirty bombs" into cities like New York and set them off. ABC's Charles Gibson even highlighted such a possibility in a Democratic presidential debate. ("I want to go to another question... The next president of the United States may have to deal with a nuclear attack on an American city. I've read a lot about this in recent days. The best nuclear experts in the world say there's a 30 percent chance in the next 10 years...") And the Bush administration claimed as one of its great accomplishments the prevention of a repeat of 9/11.
And yet, in a sense, as on September 11, 2001, maybe we were just looking the wrong way. After all, you might say that an economic dirty bomb did go off in downtown New York and this city (not to say, the nation and the world) has been experiencing a second 9/11 ever since, even if in slow motion.
In my neighborhood, back in those fateful September days in 2001, you could hear the sirens, see the jets streak overhead, catch the acrid smell of the towers and everything chemical in them burning, and like the rest of America, watch those apocalyptic-looking scenes of the towers collapsing in clouds of ash and smoke again and again. But if the look then was apocalyptic, the damage, however grim, was limited.
This time around there's no dust, no ash, no acrid smell, no sirens, no jets, and no brave rescuers either. And yet the effect might, sooner or later, be far more apocalyptic and the lives swallowed up far greater. This time, of course, the fanatical extremists were homegrown. Their" caves" were on Wall Street. They hijacked our economy and did their level best to take down our world.
And they may have come closer than most of us imagine. Alpine Sound and Oppenheimer, Tokyo Pop and Planet Kids, Docks and Ruby Foo's have all gone down (and more are surely headed that way). For the people who owned, or ran, or worked in them, unlike the survivors of the original 9/11, there will be no moving bios in the local papers, no talk of compensation, and no majestic memorials to argue about.
For the perpetrators, who have, at worst, gone home pocketing their millions, there will be no retribution. No invasions will be launched, no missiles shot into homes or hideouts. None of them will be pursued to their lairs, or kidnapped off the streets of New York, or from their palatial mansions, or apartments, or estates. None will be spirited to foreign lands to be imprisoned and tortured. None will be labeled"enemy combatants."
Quite the opposite, in 9/11, The Sequel, the U.S. government is willing to pay many of them and their institutions in the multi-billions for their time and further efforts.
In the second 9/11, all the pain and torture is in the neighborhood.
Posted on: Sunday, March 22, 2009 - 19:02
SOURCE: TheDailyBeast.com (3-17-09)
A specter is haunting the Obama administration, Congress, and just about every big bank in the land—the specter of populism. Not many Americans grasp how “credit default swaps” and “collateralized debt obligations” helped push the country into a deep recession, but they do know that the wise guys and gals who set up such arcana got obscenely rich and want to punish them for acting as if they should not be suffering like everyone else, or worse.
The spirit of populism is unlike the specter of communism famously invoked by Marx and Engels. It is ideologically promiscuous, homegrown, and nearly as old as the republic itself. The powers that be will try to bend it to their will or get stymied by an anger they cannot control.
Since at least the 1830s, when President Andrew Jackson fought an epic battle to shut down the “money power” embodied in the Second Bank of the United States, politicians and citizen-activists have spewed outrage toward elites who ignored, corrupted, or betrayed the common people. But left and right have seldom agreed about who those elites are and which “people” are rising up against their evil deeds.
The left traditionally trains its fire on corporate tycoons and the politicians they favor. During the Gilded Age, the primary villains were men like Carnegie and Rockefeller, McKinley, and Hanna. They sinned by driving small businesses out of the marketplace, busting unions, and constricting the currency to create a nation of debtors. During the Great Depression, the main targets were such fierce opponents of the New Deal as Henry Ford and Herbert Hoover. The car maker dispatched goons to beat up union organizers; while the grim-faced president responded to the crisis by denouncing radicals and urging the hungry to rely upon rugged individualism. Their spirit still lingers in the congressional leadership of the Grand Old Party.
But cultural resentments can also spur energetic populist revolts—and economic discontents are rarely separate from those based on style, religion, and status. During the 1930s, Father Charles Coughlin broadcast attacks on “international bankers” and “atheistic” Jews friendly to FDR—and drew a market share Rush Limbaugh would envy. Twenty years later, Senator Joseph McCarthy sided with the “real people” from rural areas and small towns “who are the heart and soul and soil of America.” He vowed to defend them against “the bright young men who are born with silver spoons in their mouths” who were “selling this nation out.”
McCarthy overreached, but he schooled his fellow conservatives well. During the half-century since his swift rise and humiliating fall, Republicans on the right have rarely stopped singing from the same populist hymnal. From Nixon’s praise of “Middle Americans” to Fox News' gleeful derision of John Kerry windsurfing, cultural populism helped the GOP win many an election and consistently put its opponents on the defensive.
Last fall, Sarah Palin tried to re-arrange the familiar verses. But her praise of small-town folks “who grow our food, run our factories, and fight our war” and her cute-tough sneers at “the permanent political establishment” and “Washington elite” earned her more ridicule than votes. With stock prices dropping and foreclosures abounding, the old bogeys of effete liberals and tax-eating bureaucrats had lost much of their power to scare, at least last year.
Yet a defeat for right-wing populism does not mean the left’s version is once again in command. The “people” no longer means what it did during the Great Depression. In the 1930s, blue-collar wage earners flocked to unions that promised “industrial democracy” and applauded FDR when he bashed his opponents as “economic royalists.” For most workers, Wall Street was an exotic neighborhood that few would ever have the cash to visit. Most Americans did not even pay federal income tax until World War II. For most of the nation’s history, the rhetoric of haves and have-nots had a strong claim on reality.
Since the 1950s, however, the U.S. has been a middle-class nation. It also gradually became a stock-owning one, as 401(K) plans and Internet day-trading knit an implicit partnership between financial wizards and nearly everyone with a decent job. As a consequence, in the current economic debacle, far more Americans think of themselves as cheated investors than as horny-handed captives of “the money power.”
That doesn’t, of course, lessen taxpayers’ anger at saving the bosses of Merrill Lynch or AIG from their venal errors—or paying them fat bonuses from the Treasury. But that rage is free-floating, untethered as yet to either a social movement or an insurgent party that together might insure that such behavior never happens again.
Still, we should understand the public’s wrath instead of fear it. As the historian C. Vann Woodward wrote half-a-century ago, “One must expect and even hope that there will be future upheavals to shock the seats of power and privilege and furnish the periodic therapy that seems necessary to the health of our democracy.” Like the American dream itself, ever present and never fully realized, populism lives too deeply in our culture to be trivialized or replaced. And without it, the partisans in one of our oldest, most charged modes of debate would be left with little to say.
Posted on: Friday, March 20, 2009 - 17:37
SOURCE: Foreign Policy (3-1-09)
The international community needs to recognize a simple, albeit brutal fact: The Democratic Republic of the Congo does not exist. All of the peacekeeping missions, special envoys, interagency processes, and diplomatic initiatives that are predicated on the Congo myth -- the notion that one sovereign power is present in this vast country -- are doomed to fail. It is time to stop pretending otherwise.
Much of Congo's intractability stems from a vast territory that is sparsely populated but packed with natural resources. A mostly landlocked expanse at the heart of Africa, Congo comprises 67 million people from more than 200 ethnic groups. The country is bordered by nine others -- among them some of the continent's weakest states.
A local Kiswahili saying holds,"Congo is a big country -- you will eat it until you tire away!" And indeed, for centuries, this is precisely what Congo's colonial occupiers, its neighbors, and even some of its people have done: eaten away at Congo's vast mineral wealth with little concern for the coherency of the country left behind. Congo has none of the things that make a nation-state: interconnectedness, a government that is able to exert authority consistently in territory beyond the capital, a shared culture that promotes national unity, or a common language. Instead, Congo has become a collection of peoples, groups, interests, and pillagers who coexist at best.
Congo today is a product of its troubled history: a century of brutal colonialism, 30 years of Cold War meddling and misrule under U.S. ally Mobutu Sese Seko, and more than a decade of war following his ouster in 1997. That conflict, which embroiled much of southern Africa, brought rebel leader Laurent Kabila, a one-time revolutionary colleague of Che Guevara, to power. Kabila was assassinated just a few short years later, leaving his son, Joseph Kabila, in office in Kinshasa, Congo's ostensible capital.
The younger Kabila inherited a broken infrastructure and a tenuous national identity shaped on repression and patronage rather than governance and the supply of basic services. Despite winning internationally sponsored elections in 2006, he still struggles to rule over a territory one quarter of the size of the United States, where a nebulous sense of Congolese identity -- based on French, music, and a shared oppressive history -- has not translated into allegiance to the Congolese state. Innumerable secessionist attempts, including those instigated by his father, have turned Congo into ungovernable fiefdoms tenuously linked to the center. Kabila has few tools at his disposal. There is little in the way of a disciplined army and police force; they are more used to living off than serving the population. Like Mobutu before him, Kabila is dependent on patronage to remain in power and on revenue from aid flows and mining taxes....
Posted on: Friday, March 20, 2009 - 16:29