Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: American Prospect (3-12-09)
This crisis doesn't yet have a name. It has all the hallmarks of a depression, but people are understandably reluctant too use the D-word. So let me suggest one: The Great Collapse, since this was both a financial collapse and an ideological one.
This great collapse doesn't have to be a second Great Depression – if government does nearly everything right, and soon. And when we come out the other side, we could have a more decent and sustainable society.
But if government doesn't do more, and fast, this could be worse than the 1930s.
Why? Three big reasons:
Finance: A Doomsday Machine.
The financial system is in far worse shape than it was when the stock market crashed in October 1929. In the 1920s, we had a stock market bubble, mainly because people could play the market "on margin," borrowing to invest in stocks. There were also scams like the original Mr. Ponzi's. Like the present decade, the Federal Reserve helped to enable the game, with low interest rates and few rules.
But today, thanks to "securitization" of loans and the ability of insiders to create exotic and unfathomable financial instruments, the current speculative system makes buying stocks on margin look like child's play. In the aftermath of the crash of 2008, the process of sorting it all out and getting banks functioning again is something that markets simply cannot do.
We are not even clear who owns what. The wise guys on Wall Street invented a doomsday machine from which there is no market escape.
In 1929 when the stock market crashed, the banking system was relatively healthy. Bank customers played these speculative games and took the losses, not banks. This time, the banks drank their own Kool-Aid.
It took until the awful winter of 1932-1933 for the general depression to fully infect the banking system. Over 4,000 banks failed in that winter alone. But Roosevelt's cure -- deposit insurance and a temporary bank holiday to sort out good banks from bad -- quickly got the financial system up and running again. There were fewer bank failures after 1933 than in any year during the 1920s. Today, by contrast, the banking mess is still dragging down the real economy, with no effective cure in sight....
Posted on: Thursday, March 12, 2009 - 13:27
... All hell broke loose in May 1968, when the community control school board in Ocean Hill-Brownsville summarily fired eighteen white unionized educators. The liberal teachers union, led by Albert Shanker, who had marched with King in Selma and sent teachers to Freedom Schools in the South, went on a series of strikes for 36 days. At the time, it was the longest and largest teachers’ strike in American history. Black Power activists further discredited themselves when they embraced egregious anti-Semitism toward Shanker and the heavily Jewish teachers’ union. They praised a student’s poem that began, "Hey, Jew boy, with that yarmulke on your head/You pale-faced Jew boy–I wish you were dead."
What were liberals to do? They knew that it was wrong when white people fired black people without cause, and they knew it was wrong when right-wing business leaders attacked unionized employees. But what was one to think when blacks fired whites and the assault on labor came from the left?
Most New York City liberals, black and white, sided with the Black Power community control activists. But there were exceptions, including A. Philip Randolph, head of the Brotherhood of Sleeping Car Porters. Randolph was a bastion of the movement, according to [Tom] Sugrue "the best-connected and best-known man in black America" when he successfully pressured FDR to end segregation in American defense industries in 1941. In Ocean Hill-Brownsville, Randolph argued that no one should be fired by race; it didn’t matter who was doing the firing. He was joined by Bayard Rustin, a close aide to King, who organized the 1963 March on Washington and criticized community control as "the spiritual descendant of states’ rights."
As Sugrue observes, the community control effort in New York City and other places yielded no real gains in student achievement, nor was it particularly popular among rank-and-file blacks. "The problem was not one of governance, it was one of resources," he concludes. Sugrue draws a similar conclusion from efforts to integrate schools by race: Verda Bradley, the plaintiffs’ mother in the Milliken case, "did not believe that association with white students would help children like Ronald and Richard overcome their educational or ‘cultural’ deficiencies." Instead, she says, "we were upset because they weren’t getting as many materials as some other schools." Sugrue agrees heartily.
This seems like a plausible lesson to draw from the community control and school integration efforts, but it’s not empirically sound. In the 1977 Milliken II case, which Sugrue fails to mention, the Supreme Court ordered substantially extra funding for Detroit schools as an answer to the failure to integrate in Milliken I. But as UCLA’s Gary Orfield has noted, the funding for parent involvement programs, special reading initiatives, better teacher training, and the like yielded no significant benefits. Other cities–like Washington, D.C. and Newark, New Jersey–have outspent their suburban counterparts, with little positive results to show for it.
What are we to make of these findings? That money doesn’t matter in education, or that blacks need to sit next to whites in order to learn? Neither. Instead, a long line of research shows that while money matters a great deal in education, people matter more, and that poor kids of all colors do better in middle-class environments. Sugrue is right to focus on "resources," but he construes the term too narrowly as per-pupil funding. Having classmates who have big dreams, value academic achievement, and don’t disrupt class is an important "resource." So is having a cadre of parents in the school community who volunteer in class and know how to hold school officials accountable. So are excellent teachers, many of whom won’t teach in high-poverty schools because they believe they won’t get much parental support and are worried about their physical safety. For all these reasons and others, separate schools for rich and poor, even when equally funded, are inherently unequal. Detroit schools are inferior not because they have "too many" black students but because they have extreme concentrations of poverty. While black students saw no academic gains in Boston when they integrated with poor and working-class whites, low-income students given the chance to live in and attend schools in affluent white suburbs of Chicago under the Gautreaux program saw substantial gains.
The reality of economic class is also important to understanding the battles Sugrue describes over affirmative action in higher education and employment. In the 1960s, the Civil Rights movement faced a crossroads: With the passage of legislation to outlaw future discrimination in employment and education, what should be done to remedy the legacy of centuries of brutal discrimination against blacks? The issue split the Civil Rights community. Of course, Black Power advocates had no qualms violating the principle of nondiscrimination when it favored blacks, but many mainstream leaders also made strong arguments that given the history of this country, it was necessary to temporarily discriminate in favor of blacks to set things right. As early as 1963, Sugrue notes, Whitney Young of the Urban League began pushing for reparations or "compensatory" programs to remedy the nation’s history of discrimination. He called for a "Marshall Plan for the Negro" and the hiring of "Negroes because they are Negroes."
Oddly, Sugrue fails to mention the alternative view, espoused not only by Randolph and Rustin, but by King himself. King struggled with the argument advanced by Whitney Young and others, but he ultimately rejected it. In his 1964 book, Why We Can’t Wait (and in 1967 testimony before the Kerner Commission), King called for "compensatory consideration," noting, "if a man is entered at the starting line in a race three hundred years after another man, the first would have to perform some impossible feat in order to catch up with his fellow runner." But instead of calling for a special program for blacks, as Young had, King called for a color-blind Bill of Rights for the Disadvantaged: "While Negroes form the vast majority of America’s disadvantaged, there are millions of white poor who would also benefit from such a bill." King continued, "It is a simple matter of justice that America, in dealing creatively with the task of raising the Negro from backwardness, should also be rescuing a large stratum of the forgotten white poor." King knew that class-based approaches would be colorblind but not blind to history; that race-specific programs would disrupt the progressive coalition with whites; and that on their merits, America needed a broader Poor People’s Campaign to root out inequality. (By contrast, the Urban League had long called for an alliance between blacks and white employers and opposed unionism.)
Sugrue doesn’t grapple with King’s argument–he simply ignores it, and in doing so ignores a crucial distinction among Northern white critics of the movement. What Sugrue fails to grasp is that there was a sizable subset of whites–think of New York City teachers who were strong supporters of King–who felt betrayed when Black Power activists called for hiring and firing based on race, whether in schools or offices. Nowhere does Sugrue distinguish between those Northern whites who were hostile to black advancement generally and those who objected to changing the rules about nondiscrimination. Nowhere does he distinguish between those whites who were offended by black separatists and those whites who never wanted integration in the first place. And yet when we consider the legacy of the Civil Rights struggles, it is absolutely essential that we keep such distinctions in mind.
Barack Obama’s presidency raises a number of interesting questions for Sugrue and others who take his view of the Northern Civil Rights movement. Clearly, Obama does not agree with the type of proposition advanced by Sugrue, namely that the political costs of embracing Black Power are negligible. Obama’s association with Rev. Jeremiah Wright was the single biggest threat to his campaign, and Obama wisely distanced himself from his former mentor. Indeed, had Obama taken Sugrue’s racial pessimism to heart, he never would have contemplated running for President at all. One has to wonder what Sugrue, who spends 500 pages minimizing the difference between Southern and Northern white attitudes, makes of the fact that John McCain trounced Obama among Southern whites by 38 percentage points but ran roughly even among whites in the rest of the country.
If Obama was right to ignore the type of racial pessimism that pervades Sweet Land of Liberty, he is also smart to reject Sugrue’s undiluted support for racial preference programs. The issue is likely to resurface next year, when the Supreme Court may consider a challenge by conservatives to the use of race in admissions at the University of Texas at Austin. Whereas Sugrue exhibits no concerns about racial preferences, Obama has been torn. During the campaign, he generally supported race-based affirmative action, but he also suggested that his own economically privileged daughters do not deserve affirmative action preferences, and that low-income whites do. In practice, this is an important concession, because 86 percent of blacks at selective universities currently come from middle- or upper-class backgrounds....
Posted on: Wednesday, March 11, 2009 - 20:47
SOURCE: TomDispatch.com (3-10-09)
How come they get to be the hawks? And we get to be the doves? A hawk is a noble bird. A dove. Well, basically it's a pigeon. The sort of bird that, in New York City anyway, messes your building's window sills, is always underfoot, and, along with the city's rats, makes a hearty lunch for the red-tailed hawks which now populate our parks.
Even a turkey would be less of a turkey than a dove. We get to carry that olive twig -- okay, they call it a"branch" -- around in our beaks, but you can bet your bippy that they get the olives, or, more likely, the opportunity to trample the olive groves into oil.
They get to swoop and prey. We get to pace the sidelines, cooing our complaints. Their ideas -- it never matters how visibly dumb they are -- get tried. Ours never do. And when theirs fail miserably, they get to recalibrate and try again. We never get to try once.
That's because it's well accepted that they are"realists" and we are"dreamers," or"utopians," or maybe, like most doves, vegans. If you're not addicted to force (and so failure), you're simply not a part of the grand scheme of things, of the world as it is.
They get hundreds of billions of dollars to play with. We don't get bus fare to Washington. Oh, and then, at about the point when everything they've planned for has gone to hell, they suddenly turn to us and, claiming we're just so many naysayers, ask belligerently what the hell we'd do now. What's our plan anyway?
And to make matters worse, even though they have a dismal record when it comes to predicting what their plans will do, they don't hesitate to explain to us with complete confidence just what sort of catastrophes our ideas will surely lead to. If we force them to withdraw from such-and-such a country in such-and-such a way, we'll be responsible for nothing short of"genocide," or ensure that a nuclear weapon goes off in an American city, or worse. And the media believes them, despite the fact that they've been proven wrong time and again, and so gives them carte blanche as"experts."
I'm talking, of course, about the U.S. military's top brass (uniforms and all those medals are just so imposing!), the key civilians in the Pentagon, the rest of the national security establishment, the hordes of think-tank strategists in our capital, and the political leaders who go with them. Talk about failing upwards! Despite everything, hawks rule; doves never even get the chance to take off. And as the novelist Kurt Vonnegut used to say, so it goes.
Force as the Solution
And now for a tad of history...
After the attacks of September 11, 2001, those few who suggested that the appropriate response might be intensive, determined global police action, not the loosing of the might of the U.S. military on Afghanistan, were derisively hooted from the room. It was so obvious that an invasion was not only a necessity, but couldn't fail against the ragtag Taliban and their al-Qaedan allies, not given the military might of the planet's"sole superpower." Even now, when it comes to that invasion"lite" and the subsequent occupation of Afghanistan from which unending disaster ensued, no mea culpas have been offered; nor does anyone in the mainstream pay the slightest attention to those who worried about, or warned against, such an approach.
Nor was serious attention paid when, before the invasion of Iraq, millions of people worldwide poured into the streets of global cities to say loud and clear: Don't do it! It'll be a catastrophe!
Instead, they did it. It was a catastrophe and both the antiwar crowds and the critics of that moment have been largely forgotten -- those who weren't simply discredited -- while the enthusiasts for the invasion, military and civilian, now often transformed into" critics" of how it and its aftermath were handled, remain the"experts" on what the U.S. should do next. Counterintuitive as it might seem, they are the ones whose assessments still count -- and that's par for the course.
Once the invasion was over, doves said, okay, at least don't occupy the country long term. Don't build massive bases. Get out while you can -- and quickly. Of course, no one who mattered paid the slightest heed to such wrong-headedness in the wake of such a historic"victory." And so it went. And so it goes.
In our world as it is, force remains the essential arbiter. And when its application leads to catastrophe, the response is... simply more of the same.
Consider this conundrum logically. On the one hand, you have a method that, in our moment, has failed the United States repeatedly. On the other, you have something largely untried, an attempt to settle problems without resorting to force or, at least, with minimal force or the use of force as a genuine last resort in defense of nation, kin, and self. Yet, their efforts and our money go only into developing better ways of using force, and ever-more-powerful and eerie ways of delivering it.
Or have I missed a sudden proliferation of peace task forces and think tanks in Washington? Has anyone seen the suggestion, first made in 1792 by signer of the Declaration of Independence Benjamin Rush, and more recently by Congressman Dennis Kucinich, for the establishment of a cabinet level Department of Peace go anywhere -- other than into the bottom drawer where the dossier on Kucinich's sighting of a UFO is stored?
On the one hand, failure; on the other, the unknown. You would think that, every now and then, the"opposites" principle the character George on Seinfeld applies to his failed life would hold. As Jerry Seinfeld tells him:"If every instinct you have is wrong, then the opposite would have to be right."
In Washington, though, what our former Secretary of Defense called the "known knowns" are invariably preferred, and so war rooms, not peace rooms, prevail and, as in Afghanistan today, military commanders remain our ultimate experts for whom every day is a potential do-over.
Force as Religion
In these last years in Washington, force became something close to an American religion. The Bush administration's top officials were all fundamentalists in their singular belief in the efficacy of force. In fact, they arrived convinced that an all-powerful, techno-wondrous military, unrivaled on the planet, left them with the ability to project force in ways no other power ever had. When it came to remaking the world, anything seemed possible.
What this meant was that an extreme version of military fundamentalism went hand-in-hand with an extreme version of economic fundamentalism. Today, both of these fundamentalisms are collapsing, even if a pared down version of the military half of the equation is anything but dead.
In those same years, Americans also began to genuflect before the idea of our military in ways previously unimaginable. They pledged their unending support for"our troops," now commonly referred to as"warriors," who were repeatedly hailed as the bravest, most valiant, most successful fighters around, part of the most awesome military ever. It -- and they -- simply could do no wrong. Given this faith, when things did go wrong, mistakes would never be blamed on the military.
As a result, while actual American soldiers were sent halfway across the planet in a distinctly unreverential way on their third, fourth, and fifth tours of duty (with few here giving much of a damn), Americans treated the idea of those"warriors" and their"mission" with ritualistic fervor.
A cold-eyed look at the record of the U.S. military in these last years, however, tells quite a different tale. It's no small thing, after all, that U.S. military actions in two disastrous wars managed to burnish the reputation of one of the uglier fallen dictators on the planet and pave the way for the return, as a national resistance force, of a brutish, retrograde, failed regime almost universally rejected by its own people when it fled in November 2001. I'm speaking, of course, about Iraq's Saddam Hussein and the Taliban of Afghanistan. Worse yet, the ever greater application of force, including recently the repeated firing of missiles from CIA-operated drone aircraft into the Pashtun borderlands of Pakistan, has resulted in the spread of the Taliban, religious extremism, terrorism, and war into the heartland of Pakistan, a nuclear-armed country now being destabilized.
What makes all this more remarkable is that, unlike the Soviet Union in Afghanistan in the 1980s, twenty-first century America had no impressive enemies to face in September 2001. In losing its brutal Afghan War, the Soviets confronted a superpower that was more than its match -- us. In Afghanistan today, it's estimated that the Taliban consists of but 10,000-15,000 relatively lightly armed guerrillas. The Iraq insurgency was probably only marginally larger than that at its height. Al-Qaeda, with a capability for major operations every couple of years, was even less impressive, despite the 9/11 televisual spectacular it put on.
You would have to go back to Spain at the beginning of the nineteenth century to find the match for this moment. Then, the most advanced military in Europe, Napoleon's army, an imperial force advancing (like the American military in recent years) under the banner of liberty, ran into a meat grinder of an insurgency from the Sunni fundamentalists of that day -- enraged Catholic peasants, often led by their priests. (If you want to know what that was like, check out Goya's unforgettable series of prints, The Disasters of War.)
In Iraq, over nearly six years, the U.S. military has recalibrated so many times it's dizzying. Who now recalls the"revolution in military affairs" that created the"lite," high-tech military which launched a "decapitation" campaign that killed plenty of Iraqi civilians but left all of that country's leaders with their heads still firmly on their shoulders; or the "shock and awe" campaign, which mainly awed Washington -- and that was before the occupation, the Sunni insurgency, and a civil war took root, after which tactical changes came and went with names like "get tough,""oil spot" and"ink blot," the "Salvador option,"" clear and hold," and "the surge" as well as the " clear, hold, and build" counterinsurgency strategy which is now supposedly being transferred to Afghanistan.
Today, Iraq, still one of the most dangerous places on the planet, is far quieter than at the height of the civil violence of 2005-2006 and so the"surge," overseen by Generals David Petraeus and Ray Odierno, is said in Washington to have worked, even if it hasn't succeeded in resolving the underlying ethnic, political, and religious tensions let loose by the American invasion. A recent article on the inside pages of the New York Times, however, offers a somewhat different perspective on the effectiveness of military force in Iraq in these last years. Little aid, Times journalist Timothy Williams reports, is now available to Iraq's estimated 740,000 widows, most made so, it seems, by years of war and violence; and that figure, he indicates, may be an undercount, given the chaos in which that country remains.
If you were capable of adding to the dead husbands of those hundreds of thousands of"war widows," the dead wives, dead sisters, dead daughters, dead grandmothers and grandfathers, as well as the children who died thanks, in one way or another, to the violence of those years, not to speak of the large group of dead young men who were not yet married, you would surely have a staggering figure, a toll of perhaps a million or more Iraqis from an estimated prewar population of perhaps 26 million. That level of slaughter might qualify in scale as near genocidal. (It's worth adding that, as in the Vietnam era so many decades ago, mainstream critics of antiwar critics continue to regularly suggest that any kind of"precipitous" withdrawal of American troops would almost certainly result in a genocidal slaughter, even as such a slaughter has taken place with the troops there.)
If the staggering numbers of dead civilians in Iraq's post-2003 killing fields, and those who are still dying, are a measure of Washington's"success," it's the success of the undertaker.
Taking Options Off the Table
Let's face it, the U.S. is addicted to force, and when force fails to achieve its purposes (for failure, too, is addictive), yet more force is applied in marginally different ways under radically different names.
Now much of Washington and the media have indeed reached a consensus that the Bush administration's use of force was a disaster of the first order. As a result, they have generally concluded that, in Iraq, we must be especially careful not to stop applying it too quickly lest we destabilize what's left of that country and, in Afghanistan, that achieving"stability" calls for the deployment of significantly more forces which, of course, will use significantly more force.
In Iraq, where President Obama is indeed talking about a withdrawal that would remove all U.S. forces by the end of 2011, we also know, thanks to Thomas Ricks's latest book The Gamble, that America's top generals, including Centcom commander Petraeus and General Odierno, the top commander in Iraq, believe we'll still be fighting in that country in 2015. In the meantime, the general who commands U.S. forces in Afghanistan, David McKiernan, is already talking about years more of fighting at surge levels -- and suggesting that yet more U.S. troops will be needed. ("I think... that this is not a temporary force uplift, that it's going to need to be sustained for some period of time. I can't give you an exact number of -- the year that it would be. But I've said I'm trying to look out for the next three to four or five years.")
In the meantime, the Obama administration is hoping to find some extra help by calling together a regional conference of interested countries, possibly including Iran, and by using the military to negotiate with and peel off"moderate" Taliban backers, while it sends in at least 17,000 more troops. This is what passes for new foreign policy thinking in Washington.
In the meantime, the Afghan chain of command has been further militarized. It now stretches from retired Marine General Jim Jones, the new national security advisor, through Centcom commander Petraeus and Afghan commander McKiernan to a soon-to-retire Army general, Karl Eikenberry, who reportedly will be appointed U.S. ambassador in Kabul. Meanwhile, in southern Afghanistan, as well as along the Pakistani border, peace and stabilization will involve the further application of force with results that shouldn't surprise us.
To summarize: They can be wrong a hundred times and when they are, they get to try every cockamamie scheme and call it anything they want. We don't even have names for whatever peace strategies might be used. And while Iran is, however grudgingly, however imperially, being invited to the Afghan table, antiwar activists and critics, no matter how on the mark they might have been, remain the equivalent of an American Hamas.
On the other hand, if you've been a"hawk" and a pundit, or one of those retired generals who talked us, however ineptly, through our latest wars (like the TV financial analysts who, in mid-meltdown, were still calling on us to buy more stocks and assuring us of the solidity of A.I.G. and Citigroup), you can't be wrong often enough to be asked to leave the table at which the Great Game is played.
Oh, and with this in mind, a small tip for prospective"doves" within the Obama administration: Be careful not to be too on the mark in your analysis or, at least, too loud about proclaiming it. On this subject, history is a suicide bomber and it's coming for you. After all, the worst thing in any administration is to be a dove and be right.
As David Halberstam memorably wrote in his history of the Vietnam War, The Best and the Brightest, of hawkish future Secretary of State Dean Rusk,"So [he] was once again promoted (the best people, who had correctly predicted the fall of China, would see their careers destroyed, but Dean Rusk, who had failed to predict the Chinese entry into the Korean War, would see his career accelerate.) There had to be a moral for him here: if you are wrong on the hawkish side of an event you are all right; if you are accurate on the dovish side you are in trouble."
Leaving the Comfort Zone
Let's be clear here. In our world, any application of imperial force is part of the problem, not part of the solution. It doesn't work. We can't afford it. It's not in"the national interest." The last seven years have made this abundantly clear for those who care to look.
Let's be clear on this, too: If we keep sending military people in to solve our problems, they will, not surprisingly, turn to military solutions. Whatever lip service they offer to diplomacy and other possible paths, they will, in the end, prefer force by whatever label. It's what they know. However uncomfortable its results, it's still their comfort zone.
That's why the American president is commander-in-chief -- exactly so that military men aren't left to"solve" our problems for us.
Let's be clear on this as well: Nobody knows what antiwar solutions would make sense, no less succeed, since so little effort or money or time or experimentation has gone into them, but we know a lot about what force can't do in our world.
Wouldn't it make sense to put a small percentage of the long-term effort and money that the Pentagon now profligately invests in force and the means to deliver it into strategies for peace, and into the de-escalation of the use of force as a solution (and of the global imperial mission that goes with it)? Shouldn't somebody consider, for instance, whether the principle found in so many individual martial arts -- that defense, and even striking reserves of power, can be found not in meeting force with blunt force, but in giving way before force -- might apply to more collective situations? Don't such groups as the Taliban and al-Qaeda feed off of, thrive and recruit off of, military action against them as well as the human destruction and the attention that goes with it?
Isn't it time for us to begin to take force off that"table" on which, officials in Washington always insist, lie"all options," but especially smash-mouth ones? Isn't it time to suggest that there can be no national interest when it comes to military action in Iraq or Afghanistan, only an imperial interest? Isn't it time to suggest that, as bad as things are, as little as we know how to do anything else, simply fighting on in Iraq or Afghanistan until 2015 or 2020, as our economic system collapses around our ears, can't be a solution to anything?
Decades ago, after visiting American troops in Vietnam, singer Johnny Cash was asked by a reporter whether that didn't make him a hawk."No, no, that don't make me a hawk," he responded."But I said if you watch the helicopters bring in the wounded boys, and then you go into the wards and sing for 'em and try and do your best to cheer 'em up, so they can get back home, it might make you a dove with claws." Later, he would call that image"stupid." Maybe that's because he didn't go all the way. Maybe he meant"a falcon of peace."
Posted on: Tuesday, March 10, 2009 - 21:43
SOURCE: Sandbox, a blog run by Martin Kramer (3-7-09)
[Prof. Kramer, the author of Ivory Towers on Sand: The Failure of Middle East Studies in America, was a full-time tenured academic at Tel Aviv University. He is now a senior fellow at the Adelson Institution of Strategic studies and a Senior Fellow at the Olin Institute at Harvard.]
How important has resentment of Israel been to Al Qaeda's terrorism? Here is one side of the argument, by an American who knows Saudi Arabia well:
The heart of the poison is the Israel-Palestinian conundrum. When I was in Saudi Arabia, I was told by Saudi friends that on Saudi TV there were three terrorists who came out and spoke. Essentially the story they told was that they had been recruited to fight for the Palestinians against the Israelis, but that once in the training camp, their trainers gradually shifted their focus away from the Israelis to the monarchy in Saudi Arabia and to the United States. So the recruitment of terrorists has a great deal to do with the animus that arises from that continuing and worsening situation.And here is the opposing view, by an American who knows the Kingdom equally well:
Mr. bin Laden's principal point, in pursuing this campaign of violence against the United States, has nothing to do with Israel. It has to do with the American military presence in Saudi Arabia, in connection with the Iran-Iraq issue. No doubt the question of American relations with Israel adds to the emotional heat of his opposition and adds to his appeal in the region. But this is not his main point.So now you've heard two sides of the debate. Who made the first statement? Charles "Chas" Freeman, former U.S. ambassador to Saudi Arabia and the Obama administration's nominee to head the National Intelligence Council (NIC). Who made the second statement? Charles "Chas" Freeman, former U.S. ambassador to Saudi Arabia and the Obama administration's nominee to head the National Intelligence Council (NIC).
The first quote dates from January 2004, the second from October 1998. The difference between them is 9/11, when it became the Saudi line to point to Israel's conflict with the Palestinians as the "root cause" of the September 11 attacks. The initial promoter of this approach in the United States (well before Walt and Mearsheimer) was Saudi billionaire Prince Alwaleed. "At times like this one," Alwaleed announced a month after 9/11, "we must address some of the issues that led to such a criminal attack. I believe the government of the United States of America should re-examine its policies in the Middle East and adopt a more balanced stance towards the Palestinian cause." That statement led then-mayor of New York Rudy Giuliani to return a $10 million check Alwaleed had just presented to him for a special "Twin Towers" relief fund.
Since 9/11 Freeman hasn't repeated his 1998 assessment ("nothing to do with Israel"), instead sticking with his Saudi-pleasing spin of 2004 ("the heart of the poison is the Israel-Palestinian conundrum"). It's not hard to figure out why. When the 9/11 Commission interviewed him in 2003, it noted that his position as president of the Middle East Policy Council "requires regular trips to the Persian Gulf for fundraising. While there, he meets with many senior Saudi officials." In 2006, Freeman finally went the extra mile, offering this explanation for 9/11:
We have paid heavily and often in treasure for our unflinching support and unstinting subsidies of Israel's approach to managing its relations with the Arabs. Five years ago, we began to pay with the blood of our citizens here at home.
Freeman was now touting precisely the sort of nonsense he had previously dismissed out of hand. And he hit paydirt for doing it: within months, Prince Alwaleed wrote a check to Freeman's Middle East Policy Council for $1 million. Here is a photo of Freeman, supplicant, visiting Alwaleed in the latter's Riyadh HQ.
Does Freeman really believe that Israel's actions caused Bin Laden's terror? Who knows? He's put forward two completely contradictory explanations. One would like to believe that in his heart of hearts, he still knows what he knew in 1998, that Bin Laden's "campaign of violence against the United States, has nothing to do with Israel." One would like to believe that in 2006, he was cynically shilling for the Saudis when he blamed 9/11 on "our unflinching support and unstinting subsidies of Israel's approach." Because if he wasn't just cynically shilling, he's gone off the rails. (Actually, there is a third Freeman explanation for 9/11, so bizarre that I don't know quite how to categorize it. Parse this: "What 9/11 showed is that if we bomb people, they bomb back.")
If Freeman's gone off the rails, he obviously shouldn't be taken out of mothballs to coordinate U.S. intelligence. But that's so even if he was just cynically shilling. “An ambassador," said Sir Henry Wotton, "is an honest man sent abroad to lie for his country.” In America, an ex-ambassador is all too often an honest man hired from abroad to lie to his own country. Freeman may have an impeccable record of past service, just as his old buddies attest. But if the National Intelligence Council and its products are to earn the respect of the American people, the NIC chair cannot be suspected of ever having deliberately twisted the truth into something else for our consumption, especially on a crucial issue of national security and at the behest of foreign interests.
Chas Freeman doesn't pass that test.
Posted on: Tuesday, March 10, 2009 - 21:42
SOURCE: Daniel Pipes website/Jerusalem Post (3-11-09)
With Binyamin Netanyahu, head of the Likud Party, about to become Israel's next prime minister, one wonders whether he will stick to his more controversial campaign promises – not that of confronting the Iranian threat, which is widely backed, but such as ending Hamas control of Gaza or keeping the Golan Heights.
Two indicators suggest what may lie ahead: (1) the general pattern of the four Likud prime ministers since 1977 and (2) specifically, Netanyahu's own record as one of those four.
Levi Eshkol (p.m. 1963-69) once acknowledged the deceit of Israeli politics:"I never promised to keep my promise!" In this spirit, three out of the four Likud leaders campaigned right and governed left, breaking their campaign promises not to retreat from territories Israel seized in 1967.
- Menachem Begin (p.m. 1977-83) was elected in 1977 on a nationalist platform that included annexing parts of the West Bank; he instead removed all troops and civilians from the Sinai Peninsula.
- Yitzhak Shamir (p.m. most of 1983-92) ran on a platform against giving land to Arabs and kept his word.
- Netanyahu (p.m. 1996-99) promised to retain the Golan Heights but nearly traded away that territory; opposed the Oslo accords but ceded more control in the Hebron and Wye accords to the Palestinian Authority.
- Ariel Sharon (p.m. 2001-06) won the 2003 elections arguing against a unilateral withdrawal from Gaza, then did exactly that, withdrawing all troops and civilians.
Surveying Likud's history, Nicole Jansezian notes with irony at Newsmax that"While Palestinian, American and European leaders worry how Israel's shift to the right will negatively impact the peace process, perhaps the only ones who need to fear an Israeli right-wing government is the Israeli right wing."
Shamir's opinion of Netanyahu plummeted after watching his actions as prime minister, seeing him by 1998 as willing to do just about anything"to continue to be elected and to hold on to the seat of prime minister." I went through a similar process of disillusionment, celebrating Netanyahu's accession in 1996 but so soured on his lack of principles that I reluctantly preferred his Labor opponent in the 1999 elections.
What now, as Netanyahu prepares to take office again? Neither his party's history, nor his own biography, nor his character, nor murmurs coming out of Israel suggest that he will keep his electoral promises. Indeed, Netanyahu already flunked his first test: after 65 of Israel's 120 members of parliament informed President Shimon Peres that they supported Netanyahu for prime minister, Peres on Feb. 20 gave Netanyahu a chance to form a government.
Netanyahu proceeded to ditch those allies in favor of forming a"national unity" government with leftist parties, notably Kadima and Labor. He even announced that his biggest mistake in 1996 had been not to form a government with Labor:"In retrospect, I should have sought national unity, and I'm seeking to correct that today." Kadima and Labor appear to have decided to go into the opposition, foiling Netanyahu's plans. But that he preferred a coalition with the left reveals the lightness of his campaign statements.
Along these lines, when asked by an interviewer,"You're not the right-wing hawk they describe in the papers?" Netanyahu proudly recalled the betrayal of his promises in the 1990s:"I'm the person who did the Wye agreement and the Hebron agreement in the search for peace."
On the Golan Heights, diplomacy has apparently begun. U.S. Secretary of State Hillary Clinton says the importance of Syria-Israel talks" cannot be overstated." Despite Netanyahu's ostensibly rejecting these negotiations, a close aide observed that a breakthrough with Damascus offers a way to curry favor with the Obama administration and Netanyahu would expect Washington in return"to give him a break with the Palestinians."
Insiders assure me Netanyahu has matured and I hope they are right. But a Likud leader observed while watching the coalition talks,"Bibi is selling everything out to the coalition partners. He doesn't care about us. He only cares about himself." Similarly, Netanyahu's opponents expect him to pursue his personal agenda: Yaron Ezrahi, a political scientist at Hebrew University, says Netanyahu has little compunction"in sacrificing an ideological position as long as it keeps him in power."
Even as I hope to be pleasantly surprised, familiar patterns do make me worry.
Posted on: Tuesday, March 10, 2009 - 21:28
SOURCE: Informed Comment (Blog run by Juan Cole) (3-9-09)
The Pakistani Taliban are not going to take over the Pakistani government. That worry doesn't keep me up at night. They are small, and operate in a rugged, remote area of the country. They can set off bombs and be a destabilizing force. But a few thousand tribesmen can't take over a country of 165 million with a large urban middle class that has a highly organized and professional army.
In contrast, the increasingly rancorous conflict between the left of center, largely secular Pakistan People's Party and the right of center, big-landlord Muslim League, has the potential to tear the country apart.
So here is some important background. Pakistan is made up of four ethnically and linguistically based provinces, Sindh, Baluchistan, the North-West Frontier, and Punjab. Punjab is Pakistan's most populous province, with 55 percent of the population and much of the country's wealth.
The PPP, led by the Bhutto family of Sindh, has a national organization and won seats from all over the country in the parliamentary election of Feb. 2008. But its firmest base is Sindh, which is a province divided into a very poor rural Sindhi population and the big urban port of Karachi, which is mainly populated by Urdu-speakers whose families came from India at Partition in 1947. Karachi's politics is now dominated by the MQM, an Urdu-speakers' secular nationalist party, which has developed an alliance with the Pakistan People's Party.
Baluchistan only has 5 percent of the country's population, and is vast, rugged and arid. It may have a lot of natural gas but who knows? Right now it is not a big player. The North-West Frontier is populated by Pushtuns, organized as somewhat egalitarian clans. They have been most deeply affected by the wars in Afghanistan, and a movement of Pakistani Taliban is active there, though most Pushtuns are not fundamentalists or militants.
So Punjab is the real prize for a politician, sort of the California of Pakistani politics, being rich agriculturally but also having a dynamic urban sector.
In last year's elections, the PPP took Sindh and was able to find political allies in Baluchistan and the North-West Frontier. But the Muslim League took Punjab. Shahbaz Sharif, the brother of the Muslim League's leader, Nawaz Sharif, was the Chief Minister of Punjab.
PPP leader Asaf Ali Zardari initially sought an alliance with the Muslim League against then dictator Pervez Musharraf, and pledged that Shahbaz would remain Chief Minister of Punjab even though the PPP became the dominant party in the federal parliament.
A conflict developed between Nawaz Sharif and PPP leader Asaf Ali Zardari over the deposed chief justice Iftikhar Chaudhury. Dictator Musharraf had dismissed Chaudhury in spring of 2007 for opposing some of his policies. Pakistan's massive legal establishment began holding rallies and demanding that the chief justice be reinstated, which he was in summer 2007. Musharraf was under pressure from Washington to become a civilian president. But he found out that fall that the supreme court would not allow this transition because the constitution requires that a military man have been out of the service for 2 years before becoming president. So Musharraf just dismissed the whole supreme court, including the recently reinstated Chaudhury, and appointed a new court, which sycophantically recognized him as president.
When he was allowed to come back to Pakistan from exile in Saudi Arabia, Nawaz Sharif, who had been overthrown as prime minister in 1999 by Musharraf, began demanding that Iftikhar Chaudhury and the old, dismissed, supreme court be reinstated.
After the PPP won the parliamentary elections, its leader, Zardari, declined to reinstate Chaudhury. Zardari was afraid that the chief justice might reinstate the corruption charges against him, which had been amnestied by Musharraf.
Zardari was elected president last September. The conflict between him and the Muslim League, simmered along.
But just last week, the supreme court dismissed Shahbaz Sharif as chief minister of Punjab, and barred him and Nawaz Sharif from running for office. Some suspect the court of acting at President Zardari's behest.
The Sharif brothers say that this court is anyway illegitimate and refuse to recognize its rulings, since it is the fruit of a poisoned tree, i.e. the arbitrary creature of a desperate military dictator 18 months ago.
The attorneys are also still angry over the failure of Zardari to reinstate Chaudhury and the others.
So on March 15, the Muslim League (which is more conservative landlord than religious fundamentalist, despite the name) is organizing a"long march" on parliament to protest the current supreme court and the recent decisions it issued against the Sharifs.
On the hustings, Nawaz Sharif said that the only thing that could save Pakistan now was a revolution, and announced that he had"raised the standard of rebellion."
An adviser to the Interior Ministry (equivalent to our Homeland Security) then came out yesterday and warned that the Sharifs could be charged with sedition if they talk like that.
So now you have people talking about the danger of a repeat of the partition of Pakistan into Bangladesh and West Pakistan in 1971. I presume the Muslim League would get Punjab and the PPP would get the other three provinces.
For Pakistan's two major civilian parties, who only 7 months ago rid the country of a military dictator, to go mano a mano at each other like this is potentially tragic. If they destabilize the country, they could tempt the military to come back out of the barracks and make yet another coup. Short of that, there could be faction-fighting in villages and cities.
Pakistan is a nuclear state, so this degree of instability is especially worrying. The danger is not a take-over by the Taliban, but rather a coup (led by whom of what views?) or blood in the streets.
Meanwhile, dictator-in-retirement Musharraf blames the Pakistan Muslim League (N)-- the"N" stands for Nawaz-- for the crisis.
The Taliban are small potatoes compared to this clash of titans.
Posted on: Tuesday, March 10, 2009 - 14:39
Until around 1960 this cluster dominated much public discourse, as it does not today. Happily, Jones and Cox don’t waste any of their thirty-five pages revisiting the overdone analysis of reasons for their relative decline in size, status, and noise. Old stuff. The new stuff here is their set of findings about clergy voices and actions today (as of last August, that is). While the mainliners have enemies, mainly among conservative Protestants and think-tanks on the right, they go about their work in thousands of vital congregations and more struggling ones. Those enemies like to portray them as ideological leftists; Clergy Voices does not find them so. The word “diffuse” shows up in the reports. They have voices in public affairs, but rarely and mildly try to project or enforce social justice “dogma.” Some see their limits as a result of lay reaction to leftism, but current members are not massively assaulted with radical preachments and policies.
Politicians who would organize and exploit them, as they do some other religious groups, would have difficulty doing so; constituencies vary too much by denomination, region, social class, and height of boundaries that might be used to keep members in and others out. Their members may have strong social justice commitments, but they blend them with those in other religions or in the secular order. Yes, half call themselves “liberal,” because they are not afraid of the label, but a third are “conservative.” Over half are Democrat-“leaning” and one-third “claim a Republican affiliation.” No surprise here: More than three-quarters want the federal government to do more on the social problems front, especially in respect to environmental and health care issues. They fall into the “church-state separation” camp, and far more are worried about public officials who are too close to religious leaders than about those who are too far.
Four out of five speak up on hunger and poverty issues but—and this fits the stereotype—only one-fourth “often discussed the issues of abortion and capital punishment.” They are friendlier than not to gay and lesbian people, and a majority supports their rights. Clergy? Ninety-three percent are still white, eighty percent male, only twenty-nine percent believe in biblical inerrancy, almost eighty-percent say they are strongly interested in politics, but most don’t preach on specific legislative or candidacy themes. They and their members pitch in on other than directly political causes and prefer broad-based works of mercy through voluntary associations in church and beyond it. On the large screen, most “are firmly opposed to the war in Iraq and most think Israel has to make greater concessions to achieve Middle East peace. That, in our reading, is the solitary issue that prompts editorial and talk-show talk. They are generally for control of guns. Maybe that’s a clue to the reasoning of those who attack them: Taking on guns, they attack what may be America’s real religion.
Find information on the sponsoring agency of the survey at www.publicreligion.org; the survey itself is available at http://www.publicreligion.org/research/?id=167.
Posted on: Tuesday, March 10, 2009 - 01:50
SOURCE: http://www.ourfuture.org (blog) (3-9-09)
The following resolution and argument was prepared at the invitation of the Yale Political Union (YPU). An organization composed of Yale student political groups from across the political spectrum, the YPU is the oldest collegiate debating society in America. Presented on February 25, 2009, the resolution passed by a margin of 2 to 1.
In his 1939 book –"It Is Later Than You Think: The Need for a Militant Democracy" – Max Lerner proffered: “The basic story in the American past, the only story ultimately worth the telling, is the story of the struggle between the creative and the frustrating elements in the American democratic adventure.”
With those words in mind, I move that: “Americans Should Embrace their Radical History.” And to second the resolution, I call upon a voice from 1930, one of America’s finest voices, the voice of Franklin Delano Roosevelt, a man destined to become the greatest president of the twentieth century.
Looking back on 10 years of conservative-Republican presidential administration and what they had wrought – an intensifying economic crisis and spreading human misery that would come to be known as the Great Depression – FDR, who was then the Governor of New York State, said: “There is no question in my mind that it is time for the country to become fairly radical for a generation.”
And do we not see what Roosevelt saw then?
We have experienced three decades of conservative ascendance and power. Three decades, in which well-funded conservative movements, and ambitious and determined political and economic elites, secured power and subordinated the public good to corporate priorities, enriched the rich at the expense of working people, hollowed out the nation’s economy and public infrastructure, and harnessed religion and patriotism to the pursuit of power and wealth. In short, we have endured thirty years of rightwing political reaction and class war from above intended to undo or undermine the progressive advances of the 1930s and 1960s.
Plus, if all that were not enough, we have suffered eight years of a presidency – the presidency of George W. Bush – marked not only by the tragedies of 9/11, war in Iraq, Hurricane Katrina, and the collapse of an interstate highway bridge in Minnesota, but also by assaults on our civil liberties, the denigration of human rights, breaches in the wall separating church and state, tax cuts for the wealthy, a campaign to privatize Social Security, continued corporate attacks on labor unions, and the pursuit of a politics of fear and loathing – all of which has not only led us to the brink of economic and social catastrophe, but also effectively placed the American dream and the nation’s exceptional purpose and promise under siege.
We clearly see the consequences of conservative rule or, more accurately, misrule – not to mention, liberal deference to it.
I therefore urge this assembly to resolve that, “We Americans should embrace our radical history” – and as FDR himself averred – “make the nation radical for a generation.”
I do so not only because the circumstances we confront demand a radical response, but also because to do otherwise would be to deny who we are. Our shared past calls on us to do so. Our own historical longings urge us to do so. And Americans yet to be await our determination in doing so.
Let us start by recalling our history and reminding ourselves who we are – a by no means simple or easy task. For as ruling classes have been ever wont to do, America’s own powers that be have regularly sought to control the telling of the past in favor of controlling the present and the future.
I could take you through a long list of New Right initiatives – from Ronald Reagan to George W. Bush – intended to determine the shape and content of American memory, consciousness, and imagination. But let’s just consider the popular little volume and video – Rediscovering God in America – authored and produced by one of America’s smartest and most prominent conservatives, former Speaker of the House, Newt Gingrich. Therein, Gingrich, a PhD in History, takes us on a walking tour of Washington DC – a walking tour in which he guides us around the Mall to discuss both the monuments and the figures they memorialize.
Sounds nice, right? But there’s more to it. Along the way Gingrich presents a narrative of U.S. history that attributes America’s founding, survival, and progress to Divine will, to our unceasing faith in and devotion to God, and to our having sustained God’s and religion’s presence in the public square.
Fair enough, you might say. However, after bizarrely and vehemently warning that “There is no attack on American culture more destructive and more historically dishonest than the secular left’s relentless effort to drive God out of the public square,” Gingrich not only discounts or ignores the fact that most of the leading Founders were deists not Christians and that – in one of the most revolutionary acts of the age – they wrote a “Godless Constitution” which provided for the separation of church and state. He also somehow neglects to mention that those originally most determined to assure that separation included not just the usual suspects, but also Christian evangelicals.
Nevertheless – with all due respect to God and the faithful among us – we must remember who we are, for as Wilson Carey McWilliams proffered twenty-five years ago: “A people’s memory sets the measure of its political freedom. The old times give us models and standards by which to judge our time; what has been suggests what might have been and may yet be. Remembering lifts us out of bondage to the present, and political recollection calls us back from the specialization of everyday existence, allowing us to see ourselves as a people sharing a heritage and a public life.”
So let us not forget that we are the descendants of Revolutionaries – of men and women who, inspired by an immigrant working-class pamphleteer, Thomas Paine, through words such as “The sun never shined on a cause of greater worth,” “We have it in our power to begin the world over again,” and “These are the times that try men’s souls,” not only turned their colonial rebellion into a war for independence, but also transformed themselves into a nation of citizens, not subjects; endowed their new nation with exceptional purpose and promise; and launched a world-historic experiment in extending and deepening freedom, equality, and democracy.
Let us not forget that we are the descendants of generations of radicals – of men and women, native-born and immigrant, who, struggled not only to realize the American dream, but also to expand the “We” in “We the People.” Recognizing the contradictions between the nation’s ideals and realities – and rejecting the notion that the American experiment had reached its limits – evangelicals, workingmen’s advocates, freethinkers, slaves and abolitionists, suffragists, populists, labor unionists, socialists, anarchists, and progressives, respectively, dissented from their established churches; pressed for the rights of workingmen; insisted on the separation of church and state; resisted their masters; demanded an end to slavery; campaigned for the equality of women; challenged the power of property and officialdom; and together made the nineteenth century an age not only of growth, expansion, conflict, and the accumulation of capital, but also of militant democracy.
And let us not forget that we are the children and grandchildren of America’s most progressive generation, the men and women who confronted the Great Depression and the Second World War – the men and women who not only made the “We” in “We the People” all the more inclusive, but also subjected big business to public account and regulation; empowered government to address the needs of working people; organized labor unions; fought for their rights; established Social Security; expanded the nation’s public infrastructure; refurbished its physical environment; and defeated the tyrannies of German fascism and Japanese imperialism.
And you yourselves are the children of a generation who – for all of our many faults and failings – marched for civil rights, pursued the vision of a Great Society, challenged cultural prohibitions and inhibitions, pushed open institutional doors for women and people of color, and protested an imperial war in Southeast Asia. Admittedly, we made mistakes, regrettable mistakes. But we also made America better and more promising in the process.
Finally, let us never forget that we are the descendants of Americans who – confronting seemingly overwhelming crises in the 1770s, 1860s, and 1930s and ‘40s – not only rescued the United States from division, defeat, and devastation, but also succeeded, against great odds and expectations, in extending and deepening freedom, equality, and democracy further than they had ever reached before.
Still, I do not argue that we “should embrace our radical history” merely because we owe it to past generations to do so – though that in itself is a good, strong, and compelling reason to do so.
I further contend that we should embrace our radical history, because we owe it to ourselves – and, ultimately, to Americans yet to come – to do so
As our greatest democratic poet Walt Whitman rightly saw it: “There must be continual additions to our great experiment of how much liberty society will bear.”
Or, even better, as the progressive journalist Henry Demarest Lloyd put it a century ago – in words that I believe you will immediately grasp: “The price of liberty is something more than eternal vigilance. There must also be eternal advance. We can save the rights we have inherited from our fathers only by winning new ones to bequeath our children.”
Those words do speak to you – don’t they? You know why? Because you are Americans – and no less so than any previous generation of Americans, you – all of us – remain radicals at heart.
Yes, the likes of Newsweek editor Jon Meacham tell us that “America remains a center-right nation.” And yes, former Reagan speechwriter and now Wall Street Journal columnist Peggy Noonan has very graciously reminded us that our newly-inaugurated President Obama “would be most unwise to rouse the sleeping giant that is conservatism.” But such talk ignores or denies what we ourselves feel and have been feeling for some time….
While we may not yet fully recognize it, we ourselves continue to feel the radical impulse and democratic imperative that generations of Americans, through their struggles, passed on to us – or better said, endowed or imbued us with. Truly, we never stopped feeling them.
Ask yourselves this: Why was it that in the midst of the seemingly most conservative political era since the 1920s, Americans passionately sought to recall, honor, celebrate, and engage, America's most revolutionary and progressive generations – the nation’s Founders and the so-called Greatest Generation and its greatest leader, FDR?
Most of you are probably too young to remember the mid 1990s explosion of interest in the likes of Washington, Franklin, Adams, Jefferson, Madison, Hamilton, and yes, Paine – an explosion of interest that editors and academics alike somewhat dismissively referred to as “Founders’ Chic.”
And you may also be too young to remember the even grander explosion of interest in FDR and the young men and women of the Great Depression who went on to fight the Second World War – an explosion of interest that turned books like Tom Brokaw’s The Greatest Generation into bestsellers and films like Steven Spielberg’s Saving Private Ryan into blockbuster hits; that made television series such as HBO’s Band of Brothers and Ken Burns’ The War major events; and that instigated innumerable popular gatherings around the country. Indeed, an explosion of interest that led us to erect two new grand monuments in the very heart of the nation’s capital: one to Franklin Rosevelt and the other to the 16,000,000 veterans of World War II.
But even if you do remember those developments, you may not have critically considered what they represented. And you would not have been alone in not doing so.
Consider the phenomenal interest in the Greatest Generation and its greatest leader. While commentators marveled at its scale and intensity, they never seemed to grasp the most profound meaning of it all. Discussing the New Deal as merely a massive program of economic recovery and the Second World War as just a series of vast military struggles – and describing Americans’ expressions of admiration and affection as if it were all one big farewell party – mainstream media folk never really appreciated or acknowledged either the radical-democratic achievements of the men and women of the 1930s and 1940s or the radical-democratic anxieties and yearnings that motivated the popular desire to thank, honor, and celebrate the generation that was passing away.
In fact, many a conservative – after decrying that FDR didn’t even deserve a monument – used the interest in and admiration for the Greatest Generation as an opportunity to attack the Sixties Generation for challenging the nation’s political and cultural order and opposing the war in Vietnam. And sadly enough, leftists did little better. They either belittled the attention to the wartime generation as nothing more than nostalgia, media hype, and the commercialization of the past or – in a somewhat paranoid fashion – charged that government and media were orchestrating a campaign to eradicate the nation's"Vietnam syndrome" in favor of new"imperial adventures."
Such critics – right and left – never really considered the connection between what Americans were experiencing and what they might actually have been trying to say and do. We, however, should not fail to consider it.
Recall that in November 1992, Americans – despite the nation’s victories in the very long Cold War and the very brief Gulf War – turned out the Republican incumbent George H.W. Bush in favor of Democrat Bill Clinton, the presidential candidate who not only emphasized “change,” but also promised to address the needs of middle- and working-class families by, among other things, investing in the nation’s already crumbling public infrastructure, protecting the environment, and establishing a system of universal national health care.
Of course, if Americans truly were expecting renewed liberalism, they were to be sadly disappointed, for Clinton quickly betrayed those who had worked to place him in office by making his first priority the passage of the North American Free Trade Agreement, an initiative proposed by Republicans and promoted by big corporations.
And we know what happened next. In the wake of NAFTA’s passage and the death of the promised progressive endeavors, Republicans, led by Newt Gingrich, took control of both houses of Congress for the first time in forty years.
Change and growth ensued, but not always or exactly the sort hoped for in 1992. In addition to learning of “ethnic cleansing” in the Balkans and genocidal civil wars in Africa, Americans witnessed accelerating globalization, persistent corporate “downsizing,” the further deregulation of capital and privatization of public goods and services, the steady erosion of the nation’s industrial base and decay of its public infrastructure, continuing assaults on labor, increasing concentration of wealth, intensifying material insecurities, the termination of Aid to Families with Dependent Children, the growth of illegal immigration, virulent “culture wars,” the emergence of rightwing militias, foreign and domestic terrorist attacks, burnings of black churches, killings at family-planning clinics, the impeachment of a president, and a quite possibly stolen presidential election in 2000.
Never get nostalgic about the 1990s!
Politicians and pundits of every sort described Americans as deeply divided, angry, and cynical, and Americans surely had substantial cause to feel that way. And yet they did not – at least not in the fashion asserted by all the media talk and images.
More serious studies showed that while Americans felt anxious, resentful, and even pessimistic, they not only continued to subscribe to both the “American creed of liberty, equality, and democracy” and the “melting pot theory of national identity.” They also continued to believe – even while recognizing that Americans had far from always lived up to them – that those very ideals and aspirations defined what it meant to be an American.
In other words, Americans still possessed a shared understanding of and commitment to the nation’s historic purpose and promise – though they did wonder seriously about its prospects and possibilities
What politicians and pundits missed – or tried to obscure – about the popular desire and effort to reconnect with the Founders and the Greatest Generation was that Americans were doing exactly what Americans have always done when they sense that the American dream and the nation’s historic purpose and promise are in jeopardy.
Almost, instinctively, they were looking back – back to those who originally and most powerfully expressed what it meant to be an American – most particularly to those who, facing crises themselves, made the United States radically freer, more equal, and more democratic in the process.
Even after thirty years of conservative and corporate rule – even after concerted efforts to make us forget, or at least confuse us about our history and what it has to say to us – we, too, not only yearn to redeem America’s purpose and promise. We also find ourselves looking back and reaching out to America’s Revolutionary and radical pasts. The task however – a task made all the more urgent by the crisis we face – is to embrace it. And perhaps we are not so far from doing just that…
In fact, maybe the resolution before us is not as fantastic as it seems… For if we look closely, we might well see that Americans are already reaching out to grab hold of and embrace their radical history. We might well see that instead of simply saying “We Americans should embrace our radical history,” we should actually be leaning into it and saying: “YES, We Americans should embrace our radical history.” Or – to quote a recently popular refrain – “Yes, we can.”
Now don’t get me wrong. I am not calling Barack Obama a radical. I’ll leave that to Rush Limbaugh or Sarah Palin (or to her smarter body-double, Tina Fey). Nevertheless, something critical, something progressive – and possibly even radical – seems to be happening.
Think back five weeks – to January 20th – to the inauguration of our new president. Inaugurations are always historic occasions, especially when one party replaces another. But this time it was historic in an even grander sense, for Americans had elected a black man to their nation’s highest office.
Of course, racism persists. But the day that Barack Obama took the oath of office was not simply a break with the past. It was truly a day of transcendence.
Looking from the Mall up to the Capitol – either standing there in the cold or watching on television – Americans, not only African Americans, but all Americans, had reason to take pride and even shed tears of joy.
And yet, perhaps there was even more going on than that – that is, more than the talking-head politicians, pundits, and presidential scholars pointed out to us.
Here’s what I mean…
Shift the vantage point and look out on the Mall from the Capitol as our new president did. Now if Newt Gingrich – or the Reverend Rick Warren – were talking to us, they would tell us that we were witnessing the American people assembled together in the presence of the Almighty.
But I saw something else that day – and maybe many of you did, too. I saw something that made me think that as much as Obama’s ascendance to the presidency represented a radical break with the past, it also represented something oh-so-very American, and yet again, in that very way, something also truly radical and truly promising.
I saw two million Americans gathered together amidst monuments and memorials that testify not so much to God’s beneficence – or, at least, not to that alone – but all the more to our persistent aspirations and perennial efforts to extend and deepen freedom, equality, and democracy.
I saw two million Americans – in all their wonderful diversity – celebrating their democratic lives, peering into the future with hope and expectation, and pressing up against monuments and memorials that render nothing less than a grand narrative of revolution and radicalism.
There they were – there we were – standing beneath a monument to a man who led a revolutionary army; chaired a constitutional convention that announced to the world that here in the United States “We the People” rule; and served as the first president of a pioneering democratic republic.
There they were, standing before a memorial to the man who wrote the words declaring “all men are created equal.”
There they were, standing in front of a monument to the man who – leading the Union through a bloody Civil War – proclaimed a “new birth of freedom” and called on his fellow citizens to devote themselves to assuring that “government of the people, by the people, for the people, shall not perish from the earth.”
There they were, standing by a memorial to the man who – in the very toughest of times – articulated our grandest and most radical aspirations in terms of four essential freedoms: “Freedom of speech and expression… Freedom of worship… Freedom from want… Freedom from Fear…”
And closer in, there they were at a memorial to our parents and grandparents, Americans who, in their many millions, fought and labored for those Four Freedoms.
One could almost hear Marian Anderson singing God Bless America and Martin Luther King, Jr., pronouncing “I Have a Dream,” from the steps of the Lincoln Memorial.
And if that were not enough, we actually heard our new president essentially calling them all forth to stand with us. He spoke of our revolutionary and radical pasts. He spoke of America’s continuing purpose and promise. And he spoke of what we needed to do by reciting the words that Washington ordered read to his troops on that cold and fateful Christmas eve in 1776 – words of Thomas Paine from his revolutionary pamphlet, The Crisis: "Let it be told to the future world... that in the depth of winter, when nothing but hope and virtue could survive ... that the city and the country, alarmed at one common danger, came forth to meet [it].”
Again, I am not saying that Obama himself is a radical – Hell, he used Paine’s words, but never mentioned Paine’s name!
But really, the point isn’t whether Obama is or isn’t a radical. It’s that we ourselves need to be.
Only then might we make him the great democratic president that we require. And even more crucially, only then – in the best of our traditions – might we redeem America’s purpose and promise and make an even greater nation for ourselves and for those who follow us.
We have much to do. In addition to repairing the damage to the Constitution of the past eight years, we must enact the Employee Free Choice Act, establish universal health care, re-appropriate the wealth appropriated from working people, invest in new technologies, refurbish our public spaces and national infrastructure, democratize corporations, and pursue a New Deal on immigration.
Propelled by the memory and legacy of those who came before us, the yearnings and aspirations we ourselves feel, and the responsibility we have to those yet to come, we can pursue not only recovery and reconstruction, but also the making of a freer, more equal, and more democratic America.
So – leaning into it, and saying it as I should have said it to begin with – I call on this House to join me in resolving that “We Americans SHOULD EMBRACE our radical history.”
Posted on: Monday, March 9, 2009 - 23:00
SOURCE: Christian Science Monitor (3-9-09)
The American newspaper is dead. Long live the American newspaper!
OK, so reports of the demise of daily journalism are a bit premature. But you can't open up the newspapers today without reading news about the papers. Declining circulation and advertising revenues have forced newsrooms to trim their staffs, which means less real reporting. A few city papers have closed – the most recent victim was Denver's 150-year-old Rocky Mountain News – while others fill their pages with fluff pieces or wire-service stories. Put simply, it's getting too expensive to gather news.
So here's a novel idea: Let's get university professors to do it. For real. And, best of all, free of charge.
Remember, most professors aren't paid for what they write now. When I publish an article in an academic journal, I don't earn a cent. But I also20don't engage more than a handful of readers, mainly fellow specialists in my own field.
It wasn't always that way. A hundred years ago, many of the leading lights in the social sciences and the humanities wrote for the popular press. If we want to revive the press – as well as our own struggling disciplines – we might look to their example.
Consider Robert E. Park, founder of the "Chicago School" of sociology and one of the most prominent intellectuals of the early 20th century. After earning his PhD in 1904 from the University of Heidelberg, in Germany, Park became secretary and press agent of the Congo Reform Association. Park's muckraking magazine articles exposed Belgium's vicious atrocities in the Congo, helping to turn world opinion against the colonial regime of King Leopold.
Park returned to academia in 1914, when he was hired by the University of Chicago. But he never saw a bright line between his new professorial duties and his old journalistic ones. Indeed, Park insisted, a sociologist should be "a kind of superreporter" who covers "long-term trends" rather than "what, on the surface, merely seems to be going on."
In my own field, history, top scholars also cultivated lay audiences: most notably, the husband-and-wife team of Charles and Mary Beard produced bestselling textbooks alongside a broad sheaf of magazine and newspaper articles. Ditto for the new discipline of anthropology, where Franz Boas and his students – especially Ruth Benedict and Margaret Mead – wrote regularly for the popular press.
Today, with the press itself in peril, we need to do the same. Economists could report on the recession, of course, providing on-the-ground analyses of bank failures, housing foreclosures, and more. Biologists could cover climate change and other environmental issues, English professors could write about the book and film industries, and anthropologists could send dispatches from faraway lands.
At20the professional schools, news-gathering opportunities would be even greater. Law professors could cover knotty questions before the Supreme Court, ranging from the detention of suspected terrorists to church-state separation. Medical school professors could describe the latest advances in patient treatment, architecture scholars could write about design, and professors of education could report on school reform.
So what would be in it for them? Right now, nothing. The way you get ahead in academia is to write for other academics, period. But we can change that, too.
Suppose that 30 or 40 prominent research universities issued a joint statement, urging their faculty to publish in popular venues – and promising to consider such articles in promotion and salary decisions. Believe me, you'd see more and more professors writing for the newspaper.
To be sure, some faculty would continue to turn up their noses at it. As the historian Patricia Limerick has quipped, these professors resemble the people nobody wanted to dance with in high school; as a defense mechanism, they pretend that they never wanted to dance in the first place.
But I think plenty of academicians would want to dance, if the academy rewarded it. And it would be good for their disciplines, too. These are tough times for the social sciences and humanities, especially, which need to justify their budgets to already-strapped state legislatures and donors. What better way to prove your worth to the public than to write for it?
Professors won't be a panacea for newspapers, of course. Many of us don't know how to write for lay readers, first of all, so we'll have to learn. But we have a lot to teach, too, about nearly every subject that a paper might cover. And did I mention that we'll work for free?
Posted on: Monday, March 9, 2009 - 22:50
SOURCE: http://popecenter.org (1-15-09)
Since the mid-1980s, I have taught a standard survey of literature course to undergraduates in California, Michigan, and most recently upstate New York. This course introduces canonical texts, from Homer’s Odyssey to early medieval texts such as Beowulf or the Icelandic sagas, and sometimes later works. Over the years, my experience has chronicled what I believe to be a broad retreat from genuine literacy into a new, orally based “post-literacy” of emotion-drive mentality, egocentrism, “presentism,” and logical obtuseness. This retreat will have serious consequences for our society.
This three-part essay will describe my observations, based on the written responses of my students on exams. My course is a general education requirement that most students must take, usually in their freshman or sophomore year. Frequently, it serves as a prerequisite for other courses in English or the humanities, and where I currently teach, it is required for most education majors. In sum, this course offers a useful occasion for the general observation of undergraduates.
The Way They Used to Be
Even in the mid-1980s, student interest in literature was low. I was a teaching assistant and teaching fellow at U.C.L.A.—a first-tier branch of a world-class state university. Except for a few English majors, however, most students saw the course as an obstacle to be hurdled or, better yet, circumvented. Poetry-averse engineering majors and haughty pre-law types volubly asserted the unfairness and inconvenience of having to study Shakespeare or Cervantes. Many read the assigned books desultorily and quite a few disdained to read any of them at all. Obsessively clever, they figured out ways to cheat on the quizzes that I imposed to keep them to the reading schedule. When it came to writing a discursive examination, the consequences of “blowing off the course” tended to manifest themselves dramatically. Instead of specific allusions and meaningful argument, one collected blue book after blue book of vapid generality, half-remembered lecture phrases, and boilerplate rhetorical devices learned (or half-learned) in high school.
In the main, however, students used competent language. They completed their sentences in grammar not too defective, and they deployed vocabulary more or less at an adult level. And in those days one still saw students actually reading books, even if they were not the books assigned in their classes. I recall a moment when it seemed that every frat-boy on campus was lugging around the paperback of The World According to Garp. (I don't know why.)
As inexplicable as the Garp enthusiasm was, it stands out in contrast with the situation today. Reading is no longer a casual activity for students, and there appears to be a correlation between the dominant student attitude to reading and the level of student competence in writing.
Adults know what propels the descent: proliferating electronic media, video games, an ideologically inspired de-emphasis of rigorous learning at all levels of education, and a pervasive attitude of entitlement that students now absorb into their deficient souls the way babies drink nourishment from a mother’s breast. Flashing lights and three-minute “rap” songs stultify cognitive development. MTV, that bastion of the youth audience, nowadays specializes less in the music video than in the “reality show,” with its endless, formless palaver among “twenty-somethings” confined in a house.
These models of comportment are definitely oral rather than literate. A number of publications over the last decade, such as Mark Bauerlein’s The Dumbest Generation, have remarked on the phenomenon of a noticeable restriction of cognitive range in college undergraduates. What Bauerlein sees, I see: young people cut off from any elevated sense of who they are, frozen in the “cool” indifference of pop-culture, largely confined to the restrictions of the present moment, and hostile to maturity.
Until about five years ago, in a sustained spasm of unjustifiable hope, I regularly asked students in all my classes to write down the titles of the last five or ten books that they had read voluntarily or, if not voluntarily, then under compulsion in high school or college. In the very first years of my career, in California, a few students wrote down half a dozen titles, often including Huckleberry Finn and Catcher in the Rye or maybe a science fiction novel by Robert A. Heinlein. The occasional ferocious Ayn Rand follower turned up who had memorized long passages from Atlas Shrugged.
For ten years, the list has been non-existent. The only books that most students have read are the politically correct parables that nowadays figure in the high school curriculum in place of what used quaintly to go by the name of the “classics.” If, at seventeen, I had taken Maya Angelou’s Why the Caged Bird Sings to represent “literature,” I might have developed no interest in books, either. Thus student reading ability remains extraordinarily low even when, in college, the instructor figures out, as I have, how to cajole them into doing it. Students typically cannot make reliable or secure statements about characters or describe events in the story or, much less, frame an interpretation of this or that legend or saga. (I say “typically” to allow for the exceptions.) But by and large, even when today’s representative undergraduate has painfully “read” Beowulf, he has less to say about it than his faking counterpart of 1987, and what he says he says in a version of written English that hardly ascends above a level of sub-literacy....
Posted on: Sunday, March 8, 2009 - 22:05
SOURCE: Frontpagemag.com (3-4-09)
After 9/11, the social-democratic political philosopher, Michael Walzer, asked the readers of Dissent magazine a tough question: “Can there be a decent Left?” His essay was in reality an appeal for its creation, since Walzer was smart enough to realize that so many who spoke in the name of the Left that horrific year were anything but. But now, so many years later, little has changed. If anyone has any doubts about this, there is no better place to start than Jamie Glazov’s important new book, United in Hate.
Glazov discusses both the philosophical underpinnings of the leftist world-view and the current form it’s taking in the U.S. Starting from the premise that existing reality in democratic America has to be destroyed and that “the enemy of my enemy is my friend,” large segments of the left today seek to forge an alliance with America’s enemies, once the Communist world, now the forces of radical Islam. Glazov traces and seeks to analyze the causes of this movement from the left’s support of “the red flag of proletarian revolution” to that of the “black flag of Islamic jihad.”
In many cases, Glazov shows how the same people who once sang the praises of Stalin as an anti-fascist leader now praise Islamic terrorists who seek to attack the West. While many learned from 9/11 that the West had real and very dangerous enemies, major figures of the once pro-Soviet Left apparently felt rejuvenated, viewing the attack on the twin towers as the revenge of the masses for American oppression of the Third World. For these people, Glazov writes, 9/11 was a “personal vindication,” since they saw “only poetic justice in American commercial airplanes plunging into American buildings packed with people.”
Now, they hoped that the project they thought had failed -- the replacement of democratic capitalism with a revolutionary socialist society -- might again have legs. Somehow the fact that radical Islam seeks to return the world to the seventh century as the basis of the social order, does not seem to faze them. Now, at least, they had a movement they could support which would enable them to realize the goal they once had -- the destruction of capitalism and the collapse of the United States.
Hence, we find that Tom Hayden, who in the1960’s supported victory for Ho Chi Minh’s forces and praised Communist Vietnam as a “rice-roots democracy,” traveled to London to meet with and embrace the Iraqi advocate of terror Muqtada al-Sadr; the left-wing MP George Galloway traveled to Syria and offered public support to every enemy of the United States, from America’s opponents in Iraq to Fidel Castro and Hugo Chavez. One must first ask what leads those who benefit from every freedom the West affords to endorse its most totalitarian opponents, and to falsely depict them as freedom fighters -- as the filmmaker Michael Moore once called Saddam’s fighters in Iraq?
The answer is provided by Glazov in a chapter that takes off from the kind of philosophical analysis the popular longshoreman author Eric Hoffer became famous for in the 1950’s. Hoffer explored the question: What makes some people true believers? I recall that the Left of the time excoriated him as a simpleton who tried to deflect Americans from supporting opposition movements by describing its members unjustly as fanatics. Glazov will do well if his critics compare him to Hoffer, who in fact was quite effective in exposing the hypocrisies of the pro-Soviet fellow-travelers.
Like Hoffer’s Western pro-communists, when the truth emerges about what life is like under fundamentalist Islam, many leftists publicly deny the reports -- as much of the Left did when it first learned about the horrors Pol Pot inflicted on Cambodia. Privately, Glazov argues, the Leftist approves of what has taken place. Indeed, Glazov emphasizes, the violence and horror is what “attracts him in the first place.” They know that without it, the eventual earthly paradise will never be born. After all, as the Bolsheviks used to argue, “you can’t cook an omelet without breaking eggs.”
Glazov mentions the response of the once famous pro-Soviet and later Maoist apologist Anna Louise Strong, who was herself for a while under suspicion of being an American agent. Glazov notes that Strong was undisturbed by the arrests and deaths of her own friends in the Stalinist purges. I once attended a talk of hers in New York City, to which she arrived after Stalin kicked her out of the USSR and condemned her as a spy. Most of the Left viewed her as a traitor, and few people went to her appearance. I was one of them. What Strong told the audience -- and I will never forget this -- was that she viewed herself as part of a group of flies in the way of an ongoing train. The train had to get to the station; it did not matter if some of them were killed as the train moved towards its destination. At least Strong was consistent. Even when she herself was falsely accused of being an American agent -- she justified all that had been said against her as necessary for the revolution’s success. Or as Glazov explains this phenomenon: “Because believers consider themselves higher life forms, their inferiors become not only expendable, but necessary waste.”
Having established how the Left thinks about the world, Glazov proceeds to document the Old and New Left’s support of Marxist tyrannies, from Lenin’s and then Stalin’s Soviet Union to Mao’s China and Castro’s Cuba and the Sandinista’s Nicaragua in the 1980’s. It is a familiar story, one however that still unfortunately needs retelling, so that new generations learn the lessons so many have seemed to forget. (or perhaps never learned.) One always finds surprise and shock at how so many supposedly learned and respected thinkers became willing dupes of those who were among the 20th Century’s major monsters and killers. One feels embarrassed to read of the stupidities uttered by the likes of George Bernard Shaw, Bertolt Brecht, Walter Duranty and others. And yet, when times change and even the leftist remnant acknowledges that their ancestors were perhaps naïve -- although of course correct to defend the idea of the Soviet revolution -- they repeat the pattern when it comes to the new tyrannical regimes they support.
The New Left took pride that it saw through the myths of Soviet Communism. It could not say the same when it came to the new regimes modeled on the old Communist ones. “The New Leftists of the sixties and seventies renewed,” Glazov writes, “…the Left’s tradition of venerating and visiting death cult tyrannies.” A diverse group including Noam Chomsky, Norman Mailer, Simone de Beauvoir, Shirley MacLaine and others fell in love with the regimes they favored: Cuba, China and Nicaragua. Many forget that the late Susan Sontag, acclaimed by so many as one of our greatest cultural figures and intellectuals, wrote that “the Cuban revolution is astonishingly free of repression.” On Vietnam, Sontag wrote that “Vietnam offered the key to a systematic criticism of America.” When Shirley MacLaine went to Mao’s China -- one of the first to make a pilgrimage to the once forbidden land -- she found a sense of purpose that was missing for her in America. No one quarreled, she wrote, and “it slowly dawned on me that perhaps human beings could be taught anything.”
But Glazov’s last two sections, on Islamism and then new romance of today’s far Left with terrorism, is by far the most important part of his book. Here, he addresses the seeming anomaly of how atheistic radicals have taken to support fundamentalist Islamic radicals. Building upon the work of Paul Berman, Laurent Murawiec and others, Glazov shows how “fascism and communism were centrally involved in the birth and development of Islamism.” Indeed, he reveals that its roots lay in Marxist-Leninist philosophy, which had a great influence on radical Islam’s founding fathers. Although the Leninists did not share Islamism’s religious component, it did share with “the secular totalitarianisms the impulse to create an earthly paradise by washing the slate clean with human blood.”
Glazov’s chapter, “To Hate a Woman,” is simply chilling. Even those who already know how Islamists treat women will be horrified about the details that Glazov has amassed. Demonstrating how Islamists believe theologically in women’s inferiority, Glazov shows how violence against women takes place from the very moment of a female’s birth. When it is time for marriage, he explains how allowing Muslim men to take many wives “minimizes the ability of a couple to nurture a deep emotional connection.” And of course, the man can easily shatter the marital bond, thus keeping women living in a state of fear. I learned from his book, for example, that 90 per cent of Pakistani wives had been beaten or sexually abused for offenses like cooking a bad meal, and that television shows describe how best to beat a woman for different offenses. All this, he writes, is part of a culture “rife with feelings of humiliation, shame, impotence, emasculation and rage, combined with the rejection of earthly joy and pleasure.” The result is glorification of a culture of death, that leads many to willingly choose death as their final service to Allah.
Finally, Glazov addresses the strange alliance between the far Left today and the forces of radical Islam and terror. As Michael Moore explained, opponents of the occupation of Iraq “are the REVOLUTION, the Minutemen and their numbers will grow- and they will win.” Noam Chomsky travelled to Lebanon in 2006 to cheer Hezbollah, telling its terror leader Nasrallah that George W. Bush was the world’s real top “terrorist,” and the U.S. one of the “leading terrorist states.” One George Washington University philosophy professor explained that 9/11 was justified because the terrorists “sought to defeat…our arrogance, our gluttonous way of life, our miserliness toward the poor…the soulless pop culture…and a domineering attitude that insists on having our own way.”
No one, of course, is more straightforward than the left-wing British MP George Galloway, who said that a Muslim-Leftist alliance “is vitally necessary…because the progressive movement around the world and the Muslims have the same enemies…the Zionist occupation, American occupation, British occupation of poor, mainly Muslim countries.” So the Left, despite its roots in secular Marxism, sees Muslim radicalism , as Glazov writes, “as a valiant form of ‘resistance’ against American imperialism and oppression.” And further uniting them is the shared belief in the new anti-Semitism and hatred of the Jew, masking itself as simply anti-Zionism and opposition to Israel’s “imperialist” policies.
Jamie Glazov’s book is a major contribution to understanding the world we live in today, and the nature of the leftist opposition to those who see the need to confront the terrorists and defend democracy. It is an essential book for our time and as former CIA Director R. James Woolsey writes in his introduction, a “courageous and illuminating book.” Buy it for your liberal friends, who more than anyone else, need to learn its lessons.
Posted on: Sunday, March 8, 2009 - 17:52
SOURCE: NYT (3-7-09)
IN the depth of winter of 1913, at the height of pre-Lenten carnival, the Vienna Bankers Club gave a Bankruptcy Ball at the opulent Blumensaal hall. Some ladies appeared as balance sheets, displaying voluptuous debits curving from slender credits. Others came as inflated collateral: faux enhancements amplified the bust or upholstered the posterior. As for the gentlemen, thin ones were costumed as deposits, fat ones as withdrawals. Sooner or later everybody repaired to the debtor’s prison — the restaurant of the Blumensaal.
Here mortgage certificates made pretty doilies for Sachertortes. Ornamented with the bailiff’s seal, eviction and foreclosure notices were colorful centerpieces, each topped by a bowl of whipped cream. If you wrote your waiter an I.O.U., he would pour you a flute of Champagne. Dancing and merriment continued until 5 a.m., when, suddenly, the orchestra leader stopped his men in the middle of the “Emperor Waltz.” He announced that since the musicians hadn’t been paid, there would be no more music, good morning, good luck, goodbye....
The very week of the Bankruptcy Ball, Archduke Franz Ferdinand, successor to the throne, met with Emperor Franz Josef to urge disengagement from Albania. This, he pleaded, would help facilitate a rational dialogue with Albania’s neighbor — militant, Austria-hating Serbia. The hater-in-chief there was Col. Dragutin Dimitrijevic, the head of Serbian military intelligence, who was organizing a secret terrorist squad to be loosed against the Habsburg realm.
At the same time, Vienna was incubating in its own streets some of the century’s prime virtuosos of violence. One of them was active close to the imperial palace, Schloss Schönbrunn, where the emperor had received his heir. An elegant building on Schönbrunner Schlossstrasse housed young Josef Stalin, dispatched by Lenin to explore the empire’s explosive nationalities situation. It was during Stalin’s weeks in Vienna that he initiated his lethal feud with young Leon Trotsky, who, a few streetcar stops away, was publishing the original Pravda. All this while on the other side of town young Adolf Hitler was seething obscurely, painting postcards for a living....
“Austria,” said Karl Kraus, who was Habsburg Austria’s H. L. Mencken, “is the laboratory for the apocalypse.” What would he say about America today?
Posted on: Sunday, March 8, 2009 - 16:20
SOURCE: Frontpagemag.com (3-6-09)
For the past year, there's been a concerted push within the U.S. government to ban frank talk about the nature of the Islamist enemy. It began with the Department of Homeland Security, then moved to the National Counter Terrorism Center and the departments of State and Defense. Already in May 2008, I heard an excellent analysis of the enemy by Deputy Assistant Secretary of Defense Thomas Mahnken in which he bizarrely never once mentioned Islam or jihad.
I've been wondering how this change in vocabulary actually occurs: is it a spontaneous mood shift, a group decision, or a directive from on high?
Jennifer Janin, head of the Urdu service at the Voice of America.
Islamic terrorists: DO NOT USE. Instead use simply: terrorist.
Islamic Fundamentalism/ Muslim Fundamentalists: AVOID.
Islamist: NOT NECESSARY.
Muslim Extremists: NOT NECESSARY. Extremist serves well.
Urdu is a dialect of Hindustani written in Arabic script found mainly in Pakistan and India and spoken almost exclusively by Muslims; it is mother tongue to about 70 million people. One can understand why euphemisms appeal in so far as VOA competes for market share with other news outlets and wishes not to insult or alienate Muslims. But VOA is not a commercial station with a bottom line and shareholders.
Voice of America logo.
In her defense, Janin might argue that she is merely picking up on Barack Obama's emphasis on "respect" for Muslims, but there is no public indication that"respect" means pretending that Islam is not a central public issue facing Americans. Indeed, on occasion, Obama has been very clear that it is. A pungent example came one year ago in Philadelphia, on March 18, 2008 when, in the course of a major speech, Obama repudiated as"profoundly distorted" the"view that sees the conflicts in the Middle East as rooted primarily in the actions of stalwart allies like Israel, instead of emanating from the perverse and hateful ideologies of radical Islam."
"Perverse and hateful ideologies of radical Islam"? It does not get much stronger than that. One wonders how might Janin's new regimen translate this – probably as the"perverse and hateful ideologies of radical extremism," which is both inaccurate and unworthy of a credible news service.
Spozhmai Maiwandi, director of VOA's South Asia Division.
(2) This latest directive from VOA fits a pattern of U.S. government-funded programming to the Middle East posing problems. Two earlier cases that come to mind: a 1991 scandal concerning the pro-Saddam tilt of VOA's reports from Baghdad and the 2007 resignation of Larry Register from Al-Hurra television for promoting anti-American and anti-Israeli views. Could someone instruct the Voice of America staff, once and for all, that its mission is not to flatter its audience nor to pursue ratings for their own sake but honestly to convey American mainstream views to the outside world?
(3) And while we're at it, could someone remind VOA employees that there's a lively debate in the United States about radical Islam; for a change, how about VOA covering this rather than smothering it under the Islamist line? In 2006, Meredith Buel of VOA robotically took a Council on American-Islamic Relations press release and rewrote it as a VOA news item; for the gory details, see my weblog entry,"Voice of America – CAIR's Shill." And the DHS document that started the whole euphemizing campaign,"Terminology to Define the Terrorists: Recommendations from American Muslims," relied on an unidentified"broad range of Muslim American community leaders and scholars" that has the hallmarks of CAIR & Co. Hey, VOA, repeat after me:"We work for the American people, we are not a subsidiary of CAIR."
Posted on: Friday, March 6, 2009 - 14:39
SOURCE: Japan Focus (2-12-09)
We have some good news, and we have some bad news.
First the good news. A little common sense is breaking out both within and about Japan. Within Japan, an election seems almost certain for the month of May. Comfy in his captain's chair, Prime Minister Aso is at present testing his 19% level of public support and -- with helpful advice from Mr. "structural reform" Takenaka Heizo -- is hoping to ride out the storm and his legal tenure. Japanese constitutional law requires that an election be called by September, and both Aso and Takenaka apparently believe that by then there will be some kind of recovery of the economy both globally and within Japan. They also seem to be betting that Aso’s YEN 2 trillion cash payout to residents (about YEN 12,000 per taxpayer) will boost support, even though upwards of 70 percent of poll respondents apparently see it as wasteful (which it is).
Aso’s wishful thinking is not shared by the LDP elders. They are, therefore, pressing him to call an election shortly after the budget is passed by the Diet. They understand that, following a 35% drop in exports in December (year on year), a 2009 recovery is unlikely. And they show signs of grasping that the current crisis is almost certain to be far worse come September. The LDP clearly worry that an electoral loss is almost certain at present, and could turn into a perfect storm, particularly if Aso's "straight on till September" obstinacy holds sway. The February 8 Nikkei makes note of their pressure on Aso, adding that if he ignores it he risks being thrown overboard. So one bit of good news is that the LDP elite know they have no choice but to go to the people quickly. This could mean that Japan’s weak and unimaginative leadership, confronting an historic crisis with a paucity of ideas, may come to an end. If so, one can only hope that the LDP will not simply be followed by more old boys bereft of ideas and leadership skills.
The LDP clearly worry that an electoral loss is almost certain at present, and could turn into a perfect storm, particularly if Aso's "straight on till September" obstinacy holds sway. The February 8 Nikkei makes note of their pressure on Aso, adding that if he ignores it he risks being thrown overboard. So one bit of good news is that the LDP elite know they have no choice but to go to the people quickly. This could mean that Japan’s weak and unimaginative leadership, confronting an historic crisis with a paucity of ideas, may come to an end. If so, one can only hope that the LDP will not simply be followed by more old boys bereft of ideas and leadership skills.
Another note of good news is found in Martin Fackler’s February 6 New York Times article, proclaiming that "Japan's big-works stimulus is lesson." There has been a great deal of discussion about what the United States might learn from Japan's experience of dealing with the collapse of the bubble economy, but most of it has been less than illuminating. Beyond the obvious lesson that the financial and fiscal authorities ought to act much faster than the Japanese did (and note that the Japanese authorities' slowness is often exaggerated), a few pertinent lessons have made it into the public debate. The article notes one of the major lessons is that the size of the fiscal stimulus needs to be huge, and that the spending needs to be sustained "until recovery takes firm root." Moreover, another lesson is that spending should not be willy-nilly. Japan bought itself numerous failed resorts, empty airports, and silent concert halls, in addition to the so-called roads and bridges to nowhere. The huge fiscal stimulus packages of the 1990s not only helped raise the public debt to 180% of Japan's GDP (by far the largest of any big OECD country), but they also imposed onerous burdens on local governments for projects that provided little basis for new economic growth.
The New York Times article points out that spending on education and social services would have delivered far more "bang for the buck than infrastructure spending," but that is not all. Imagine the benefits Japan would be enjoying now had it invested heavily in renewable energy projects, where it sadly lags behind countries such as Germany as well as -- surprise, surprise -- the United States. Indeed, under the Koizumi regime, in 2004, the subsidy for solar power purchases was axed in a striking display of market fundamentalist overreaction to the state's misguided stimulus policies of the 1990s. Japan's focus on old-style infrastructure also bolstered political lobbying for more of the same. All those skewed incentives contributed to the severity of the current downturn, since the workforce and domestic economy were not retrained and focused towards sustainable growth areas. As the article notes, citing an emeritus professor of public finance at Shimane University, "in hindsight, Japan should have built public works that address the problems it faces today, like aging, energy and food sources... This obsession with building roads is a holdover from an earlier era."
The New York Times article points out that spending on education and social services would have delivered far more "bang for the buck than infrastructure spending," but that is not all. Imagine the benefits Japan would be enjoying now had it invested heavily in renewable energy projects, where it sadly lags behind countries such as Germany as well as -- surprise, surprise -- the United States. Indeed, under the Koizumi regime, in 2004, the subsidy for solar power purchases was axed in a striking display of market fundamentalist overreaction to the state's misguided stimulus policies of the 1990s. Japan's focus on old-style infrastructure also bolstered political lobbying for more of the same. All those skewed incentives contributed to the severity of the current downturn, since the workforce and domestic economy were not retrained and focused towards sustainable growth areas. As the article notes, citing an emeritus professor of public finance at Shimane University, "in hindsight, Japan should have built public works that address the problems it faces today, like aging, energy and food sources... This obsession with building roads is a holdover from an earlier era."
So the good news is that we may be approaching the end of an era, both in the governance of Japan as well as in the notions of how the state can most effectively stimulate the economy. The latter is a particularly important lesson for the United States, which at present is embroiled in a debate over whether to use tax cuts to try and resuscitate the unsustainable economy that collapsed, or to emphasize smart spending to stimulate the economy and shift its industrial base. If the smart emphasis becomes common sense in Washington, it will become common sense in Tokyo as well, as it is already emerging among the Japanese Democrats. Even PM Aso has of late been paying lip service to the idea of a Japanese-style “green new deal.” The real deal in Washington, centering on support for renewables, the “smart grid,” and other modern infrastructure, will encourage the same in Tokyo. This could mean that Japan will have to learn from the Americans the right lessons from its own policy failures during the years of waste as well as its “structural reform” road to unsustainable export dependence.
But now the bad news. As the very astute blogger Tobias Harris notes in his February 8 comments entitled "twenty years of crisis," Japan might be on the edge of some desperate and potentially dangerous measures. After the collapse of the asset bubble and the descent into deflation, Paul Krugman argued that the Japanese central bank ought to "credibly promise to be irresponsible." Krugman's advice was that Japan needed to adopt an inflationary target in order to pull itself free of deflation and a liquidity trap (the latter obtains when nominal interest rates are already close to or equal to zero, in order to counter a recession, but the financial system and actors in the real economy do not respond with increased lending as well as increased investment and consumption). Krugman's advice to aim at inflation, even recklessly so, has been deeply controversial within Japan. Advising that the present regime, not particularly renowned for fiscal and financial probity, be let loose seems a recipe for disaster to many observers.
Notwithstanding these concerns, Krugman's advice remains attractive to some domestic actors. As Harris notes, the failure of Japan to escape its "lost decade" (indeed, its "lost twenty years") has seen a group of 15 LDP Diet members form "the Diet members league to consider the issuance of interest-free government bonds and government money." They want the government expand the money supply by a whopping ¥50 trillion, bypassing the Bank of Japan, so as to fund new rounds of economic stimulus. The group understands that there is a risk of hyperinflation, but they argue "we are facing hyper-deflation, so we need a policy to create hyper-inflation. We have to do something to undermine the central bank and government's credibility or else we won't be able to halt the yen's rise. So, while we know this is drastic medicine, we will do it."
As Harris notes, these people have clearly read Krugman. But have they read the more recent contributions from Krugman? In his February 6 opinion column for the New York Times, entitled "On the Edge," Krugman argues for smart public spending. He goes on the attack against the absurd proposals for tax cuts coming from Senate and House Republicans in Washington. The defeated presidential candidate, John McCain, went so far as to propose the fiscal stimulus be scaled back to $420 billion and be focused on tax cuts. This is in contrast to the advice of his unofficial economic adviser during the campaign, Mark Zandi of Moody's Economy.com, who of late has argued before congressional committees for a stimulus in excess of $1 trillion with a focus on such items as food stamps, aid to the states, and other spending. Like Zandi, Krugman warns that "even if a major stimulus bill does pass the Senate, there is a real risk that important parts of the original plan, especially aid to state and local governments, will have been emasculated." Krugman derides the Republicans' focus on tax cuts as "hackneyed political theater" and warns that "Washington has lost any sense of what's at stake -- the reality that we may well be falling into an economic abyss, and if we do it will be very hard to get out again."
In short, Krugman is arguing for smart spending, and nowhere does he urge the financial authorities to, as it were, whip up inflation now. Let's cross our fingers, and hope for a whole lot of learning, fast in Tokyo and Washington.
Posted on: Friday, March 6, 2009 - 01:10
SOURCE: Japan Focus (2-8-09)
Japan is an economy that is almost certainly producing well below its productive capacity - that is, the immediate problem facing Japan is one of demand, not supply. And it gives every appearance of being in a liquidity trap - that is, conventional monetary policy appears to have been pushed to its limits, yet the economy remains depressed. What can be done?"
So wrote Paul Krugman in his 1998 analysis of Japan's prolonged economic crisis, in which he argued that to escape its liquidity trap, it was necessary for "the central bank to credibly promise to be irresponsible - to make a persuasive case that it will permit inflation to occur, thereby producing the negative real interest rates the economy needs."
Four years later, then-Fed governor Ben Bernanke, in his noteworthy speech on deflation, echoed Krugman, arguing "We conclude that, under a paper-money system, a determined government can always generate higher spending and hence positive inflation" — but then concluded that Japan's prolonged crisis was not the result of ineffective monetary policy but reluctance on the part of political actors to implement structural reforms, reforms that, Krugman argued, may be necessary but would do little to "induce people to spend more."
A decade after Krugman, Japanese authorities are once again stuck trying to stimulate aggregate demand.
Economist Ikeda Nobuo, in considering Japan's sluggish aggregate demand, concludes, like Bernanke, that the high liquidity preference of Japanese households — nearly half of Japanese household assets are held in cash or low-risk, low-return bank accounts — is a response to inefficient Japanese companies. Pessimistic that the Japanese government is capable of stimulating Japanese demand in the absence of foreign demand for Japanese goods, Ikeda concludes that the lost decade never ended: Japan is in the midst of a "lost twenty years."
A group of fifteen LDP lawmakers, however, has decided to embrace Krugman's solution, calling for the government to run the printing presses, expanding the money supply by 50,000 billion yen independently of the Bank of Japan in order to finance further economic stimulus. The "Diet Members League to consider the issuance of interest-free government bonds and government money" will hold its first meeting on Tuesday, with upper house member Tamura Kotaro chairing. Former finance ministry official Takahashi Yoichi will address the inaugural meeting.
The group's motto is essentially "desperate times call for desperate measures." Acknowledging that running the presses has positive and negative consequences, the group believes that the economic situation is sufficiently dire to merit the risk of hyperinflation. Quoted in an FT article, Mr. Tamura clearly has read his Krugman: "We are facing hyper-deflation, so we need a policy to create hyper-inflation. We have to do something to undermine the central bank and government’s credibility or else we won’t be able to halt the yen’s rise. So, while we know this is drastic medicine, we will do it."
The new study group has drawn the opposition of senior LDP and cabinet officials. Yosano Kaoru, minister without portfolio for economic and fiscal policy, suggested that the government should issue more bonds instead of printing currency. Shirakawa Masaaki, president of the BOJ, insisted that the policy would do precisely what its proponents intend, namely undermine the credibility of his bank and the health of the currency. The heads of the LDP's factions were equally critical of the proposal, with Tsushima Yuji, the eponymous head of the Tsushima faction, likening the proposal to the enten ponzi scheme.
In other words, this is yet another policy upon which the LDP is divided and unsure of how to proceed, yet another sign of the governing party's flailing about in hope of finding some way to save itself (and Japan).
Not being an economist, I cannot say whether this group's proposal is appropriate. After years of weak domestic aggregate demand, it may be that only drastic inflation may be the only way to make Japanese spend at home — or demand less liquid assets with higher returns — no matter how politically risky it is for the LDP. Pensioners, already angry at the government, will presumably be no less angry as they watch inflation erode their fixed incomes.
But however appropriate the inflationary proposal, it may be beyond the power of the Aso government to implement. Whatever legitimacy the LDP-Komeito government had is now in tatters. The prime minister's latest misstep is to call for the revision of the postal privatization scheme, the very basis of the government's parliamentary supermajority. Mr. Aso questioned whether it is appropriate to divide the postal system into four companies, and stated that now is the time to revise the privatization scheme. He blithely stated that this position is wholly unrelated to the 2005 election that gave the government its mandate. Public opinion may have changed since 2005, but to backtrack now makes a mockery of democratic legitimation. If the government wants to revert from a policy that was critical to returning the coalition to power, it should have to go to the people and ask for approval to change course. (Nakagawa Hidenao made this argument at his blog, as did DPJ member Nagashima Akihisa.)
Naturally LDP proponents of postal reform have been quick to criticize the prime minister for his naked appeal to find some issue that will rescue his sinking government. (It bears noting that Mr. Aso has spoken of preserving the quality of postal services, a major concern of the public when it comes to privatization.) Koizumi confidante Takebe Tsutomu was perhaps the most succinct in his criticism: "What nonsense! I wish he would be more discrete in his speech."
In short, the prime minister is gradually losing whatever ability he has to rally his party and the public to an agenda. He is incapable of setting priorities or taking decisive action. The halls of power echo with rumors of plots to unseat him.
It is unlikely that this shiftless prime minister is capable of deciding on so risky and decisive a policy as proposed by the new league.
This prime minister, his party, and his government are bereft of authority and legitimacy — and they appear determined to drag Japan into the abyss with them.
As MTC so eloquently observes, hopefully Secretary Clinton will take heed of the stench of decay when she visits Japan later this month.
Posted on: Friday, March 6, 2009 - 01:01
SOURCE: WSJ (3-4-09)
Central questions these days are how severe will the U.S. economic downturn be and how long will it last?
The most serious concern is that the downturn will become something worse than the largest recession of the post-World War II period -- 1982, when real per capita GDP fell by 3% and the unemployment rate peaked at nearly 11%. Could we even experience a depression (defined as a decline in per-person GDP or consumption by 10% or more)?
The U.S. macroeconomy has been so tame for so long that it's impossible to get an accurate reading about depression odds just from the U.S. data. My approach uses long-term data for many countries and takes into account the historical linkages between depressions and stock-market crashes. (The research is described in "Stock-Market Crashes and Depressions," a working paper Jose Ursua and I wrote for the National Bureau of Economic Research last month.)
The bottom line is that there is ample reason to worry about slipping into a depression. There is a roughly one-in-five chance that U.S. GDP and consumption will fall by 10% or more, something not seen since the early 1930s.
Our research classifies just two such U.S. events since 1870: the Great Depression from 1929 to 1933, with a macroeconomic decline by 25%, and the post-World War I years from 1917 to 1921, with a fall by 16%. We also assembled long-term data on GDP, consumption and stock-market returns for 33 other countries, sometimes going back as far as 1870. Our conjecture was that depressions would be closely connected to stock-market crashes (at least in the sense that a crash would signal a substantially increased chance of a depression).
This idea seems to conflict with the oft-repeated 1966 quip from Paul Samuelson that "The stock market has predicted nine of the last five recessions." The line is clever, but it unfairly denigrates the predictive power of stock markets. In fact, knowing that a stock-market crash has occurred sharply raises the odds of depression. And, in reverse, knowing that there is no stock-market crash makes a depression less likely.
Our data reveal 251 stock-market crashes (defined as cumulative real returns of -25% or less) and 97 depressions. In 71 cases, the timing of a market crash matched up to a depression. For example, the U.S. had a stock-market crash of 55% between 1929-31 and a macroeconomic decline of 25% for 1929-33. Likewise, Finland had a stock-market crash of 47% for 1989-91 and a macroeconomic fall of 13% for 1989-93. We found that 30 cases where there were both crashes and depressions were also associated with wars. In fact, World War II is the worst macroeconomic event of the period, with strong U.S. wartime economic growth as an outlier....
Posted on: Wednesday, March 4, 2009 - 23:42
SOURCE: Daily Mail (UK) (3-4-09)
The decision to award an honorary knighthood to Senator Edward Kennedy shows Britain at its most masochistic, New Labour at its most cynical and - if he accepts it - Kennedy at his most hypocritical.
To bestow such a distinction on a man who has spent almost all his adult life profoundly opposed to the United Kingdom's best interests also makes a mockery of the honours system.
Ever since Patrick Kennedy (Ted's Irish great-grandfather) set foot on Noddle Island, Boston, on April 21, 1849, the family has nursed a deep resentment against the country that they blame for forcing them out of County Wexford during the Great Potato Famine.
Ted Kennedy's father, Joe, who had made his money from bootlegging in the Prohibition era, became American ambassador to London from 1938 to 1940. As the U.S. envoy, he was an unrelenting appeaser and as unhelpful to Britain as it was possible to be in those perilous days, believing that Adolf Hitler was going to win the war.
Derided as a coward and known as 'Jittery Joe' for panicking when bombs were falling, his term as ambassador ended abruptly, along with his political ambitions, during the Blitz in November 1940 when he remarked: 'Democracy is finished in England.' Within a month he was forced to resign.
Over all matters concerning Ireland, the Kennedys have taken a pro-Nationalist line that has been deeply antagonistic to the Union of Great Britain and Northern Ireland. That is why it is absurd for Gordon Brown to make this award, in the words of its official citation, 'for services to U.S.-UK relations and to Northern Ireland'...
Posted on: Wednesday, March 4, 2009 - 21:10
SOURCE: Frontpagemag.com (3-3-09)
Was I the only one rubbing my eyes in disbelief yesterday, as the Egyptian government hosted an"International Conference in Support of the Palestinian Economy for the Reconstruction of Gaza"?
Husni Mubarak of Egypt addresses the Gaza donors' conference.
Among the larger donations included a Gulf Cooperation Council contribution of $1.65 billion over five years and a U.S. government pledge of $900 million from the American taxpayer (of which $300 million will go for Gaza rebuilding).
Husni Mubarak of Egypt, Nicholas Sarkozy of France, Silvio Berlusconi of Italy, Ban Ki-moon of the United Nations, Amr Moussa of the Arab League, and Mahmoud Abbas of the Palestinian Authority gave speeches.
Why my disbelief at this spectacle: I wonder if those eminentoes and worthies really believe that warfare in Gaza is a thing of the past, and that the time for reconstruction is nigh?
They must not read dispatches from southern Israel, which report the daily warfare that continues there. Take a representative news item from Yedi'ot Aharonot, dated February 28,"Experts: Grads in Ashkelon were advanced."
the two Grad rockets that landed in Ashkelon Saturday morning[, Feb. 28,] were new and improved models, capable of greater destruction than those usually fired from Gaza. One of the rockets hit a school in the southern city, and succeeded in penetrating the fortification used to protect it from projectiles. … The Grad rockets that hit Ashkelon were two of only five or six locally manufactured 170 mm rockets ever fired at Israel, experts say. The rarely used rockets have a range of 14 km (8.6 miles) and are capable of massive damage, evident from the destruction witnesses described on the scene of Saturday's attack.
Sderot house hit by a rocket. (photo by Meital Ohayon)
In an official protest to the United Nations, the Israel's Ambassador Gabriela Shalev noted that"there have been nearly 100 rocket and mortar attacks from the Gaza Strip" since the ceasefire on January 18, or over two per day. These have been increasing in number, with 12 rockets were fired at Sderot on March 1 alone.
Responding to these attacks, the Israeli cabinet resolved on March 1 that"should the firing from the Gaza Strip continue, it would be met by a painful, sharp, strong and uncompromising response by the security forces." Prime Minister-designate Binyamin Netanyahu echoed this bellicosity, reportedly telling a European leader that he would not sacrifice Israel's security"for a smile."
(Saudi foreign minister Saud Al-Faisal, in unexpected agreement, noted that rebuilding Gaza would be"difficult and fool-hardy, so long as peace and security do not prevail" there.)
What the hell are the donor countries doing, getting in the middle of an on-going war with their high-profile supposed reconstruction effort? My best guess: this permits them subtly to signal Jerusalem that it better not attack Gaza again, because doing so will confront it with a lot of very angry donor governments – including, of course, the Obama administration.
Adding to the surreal quality is a blithe disregard for Israel's security needs. Consider the attitude of Douglas Alexander, international development secretary for Britain's Labour government, who pledged £30 million of his taxpayers' funds to rebuild houses, schools, and hospitals in Gaza."There is a desperate need for tough restrictions on the supply of goods to be relaxed," he said, demanding next that"Israel must do the right thing and allow much-needed goods to get through to those men, women and children who continue to suffer."
That's very humanitarian of Mr. Alexander, but he willfully ignored Israeli expectations that Hamas will confiscate steel, concrete, and other imported construction materials to build more tunnels, bunkers, and rockets. After all, Hamas appropriated prior deliveries intended for civilians, and so blatantly that even the usually docile United Nations Relief and Works Agency protested.
Husni Mubarak might warn Hamas not to treat the donors' pledges as a" conquest of war," but it will assuredly do precisely that. U.S. Rep. Mark Kirk (Republican of Illinois) got it right:"To route $900 million to this area, and let's say Hamas was only able to steal 10 percent of that, we would still become Hamas' second-largest funder after Iran."
So, under the cheery banner of building, in Clinton's words,"a comprehensive peace between Israel and its Arab neighbors," donor states are not only defying Israel to protect itself from rocket fire but they are funneling matériel to Hamas.
Is this ignorance or mendacity? I suspect the latter; no one is that dumb.
Posted on: Tuesday, March 3, 2009 - 20:20
SOURCE: Philadelphia Inquirer (3-3-09)
Hey, kids, study hard! The job you save may be your own.
So said President Obama during his recent rousing speech to a joint session of Congress, noting that more than three-quarters of our fastest-growing occupations require more than a high school degree.
Obama, quite admirably, is putting his money where his mouth is. His proposed budget includes $46.7 billion for education - a 13 percent hike from last year - mostly devoted to expanding access to college.
As an educator myself, I should be elated: Education is suddenly "hot" again. But Obama's speech left me cold, too, by casting education in purely vocational terms - as a route to better-paying work. He omitted its other important role: to unite us as citizens. Especially when times get tough, we need schools to bring us together.
That's why Horace Mann called them "common" schools, after all. Pleading with antebellum Americans to establish state systems of education, Mann and other reformers stressed social integration.
"I want to see the children of the rich and poor sit down side by side on equal terms, as members of one family - a great brotherhood," a Mann ally wrote in the 1830s. Unless the young nation developed closer "bonds of sympathy," he added, it would be "rent asunder by distrust, envy, and all hateful passions."
Most of all, schools needed to teach Americans how to govern themselves, reformers argued. Regardless of future occupation, everyone was going to become a citizen. So everyone should learn the skills and habits of democratic life: reason, tolerance, and mutual respect.
To be sure, schools could prepare people for work and citizenship. But, as John Dewey reminded us, writing 70 years after Mann, there is a tension between the two goals. One is economic, oriented toward personal mobility and advancement; the other is social, highlighting our shared duties and responsibilities.
And when we stress education for occupation, ironically, we also risk holding people back. If the job of school was simply to ready people for jobs, Dewey warned, so-called manual workers wouldn't require much schooling at all. By putting them on vocational tracks, we put a virtual ceiling on how far they could go.
"The dominant vocation of all human beings at all times is living - intellectual and moral growth," Dewey wrote in 1916. "To predetermine some future occupation for which education is to be a strict preparation is to injure the possibilities of present development."
But that's precisely what happened. Just as Dewey predicted, our schools became enormous sorting machines, separating the so-called white-collar workers from the blue-collar ones. Are you going to become a doctor or lawyer? Then the college-prep track is for you. A factory laborer? Take a few vocational classes or simply drop out of school - and start laboring.
But today, as Obama correctly emphasized, it's much harder to get a job with decent pay without formal education. "In a global economy where the most valuable skill you can sell is your knowledge, a good education is no longer just a pathway to opportunity; it is a prerequisite," Obama said.
He went on to urge every American to commit to at least one year of higher education or career training. In the future, it would seem, we all will wear white collars to work.
But we won't, of course. And that's the elephant in the classroom. America will always need someone to pick up the trash, bus the tables, and pave the roads. What will become of these people in the brave new world of knowledge? And, most of all, what will our schools do for them?
If we think of education strictly in vocational terms, the answer is pretty obvious: nothing. You don't need formal schooling to become a busboy. But you do need it to become a citizen.
So, even as our schools prepare more and more people for the knowledge economy, let's also make sure that they teach us how to think and care about each other. Someone is going to lose in the great vocational sweepstakes, that's for sure. The only question is what the rest of us will do about it.
Posted on: Tuesday, March 3, 2009 - 20:03
SOURCE: NYT (3-1-09)
... Most Americans think of the filibuster (if they think of it at all) through the lens of “Mr. Smith Goes to Washington” — a minority in the Senate deeply disagrees with a measure, takes to the floor and argues passionately round the clock to prevent it from passing. These filibusters are relatively rare because they take so much time and effort.
To reduce deadlock, in 1917 the Senate passed Rule 22, which made it possible for a supermajority — two-thirds of the chamber — to end a filibuster by voting for cloture. The two-thirds majority was later changed to three-fifths, or 60 of the current 100 senators.
In recent years, however, the Senate has become so averse to the filibuster that if fewer than 60 senators support a controversial measure, it usually won’t come up for discussion at all. The mere threat of a filibuster has become a filibuster, a phantom filibuster. Instead of needing a sufficient number of dedicated senators to hold the floor for many days and nights, all it takes to block movement on a bill is for 41 senators to raise their little fingers in opposition.
Historically, the filibuster was justified as a last-ditch defense of minority rights. Under this principle, an intense opposition should be able to protect itself from the tyranny of the majority. But today, the minority does not have to be intense at all. Its members have only to disagree with a measure to kill it. Essentially, the minority has veto power.
The phantom filibuster is clearly unconstitutional. The founders required a supermajority in only five situations: veto overrides and votes on treaties, constitutional amendments, convictions of impeached officials and expulsions of members of the House or Senate. The Constitution certainly does not call for a supermajority before debate on any controversial measure can begin.
And fixing the problem would not require any change in Senate rules. The phantom filibuster could be done away with overnight by the Senate majority leader, Harry Reid. All he needs to do is call the minority’s bluff by bringing a challenged measure to the floor and letting the debate begin....
HNN Hot Topics: Filibusters
Posted on: Monday, March 2, 2009 - 22:31