Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Guardian (10-15-07)
I actually don't begrudge Coulter. The woman is a genius. She understands just how vapid, brainless and driven by right-wing bias and cliché is the American political discourse, and she exploits this opening for everything it's worth. Nothing about Coulter is necessary, of course; there is already plenty of hate being spewed and weak people being mocked, marginalized and dismissed as inferior to white Christian conservative Americans. But the fact that she manages to create these mini-tempests in proverbial teapots every time she releases a new book - most of which are not even really books - and the mainstream media continues to play its same Pavlovian role is usefully instructive of just how craven and conservative our culture has become.
I know it's Abe Rosenthalish/Daniel Patrick Moyinhanian to quote myself. But I've been making the same point about Coulter for more than half a decade. In 2002, I wrote in the Nation:
"It's degrading to have to write about Coulter again. As a pundit, she is about on a par with Charles Manson, better suited to a lifelong stay in the Connecticut Home for the Criminally Insane than for the host's seat on Crossfire. Her books are filled with lies, slander and phony footnotes that are themselves lies and slanders. Her very existence as a public figure is an insult to our collective intelligence ... . I should really be writing about the campaign by neocon chickenhawks to intimidate Howell Raines and the New York Times on Iraq. But fortunately, John Judis and Nick Confessore have taken responsibility for that, leaving me to the less ominous but more baffling phenomenon of the bestselling Barbie-doll terrorist-apologist, who continues to be celebrated by the very media she terms "retarded" and guilty of "mass murder" while calling for their mass extinction by the likes of her ideological comrade Timothy McVeigh."
I try to ignore her. But here we are again. This time she's calling for the conversion of the Jews. As with whether John Edwards really is "a faggot," or whether we should be killing Arab children (or New York Times journalists) en masse, it's too stupid a discussion in which even to engage for anyone with the slightest bit of self-respect. But it does play with her base, which is significant in and of itself. And personally, I'm looking forward to seeing how well she survives this one. After all, didn't anybody tell Annie we control the media?
This might just be the end. Boruch Hashem.
Posted on: Tuesday, October 16, 2007 - 17:27
SOURCE: FrontpageMag.com (10-16-07)
"Would You Buy a Used Hawk From This Man?" runs the title of a Oct. 15 Newsweek smear of presidential candidate Rudy Giuliani, suggesting that the mayor's advisors, "some of the Bush era's most assertive neoconservatives," represent George W. Bush retreads. The article even quotes a foreign policy analyst accusing Giuliani of "out-Bushing Bush."
Ever in lock-step, Time magazine's blog likewise asserted last week that Giuliani's "message seems to be that Bush's policies for the region have worked pretty well, so let's have more of the same."
How odd. Actually, the opposite should be apparent about Newsweek's six featured advisors – Norman Podhoretz, Martin Kramer, Peter Berkowitz, Nile Gardiner, Robert Kasten, and myself. First, we collectively had many disagreements with Bush administration policies and, second, we lacked impact on them. In other words, the real story is Giuliani's fresh start in foreign policy, joined by a cast unconnected to the current president's successes and failures.
Consider my own divergence from Bush administration policy. My writings and spoken statements over the past seven years have criticized the handling of Iraq, the war on terror, democratization, and (especially) the Arab-Israeli conflict.
Iraq: Iraq policy is too ambitious, I time and again asserted, starting in April 2003. Under the slogan "Stay the course – but change the course," I developed a third position between the administration and its critics, one that called for getting foreigners quickly out of the business of running Iraq and coalition troops out of the cities and into the Iraqi countryside and borders. I ridiculed the massive U.S. embassy in Baghdad. I urged that elections be delayed and that authority be turned over to a democratically-minded Iraqi strongman.
War on terror: I inveighed against the euphemistic and inaccurate term, "war on terror," arguing for the need to (1) identify the enemy correctly and (2) develop a clear set of goals to defeat it. I praised the improvements that culminated in Bush's statement of August 2006 that Americans are "at war with Islamic fascists," but then I rued his more recent retreat from naming the enemy.
Democratization: When the president first announced the goal of increasing political participation in the Middle East, I applauded, even as I warned against the overly-abrupt replacement of tyranny with democracy, urging that the process be done slowly and cautiously. Noting that the actual implementation empowered Islamists, I assigned it a failing grade.
Arab-Israeli conflict: I have objected to nearly every aspect of the current administration's policy in this theater, condemning Bush's landmark June 2002 speech for rewarding terrorism, rejecting his embrace of a Palestinian state, and warning after his reelection in 2004 of "potentially the most severe crisis ever in U.S.-Israel relations." I have predicted the forthcoming Annapolis round of negotiations will fail and worry about the damage they will inflict.
George W. Bush (r.) and Rudy Giuliani (l.)
Despite these differences, I twice voted enthusiastically for George W. Bush, am proud to have been his nominee in 2003, and predict historians will rate his presidency a success. But presenting Rudy Giuliani and his advisors as Bush administration clones is nonsense. News magazines might consider doing some research before spouting off.
And finally, some thoughts about the "neoconservative" label, bandied about by Newsweek and left-wing critics: As Irving Kristol, sometimes called the godfather of Neoconservatism, points out, the term has evolved since its first appearance in the early 1970s and today is characterized by three features:
***In economics, a low tax, risk-friendly approach with the goal of achieving growth;
***In social issues, a favorable attitude toward strong, growing, and moral state power; and
***In foreign policy, a patriotic, anti world-government approach that comes to the aid of fellow democratic states.
I somewhat fit this triad, agreeing with the first and third prongs but not the second, where I lean libertarian. This ambiguity led me in 2005 to observe that I could never quite figure out whether or not I am a neoconservative – while noting that others long ago had apparently decided the matter for me. "Journalists use ‘neoconservative' to describe me, editors include my writings in a neoconservative anthology, critics plumb my views for insight into neoconservative thinking, and event hosts invite me to represent the neoconservative viewpoint."
That said, if the term currently requires having supported George W. Bush's Middle East-related policies, then I am not a neoconservative.
Posted on: Tuesday, October 16, 2007 - 17:13
SOURCE: Britannica Blog (10-15-07)
Why is new Republican candidate Fred Thompson doing so well in the polls despite a remarkably sub par performance on the campaign trail? The simple reason is that he is a vote for “none of the above.”
Republicans usually have had an anointed candidate to nominate for president. In recent years, the two Bush’s, Bob Dole, and Ronald Reagan were the clear favorites from start to finish. This year, however, there is no heir apparent to the Republican throne.
Beyond Thompson, the three other leading Republican candidates in the field all have fatal flaws that will likely sink their presidential hopes. Rudy Giuliani is too liberal on social issues and his personal life is a mess. Many Christian conservatives will balk at voting for Mitt Romney because of his religion. His record as governor of Massachusetts is also suspect for conservatives despite his desperate lurch to the Right. Conservatives have never liked John McCain and he is but a shadow of his former self on the campaign trail.
Thus Republican conservatives, who dominate primary voting, are desperately seeking a candidate. Right now, Thompson seems to be their best choice. But he too is a flawed candidate who likely cannot be nominated. He lacks energy, focus, and punch. He looks tired and distracted. His message is nothing more than a string of platitudes. It’s not clear that he has the fire in the belly to be president.
For the first time in more than fifty years, there is real chance that the Republicans will not have selected a nominee at convention time – which comes late this time, on September 1. If so, party leaders will be looking for an alternative to the flawed crew of declared candidates.
The Republicans’ obvious choice is General David H. Petraeus. With the Keys to the White House pointing to a Republican defeat, it is in the interest of the GOP to act unconventionally and shake up the presidential race. General Petraeus is their best bet to achieve this goal.
The administration has built up Petraeus to almost godlike status. He is almost universally known, well spoken, and has no political record for opponents to attack. MoveOn.org hates him, which is a big plus for most Republicans. Bush operatives have groomed him as a potential dark horse nominee, knowing full well the problems with the current field of candidates. Even Democrats in Congress fawned over the general when he testified on the surge in Iraq last month.
If this scenario sounds improbable, think again. Until recently it was commonplace for the Republican Party to nominate military men. They nominated General Dwight Eisenhower in 1952 and Theodore Roosevelt, the hero of the Spanish-American war, in 1904. William Howard Taft, the 1908 and 1912 nominee, was not a military commander but had served as Roosevelt’s Secretary of War. In the nineteenth century the Republicans nominated generals Ulysses S. Grant, Rutherford B. Hayes, Benjamin Harrison, James Garfield, and William McKinley. Grant, Eisenhower, and Taft had not run for office before.
In 1948, the Republican Party flirted with the nomination of General Douglas MacArthur. But the imperious war hero lacked the common touch. General Petraeus, however, does not suffer from MacArthur’s megalomania.
The Republicans are looking for a savior in 2008. They haven’t found him yet. Don’t be surprised if they return to party tradition and nominate General Petraeus.
Posted on: Tuesday, October 16, 2007 - 02:14
SOURCE: Christian Science Monitor (10-15-07)
"We'll have the Indians on the warpath all the time, eager for scalps to dangle at their belts."
That's what a Cleveland sportswriter wrote in 1915, celebrating the new name of the city's baseball team. Previously called the "Naps," in honor of Hall of Famer Nap Lajoie, the team had recently traded Lajoie. So it needed a new name, and "Indians" was born.
So, alas, was Chief Wahoo.
Chief Wahoo is the Indians' mascot, a grotesque caricature grinning idiotically through enormous bucked teeth. You can see him during this week's American League Championship Series between the Indians and the Boston Red Sox. He's a reminder of the days when whites regarded native Americans as savages on the warpath, with scalps dangling from their belts. And it's time for him to go.
How can we profess equality of all Americans, then mock the first Americans in our sports teams? Remember, Cleveland isn't the only culprit here. Take a look at the mascot for the Washington Redskins, and you'll see what I mean.
One defense comes from the Cleveland Indians' official history, which claims that the team was renamed in 1915 to memorialize Louis Francis Sockalexis, the first American Indian in the major leagues. So the team's name – and, by extension, its mascot – serves to honor native Americans, not to demean them.
There's one problem with the story: It's not true. Mr. Sockalexis had died just two years before, in 1913, but his name didn't figure in talks on renaming the team. Even more, it's irrelevant. Suppose the team had indeed been named to remember Sockalexis, who played for the old Cleveland Spiders from 1897 to 1899. That still wouldn't justify the use of Chief Wahoo, who bears little resemblance to Louis Sockalexis – or to anyone, really, aside from a shared racist image of the savage Indian.
But in one way, Sockalexis remains deeply relevant. Like African-American trailblazer Jackie Robinson, Sockalexis faced brutal slurs from opposing players. In the stands, meanwhile, fans would perform "war whoops" and dances when Sockalexis played. When alcoholism ended his brief major league career, sportswriters reported that he had succumbed to an inherent "Indian weakness."
By holding on to Chief Wahoo, then, we don't do any honor to Louis Sockalexis. Instead, we perpetuate precisely the hatred and prejudice that he encountered.
To be fair, Cleveland team officials have been working to replace Chief Wahoo with a cursive, feather-shaped "I." And the mascot in the stands is no longer a clownish "Indian," but a pink-and-yellow creature named Slider. But Chief Wahoo remains on the team's hats, which is probably the most prominent place of all.
And he's not the only Indian sports-team caricature, of course.
In 2001, the US Commission on Civil Rights called for an end to the use of native American images and team names by non-Indian schools. Four years later, the NCAA barred 18 "hostile and abusive" mascots from postseason tournaments. But more than 100 colleges and universities still maintain a native American mascot.
At the professional level, of course, we have the Atlanta Braves (who can forget the "Tomahawk Chop?") as well as the Washington Redskins. By any measure, the headdress-clad "Redskin" is every bit as offensive as the cartoonish Chief Wahoo.
And here someone might respond that the cartoon of Chief Wahoo is, well, just a cartoon, saying it doesn't reflect the sentiments of anyone today, and only an overly sensitive, politically correct liberal could object to something so innocuous.
In 1947, the Washington Post rejected black demands to remove "Little Black Sambo" from elementary school textbooks. Remember Sambo? To African-Americans, this simple-minded and thick-lipped figure embodied the worst elements of antiblack caricature. But to the Post, he was just a storybook character – and a harmless one, at that. "To insist that Negroes be given equal rights with other citizens is one thing," the Post editorialized. "To insist that their particular sensibilities entitle them to exercise a kind of censorship is quite another."
But the issue involved more than just black attitudes, as one local African-American leader replied. Instead, he argued, it affected everyone. If white children absorbed the Sambo story of black-as-buffoon, they would never regard African-Americans as truly equal.
So when you watch the Cleveland Indians on television this week, watch your kids as well. Ask yourself what the image of Chief Wahoo teaches them about Native Americans. And ask yourself if you can live with the answer.
Posted on: Monday, October 15, 2007 - 20:14
SOURCE: Harvard University website (10-12-07)
... A number of inaugural veterans – both orators and auditors – have proffered advice, including unanimous agreement that my talk must be shorter than Charles William Eliot’s – which ran to about an hour and a half. Often inaugural addresses contain lists – of a new president’s specific goals or programs. But lists seem too constraining when I think of what today should mean; they seem a way of limiting rather than unleashing our most ambitious imaginings, our profoundest commitments.
If this is a day to transcend the ordinary, if it is a rare moment when we gather not just as Harvard, but with a wider world of scholarship, teaching and learning, it is a time to reflect on what Harvard and institutions like it mean in this first decade of the 21st century.
Yet as I considered how to talk about higher education and the future, I found myself – historian that I am – returning to the past and, in particular, to a document I encountered in my first year of graduate school. My cousin Jack Gilpin, Class of ’73, read a section of it at Memorial Church this morning. As John Winthrop sat on board the ship Arbella in 1630, sailing across the Atlantic to found the Massachusetts Bay Colony, he wrote a charge to his band of settlers, a charter for their new beginnings. He offered what he considered “a compass to steer by” – a “model,” but not a set of explicit orders. Winthrop instead sought to focus his followers on the broader significance of their project, on the spirit in which they should undertake their shared work. I aim to offer such a “compass” today, one for us at Harvard, and one that I hope will have meaning for all of us who care about higher education, for we are inevitably, as Winthrop urged his settlers to be, “knitt together in this work as one.”
American higher education in 2007 is in a state of paradox – at once celebrated and assailed. A host of popular writings from the 1980s on have charged universities with teaching too little, costing too much, coddling professors and neglecting students, embracing an “illiberalism” that has silenced open debate. A PBS special in 2005 described a “sea of mediocrity” that “places this nation at risk.” A report issued by the U.S. Department of Education last year warned of the “obsolescence” of higher education as we know it and called for federal intervention in service of the national interest.
Yet universities like Harvard and its peers, those represented by so many of you here today, are beloved by alumni who donate billions of dollars each year, are sought after by students who struggle to win admission, and, in fact, are deeply revered by the American public. In a recent survey, 93 percent of respondents considered our universities “one of [the country’s] most valuable resources.” Abroad, our universities are admired and emulated; they are arguably the American institution most respected by the rest of the world.
How do we explain these contradictions? Is American higher education in crisis, and if so, what kind? What should we as its leaders and representatives be doing about it? This ambivalence, this curious love-hate relationship, derives in no small part from our almost unbounded expectations of our colleges and universities, expectations that are at once intensely felt and poorly understood....
Posted on: Monday, October 15, 2007 - 15:10
SOURCE: Blog (10-13-07)
This represents people from the far left, the blatantly anti-Israel, the Republican right, and the left of center.
Why then all the attention to Abe Foxman's position??? I have been inundated with emails critical of Abe Foxman and the ADL. So many of them are so overtly antisemitic that I have not posted them.
Various towns in Massachusetts want to drop the ADL's anti-prejudice programs because of its stand on the Armenian genocide. Are they also going to condemn Jimmy Carter when he comes to town???? Why does he get a free pass?
By the way, Carter also refuses to call what is happening in Darfur a genocide. But Israel practices apartheid? What am I missing here?
And why all the talk about the Jewish Lobby controlling foreign policy when 7 of the 8 Jewish members of the House Foreign Relation Committee voted FOR the resolution? Did these Representatives not get the message? Were they missing the day the Lobby handed out its marching orders?
Something is out of whack here.... seriously so.
Posted on: Sunday, October 14, 2007 - 21:15
SOURCE: Montreal Gazette (10-13-07)
Former U.S. vice-president Al Gore's Nobel Peace Prize disproves one of the more depressing observations about the United States. The glittering Great Gatsby novelist of the 1920s, F. Scott Fitzgerald, once said, bitterly, "There are no second acts in American lives."
However, after bottoming out in frustration when the presidency eluded him in 2000, Gore has enjoyed a triumphal second act as an environmental activist.
Gore's unique triple play of snaring an Emmy Award, an Oscar and now the Nobel Peace Prize, will inevitably resurrect talk of a run for the presidency in 2008. But Gore - and American voters - beware: The skills required to succeed as president and to win a Nobel Prize are not only different but contradictory, especially these days.
Americans yearn for a president with George Washington's rectitude and Pope John Paul's certitude. Celluloid and video presidents often have displayed those qualities, most recently in the character Martin Sheen created on NBC's West Wing, President Josiah "Jed" Bartlet. Bartlet is not only honest and principled; he, like Gore, is a Nobel laureate, having won his in economics before becoming governor of New Hampshire and then president.
Alas, the real world usually leaves the Nobel geniuses in academia, the saints in temples and crusaders on the outside agitating for change within. The Al Gore who won the Nobel Peace Prize was very different than the Al Gore who lost the presidency by judicial fiat. Gore as environmentalist is passionate, daring, funny, even if a bit wooden in his Inconvenient Truth lectures.
Candidate Gore was lethargic, hypercautious, and so wooden that the few jokes he made had to do with his resemblance to those old-fashioned cigar-store Indians.
Few remember how awkward Gore was in 2000 - when he should have triumphed with large margins, and did not even win his home state. Few remember that in late October 2001, with Americans still reeling from the Sept. 11 terrorist attacks, few of Gore's own people missed having him in the Oval Office.
After America's Afghanistan invasion ousted the Taliban and sent Osama bin Laden to the hills, 14 of 15 Al Gore supporters the New York Times interviewed had little good to say about Gore's leadership ability. Most Democrats insisted on speaking anonymously, but they pointed to Gore's tendency to micromanage, to be a know-it-all, to get mired in complexity, and to lack George W. Bush's black-and-white clarity when communicating with the people. Similarly, the current Democratic front-runner, Hillary Rodham Clinton, has overcome her tendency to be stiff in public and rigid in her policy preferences, demonstrating a remarkable vindicating fluidity so far.
Many Americans also will be able to read the signals coming from Oslo loud and clear. Gore's prize is well-deserved for his activism, for his commitment and for his ability to get millions around the world thinking about our stewardship of this planet.
His success in popularizing the concept of everyone's "carbon footprint," which became the buzzword of this past year, is admirable and important. But the Nobel Prize committee's delight in selecting candidates who are Bush's rivals is not likely to win Gore many votes in America's heartland, even among many fed up with Bush.
At a time when America's international standing is so low, it is important to have Europeans' favourite Americans like Jimmy Carter and Gore wined, dined and prized on the continent. But being a hit in the castles of Europe - or even the seminar rooms and newsrooms of North America - has limited utility during a bruising presidential campaign across all 50 United States.
A healthy democracy needs consensus-building politicians who know how to compromise and Martin Luther King-style "creative extremists" who don't.
Gore has demonstrated great skill as an activist - and is doing noble work, challenging us all to build greater lives by living more modestly. But even in this age of celebrity politics, not all fame is transferable.
Gore should not let the accolades of Hollywood and Oslo go to his head. He should continue his great campaign for a clean future while watching the bruising, intensive, soul-searing, often dirty campaign for president of the United States go on without him.
Posted on: Sunday, October 14, 2007 - 18:19
SOURCE: LAT (10-14-07)
Victory in Iraq and Afghanistan is vital to U.S. national security, and we must spend whatever it takes to win in both places. The $190 billion requested for this year is still less than 1.5% of our gross domestic product, a small burden given the enormity of the stakes. We are in a desperate war against terrorists who have vowed to destroy us, yet our military remains about the same size as it was in the 1990s.
America's top priority for weakening Islamist terror groups should be to defeat Al Qaeda in Iraq, which is the increasingly important offshoot of Al Qaeda in Afghanistan. It cannot be allowed to grow stronger. Already, Al Qaeda has used the Soviet failure in Afghanistan and U.S. retreats from Somalia and Lebanon as proof of the strength of its ideology. Sheik Hassan Nasrallah, head of Lebanese Hezbollah (whose agents are supporting Shiite extremists in Iraq), has said that the U.S. will withdraw in shame from Iraq as we did from Vietnam. We must not allow that prediction to come true.
Some people say we must get out of Iraq immediately because our presence there serves only to recruit more people to the ranks of theradical Islamists. But honestly, the presence of American forces in any numbers in a Muslim land can serve as a recruiting tool. It doesn't matter to the terrorists if there are 160,000 Americans in Iraq or 160 -- the propaganda about " U.S. occupation" will be just the same. It does matter if they can claim to have defeated us again.
Other critics would abandon Iraq and shift resources to Afghanistan. Current efforts to fight Al Qaeda inside Afghanistan must be stepped up. But how would we actually rout Al Qaeda from its bases in the tribal areas of Pakistan? Shall we invade Pakistan, a nuclear weapons state with 125 million people? Using American forces to defeat Al Qaeda in Pakistan would be extremely difficult and dangerous, but we are already defeating Al Qaeda in Iraq. It makes no sense to abandon a critical effort in Iraq that is going well to start a riskier campaign from scratch in Pakistan.
Moreover, Iraq is a potentially wealthy country in the heart of the Middle East; Afghanistan is an isolated land with few resources and central to nothing. Al Qaeda would happily trade Afghanistan for Iraq -- indeed, it has done so, funneling its own resources into Iraq to fight us where we are strongest. Ceding either Iraq or Afghanistan to them would be a tragic mistake.
Posted on: Sunday, October 14, 2007 - 18:10
SOURCE: Informed Comment (Blog) (10-12-07)
Turkey has been the strongest ally that the United States has had in the Middle East since the end of WW II. The Marshall Plan started with Northern tier states like Turkey and Greece. Turkey joined NATO and was a key player in the American victory in the Cold War. As a secular government, Turkey stood against the rising tide of Muslim radicalism. To the extent that Turkey is moderating its long-term secular militancy, and moving toward fair elections, it may be providing a model for a moderate, democratic Middle East. Its economy is growing rapidly, foreign investment is in the billions. Turkey is in short, almost everything the US could have asked for in the Middle East.
But the Bush administration has, during the past five years, increasingly thrown away this asset, and now is in danger of losing a close and valued ally altogether. It is unclear what US interests are served by this repeated and profound damage inflicted by Washington on Turkey, or what Ankara ever did to us that we are treating them so horribly. (The dismissive treatment in some ways began when the US promised Turkey $1 bn in aid to offset the damage to its economy of the Gulf War in 1990-1991, but then Congress formally decided by the mid-1990s to renege on the pledge. No one has ever explained why we stiffed them.)
The threat of a Turkish hot pursuit of PKK guerrillas into Iraqi Kurdistan is starting to have an effect on Kurdistan's economy and stability. Inflation is high and some Turkish businesses that had won bids to operate in the Kurdistan Regional Authority (KRG) are going back home in fear of trouble. Getting banks to underwrite economic enterprises is getting harder, which could result in a slowdown for Iraqi Kurdistan. This area was the last in Iraq not to be hit hard by instability, but tensions are growing.
Imagine what things look like from a Turkish point of view. Remember that Turkey is a NATO ally, that it stood with the US during the Korean War (in which its troops fought), during the Cold War, and during Bush's war on terror. Turkey gives the US military facilities, including the Incirlik Air Force base, through which large amounts of materiel for the US forces in northern Iraq flows.
First, the Bush administration insisted on invading Iraq and overthrowing the secular Iraqi government. It thereby let the Salafi Sunni and the Shiite fundamentalist genies out of the bottle and created vast instability on the southeastern border. It would be as though a US ally had invaded Mexico and inadvertently unleashed a Marxist peasant rebellion against San Diego. Secular Turkey already felt itself menaced by the Shiite ayatollahs of Iran and by the rising Salafi and al-Qaeda trends, and the US made everything far worse.
Then, the US gave the Kurdistan Regional Authority control over the Kirkuk police force and unleashed Kurdish troops on the Turkmen city of Tal Afar. (The Turks look on Iraq's 800,000 Turkmen as little brethren, over whom they feel protective, and don't want them dominated by Kurds).
The Kurds promptly announced their aspiration of annexing 3 further provinces, or at least big swathes of them, including the oil province of Kirkuk, and including substantial Turkmen populations. Not only was that guaranteed to cause violence with the Arabs and Turkmen, but it would give Kurdistan a source of fabulous wealth with which it could hope to attract Kurds in neighboring countries to join it, a la German Unification after the fall of the Berlin Wall - except that this unification would dismember several other countries.
Then the Kurdistan Regional Authority gave safe haven to 3,000 to 5,000 Kurdish guerrillas from eastern Anatolia in Turkey who have been killing Turks and blowing up things, reviving violence that had subsided in the early zeroes. Despite the US military occupation of Iraq, Washington has done nothing to stop what Turkey sees as terrorists from going over the border into Turkey and killing Turks. Turkish intelligence is convinced that the camps in Iraqi Kurdistan are key to weapons provision for the PKK, and that funding is coming from Kurdish small businessmen in Western Europe.
PKK guerrillas have just killed 13 Turkish troops on Sunday and in the past few weeks have killed 28 altogether. If guerrillas were raiding over the border into the United States and had killed 28 US troops I think I know what Washington's response would be.
The the US Congress abruptly condemned modern Kemalist Turkey for the Armenian genocide, committed by the Ottoman Empire, provoking Ankara to withdraw its ambassador from Washington. I have long held that Turkey should acknowledge the genocide, which killed hundreds of thousands and displaced more hundreds of thousands. The Turkish government could then point out that it was committed by a tyrannical and oppressive government-- the Ottoman Empire-- against which the Kemalists also fought a long and determined war to establish a modern republic. I can't understand Ankara's unwillingness to distance itself from a predecessor it doesn't even think well of--the junta of Enver Pasha and the later pusillanimity of the sultan (the capital is in Ankara and not Istanbul in part for this very reason!)
But no dispassionate observer could avoid the conclusion that the Congressional vote condemning Turkey came at a most inopportune time for US-Turkish diplomacy, at a time when Turks were already raw from watching the US upset all the apple carts in their neighborhood, unleash existential threats against them, cause the rise of Salafi radicalism next door, coddle terrorists killing them, coddle the separatist KRG, and strengthen the Shiite ayatollahs on their borders.
The Congressional vote came despite the discomfort of elements of the Israel lobby with recognizing the mass killing of Armenians as a genocide. Andrew E. Mathis explains Abraham Foxman's intellectually bankrupt vacillations on this issue. Foxman and others of his ideological orientation have been forced grudgingly to back off their genocide denial in the case of the Armenians by a general shift in opinion among the American public, and his change of position may have removed any fears among congressional representatives that the Israel lobby would punish them for their vote. (Turkey and Israel have long had a strong military and diplomatic relationship, which the Israel lobby had earlier attempted to preserve by lobbying congress on Turkey's behalf with regard to some issues. But the Israel lobby is now split between pro-Kurdish factions and pro-Turkish factions, and the pro-Kurdish ones appear to be winning out. Richard Perle & Michael Rubin of AEI are examples of the pro-Turkish Neoconservative strain in the Israel lobby. They are losing.)
In 2000, 56% of Turks reported in polls that they had a favorable view of the United States. In 2005 that statistic had fallen to 12%. I shudder to think what it is now.
Posted on: Friday, October 12, 2007 - 19:18
SOURCE: TomDispatch.com (10-11-07)
Duane Schattle doesn't mince words."The cities are the problem," he says. A retired Marine infantry lieutenant colonel who worked on urban warfare issues at the Pentagon in the late 1990s, he now serves as director of the Joint Urban Operations Office at U.S. Joint Forces Command. He sees the war in the streets of Iraq's cities as the prototype for tomorrow's battlespace."This is the next fight," he warns."The future of warfare is what we see now."
He isn't alone."We think urban is the future," says James Lasswell, a retired colonel who now heads the Office of Science and Technology at the Marine Corps Warfighting Laboratory."Everything worth fighting for is in the urban environment." And Wayne Michael Hall, a retired Army brigadier general and the senior intelligence advisor in Schattle's operation, has a similar assessment,"We will be fighting in urban terrain for the next hundred years."
Last month, in a hotel nestled behind a medical complex in Washington, D.C., Schattle, Lasswell, and Hall, along with Pentagon power-brokers, active duty and retired U.S. military personnel, foreign coalition partners, representatives of big and small defense contractors, and academics who support their work gathered for a"Joint Urban Operations, 2007" conference. Some had served in Iraq or Afghanistan; others were involved in designing strategy, tactics, and concepts, or in creating new weaponry and equipment, for the urban wars in those countries. And here, in this hotel conference center, they're talking about military technologies of a sort you've only seen in James Cameron's 2000-2002 television series Dark Angel.
I'm the oddity in this room of largely besuited defense contractors, military retirees, and camouflage-fatigue-clad military men at a conference focused on strategies for battling it out in the labyrinthine warrens of what urbanologist Mike Davis calls"the planet of slums." The hulking guy who plops down next to me as the meeting begins is a caricature of just the attendee you might imagine would be at such a meeting."I sell guns," he says right off. Over the course of the conference, this representative of one of the world's best known weapons manufacturers will suggest that members of the media be shot to avoid bad press and he'll call a local tour guide he met in Vietnam a"bastard" for explaining just how his people thwarted U.S. efforts to kill them. But he's an exception. Almost everyone else seems to be a master of serene anodyne-speak. Even the camo-clad guys seem somehow more academic than warlike.
In his tour de force book Planet of Slums, Davis observes,"The Pentagon's best minds have dared to venture where most United Nations, World Bank or Department of State types fear to go… [T]hey now assert that the ‘feral, failed cities' of the Third World -- especially their slum outskirts -- will be the distinctive battlespace of the twenty-first century." Pentagon war-fighting doctrine, he notes,"is being reshaped accordingly to support a low-intensity world war of unlimited duration against criminalized segments of the urban poor."
But the mostly male conference-goers planning for a multi-generational struggle against the global South's slums aren't a gang of urban warfare cowboys talking non-stop death and destruction; and they don't look particularly bellicose either, as they munch on chocolate-chip cookies during our afternoon snack breaks in a room where cold cuts and brochures for the Rapid Wall Breaching Kit -- which allows users to blast a man-sized hole in the side of any building -- are carefully laid out on the tables. Instead, these mild-mannered men speak about combat restraint,"less than lethal weaponry," precision targeting, and (harking back to the Vietnam War)"winning hearts and minds."
The Men of Urban Warfare
Take Dr. Russell W. Glenn, a thin, bespectacled RAND Senior Policy Researcher who looks for all the world like some bookish college professor Hollywood dreamed up. You'd never guess he went to the Army's airborne, ranger, and pathfinder schools and is a veteran of Operation Desert Storm. You'd also never suspect that he might be the most prolific planner for the Pentagon's century-long slum fight of tomorrow.
In Planet of Slums, Davis notes that the RAND Corporation, a non-profit think-tank established by the Air Force in 1948, has been a key player in pioneering the conceptual framework that has led to the current generation of what's called, in the jargon of this meeting,"urban operations" or, more familiarly, UO. Glenn, it so happens, is their main man in the field. He travels the planet studying counterinsurgency warfare. Of late, he's been to the Solomon Islands, where an island rebellion occurred in the late 1990s, the Philippines, where an insurgency has been raging for decades (if not since the U.S. occupation at the dawn of the twentieth century), and, of course, Iraq. He's co-authored well over 20 UO studies for RAND including, most recently, ''People Make the City": Joint Urban Operations Observations and Insights from Afghanistan and Iraq (publicly available in 86-page executive summary form) and the still-classified A Tale of Three Cities: Analyzing Joint Urban Operations with a Focus on Fallujah, Al Amara, and Mosul.
On the technological front, the Pentagon's blue-skies research outfit, the Defense Advanced Research Projects Agency (DARPA) sent its grandfatherly-looking deputy director, Robert F. Leheny, to talk about such UO-oriented technology as the latest in unmanned aerial vehicles (UAVs) and sense-through-walls technologies that allow troops to see people and objects inside buildings. While Leheny noted that 63% of DARPA's $3 billion yearly budget ($600 million of it dedicated to UO technologies in the coming years) is funneled to industry partners, DARPA is only a part of the story when it comes to promoting corporate assistance in this 100-year-war growth area.
The largest contractors in the military-corporate complex are already hard at work helping the Pentagon prepare for future urban occupations. Raytheon, L-3 Communications, and Science Applications International Corporation (SAIC) -- the 5th, 7th, and 10th largest Pentagon contractors last year, taking in a combined $18.4-plus billion from the Department of Defense -- have all signed Cooperative Research and Development Agreements with the U.S. Joint Forces Command, according to Berry"Dan" Fox, the Deputy Director of Science and Technology at its Joint Urban Operations Office.
As you might imagine, smaller contractors are eager to climb aboard the urban warfare gravy train. At the conference, Lite Machines Corporation was a good example of this. It was vigorously marketing a hand-launched, low-flying UAV so light that it resembled nothing more than a large, plastic toy water rocket with miniature helicopter rotors. The company envisions a profitably privacy-free future in which urban zones are besieged by"swarms" of such small UAVs that not only peek into city windows, but even invade homes. According to a company spokesman,"You could really blanket a ground area with as many UAVs as you want…. penetrate structures, see through a window or even break a window," in order to fly inside a house or apartment and troll around.
DARPA'S Leheny also extolled hovering UAVs, specifically the positively green-sounding Organic Micro Air Vehicle which brings to mind the"spinners" in Blade Runner or, even earlier in blow-your-mind futuristic movie history, V.I.N.CENT from Disney's The Black Hole. This drone, Leheny noted, has"perch and stare" capabilities that allow it to lie in wait for hours before fixing on a target and guiding in extended-line-of-sight or beyond-line-of-sight weapons. He also described in detail another DARPA-pioneered unmanned aerial vehicle, the WASP -- a tiny, silent drone that spies on the sly and can be carried in a soldier's pack. Leheny noted that there are now"a couple hundred of these flying in Iraq."
In addition to endless chatter about the devastated"urban canyons" of Iraq and Afghanistan, the specters of past battleground cities -- some of them, anyway -- were clearly on many minds. There were constant references to urban battle zones of history like Stalingrad and Grozny or such American examples as Manila in 1945 and Panama City in 1989. Curiously neglected, however, were the flattened cities of Germany and Japan in World War II, not to speak of the bombed-out cities of Korea and Vietnam. Perhaps the Korean and Vietnam Wars weren't on the agenda because"restraint" and"precision" were such watchwords of the meeting. No one seemed particularly eager to discuss the destruction visited on the Iraqi city of Fallujah either -- three-quarters of its buildings and mosques were damaged in an American assault in November 2004.
During James Lasswell's presentation, he was quite specific about the non-Fallujah-like need to be"very discriminate" in applying firepower in an urban environment. As an example of the ability of technology to aid in such efforts, he displayed a photo of the aftermath of an Israeli strike on a three-story Lebanese building. The third floor of the structure had been obliterated, while the roof above and the floors below appeared relatively unscathed. In an aside, Lasswell mentioned that, while the effort had been a discriminating one, the floor taken out"turned out to be the wrong floor." A rumble of knowing chuckles swept the room.
Fighting in the City of Your Choice, 2045
Discrimination, it turned out, didn't mean legal constraint. Speakers and conference-goers alike repeatedly lamented the way international law and similar hindrances stood in the way of unleashing chemical agents and emerging technologies. Microwave-like pain rays and other directed energy weapons -- such as the Active Denial System which inflicts an intense burning sensation on victims -- were reoccurring favorites of the gathering. During their PowerPoint presentation, the men from Lite Machines, for instance, showed a computer rendering of their micro-UAVs attacking an unarmed crowd gathered in a town square with a variety of less-than-lethal weapons like disorienting laser dazzlers and chemical gases (vomiting and tear-gas agents), while a company spokesman regretfully mentioned that international regulations have made it impossible to employ such gases on the battlefield. Undoubtedly, this was a reference to the scorned Chemical Weapons Convention, which has been binding for the last decade.
RAND's Glenn similarly brought up the possibility of reassessing such international conventions and overcoming fears that chemical weapons might fall into the"wrong hands." Saddam Hussein was his example of such"wrong hands," but the hands responsible for Abu Ghraib, Mahmudiyah, Hamdania, Haditha, or the invasion of Iraq itself -- to find non-existent banned weapons -- seemed to give him no pause.
While the various speakers at the conference focused on the burgeoning inhabitants of the developing world's slum cities as targets of the Pentagon's 100-year war, it was clear that those in the"homeland" weren't about to escape some of its effects either. For example, back in 2004, Marines deploying to Iraq brought the Long Range Acoustic Device (LRAD) with them. A futuristic non-lethal weapon alluded to multiple times at the conference, it emits a powerful tone which can bring agonizing pain to those within earshot. Says Woody Norris, chairman of the American Technology Corporation, which manufactures the device:"It will knock [some people] on their knees." That very same year, the LRAD was deployed to the streets of the Big Apple (but apparently not used) by the New York Police Department as a backup for protests against the Republican National Convention. In 2005, it was shipped to"areas hit by Hurricane Katrina" for possible" crowd control" purposes and, by 2006, was in the hands of U.S. Border Patrol agents. In that same year, it was also revealed that the Los Angeles County Sheriff's Department had begun testing the use of remote-controlled surveillance UAVs -- not unlike those now operating above Iraqi cities -- over their own megalopolis.
When it came to the"homeland," conference participants were particularly focused on moving beyond weaponry aimed at individuals, like rubber bullets. Needed in the future, they generally agreed, were technologies that could target whole crowds at once -- not just rioters but even those simply attending"demonstrations that could go violent."
Other futuristic UO concepts are also coming home. According to Dan Fox of the Joint Urban Operations Office, the Department of Justice, like the military, is currently working on sense-through-wall technologies. His associate Duane Schattle is collaborating with the U.S. Northern Command (NORTHCOM) -- set up by the Bush administration in 2002 and whose area of operations is"America's homefront" -- on such subjects as"sharing intelligence, surveillance, and reconnaissance, command and control capabilities." He also spoke at the conference about developing synergy between the Departments of Defense and Homeland Security in regard to urban-operations technologies. He, too, expressed his hope that microwave weapon technology would be made available for police use in this country.
A specific goal of DARPA, as a slide in deputy chief Leheny's presentation made clear, is to"make a foreign city as familiar as the soldier's backyard." This would be done through the deployment of intrusive sensor, UAV, and mapping technologies. In fact, there were few imaginable technologies, even ones that not so long ago inhabited the wildest frontiers of science fiction, that weren't being considered for the 100-year battle these men are convinced is ahead of us in the planet's city streets. The only thing not evidently open to discussion was the basic wisdom of planning to occupy foreign cities for a century to come. Even among the most thoughtful of these often brainy participants, there wasn't a nod toward, or a question asked of, the essential guiding principle of the conference itself.
With their surprisingly bloodless language, antiseptic PowerPoint presentations, and calm tones, these men -- only one woman spoke -- are still planning Iraq-style wars of tomorrow. What makes this chilling is not only that they envision a future of endless urban warfare, but that they have the power to drive such a war-fighting doctrine into that future; that they have the power to mold strategy and advance weaponry that can, in the end, lock Americans into policies that are unlikely to make it beyond these conference-room doors, no less into public debate, before they are unleashed.
These men may be mapping out the next hundred years for urban populations in cities across the planet. At the conference, at least, which ones exactly seemed beside the point. Who could know, after all, whether in, say, 2045, the target would be Mumbai, Lagos, or Karachi -- though one speaker did offhandedly mention Jakarta, Indonesia, a city of nine million today, as a future possibility.
Along with the lack of even a hint of skepticism about the basic premise of the conference went a fundamental belief that being fought to a standstill by a ragtag insurgency in Iraq was an issue to be addressed by merely rewriting familiar tactics, strategy, and doctrine and throwing multi-billions more in taxpayer dollars -- in the form of endless new technologies -- at the problem. In fact, listening to the presentations in that conference room, with its rows of white-shrouded tables in front of a small stage, it would not have been hard to believe that the U.S. had defeated North Korea, had won in Vietnam, had never rushed out of Beirut or fled Mogadishu, or hadn't spent markedly more time failing to achieve victory in Afghanistan than it did fighting the First and Second World Wars combined.
To the rest of the world, at least, it's clear enough that the Pentagon knows how to redden city streets in the developing world, just not win wars there; but in Washington -- by the evidence of this"Joint Urban Operations, 2007" conference -- it matters little. Advised, outfitted, and educated by these mild-mannered men who sipped sodas and noshed on burnt egg rolls between presentations, the Pentagon has evidently decided to prepare for 100 years more of the same: war against various outposts of a restless, oppressed population of slum-dwellers one billion strong and growing at an estimated rate of 25 million a year. All of these UO experts are preparing for an endless struggle that history suggests they can't win, but that is guaranteed to lead to large-scale destruction, destabilization, and death. Unsurprisingly, the civilians of the cities that they plan to occupy, whether living in Karachi, Jakarta, or Baghdad, have no say in the matter. No one thought to invite any of them to the conference.
This article first appeared on www.tomdispatch.com, a weblog of the Nation Institute, which offers a steady flow of alternate sources, news and opinion from Tom Engelhardt, a long time editor in publishing, the author of The End of Victory Culture, and a fellow of the Nation Institute.
Posted on: Friday, October 12, 2007 - 18:57
SOURCE: Nation (10-10-07)
The People's Republic of China is an unusual place; as someone who teaches about China and regularly travels there, I wouldn't want it any other way. It would be boring to lecture about a country lacking distinctive elements, and spending time in China would be far less interesting if I weren't continually struck by ways that it differs from other countries, including this one. It is also true that China is sometimes treated unusually by the international community. Given the size of its population, for example, it is often courted and feared in ways that smaller countries are not.
But I'm distressed by the tendency of so many Americans to assume that everything that goes on in China and everything about the treatment it gets is exotic and unusual. Often things that happen in or involve China are normal--even routine--and we can understand them without factoring in esoteric cultural traits or thinking of the country as a place that, in the global arena, always mysteriously gets handled with kid gloves.
Take the Olympics. To read some American commentaries, you'd think that the Games have virtually never been hosted by human rights-abusing authoritarian regimes--with Berlin in 1936 and Moscow in 1980 the only previous anomalies. According to this line of thinking, the decision for China to host the Games, despite the fact that the country is led by the regime responsible for the June 4, 1989, massacre, proves that it consistently gets coddled.
But the list of authoritarian states that have hosted the Olympics is actually fairly long. Militaristic Japan got the go-ahead for the 1940 Games (though these were ultimately canceled). In 1968 the Olympic Games were held days after a massacre of students in Mexico (then a one-party state). In 1988, when Seoul got the nod to host the Games, South Korea was run by the same autocrats with blood on their hands from the Kwangju Massacre in 1980.
And consider the current furor over product safety and piracy. Here the situation is fairly typical for a country at a certain stage of capitalist development, yet China somehow is regarded differently.
Troubling forms of corruption are endemic to China, making it easy for well-connected people to get away with flouting copyright and product safety laws. Still, as American historian Stephen Mihm notes in a recent essay published in the Boston Globe, chalking up the piracy and product problems to China's unique features is "a tempting way to see things, but wrong."
That's because what's "happening halfway around the world may be disturbing, even disgraceful, but it's hardly foreign," Mihm writes. "A century and a half ago, another fast-growing nation had a reputation for sacrificing standards to its pursuit of profit, and it was the United States.... American factories turned out adulterated foods and willfully mislabeled products. Indeed, to see China today is to glimpse, in a distant mirror, the 19th-century American economy in all its corner-cutting, fraudulent glory."
Something else that's "hardly foreign" but has been treated as exotic is Mattel executive Thomas Debrowski's apology to China for the country's factories taking all the heat initially for his company's recalls. Debrowski's point was simple: even though Chinese factories shouldn't use lead paint (a banned substance), fewer toys were recalled for this reason than for containing magnets harmful to toddlers if swallowed. In producing those, Chinese factories just faithfully followed flawed designs that were made in the USA.
In America, people and corporations trying to move beyond an unpleasant moment routinely make public apologies. Don Imus and his employers did this, for example, after the radio host insulted a group of female athletes. And the president of Duke University recently apologized for not doing more to support members of the school's lacrosse team when they were being investigated as a result of rape charges that were eventually dropped.
And yet, to explain Debrowski's actions, many commentators felt compelled to invoke esoteric concepts and terms. The company had "kowtowed" to China (a term conjuring up exotic images of feudal obeisance). Debrowksi had apologized in a very public way, a business professor claimed in a widely circulated AP report, because "saving face...is very important in the Chinese culture."
Funny, when Imus and his employers made their apologies, this wasn't referred to as "kowtowing" to anyone. Nor have the very public actions taken by Duke's president been said to reveal an American cultural obsession with face-saving.
Aha, a skeptic might interject--there's still something culturally distinctive about the way Chinese officials recounted Debrowski's apology. To protect the regime's reputation, they claimed that Mattel had assured the world that China and its factories were completely blameless.
That was misleading but hardly exotic. Can Americans really claim without blushing that spin control intended to "save face" and deflect criticism from government failings is unknown here? Doesn't our President keep presenting reports of minimal progress amidst cascading disaster in Iraq as proof his policies are working?
To borrow Mihm's phrasing, when the Chinese regime acts in "disturbing" or "disgraceful" ways, we should by all means speak out against this behavior and try to change it. But when we do so, it would be disingenuous to pretend that China's behavior is exotic. Often, the things we want Beijing to stop doing--recklessly using fossil fuels, mistreating ethnic groups living in frontier regions, and so on--are much like things that America is doing or used to do, much as we wish now that it hadn't.
The Chinese regime deserves to be chastised for the shameful way it continues to prop up the thuggish rulers of Burma, including moving to block a forceful UN censure, just because they happen to be allies. But what Beijing is doing is not so different from what Washington did in 1980 at the time of the Kwangju Massacre, when the protesters being mowed down were Korean rather than Burmese and the thugs behind the killings were our allies, not China's.
We should try to work from a novel starting point whenever we want to criticize China--or indeed when we want to praise it or simply try to understand it. Namely, assume that despite its unusual size, distinctive history and other things that set it apart and make it anomalous (such as being run by a Communist Party that has embraced elements of capitalism), China has many features that are familiar, not exotic.
It's been nearly thirty years since Washington and Beijing normalized diplomatic relations. Isn't it time we finally normalized the way we think and talk about China?
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Friday, October 12, 2007 - 18:30
SOURCE: Jerusalem Post (10-11-07)
"We are all Keynsians now," Richard Nixon famously asserted just as the economic theories of John Maynard Keynes fell into disrepute. Likewise, one could have said with similar confidence in 1989, as Israel's existence reached wide acceptance,"We are all Zionists now." No longer.
Count the ways Israel is under siege: from Iranians building a nuclear bomb, Syrians stockpiling chemical weapons, Egyptians and Saudis developing serious conventional forces, Hizbullah attacking from Lebanon, Fatah from the West Bank, Hamas from Gaza, and Israel's Muslim citizens becoming politically restive and more violent.
World-wide, professors, editorialists, and foreign ministry bureaucrats challenge the continued existence of a Jewish state. Even friendly governments, notably the Bush administration, pursue diplomatic initiatives that undermine Israeli deterrence even as their arms sales erode its security.
Let's suppose, however, that the country muddles through these many problems. That leaves it face to face with its ultimate challenge: a Jewish population increasingly disenchanted with, even embarrassed by, the country's founding ideology, Zionism, the Jewish national movement.
As developed by Theodor Herzl (1860-1904) and other theoreticians, Zionism's call for a sovereign Jewish state fit the political context and mood of its time. If Chinese, Arabs, and Irish sought to establish a national state, why not Jews?
Indeed, especially Jews, for through nearly two millennia they had paid the greatest price of any people for their political weakness, having been expelled, victimized, persecuted and mass murdered as none other. Zionism offered an escape to this tragic history by standing tall and taking up the sword.
From its inception, Zionism had its share of Jewish opponents, ranging from the Haredim (Ultra-Orthodox) to nostalgic Iraqis to reform rabbis, But, until recently, these were marginal elements. Now, due to high birth rates, the once-tiny Haredi community constitutes 22 percent of Israel's current first-grade class; add to this the roughly equivalent number of Arab first-graders and a sea-change in Israeli politics can be expected about 2025.
Worse for Israel, Jewish nationalism has lost the near-automatic support it once had among secular Jews, many of whom find this nineteenth-century ideology out of date. Some accept arguments that a Jewish state represents racism or ethnic supremacism, others find universalist and multi-cultural alternatives compelling. Consider some signs of the changes underway:
- Young Israelis are avoiding the military in record numbers, with 26 percent of enlistment-age Jewish males and 43 percent of females not drafted in 2006. An alarmed Israel Defense Forces has requested legislation to deny state-provided benefits to Jewish Israelis who do not serve.
- Israel's Attorney General Menachem Mazuz has up-ended the work of the Jewish National Fund, one of the pioneer Zionist institutions (founded in 1901) by determining that its role of acquiring land specifically for Jews cannot continue in the future with state assistance.
- Prominent Israeli historians focus on showing how Israel was conceived in sin and has been a force for evil.
- Israel's ministry of education has approved school books for third-grade Arab students that present the creation of Israel in 1948 as a" catastrophe" (Arabic: nakba).
- Avraham Burg, scion of a leading Zionist household and himself a prominent Labor Party figure, has published a book comparing Israel with 1930s Germany.
- A 2004 poll found only 17 percent of American Jews call themselves"Zionist."
Abraham Burg, a former Labor Party leader, compares Israel with 1930s Germany.
To top it off, Arabs are moving these days in the opposite direction, reaching a fever pitch of ethnic and religious bellicosity.
As a Zionist myself, I watch these several trends with foreboding about Israel's future.
I console myself by recalling that few of today's problems were evident in 1989. Perhaps in 2025, Zionism's prospects will again brighten, as Westerners generally and Israelis specifically finally awake to the dangers posed by Palestinian irredentists, jihadists, and other extremist Middle Easterners.
Posted on: Thursday, October 11, 2007 - 22:01
SOURCE: NYT Magazine (9-23-07)
The last Supreme Court term, which ended in June, was the stormiest in recent memory, with more 5-to-4 decisions split along ideological lines than at any time in the court's history. In a series of controversial cases about abortion, racial integration in schools, faith-based programs and the death penalty, the court's four more conservative justices prevailed, with Justice Anthony M. Kennedy providing the crucial fifth vote. The four more liberal justices were often moved to dissent in unusually personal and vehement terms. ''It is my firm conviction,'' Justice John Paul Stevens wrote in the case striking down race-based enrollment policies in public schools, ''that no Member of the Court that I joined in 1975 would have agreed with today's decision.'' According to the gossip among Supreme Court law clerks, the level of tension among the justices is higher than at any point since Bush v. Gore in 2000.
Not long after beginning his tenure as chief justice in 2005, John G. Roberts Jr. announced publicly that he would try to promote unanimity and collegiality on the court. During his first months on the job, the court managed to achieve his goal, issuing a series of 9-to-0 opinions. But this past term, the court's first full one with Justice Samuel A. Alito Jr., the brief period of harmony abruptly ended: the percentage of 5-to-4 decisions in which the four liberals were together in dissent rose to 80 percent, up from 55 percent in the 2004 term. For the foreseeable future, the court seems likely to be polarized, with the conservative bloc ascendant and the liberal bloc embattled.
Justice Stevens, the oldest and arguably most liberal justice, now finds himself the leader of the opposition. Vigorous and sharp at 87, he has served on the court for 32 years, approaching the record set by his predecessor, William O. Douglas, who served for 36. In criminal-law and death-penalty cases, Stevens has voted against the government and in favor of the individual more frequently than any other sitting justice. He files more dissents and separate opinions than any of his colleagues. He is the court's most outspoken defender of the need for judicial oversight of executive power. And in recent years, he has written majority opinions in two of the most important cases ruling against the Bush administration's treatment of suspected enemy combatants in the war on terror -- an issue the court will revisit this term, which begins Oct. 1, when it hears appeals by GuantÃ¡namo detainees challenging their lack of access to federal courts....
Posted on: Thursday, October 11, 2007 - 21:21
SOURCE: Britannica Blog (10-10-07)
More than half a century has passed since the United States deposed the only democratic government Iran ever had. As militants in Washington urge a second American attack on Iran, the story of the first one becomes more urgently relevant than ever. It shows the folly of using violence to try to reshape Iran.
If the United States had not sent agents to depose Prime Minister Mohammad Mossadegh (right) in 1953, Iran would probably have continued along its path toward full democracy. Over the decades that followed, it might have become the first democratic state in the Muslim Middle East, and perhaps even a model for other countries in the region and beyond.
Before great powers take far-reaching decisions that can reshape the world, their leaders normally consider the lessons of history. Any serious discussion about modern Iran, and certainly any debate about whether the United States should intervene there, must include an assessment of what happened after the last intervention. In 1953, eager to achieve short-term goals, the US launched an operation that brought calamity on both Iran and itself. Some in Washington, however, reject the idea that this history has any relevance to the present era. They believe that this time, the United States can attack Iran and emerge triumphant.
Attacking Iran now, however, would turn that country’s oppressive leaders, who are now highly unpopular at home, into heroes of Islamic resistance; give them a strong incentive to launch a violent counter-campaign against American interests around the world; greatly strengthen Iranian nationalism, Shiite irredentism and Muslim extremism, thereby attracting countless new recruits to the cause of terror; undermine the democratic movement in Iran and destroy the prospects for political change there for at least another generation; turn the people of Iran, who are now among the most pro-American in the Middle East, into enemies of the United States; require the United States to remain deeply involved in the Persian Gulf indefinitely, forcing it to take sides in all manner of regional conflicts and thereby make a host of new enemies; enrage the Shiite-dominated government in neighboring Iraq, on which the US is relying on calm the violence there; and quite possibly disrupt the flow of Middle East petroleum in ways that could wreak havoc on Western economies.
These two countries are not fated to be enemies forever. In fact, they share many strategic goals and may even be seen as potential allies. Both desperately want to stabilize Iraq and Afghanistan. Both detest radical Sunni movements like al-Qaeda and the Taliban. Both, for different reasons, seek to assure a steady supply of petroleum to Western markets. Iran’s oil industry is in a parlous state and needs tens of billions of dollars in investment; the United States has huge reserves of capital and a voracious appetite for oil.
A new American approach to Iran should be based on direct, bilateral, and unconditional negotiations. Beyond that, it is in the urgent interest of the United States to promote all manner of social, political and economic contacts with Iranians. In a new climate, American businesses would no longer be forbidden to trade with Iran, but encouraged to do so. Rather than tightly restricting the number of visas issued to Iranians, the US would do the opposite: invite as many Iranians as possible to the United States, and flood Iran with Americans.
Unlike other countries in its neighborhood, Iran has been advancing toward democracy since adopting its first constitution more than a century ago. Iranian constitutions have not always been observed, and Iranian elections have not always been fair. Over this long period, however, the Iranian people have developed a deep understanding of what democracy means. Many thirst for it. There is more fertile ground for democratic change in Iran than in almost any other Muslim country.
Some in Washington argue that any new regime in Iran would be an improvement over the repressive and xenophobic mullahs. They are dangerously mistaken. An attack on Iran might well throw that country into chaos like that which has enveloped Iraq. In such an anarchic environment, there would be no central authority to control violent radicals. Most frighteningly, those radicals might include enraged nuclear technicians and scientists. The chance that Iranians might use their technological know-how to pass weapons of mass destruction on to terrorist groups would be far greater after an attack than it is now.
Bombing nuclear facilities in Iran — assuming they could all be found and destroyed — would be at best a temporary solution. It would almost certainly lead to the emergence of more terrifying threats than those Iran poses today. As the director of the International Atomic Energy Agency, Mohammad alBaredei, likes to point out, buildings can be attacked and destroyed, but “you cannot bomb knowledge.”
By violently pushing Iran off the path to democracy in 1953, the United States created a whirlpool of instability from which undreamed-of threats emerged years later. A long American campaign of isolation, pressure and threats has produced no change in Iran’s behavior. Continuing it will mean a steady increase in tension that some in Washington believe should culminate in a military attack. Such an attack would usher in another era of upheaval in Iran and the surrounding region, this time with the overlay of nuclear-tinged terror.
Operation Ajax, as the CIA plot to depose Prime Minister Mossadegh was code-named, brought immeasurable tragedy to Iran, contributed to the rise of anti-American terror and, in the end, greatly weakened the security of the United States. Few episodes of 20th-century history more perfectly epitomize the concept of “blowback.” Today, as anti-Iran rhetoric in Washington becomes steadily more strident, it is more urgent than ever for Americans to understand how disastrous the last US attack on Iran turned out to be. They might also ponder the question of what moral responsibility the US has to Iran in the wake of this painful history.
Posted on: Wednesday, October 10, 2007 - 22:57
SOURCE: Global Research in International Affairs (GLORIA) Center (10-9-07)
Working on new material for the seventh edition of the Israel-Arab Reader, a documentary work that I edit along with Walter Laqueur, reminds me that there is nothing like examining old material as a way to gain new insights.
This edition updates the book whose current contents ended in 2000 with the failure of the peace process. The most important developments since then are basically the renewed intifada; Israeli withdrawals from southern Lebanon and the Gaza Strip; the growing direct involvement of Hizballah and Iran, including the 2006 war; and Hamas' triumph over Fatah in the 2006 elections and in seizing control of the Gaza Strip.
But you know all that already. What is really interesting is to see some fascinating themes that tell us so much about both the conflict and Middle East politics.
A very important theme is whether there is any operational plan, any real ability, to implement needed reforms and achieve goals: Compare the speech made by Mahmoud Abbas to the Palestinian Legislative Council when he took power as prime minister on April 29, 2003, with one made by Prime Minister Ehud Olmert immediately after the Lebanon War, on August 28, 2006.
Abbas gives an impressive talk. While blaming all Palestinian problems on Israel and the occupation, he also puts the emphasis on the need for internal reform. The Palestinians have behaved so impressively, he claims, that the world has decided they are worthy of a state.
To achieve that goal, however, requires the Palestinians to put their own house in order. It must provide security for its citizens, ensure the security services operate according to law, and be disciplined. The government will not allow--to the contrary it will strictly prevent--interference by the security forces in the lives, affairs and business of citizens unless within the limits permitted by the law.
On and on he goes, discussing the need for economic development, higher living conditions, and freedom. To Israelis, he pledges that his government will oppose terrorism. What is telling here is that literally not a single thing he talks about was ever implemented. We aren't just talking about success here. Abbas never even tried. From today's perspective, with Fatah driven out of the Gaza Strip and so much discredited, all these promises seem most ironic.
One doesn't have to be an admirer of Olmert to see the difference in attitude, for he is reflecting his society in this case. He explains: Even if the overall balance is positive, we cannot ignore the failures, we must not cover them up, we must not overlook anything. We do not have time. We must act quickly [to]…fix everything that must be fixed….
Obviously, Olmert wants to avoid taking the blame or suffering the consequences of these shortcomings. But his government, and the military especially, has honestly examined the mistakes and made many changes. And if they don't do the necessary reforms, their elected successors will do so. The difference here is a society which takes self-criticism and implementation seriously and one that does not, perhaps cannot, do so.
A second interesting theme is whether human life is valued. Nasrallah and the leaders of Iran, Syria, and Hamas extol martyrdom. They will win, they claim, because they love death and their enemies love life.
A word of advice: when you hear someone bragging about how wonderful it is to be a martyr be assured that those people will lose. For the object of war is to survive and win. The martyrdom egomaniacs are obsessed with being heroes no matter what the cost. But he who loves martyrdom also loves the martyred cause, the glorious defeat toward which their strategy is leading them. The Palestinian movement's history provides the best possible example of this syndrome.
In contrast, Olmert said: ,It is true that they suffered heavier losses, but this does not console us over the loss of one soldier, one person who was killed, one citizen who died. But caring about your own people—even if the other side sees that as weakness—is a far greater weapon in building a strong, successful society than does treating your citizens as dispensable pawns for the glory of deity and dictator.
A third theme is the definition of victory. Nasrallah's stance here is shockingly, well, suicidal. He stated: If the resistance survives, this will be a victory. If its determination is not broken, this will be a victory….[A refusal] to accept any humiliating terms…will be a victory. If we are not militarily defeated, this will be a victory.
In short, as long as you keep fighting and refuse to acknowledge defeat you have won even if you destroy your own society. What this means in practice is that the Arab side has been able to sustain the conflict for 60 years at the cost of social progress, higher living standards, stability, and freedom. This is a disastrous victory which ensures that the inability to win becomes ever more certain.
Israeli leaders, in contrast, define victory in terms of security for their citizens, control over strategic territory, deterrence, and other very specific goals which actually produce some benefit. Moreover, they can either be met or measured against the need for changes in strategy and methods.
There are reasons why some societies succeed and others fail, how some causes triumph and others don't. When such differences are attributed to conspiracy, Western meanness, or even defeat as a virtue; when being the underdog forgives all vices and the highest praise is to be regarded as victim, these reasons are disparaged. And that, of course, ensures that all the bad notes of history play on into a new century.
Posted on: Wednesday, October 10, 2007 - 21:32
SOURCE: http://www.delawareonline.com (10-7-07)
Sept. 30 saw a rare display of Iraqi-American unity in Baghdad: The U.S. embassy as well as scores of Iraqi politicians joined forces in condemning a U.S. Senate resolution to impose a federal state structure on all parts of Iraq.
In general, there was agreement that the proposal which had been introduced by Sen. Joseph Biden constituted gross interference in Iraqi internal affairs. Iraq already has a specific and very elaborate procedure for deciding the federalism issue, but both the timeline (nothing will start until April 1, 2008) as well as the size and number of the future federal entities (to be decided by popular referendums on the basis of grass-roots initiatives) are clearly at variance with the Senate's proposal of an"international conference" intended to accelerate and simplify matters.
For once, it seemed as if the Bush administration and the Iraqis were united in stressing the virtues of a unified Iraq capable of recovering from sectarian distrust.
There was one anomaly in this picture of Iraqi-American unity of purpose: The main forces that pulled together to condemn the Senate's decision were mostly from parties that are being largely ignored by the Bush administration. They included Sadrists, the Fadila party, independents and Daawa members of the United Iraqi Alliance, the Tawafuq bloc, and secular groups like Iraqiyya and National Dialogue Front. All in all, they made up a strong Shiite-Sunni alliance accounting for more than a simple majority in Iraq's parliament.
By way of contrast, all of Washington's principal allies in Iraq were absent. The Kurds enthusiastically welcomed the Senate decision, and the Islamic Supreme Council of Iraq wavered in its response, probably understanding some obvious parallels between the Senate proposal and their own scheme for a Shiite region, but also sensing a public opinion blowing in a different direction.
Could the message from Baghdad have been any clearer? Is there now any doubt as to where the real center in Iraqi politics is located? The reactions in Baghdad to the Senate decision show clearly that there is in fact a majority of Iraqi politicians that are prepared to work for the Bush administration's goal of a unified and non-sectarian Iraq, staunchly independent from its neighbors while at the same time at peace with them.
Posted on: Wednesday, October 10, 2007 - 21:22
SOURCE: FrontpageMag.com (10-10-07)
[Henry Mark Holzer, Professor Emeritus at Brooklyn Law School, is an appellate lawyer who specializes in constitutional law. You can contact him via his website: www.henrymarkholzer.com.]
Upon entering the august chamber of the Supreme Court of the United States during oral argument, one immediately sees the nine justices, almost regal in their black robes in front of a huge velvet curtain.
Among his colleagues—Chief Justice Roberts and Associate Justices Stevens, Kennedy, Scalia, Souter, Ginsburg, Breyer, and Alito—sits Associate Justice Clarence Thomas.
Chief Justice Roberts’s father was an executive with Bethlehem Steel. Justice John Paul Stevens’s father was a lawyer. Justice Antonin Scalia’s father was a professor of romance languages. Justice Anthony Kennedy’s father was a lawyer. Justice David Souter’s father was a banker. Justice Ruth Bader Ginsburg’s father was a businessman. Justice Stephen Breyer’s father was a lawyer. Justice Samuel Alito’s father was a high school teacher. Given their family circumstances and upbringing, it is not surprising that eventually they would be successful candidates for the Supreme Court of the United States.
In contrast, Justice Clarence Thomas was nine when he first met his father, whose “firm, shameless voice…carried no hint of remorse for his inexplicable absence from our lives.” In his recently published autobiography, My Grandfather’s Son, the author writes, “I saw him for the second time after I graduated from high school.”
Thomas’s mother had been born out of wedlock. She worked in a rural factory shucking oysters and picking crabs, and as a domestic servant.
I knew none of this, and little else, about Clarence Thomas’s personal life until reading his revealing, evocative autobiography. However, before that, albeit in another connection, I knew him quite well—even though we have never corresponded, spoken, or met.
I knew Justice Thomas through his opinions, written as an Associate Justice of the Supreme Court of the United States, which are the subject of my own book The Supreme Court Opinions of Clarence Thomas, 1991-2006.
My research for that book consisted solely of reading and analyzing some 350 of those opinions. The Supreme Court Opinions of Clarence Thomas is thus, in effect, a judicial biography.
But while reading Justice Thomas’s opinions on most of the important provisions of the Constitution (and many federal statutes) I couldn’t help wondering about the man, not the justice, who for now 16 terms has been making statement such as:
- “The Court’s evident belief that it is qualified to pass on the ‘[m]ilitary necessity’…of the Commander in Chief’s decision to employ a particular form of force against our enemies is so antithetical to our constitutional structure that it simply cannot go unanswered.”
- “The Necessary and Proper Clause is not a warrant to Congress to enact any law that bears some conceivable connection to the exercise of an enumerated power.”
- “As serious as the [majority’s] disregard for history, is its disregard for well-established principles of statutory construction. The Court chooses not only the harshest interpretation of a criminal statute, but also the interpretation that maximizes federal criminal jurisdiction over state and local officials.”
- “Today’s decision, while protecting jurors, leaves defendants with less means of protecting themselves. * * * In effect, we have exalted the right of citizens to sit on juries over the rights of the criminal defendant, even thought it is the defendant, nor the jurors, who faces imprisonment or even death.”
- “In short, the view that the Establishment Clause precludes Congress from legislating respecting religion lacks historical provenance, at least based on the history of which I am aware.”
- “Yet today the fundamental principle that ‘the best test of truth is the power of the thought to get itself accepted in the competition of the market’…is cast aside in the purported service of preventing ‘corruption,’ or the mere ‘appearance of corruption.’ * * * Apparently, the marketplace of ideas is to be fully open only to defamers…nude dancers…pornographers…flag burners…and cross burners.”
- “Whether embodied in the Fourteenth Amendment or inferred from the Fifth, equal protection is not a license for courts to judge the wisdom, fairness, or logic of legislative choices.”
- “This deferential shift in phraseology [from “public use” to “public purpose”] enables the Court to hold, against all common sense, that a costly urban-renewal project whose state purpose is a vague promise of new jobs and increased tax revenues, but which is also suspiciously agreeable to the Pfizer Corporation, is for ‘public use.’ I cannot agree. If such ‘economic development’ takings are for a ‘public use,’ any taking is, and the Court has erased the Public Use Clause from our Constitution….”
- “In my view, a use of force [by prison guards] that causes only insignificant harm to a prisoner may be immoral, it may be tortuous [an actionable civil wrong], it may be criminal, and it may even be remediable under other provisions of the Federal Constitution, but it is not cruel and unusual punishment.”
- “…as a Member of this Court, I am not empowered to help petitioners and others similarly situated. My duty, rather, is to decide cases agreeably to the Constitution and laws of the United States. * * * And like Justice Stewart [dissenting in Griswold v. Connecticut], I can find neither in the Bill of Rights nor in any other part of the Constitution a general right of privacy…or as the Court terms it today, the ‘liberty of the person both in its spatial and more transcendent dimensions’….”
- And, making as his own words those of ex-slave Frederick Douglass: “[I]n regard to the colored people, there is always more that is benevolent, I perceive, than just, manifested toward us. What I ask for the negro is not benevolence, not pity, not sympathy, but simply justice. The American people have always been anxious to know what they shall do with us…I have had but just one answer from the beginning. Do nothing with us! Your doing with us has already played the mischief with us. Do nothing with us! If the apples will not remain on the tree of their own strength, if they are worm-eaten at the core, if they are early ripe and disposed to fall, let them fall! And if the negro cannot stand on his own legs, let him fall also. All I ask is, give him a chance to stand on his own legs! Let him alone!…[Your] interference is doing him positive injury.”
Time and again I wondered what the formative influences were that shaped the man who sat with dignity and confidence among his judicial peers, judges of such different backgrounds and upbringings.
Neither I nor the rest of America—the legal profession and laymen alike—have to wonder any longer. In My Grandfather’s Son Clarence Thomas the man, not the justice, has candidly told us how he became who he is today.
Clarence Thomas’s life may have taken him to the heights of the Supreme Court of the United States, but it began inauspiciously.
He was born in 1948. His place of birth, Pinpoint, Georgia, “too small to be called a town,” was a 25-acre peninsula, a tidal salt creek. The lives of the hundred-or-so inhabitants “were a daily struggle for the barest of essentials: food clothing, and shelter.” Medical care was sparse, if available at all. Thomas’s home was a “shanty,” lacking a bathroom or running water. There was but a single light bulb. Newspapers stuffed into cracks were supposed to keep out the winter cold.
Because Thomas “had no idea that any other life was possible,” his days in Pinpoint, though “uncomplicated and unforgiving,” were “idyllic.” From reading the author’s descriptions of life there—“skipping oyster shells on the water,” catching fish—one gets the sense of Huck Finn amidst “Negro” rural poverty. It is a testament to Thomas’s sense of balance that he can recall those days fondly, despite how objectively disadvantaged materially he was.
At the age of six, Thomas and his brother moved to Savannah, into a single room occupied by his mother, a far cry from his home today in Northern Virginia. He characterizes where he lived, on the second floor of a tenement, as “the foulest kind of urban squalor.”
In the mid-Twentieth Century—not in Lincoln’s log cabin days—the future lawyer, federal administrative official, and Associate Justice of the Supreme Court, today not a stranger to the finery of the White House, lived with running water on only the floor below, an outdoor toilet with a cracked and rusty bowl and a rotten wooden seat, and the stench of raw sewage emanating from a broken pipe.
The youth who would one day break bread with the president of the United States could not afford sugar for his breakfast cornflakes. While his mother and brother slept in the room’s only bed, Clarence Thomas, who would one day own a 40-foot motor home, slept in a chair that “was too small, even for a six-year-old.”
The only source of heat was a kerosene stove, but because on his mother’s paltry earnings they couldn’t afford to light it very often, and the child who today has more than satisfied all his material needs “was cold most of the time, cold and hungry.” Indeed, Thomas says, “Never before had I known the nagging, chronic hunger that plagued me in Savannah.” In a sentence of touchingly evocative prose, Thomas writes that it was “[h]unger without the prospect of eating and cold without the prospect of warmth—that’s how I remember the winter of 1955.”
The following summer the situation improved. Thomas’s mother found a two-bedroom apartment, which had a stove and refrigerator. “The outdoor toilet didn’t leak,” and Clarence had his own bed.
Although he doesn’t dwell on the impact his early childhood had on him, Thomas reveals and implies enough for the reader to form a pretty clear picture. The virtual non-existence of his father took a toll. “Idyllic” or not his life in Pinpoint was a struggle for survival. Mere existence in Savannah was “hell,” and there his mother “worked to stay alive and keep us alive, nothing more.”
This chapter of Clarence Thomas’s life changed later in the summer of 1955 when his mother unceremoniously announced he and his brother were going to live with their grandparents. Two grocery bags were all that was required to pack the children’s earthly belongings, and off they went.
Thomas surmises that the main reason for the move was because his mother “simply couldn’t take care of two energetic young boys while holding down on full-time job that paid only ten dollars a week”—especially since “she refused to go on welfare.” His absent father made no contribution to Clarence and his brother’s care.
It was that move, Thomas becoming his “grandfather’s son,” laid on top of the imprints from his earlier years in Pinpoint and Savannah, that influenced Clarence Thomas life materially, spiritually, psychologically, and in every other way.
From what the author writes about his grandfather, Myers Anderson, “an ill-educated, modestly successful black man in the Deep South,” it’s clear that Thomas could have written an entire loving book about him. “In every way that counts, I am my grandfather’s son. I even called him Daddy because that was what my mother called him...He was dark, strong, proud, and determined to mold me in his image...He was the one hero in my life. What I am is what he made me.”
This is truly so, and it is in the next section of My Grandfather’s Son—where writing about family history, his grandfather’s background, and the incredible material change in his and his brother’s circumstances—the author provides context for the story of his growing-up and describes the influence of Myers Anderson, “the greatest man I have ever known.”
This window into the youth Clarence Thomas who became the Associate Justice Clarence Thomas, provided by this part of My Grandfather’s Son, is utterly fascinating and candidly revealing. Regrettably, space considerations don’t allow me to here do more than merely touch on the most important facts.
While living with his grandfather and grandmother, Clarence Thomas actually had two lives.
One was in the city, where he attended a Catholic school run by Irish immigrant Sisters. His description of the institution uses words like “neat and clean,” where the students “were required to pick up trash, empty wastebaskets, sweep floors, and clear blackboards.” Classes were “orderly,” and corporal punishment was normal. The nuns treated all the students with respect. Importantly, “[t]he sisters also taught us that God made all men equal, that blacks were inherently equal to whites, and that segregation was morally wrong.” (My emphasis.)
Life in the city with his grandparents emphasized education (“I was never prouder than when I got my first library card”), discipline (“[W]e were never to ‘spute’ his word”), and hard work helping Daddy deliver fuel oil (“My fingers grew numb from the cold”).
Thomas’s other life with his grandparents was in the country, on an abandoned 60-acre farm that had been in his grandfather’s family for generations. On Christmas Day 1957, Daddy announced that they—he, Clarence, and his brother—were going to build a house there. According to the author, “[b]y springtime we’d finished building a simple four-room house, and we spent the summer building garages, a barn, and other facilities, putting up fences, and clearing the surrounding land with axes and bush hooks. Friends and family members had helped us lay the cinder blocks and put on the roof, but we did all the rest of the work ourselves, screening the porch and installing a secondhand tub, sink, and toilet….”
Clarence Thomas was now all of 10-years-old.
After that, Thomas spent every summer there—“a place of torment, and salvation”—doing tasks that are difficult to imagine of a Supreme Court justice.
On task led to the next. Up before sunrise. Cutting trees, clearing land, laying fence, cutting grass, feeding animals, driving tractors, planting crops, spreading fertilizer, weeding fields, picking corn, cutting sugarcane, skinning animals, cleaning fish, throttling chickens, and, yes, slaughtering hogs (the details of which, as a vegan, I could have done without). All without gloves (Daddy considered it a weakness), and under the brutal and unrelenting Georgia sun—plagued by hot air in which swirled hordes of gnats, mosquitoes, and flies.
Was Daddy some kind of a Simon Legree?
On the morning the Thomas brothers moved into their grandfather’s house he informed them that “[t]he damn vacation is over”—which caused Clarence to think “of the filthy outdoor toilet behind [his mother’s] old tenement and [try] to figure out what vacation he [Daddy] was talking about.”
From now on there would be “manners and behavior” and “rules and regulations.” “Our first task,” Thomas writes, “was to get a good education so that we could hold down a ‘coat-and-tie job,’ and he wouldn’t listen to any excuses for failure. ‘Old Man Can’t is Dead—I helped bury him,’ he said time and again.” (My emphasis.)
Daddy, Thomas writes, “loomed over us like a dark behemoth, instilling fear and demanding absolute adherence to all his edicts, however arbitrary they might appear to be.” But all in aid of one relentless goal: instilling in his grandsons independence, discipline, knowledge, and self-esteem.
About the farm, the old man cannily explained years later “that he’d decided to build a house and cultivate the family land in order to keep [Clarence’s brother] and me off the streets of Savannah during the hot weather months when nobody bought fuel oil [and thus there was no work to keep the boys occupied and out of trouble].”
Clarence Thomas’s early years in Pinpoint, in Savannah, and on the farm, as revealed in My Grandfather’s Son are, as he acknowledges, what essentially formed him. The lessons and experiences of his childhood—the hurt of an absent father, the cost of a broken family, the desperation of rural poverty, the despair of urban squalor, the benefits of iron discipline, the self-esteem gained from hard work, the necessity of inculcated values, would be with him all the way to the Supreme Court.
In Catholic high school he studied hard, delivered fuel oil, and slept little. He experienced how “the peculiar institution of slavery had evolved into the peculiar institution of segregation,” and became aware of the civil rights movement that was beginning to swirl around him. Soon switching to a seminary where he was one of only two black students (and later the only one), he won a Latin prize, was instilled with academic discipline, and for a while suffered race-based insults and indignities.
Graduation summer found Thomas as a janitor, groundskeeper, and general handyman.
In the fall he began studies at a religious college, but doubts arose about his vocation partly because of the Catholic Church’s unacceptable position on racial discrimination. When Martin Luther King was shot and a fellow student said “I hope the son of a bitch dies,” “[h]is brutal words finished off my vocation—and my youthful innocence about race.”
Thomas left the college, and told Daddy. “I had broken my promise [not to quit], and my failure to live up to my word became a burden on my conscience that I have never escaped.”
Myers Anderson threw his grandson out of the house: “I want you to leave,” he said. “Today, this day.” Thomas writes: “I fumbled for something more to say. Would he help me with college? ‘I’m finished helping you,’ he said. ‘You’ll have to figure it out yourself. You’ll probably end up like your no-good daddy or those other no-good Pinpoint Negroes.’ The set of his jaw and the steel in his voice left no doubt that his word was final. My life and fate were in my hands.”
Broke, Thomas moved in with his mother, found a job as a proofreader in a paper bag factory, endured racial insults, and started down the road to racial radicalization.
He writes of how the assassination of Robert Kennedy somehow crystallized his fear of white America, making him remember the frustrations and humiliations Daddy had suffered. He writes of the “rage that threatened to burn through the masks of meekness and submission behind which we hid our true feelings. It was like a beast that lay in wait to devour us.” In one of the most open, and probably most difficult passages to have written, Thomas says that:
I lost my battle with the beast in the summer of 1968. It isn’t hard to see why. My family, my faith, my vocation, the heroes who inspired me: all had been taken from me. Once they had helped keep the beast at bay. Now it slipped its leash and began to consume me from within. I began to fear that I would never climb out from the crushing weight of segregation. No matter how hard I worked or how smart I was, any white person could still say to me, “Keep on trying, Clarence, one day you will be as good as us,’ knowing that he, not I, would be the judge of that. The more injustice I saw, the angrier I became, and the angrier I became, the more injustice I saw, not only at [the factory where he worked] but everywhere I worked.
Thomas saw Daddy as a victim of that injustice. His grandfather was religious, honest, patriotic, hard working. He had struggled to shelter his family, clothe them, and put food on the table. “Daddy didn’t complain,” Thomas writes, “but I couldn’t accept the way the white man had treated him. Somehow, some way, he and the others like him had to be avenged.”
Clarence Thomas had come a long way from playing barefoot in the bubble that was Pinpoint, to being a black college student determined to avenge the wrongs done to the Negro race in America.
The following fall he enrolled at Holy Cross College, obsessed over social problems, especially race, left the church, earned good grades, and began thinking about law school.
In passages about how some of his black classmates were over their heads academically at Holy Cross, we can see the genesis of his Affirmative Action jurisprudence, as applied in his Grutter dissent where he rails against do-gooders who use under qualified blacks as guinea pigs for liberal academic social experiments.
Affirmative Action was not the only thing that Thomas began to notice about racial issues. He disagreed with a plan of black students to live separately. He wondered why the administration gave into the plan with such alacrity. He began, perhaps inchoately, to realize that there were race hustlers out there, playing their own game. He questioned racial entitlements. Although, in his words, “[t]he beast of rage kept gnawing at my soul…the more I saw of radicalism, the more I doubted it had any answers to offer me…As much as I hated the injustices perpetrated against blacks in America, I couldn’t bring myself to hate my country, then or later.”
Participation in a “demonstration” that ended in tear gas was a catharsis and Thomas’s radical days were about over.
In the rest of his time at Holy Cross Thomas studied voraciously, soaking up knowledge like the proverbial sponge, and through introspection and rigorous honesty let go of the rage.
His expanding intellect now stimulated by the works of Ayn Rand, Richard Wright, Ralph Ellison, and others, Thomas was accepted to Harvard Law School. He declined, perhaps because it was too conservative (!), in favor of Yale, which was smaller and perhaps more liberal.
Despite considering himself far left-of-center and reluctantly voting for the “too conservative” George McGovern, some of classmate John Bolton’s (yes, that John Bolton) conservative arguments began to sink in.
Indeed, Thomas began to realize that he was being used by Yale: “in the years following Dr. King’s assassination, affirmative action (though it wasn’t yet called that) had become a fact of life at American colleges and universities, and before long I realized that those blacks who benefited from it were being judged by a double standard. As much as it stung to be told that I’d done well in the seminary despite my race, it was far worse to feel that I was now at Yale because of it.”
In My Grandfather’s Son, Thomas writes of how in his last semester at Yale Law School he realized that no job offers were forthcoming: “Now I knew what a law degree from Yale was worth when it bore the taint of racial preference. I was humiliated….” (My emphasis.) (Today, the Yale degree reposes in his basement, adorned with a 15-cent price sticker he removed from a cigar package.) Thomas says to this day that going to Yale was a “mistake.”
Eventually, Clarence Thomas was hired by then-Missouri Attorney General, later United States Senator, John Danforth. Although neither of them could have known it then, Clarence Thomas, the unstoppable son of his grandfather, was headed for the Supreme Court of the United States.
On the way there, the future justice’s life was full of highs and lows.
- Passing the Missouri bar examination, the first time out.
- Arguing an appeal before the Missouri Supreme Court only a few days after being admitted to practice. (Regrettably, Justice Thomas tells the reader only about his and the appellant’s lawyer’s sartorial splendor.)
- Learning, as a state criminal-appeals attorney, not “to assume that whites were responsible for all the woes of blacks, and [he] stopped throwing around the word ‘oppression’ so carelessly. [He] also grew more wary of unsupported generalizations and conspiracy theories, both of which had become indispensable feature of radical argument.”
- Finding and returning a stranger’s lost wallet containing $600, when he himself was virtually broke, and as a result reaching a “defining moment: my needs, however great they might be, didn’t convert wrong to right or bad to good. That man’s wallet wasn’t mine, no matter how much I needed the money, or how rude he happened to be.”
- Ameliorating his racial beliefs but feeling alone in them, but then finding Tom Sowell’s Race in America. “I felt like a thirsty man gulping down a glass of cool water. Here was a black man who was saying what I thought—and not behind closed doors, either, but in the pages of a book that had just been reviewed in a national newspaper. Never before had I seen my views stated with such crisp, unapologetic clarity: the problems faced by blacks in America would take quite some time to solve, and the responsibility for solving them would fall largely on black people themselves.” (Thomas promptly bought six copies of Race in America.)
Relocating to Washington, D.C., after leaving a corporate law job with Monsanto in Missouri, and joining the senatorial staff of John Danforth.
- Meeting Tom Sowell and Walter Williams.
- Registering to vote in Maryland—and as a Republican, so he could vote for Ronald Reagan. “It was a giant step for a black man, but I believed it to be a logical one. I saw no good coming from an ever larger government that meddled, with incompetence if not mendacity, in the lives of its citizens, and I was particularly distressed by the Democratic Party’s ceaseless promises to legislate the problems of blacks out of existence. Their misguided efforts had already done great harm to my people, and I felt sure that anything else they did would compound the damage.”
- Meeting, fortuitously, a labor-relations lobbyist for the U.S. Chamber of Commerce, Virginia Bess Lamp, whom he married in 1987 and calls “a gift from God.”
- Leading two of his EEOC staffers (Ken Masugi and John Marini) “in discussions of the natural-law philosophy with which the Declaration of Independence, America’s first founding document, is permeated…We debated at length the implications of natural-law thinking, and speculated on how it might apply to contemporary political discussions. These arguments stimulated my mind in a way that no discussion of current events could possibly hope to equal.”
- Accepting nomination to, and being confirmed for, a seat on the United States Court of Appeals for the District of Columbia Circuit.
- Unsuccessfully trying to sell his blood to raise money while studying for the bar exam.
- In his first law job, living from paycheck to paycheck, with usually only about $10 left over, and thus having repeatedly to borrow money to tide over his family of three.
- A bank foreclosing on one of his student loans.
- Finally recognizing that Daddy’s “hardness had hardened my own heart. Eventually the chasm that separated us became too wide to cross. It is my fault, not his that I never tried to bridge it. Only in the very last months of Daddy’s life did we share a solitary embrace, and by then it was too little, too late. Not a day passes that I don’t wish I had thrown open my arms sooner to that good man. Not until he was gone did I know how wrong I’d been to turn away from his love.”
- Discontent with his marriage, causing him to drink even more than he had formerly, and forcing him to conclude that in order to be happy he had to dissolve it. “I left my wife and child. It was the worst thing I’ve done in my life, worse even than going back on my promise to Daddy that I would finish my seminary studies and become a priest. I had broken the most solemn vow a man can make, the one that ends…as long as you both shall live. I still live with the guilt, and always will.”
As the assistant secretary for civil rights in the Department of Education having a difficult time “because of the public’s perception of the Reagan administration’s racial attitudes.”
- The close-in-time deaths of his grandfather and grandmother, while Thomas was trying to cope with his constant personal financial problems and reform the dysfunctional Equal Employment Opportunity Commission, of which he had recently become chairman. “Things kept on going from bad to worse. Running EEOC was a Sisyphean struggle: every time we put out one fire, another one started. My bills piled up, often unopened. I was nearly evicted from my apartment more than once….”
From the EEOC, Thomas became a judge on the D.C. Circuit. One day, about 15-months later, Thomas was secretly taken to the White House via a tunnel from the Treasury Department, and “escorted to a windowless office and left by myself for a few hours…As I waited, I tried to think of a way to convince President Bush to choose someone else [to replace Justice Thurgood Marshall on the Supreme Court]. The obvious reasons were my relative youth and inexperience—I’d just turned forty-three…and had been on the Court of Appeals for only fifteen months—but I knew these were mere excuses. Neither then nor at any other time did it occur to me that I could not do the work of a Supreme Court justice. I’d spent my whole life coping with one challenge after another, and I knew I could handle this one as well, the same way I’d learned Latin, passed the Missouri bar exam, briefed and argued numerous cases, and straightened our EEOC. The problem was that I still didn’t know whether I wanted to spend the rest of my life as a judge, and I was sure that I didn’t want to run the confirmation gauntlet again.”
A few days later President Bush called: “Judge, we’re still thinking about this Supreme Court thing. Could you come up to Kennebunkport tomorrow to have lunch with me and talk about it?” Thomas, apparently still ambivalent, went. That day, the president announced Clarence Thomas’s nomination.
As the President introduced me to America, I thought of my wife, my grandparents, and all the other people who had helped me along the way, especially the nuns of St. Benedict the Moor and St. Pius X. Then my thoughts drifted from those who had made this day possible to those who would now try to undo it. I recalled the ants I had watched as a child on the farm, building their hills one grain of sand at a time, only to have them senselessly destroyed in an instant by a passing foot. I’d pieced my life together the same way, slowly and agonizingly. Would it, too, be kicked callously into dust?
Most readers of this review know the answer to the question Clarence Thomas asked himself that sunny day in Maine, and I will not discuss it here except to make two points.
The first is that, as his book shows beyond doubt, the contemptible conduct of the interest groups, politicians, and individuals who sought to defeat then-Judge Thomas’s nomination exposed themselves as bigots, frauds, liars, and enemies of the democratic process.
Indeed, while their conduct was contemptible, they themselves were and remain, beneath contempt. That goes doubly for Anita Hill, a perjurious ingrate who willingly allowed herself to be used as the tool of corrupt forces, and who should have been disbarred for lying to the Senate Judiciary Committee, which constitutes unethical professional conduct. If for no other reason, and there are many others, My Grandfather’s Son needs to be read for the true story of Hill’s mendacious assault on a decent man who had more than once been her benefactor.
Second, as both his book and the public record make clear, during the Senate Hearings on his confirmation Clarence Thomas was knowledgeable, responsive, dignified, and especially brave. Brave, because at the end he told the Judiciary Committee, the Senate, the American public, and the world that, given what had been done to him.
The Supreme Court is not worth it. No job is worth it. I am not here for that. I am here for my name, my family, my life, and my integrity. I think something is dreadfully wrong with this country, when any person, any person in this free country would be subjected to this…This is a circus. It is a national disgrace. And from my standpoint, as a black American, as far as I am concerned, it is a high-tech lynching for uppity blacks who in any way deign to think for themselves, to do for themselves, to have different ideas, and it is a message that, unless you kowtow to an old order, this is what will happen to you, you will be lynched, destroyed, caricatured by a committee of the U.S. Senate rather than hung from a tree.
Before ending this review I want to mention a disclosure that Justice Thomas makes in his book, something that probably only a few people knew before My Grandfather’s Son was published—something that, hopefully, will lay to rest for all time vicious, indeed racist, allegations that have dogged Justice Thomas for years.
After I had been on the Court for about five years, I raised the topic of my nomination with Boyden Gray [counsel to the President] over lunch. He had always been candid with me, so I asked him a straight question, knowing that he would give me an equally straight answer: was I picked because I was black? Boyden replied that in fact my race had actually worked against me. The initial plan, he said, had been to have me replace Justice Brennan [who had retired before Justice Marshall retired] in order to avoid appointing me to what was widely perceived as the Court’s “black” seat, thus making the confirmation even more contentious. But Justice Brennan retired earlier than expected, and everyone in the White House agreed that I needed more time on the D.C. Circuit in order to pass muster as a Supreme Court nominee.
There is much more to say about My Grandfather’s Son. It is honest, touchingly introspective, and extremely self-revelatory. Much of Justice Thomas’s prose is poetic. His autobiography—warts and all—shows Clarence Thomas to be a fine human being, doting father, loving husband, patriotic American, and brave man. Indeed, when one finishes reading the justice’s autobiography, the reader feels like he is saying goodbye to a new friend, with whom he has traveled the long road from Pinpoint, Georgia, to the marble halls of the United States Supreme Court.
And at that Court, one wonders whether in his chambers Clarence Thomas sometimes turns to a proudly displayed bust of his grandfather and shares a thought with him—and then hears his grandfather reminding, yet again, that “Old Man Can’t is Dead—I helped bury him.”
Posted on: Wednesday, October 10, 2007 - 19:02
SOURCE: WaPo (10-7-07)
Among so much about American politics that can impress or depress a friendly transatlantic observer, there's nothing more astonishing than this: Why on Earth should Sen. Hillary Rodham Clinton be the front-runner for the presidency?
She has now pulled well ahead of Sen. Barack Obama, both in polls and in fundraising. If the Democrats can't win next year, they should give up for good, so she must be considered the clear favorite for the White House. But in all seriousness: What has she ever done to deserve this eminence? How could a country that prides itself on its spirit of equality and opportunity possibly be led by someone whose ascent owes more to her marriage than to her merits?
We all, nations as well as individuals, have difficulty seeing ourselves as others see us. In this case, I doubt that Americans realize how extraordinary their country appears from the outside. In Europe, the supposed home of class privilege and heritable status, we have abandoned the hereditary principle (apart from the rather useful institution of constitutional monarchy), and the days are gone when Pitt the Elder was prime minister and then Pitt the Younger. But Americans find nothing untoward in Bush the Elder being followed by Bush the Younger.
At a time when Americans seem to contemplate with equanimity up to 28 solid years of uninterrupted Bush-Clinton rule, please note that there are almost no political dynasties left in British politics, at least on the Tory side. Admittedly, Hilary Benn, the environmental secretary, is the fourth generation of his family to sit in Parliament and the third to serve in a Labor Party cabinet. But England otherwise has nothing now to match the noble houses of Kennedy, Gore and Bush. ...
Posted on: Tuesday, October 9, 2007 - 20:00
SOURCE: Altercation (Blog) (10-9-07)
Look, ladies and gentlemen, either medical care saves lives and prevents illness or it doesn't. I'd argue that it does, and I think even George W. Bush might agree. Granting that, poor children who have access to it are less likely to die from serious sickness and less likely to contract various preventable diseases and maladies if they do have access to such care. If they don't have such access, they will more likely "get sick and die." This strikes me again as a statistical certainty and again, if you could get Mr. Bush to give a straight answer on the question, I don't see how he could disagree either. Now, given that we know what the result will be of refusing to allow states to cover more poor children with health care -- and remember, these are the children who are most vulnerable to sickness in the first place -- that there will be more sickness and death on the part of these same uncovered children, just what are Mr. Bush's own stated reasons for vetoing the program? They can be found in Bush's own words, here and here, and they all involve the prevention of what he fears will be a slippery slope to "socialized medicine" to which he objects entirely and unashamedly on ideological grounds.
Note that I do not claim and never said that George W. Bush wants poor kids to get sick and die, per se. I don't think he does. I said only that he prefers this to signing the SCHIP bill, and in doing so, demonstrated his commitment to his own stated (but rarely followed) ideology....
Posted on: Tuesday, October 9, 2007 - 18:19
SOURCE: Letter to the editor of the NYT (10-9-07)
David Brooks has gracefully summarized the Burkean conservative temperament, but his analysis is flawed. There is absolutely no connection between Edmund Burke, who wrote at the end of the 18th century, and contemporary American conservatives. Their historic roots go back to the beginning of the 20th century and the passage of legislation protecting children, women, workers and consumers from the dangerous excesses in business practices in the era of the robber baron.
Then Republicans wrapped themselves in the rhetoric of freedom to repel interference with their aggressive profit-seeking. The religious conservatives migrated to the Republican Party much later in the 1960s, when many Americans abandoned strict sexual mores in the name of a different kind of freedom.
Hardly any American then or now would side with Burke in his famous exchange with Thomas Paine. Burke, revealing his deep respect for monarchy, lamented the treatment of Marie Antoinette by the French revolutionaries, to which Paine, in reference to the suffering of the French people, replied, “He pities the plumage, but forgets the dying bird.”
Posted on: Tuesday, October 9, 2007 - 17:21