Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: http://electronicintifada.net (6-18-07)
The Gaza Strip is a little bit more than two percent of Palestine. This small detail is never mentioned whenever the Strip is in the news nor has it been mentioned in the present Western media coverage of the dramatic events unfolding in Gaza in the last few weeks. Indeed it is such a small part of the country that it never existed as a separate region in the past. Gaza's history before the Zionization of Palestine was not unique and it was always connected administratively and politically to the rest of Palestine. It was until 1948 for all intents and purposes an integral and natural part of the country. As one of Palestine’s principal land and sea gates to the world, it tended to develop a more flexible and cosmopolitan way of life; not dissimilar to other gateways societies in the Eastern Mediterranean in the modern era. This location near the sea and on the Via Maris to Egypt and Lebanon brought with it prosperity and stability until this life was disrupted and nearly destroyed by the Israeli ethnic cleansing of Palestine in 1948.
In between 1948 and 1967, Gaza became a huge refugee camp restricted severely by the respective Israeli and Egyptian policies: both states disallowed any movement out of the Strip. Living conditions were already harsh then as the victims of the 1948 Israeli politics of dispossession doubled the number of the inhabitants who lived there for centuries. On the eve of the Israeli occupation in 1967, the catastrophic nature of this enforced demographic transformation was evident all over the Strip. This once pastoral coastal part of southern Palesine became within two decades one of the world's densest areas of habitation; without any adequate economic infrastructure to support it.
The first twenty years of Israeli occupation at least allowed some movement outside an area that was closed off as a war zone in the years 1948 to 1967. Tens of thousand of Palestinians were permitted to join the Israeli labor market as unskilled and underpaid workers. The price Israel demanded for this slavery market was a total surrender of any national struggle or agenda. When this was not complied with -- the 'gift' of laborers' movement was denied and abolished. All these years leading to the Oslo accord in 1993 were marked by an Israeli attempt to construct the Strip as an enclave, which the Peace Camp hoped would be either autonomous or part of Egypt and the Nationalist camp wished to include in the Greater Eretz Israel they dreamed of establishing instead of Palestine.
The Oslo agreement enabled the Israelis to reaffirm the Strip's status as a separate geo-political entity -- not just outside of Palestine as a whole, but also cut apart from the West Bank. Ostensibly, both the Gaza Strip and the West Bank were under the Palestinian Authority but any human movement between them depended on Israel's good will; a rare Israeli trait and which almost disappeared when Benjamin Netanyahu came to power in 1996. Moreover, Israel held, as it still does today, the water and electricity infrastructure. Since 1993 it used, or rather abused, this possession in order to ensure on the one hand the well-being of the Jewish settler community there and on the other in order to blackmail the Palestinian population into submission and surrender. The people of the Gaza Strip thus vacillated in the last sixty years between being internees, hostages or prisoners in an impossible human space.
It is within this historical context that we should view the violence raging today in Gaza and reject the reference to the events there as a campaign in the 'war against terror,' an instance of Islamic revivalism, a further proof for al-Qadia’s expansionism, a seditious Iranian penetration into this part of the world or another arena in the dreaded Clash of Civilizations (I picked here only few out of many frequent adjectives used in the Western media for describing the present crisis in Gaza).
The origins of the mini civil war in Gaza lie elsewhere. The recent history of the Strip, 60 years of dispossession, occupation and imprisonment produced inevitably internal violence such as we are witnessing today as it produced other unpleasant features of life lived under such impossible conditions. In fact, it would be fair to say that the violence, and in particular the internal violence, is far less than one would have expected given the economic and social conditions created by the genocidal Israeli policies in the last six years.
Power struggles among politicians, who enjoy the support of military outfits, is indeed a nasty business that victimizes the society as a whole. Part of what goes on in Gaza is such a struggle between politicians who were democratically elected and those who still find it hard to accept the verdict of the public. But this is hardly the main struggle. What unfolds in Gaza is a battleground between America's and Israel's local proxies -- most of whom are unintentionally such proxies but none the less they dance to Israel's tune -- and those who oppose it. The opposition that now took over Gaza did it alas in a way that one would find very hard to condone or cheer. It is not the Hamas' Palestinian vision that is worrying, but rather the means it has chosen to achieve it that we hope would not be rooted or repeated. To its credit one should openly say that the means used by Hamas are part of an arsenal that enabled it in the past to be the only active force that at least tried to stop the total destruction of Palestine; the way it is used now is less credible and hopefully temporary.
But one cannot condemn the means if one does not offer an alternative. Standing idle while the American-Israeli vision of strangling the Strip to death, cleansing half of the West bank from its indigenous population and threatening the rest of the Palestinians -- inside Israel and in the other parts of the West Bank -- with transfer, is not an option. It is tantamount to"decent" people’s silence during the Holocaust.
We should not tire from mentioning the alternative in the 21st century: BDS -- Boycott, Divestment and Sanctions -- as an emergency measure -- far more effective and far less violent -- in opposing the present destruction of Palestine. And at the same time talk openly, convincingly and efficiently, of creating the geography of peace. A geography in which abnormal phenomena such as the imprisonment of small portion of the land would disappear. There will be no more, in the vision we should push forward, a human prison camp called the Gaza strip where some armed inmates are easily pitted against each other by a callous warden. Instead that area would return to be an organic part of an Eastern Mediterranean country that has always offered the best as a meeting point between East and West.
Never before, in the light of the Gaza tragedy, has the twofold strategy of BDS and a one state solution, shined so clearly as the only alternative forward. If any of us are members in Palestine solidarity groups, Arab-Jewish dialogue circles or part of civil society's effort to bring peace and reconciliation to Palestine -- this is a time to put aside all the false strategies of coexistence, road maps and two states solutions. They have been and still are sweet music to the ears of the Israeli demolition team that threatens to destroy what is left of Palestine. Beware especially of Diet Zionists or Cloest Zionists, who recently joined the campaign, in Britain and elsewhere against the BDS effort. Like those enlightened pundits who used liberal organs in the United Kingdom, such as The Guardian, to explain to us at length how dangerous is the proposed academic boycott on Israel. They have never expended so much time, energy or words on the occupation itself as they did in the service of the ethnic cleansing of Palestine. UNISON, Britain’s large public service trade union, must not be deterred by this backlash and it should follow these brave academics who endorsed the debate on the boycott, as should Europe as a whole: not only for the sake of Palestine and Israel, but also if it wishes to bring a closure to the Holocaust chapter in its history.
And a final small portion of food for thought. There are quite a few Jewish mothers and wives in the Gaza Strip -- some sources within Gaza say up to 2000 -- married to local Palestinians and parents to their children. There are many more Jewish women who married Palestinians in the Palestine countryside. An act of desegregation that both political elites find difficult to admit, digest or acknowledge. If despite the colonization, occupation, genocidal policies and dispossession such harmonies of love and affection were possible, imagine what could happen if these criminal policies and ideologies would disappear. When the Wall of Apartheid is removed and the electric fences of Zionism dismantled -- Gaza will become once more a symbol of Fernand Braudel's coastal society, able to fuse different cultural horizons and offer a space for new life instead of the war zone it has become in the last sixty years.
Posted on: Thursday, July 5, 2007 - 18:04
SOURCE: NY Post (7-5-07)
THE arrest of seven doctors in the attempted British terror bombings has shocked many people. Sadly, it shouldn't.
All seven are Muslims working at government-financed hospitals, their salaries paid by the British taxpayer. Dr. Muhammad Hanif practiced at Halton Hospital in Runcorn, Cheshire; Dr. Muhammad Asha, at the North Staffordshire NHS Trust's University Hospital.
So can doctors be terrorists? Can people who are financially well-off be terrorists? Absolutely. It is ideology, after all, that turns people into terrorists - not suffering.
Indeed, the No. 2 leader of al Qaeda is Dr. Ayman al-Zawahiri - who was previously a stalwart of the Muslim Brotherhood in his native Egypt. Zawahiri made the connections that led to his role in al Qaeda when he went to Afghanistan in 1980 to provide medical care for jihadists fighting the Soviets. Later, he was a key architect of the 1998 bombings of the U.S. embassies in Kenya and Tanzania, which killed hundreds of innocents, and of the 9/11 attacks.
Other Islamist MDs include two recent leaders of the Palestinian terror group Hamas. Abdel Aziz Rantisi graduated first in his class from an Egyptian university in pediatrics; he succeeded slain cleric Ahmad Yassin as Hamas' chief in March 2004 - only to perish himself a month later. After him came Mahmoud al-Zahar (who had helped found Hamas in 1987), an Egyptian-trained surgeon who today is the most powerful man in the Gaza Strip.
Even before the rise of Islamism, there were secular doctor-terrorists. Dr. George Habash, a founder and for three decades leader of the Popular Front for the Liberation of Palestine (PFLP), is now still active though enjoying his semi-retirement in Syria. The PFLP was one of the most bloodthirsty terrorist groups in history, with exploits such as a 1978 attack at Orly Airport in Paris and the 2002 shooting of five (including a mother and three children) in Itamar on the West Bank.
Dr. Waddi Haddad was Habash's sometime partner before forming an organization specializing in terrorist operations, the Popular Front for the Liberation of Palestine-External Operations. Among its more spectacular actions was a 1977 hijacking of a Lufthansa flight en route from Mallorca to Frankfurt.
Consider, too, the kind of people who become doctors - relatively intelligent, well-organized, hard-working. These are valuable skills both in leading terrorist groups and in carrying out operations.
Yes, we in the West expect the study of medicine to produce humanists - men and women who view all life as sacred, dedicated to broad service for humanity.
But it is also an intellectual endeavor - exposing one to other intellectual currents in the surrounding world. And in much of the Muslim world, the strongest currents have been the various extremisms that promote terrorism.
And the doctrines of radical Arab nationalism and Islamism (like the one that motivated those totalitarians working in concentration camps six decades ago) view their enemies as sub-human. Toward them, those trained in healing are quite willing to become doctors of death.
Posted on: Thursday, July 5, 2007 - 17:19
SOURCE: NYT (7-4-07)
IT’S just a red stake stuck in an anonymous spread of pasture 20 miles north of Belle Fourche, S.D., a rodeo town of about 5,000 inhabitants. But it is also the geographical center of the United States of America, as defined by the National Geodetic Survey in 1959. Or at least it is for now.
To find it, says Teresa Schanzenbach, executive director of the town’s chamber of commerce, “you have go into a ditch, cross a barbed-wire fence and maneuver amongst the cactus and cow pies.” So, plans are that in August, the center of the nation is to be moved 20 miles south, and an eye-catching granite monument will be unveiled in Belle Fourche itself so that visitors can see it more easily.
This may seem like a high-handed way to treat both geography and the United States itself. Certainly the implications reach well beyond Belle Fourche. Is the balance of the nation going to be affected? Will there be a seismic tilt towards Canada? And can we be sure that the center won’t shift again? History certainly suggests that it will — and within the foreseeable future.
The event that made Belle Fourche the focal point of the nation’s land mass was the admission of Hawaii and Alaska in 1959. Never have the frontiers of the United States remained fixed for so long....
... [B]efore the inhabitants of Belle Fourche invest too much money in a permanent monument, they should consider whether the age of expansionism is indeed over. In particular they might cast a wary glance at the Security and Prosperity Partnership of North America, which was created in 2005 by President Bush and counterparts in Mexico and Canada....
History is resuming its normal course. It may be slower than invasion or purchase, but the regulations and agencies needed to enforce them will pull Canada and Mexico within the reach of United States jurisdiction as effectively as any means that Seward envisioned. Meanwhile, the citizens of Belle Fourche would be well advised to make the new geographical center of the United States transportable. It may eventually need to travel to somewhere near Omaha.
Posted on: Wednesday, July 4, 2007 - 13:51
SOURCE: Life During Wartime (7-4-07)
Joshua Brown, co-director of the New Media Lab, is also Executive Director of the Center for Media and Learning/American Social History Project (ASHP) at the CUNY Graduate Center. Before assuming those duties in 1998, Brown served as CML/ASHP's Creative Director, supervising the conception and production of the organization's varied and award winning"old" and new media projects since 1981, Among his past and current ASHP/CML credits are: co-director of the ten-part"Who Built America?" documentaries series; co-author and visual editor of the Who Built America? textbook and CD-ROMS; creative director of Liberty, Equality, Fraternity, a digital history of the French Revolution; co-executive producer of History Matters, a U.S. history Web site; and co-executive producer and co-writer of The Lost Museum, a Web-based 3-D exploration/archive of P. T. Barnum's American Museum.
A Ph.D. in American history from Columbia University, Brown is the author of Beyond the Lines: Pictorial Reporting, Everyday Life, and the Crisis of Gilded Age America (University of California Press, 2002), co-editor of History from South Africa: Alternative Visions and Practices (Temple University Press, 1993), and author of numerous articles and essays on visual culture and U.S. history. In addition, his cartoons and illustrations appear in popular and scholarly publications as well as digital media. Brown received Columbia University's 1994 Bancroft Dissertation Prize as well as grants from the American Council of Learned Societies and the National Endowment for the Humanities.
Posted on: Wednesday, July 4, 2007 - 10:16
SOURCE: New York Sun (7-3-07)
When Dwight D. Eisenhower dedicated the Islamic Center in Washington, D.C., in June 1957, his 500-word talk effused good will ("Civilization owes to the Islamic world some of its most important tools and achievements") even as the American president embarrassingly bumbled (Muslims in the United States, he declared, have the right to their"own church"). Conspicuously, he included nary a word about policy.
Exactly fifty years later, standing shoeless, George W. Bush rededicated the center last week. His 1,600-word speech also praised medieval Islamic culture ("We come to express our appreciation for a faith that has enriched civilization for centuries"), but he knew a mosque from a church – and he had more on the agenda than flattery.
Most arresting, surely, was his statement that"I have invested the heart of my presidency in helping Muslims fight terrorism, and claim their liberty, and find their own unique paths to prosperity and peace." This cri du coeur signaled how Mr. Bush understands to what extent actions by Muslims will define his legacy.
Should they heed his dream"and find their own unique paths to prosperity and peace," then his presidency, however ravaged it may look at the moment, will be vindicated. As with Harry S Truman, historians will acknowledge that he saw further than his contemporaries. Should Muslims, however, be"left behind in the global movement toward prosperity and freedom," historians will likely judge his two terms as harshly as his fellow Americans do today.
Of course, how Muslims fare depends in large part on the future course of radical Islam, which in turn depends in some part on its understanding by the American president. Over the years, Mr. Bush has generally shown an increased understanding of this topic. He started with platitudinous, apologetic references to Islam as the"religion of peace," using this phrase as late as 2006. He early on even lectured Muslims on the true nature of their religion, a presumptuous ambition that prompted me in 2001 to dub him"Imam Bush."
As his understanding grew, Mr. Bush spoke of the caliphate,"Islamic extremism" and"Islamofacism." What euphemistically he called the"war on terror" in 2001, by 2006 he referred to with the hard-hitting"war with Islamic fascists." Things were looking up. Perhaps official Washington did understand the threat, after all.
But such analyses roused Muslim opposition and, as he approaches his political twilight, Mr. Bush has retreated to safer ground, reverting last week to decayed tropes that tiptoe around any mention of Islam. Instead, he spoke inelegantly of"the great struggle against extremism that is now playing out across the broader Middle East" and vaguely of"a group of extremists who seek to use religion as a path to power and a means of domination."
Worse, the speech drum-rolled the appointment of a U.S. special envoy to the Organization of the Islamic Conference, directing this envoy to"listen to and learn from" his Muslim counterparts. But the OIC is a Saudi-sponsored organization promoting the Wahhabi agenda under the trappings of a Muslim-only United Nations. As counterterrorism specialist Steven Emerson has noted, Bush's dismal initiative stands in" complete ignorance of the rampant radicalism, pro-terrorist, and anti-American sentiments routinely found in statements by the OIC and its leaders."
Sitting in the audience at the Islamic Center on June 27, 2007, three senior Bush administration staffers wore makeshift hijabs: Fran Townsend (far left), Assistant to the President for Homeland Security and Counterterrorism, NSC Senior Director for European Affairs Judy Ansley (left), and Under Secretary of State for Public Diplomacy and Public Affairs Karen Hughes (right).
In brief, it feels like"déjà vu all over again." As columnist Diana West puts it,"Nearly six years after September 11 — nearly six years after first visiting the Islamic Center and proclaiming ‘Islam is peace' — Mr. Bush has learned nothing." But we now harbor fewer hopes than in 2001 that he still can learn, absorb, and reflect an understanding of the enemy's Islamist nature.
Concluding that he basically has failed to engage this central issue, we instead must look to Mr. Bush's potential successors and look for them to return to his occasional robustness, again taking up those difficult concepts of Islamic extremism, Shariah, and the caliphate. Several Republicans – Rudy Giuliani, Mitt Romney, and (above all) Fred Thompson – are doing just that. Democratic candidates, unfortunately, prefer to remain almost completely silent on this topic.
Almost thirty years after Islamists first attacked Americans, and on the eve of three major attempted terrorist attacks in Great Britain, the president's speech reveals how confused Washington remains.
This article is reprinted with permission by Daniel Pipes. This article first appeared in the New York Sun.
Posted on: Tuesday, July 3, 2007 - 18:56
SOURCE: The Politico (7-2-07)
It may be admirable to model oneself on a successful figure from the political past. Who wouldn't want to write like Lincoln, transform policy like FDR and inspire his followers like Reagan and Debs? But to equal the accomplishments of one's idol is a far more difficult task.
History, after all, has a way of changing the terrain and rules of the political game, frustrating those who think they know how to win it.
During George W. Bush's first presidential campaign, Karl Rove told reporters he aimed to be the Mark Hanna of the 21st century. Like Hanna, the wealthy Cleveland industrialist who died in 1904, Rove would create a durable, conservative Republican majority by appealing to groups that had previously leaned Democratic -- particularly Hispanics and white Catholics.
For Rove, the folksy Bush would serve as the modern parallel to the genial William McKinley, Hanna's protégé and close friend. Both men were skillful, according to Rove, at bridging partisan divides and represented a softer, more compassionate image of their party than did the ideologues who preceded them.
During Bush's first term, Rove seemed to have a decent chance of achieving his objective. The president's resolute response to the terrorist attacks of September 2001 helped guide the GOP to victory in the 2002 midterm election and convinced most Americans to back the invasion of Iraq months later.
Although that support had dwindled by 2004, Bush was still a more credible commander in chief than John F. Kerry was, and his opposition to abortion and gay marriage helped him win the Catholic vote over his Catholic adversary.
The reversal of fortunes the president and his party have since endured has been widely blamed on the botched adventures in Iraq and New Orleans, as well as on the stink of corrupt dealings on high. But Rove has been unable to devise a way out of the mess, and his failure to follow the example of Hanna is part of the reason.
Like Rove, Hanna was a lightning rod for his Democratic critics. Hanna's cartoon portrait as a fat, pompous ass dressed in a tight suit checked with dollar signs -- created in 1896 by Homer Davenport of the New York Journal -- remains one of the more familiar images from that era.
But Hanna was one plutocrat who knew how to build coalitions with the common folk. He engineered McKinley's nomination in 1896 with the slogan "The People Against the Bosses." Then, during the general election campaign, he and his candidate wooed working-class voters with a pledge to guarantee "a full dinner pail" based on high tariffs that boosted their wages as well as the profits of their industrial employers.
In 1900, Hanna became the first president of the National Civic Federation, a group that tried to mediate conflicts between capital and labor and included AFL leader Samuel Gompers and social reformer Jane Addams on its board.
Hanna also persuaded most Republicans to ignore or downplay prohibition, the key "social issue" of the day. During the Gilded Age, the GOP's alliance with temperance activists, especially in the Midwest, had limited its appeal to many European immigrants.
But neither McKinley nor the two Republican presidents who followed him tried to outlaw the liquor business, which enabled them to win the votes of many German-Americans and Irish-Americans, key voting blocs whose members were fond of their saloons and beer gardens.
What is more, the master strategist did not work only behind the scenes. Hanna won three separate Ohio elections to the U.S. Senate, which -- in the days before popular elections to that body -- testified to his firm hold over the state legislature.
Hanna didn't live to see the fruition of his grand design. But using the pragmatic approach he pioneered, Republicans won seven of the nine presidential elections from 1896 to the Great Depression and controlled both houses of Congress for all but eight years during that span. The party was dominant or competitive everywhere outside the Deep South.
Rove knows this history as well as the handful of scholars who write about it do. Yet the strategy he has urged on his president and party is opposite that of Hanna. Instead of broadening his GOP's base in a divided nation, Rove has relied on Big Business and the Christian right to rally their loyalists.
Instead of enacting programs that could have given substance to the slogan of "compassionate conservatism," he led Republicans to pass big tax cuts, oppose a hike in the minimum wage and, until recently, deny the facts about global warming.
A strict adherence to conservative gospel buoyed Republicans as long as the administration seemed a reliable defender of national security. But that polarizing stance has made them vulnerable as Bush's incompetence has outweighed his toughness. It would have been shrewder to imitate Hanna's method of triumphing through compromise.
Granted, Rove's ambition is harder to fulfill. It may not be possible for any political party to achieve the domination Hanna's GOP enjoyed for more than a generation. The percentage of independent voters is far higher now, and it's harder to enforce partisan discipline at a time when candidates raise funds for themselves and can't avoid taking a stand on every controversial issue.
But the affable Rove has all but ensured that he won't match the feat of his grimmer role model. And Bush's successor, whoever he or she may be, will likely repudiate his brand of politics instead of seeking to emulate it.
Posted on: Tuesday, July 3, 2007 - 18:46
SOURCE: Special to HNN (7-3-07)
I saw New Orleans
Saw the people left for dead
I heard every bald faced lie
You politicians said
I've seen it for myself
And you can't fool your sight
Well we'd better make a change
And we'd better start tonight
Mavis Staples"My Own Eyes," from the 2007 album We'll Never Turn Back
In America's long and painful struggle against racial injustice, certain visual images have assumed powerful historic significance, because they highlight, in simple and dramatic form, the cruelty and inhumanity of America's treatment of its black population. One of these, coming at the very dawn of the post-war Civil Rights movement, was the battered and swollen face of 14 year old Emmett Till, put on display in a Chicago church after his mangled body was recovered from a river in a Mississippi town where he was cruelly murdered for comments he made to a white woman in a small country store. To the tens of thousands of mourners who walked by Till's casket, and the millions who saw his face on the cover of Jet magazine, Till's brutally battered visage was a visual indictment of a Jim Crow regime willing to resort to unspeakable acts of violence to intimidate African Americans and deprive them of their rights and dignity.
Eight years later, in the Spring of 1963, Birmingham Sheriff Bull Connor provided another indelible image of America's brutal treatment of its Black population when he unleashed water hoses and police dogs on a peaceful group of African American high school and middle school students marching to desegregate Birmingham's down town business district. One image in particular, of a German Shepard biting the stomach of a 15 year old Black male demonstrator was disseminated by wire services around the world and has come to symbolize, to billions of people woldwide, the cruelty and irrationality of the America's system of racial segregation, which forced many African-Americans to live as second class citizens
Now, thanks to Hurricane Katrina, the world has been given a new and equally haunting image of racial inequality in America- that of tens of thousands of poor Black people left stranded amidst rising flood waters outside the New Orleans Coliseum or the roofs of houses That you could have so many people abandoned by their government, in the richest nation in the world, after a national catastrophe everyone knew was coming was shocking in itself. But that nearly all of these people were Black and poor showed that 40 years after the Civil Rights movement, the nation was still profoundly segregated by race and class.
The reaction of George Bush to this catastrophe would forever tarnish his presidency. Having starved the key federal agencies responsible for flood prevention and emergency management and having sent the bulk of the National Guard to fight the war in Iraq, Bush had no emergency response team in place capable of rescuing people quickly when the levees broke in New Orleans. But equally significantly, Bush displayed no empathy or emotional identification with the tens of thousands of Black people and poor people stranded in New Orleans. He refused to change his plans and take personal charge of relief efforts until five days of the levee broke and seemed perplexed that the images of desperation from the flooded city were provoking worldwide outrage.
Blinded by his own conviction that responsibility for environmental issues and the relief of poverty were best left in private hands or devolved onto local governments, Bush failed to see most Americans-even those who voted Republican- expected their federal government to take the lead when there was a natural catastrophe the same way it did in the face of a terrorist attack. They also expected that relief efforts would be timely, efficient and competently led, and would rescue people based on level of need, not on their economic status, racial background or ability to pay. Even to many conservatives, it was embarrassing to have America paraded before the world as a racist, classist nation that so relied on private transportation in an emergency that middle class white residents escaped to safety while black and poor people were left to die.
As in the case of photos of Emmett Till's and of the Birmingham teenage protesters attacked by hoses and dogs, the visual images of New Orleans stranded citizens provided an inescapable reminder- and for some a wake up call--of how powerful racial divisions in American were and how much cruelty those divisions could inspire.
But they also served to demystify the Bush presidency and the entire Republican philosophy that had been ascendant for the past 25 years. Because what took place in New Orleans, unlike what took place in Webb Mississippi, or Birmingham Alabama, was not the result of actions by private individuals and government officials designed to preserve white supremacy, but the result of a systematic weakening of public institutions by conservatives who claimed their motives were non racial. If you eviscerate the Army Corps of Engineers and the Federal Emergency Relief Management Agency; if you refuse to fund public transportation, public housing, and public health care; if you have no plan to help indigent people in the case of an emergency, than when a crisis like Katrina takes place, you will see, WITH YOUR OWN EYES, how truly divided and unequal American society has become.
And your eyes won't be lying. In a recent New York Times column entitled"Enough is Enough,"(June 30, 2007) Bob Herbert pointed out that what Katrina revealed in New Orleans is only one example of a problem that is truly national in scope:
" Have you looked at the public schools lately? Have you looked at the prisons? Have you looked at the legions of unemployed blacks roaming the neighborhoods of big cities across the country? These jobless African-Americans, so many of them men, are so marginal in the view of the wider society, so insignificant, so invisible, they aren't even counted in the government's official jobless statistics."
Now that we have seen these divisions, what are we going to do? I for one, am for following the lead of Mavis Staples, whose new album"We'll Never Turn Back" is the most inspiring collection of freedom songs I have heard in the last ten years:
We'd better make a change
And we'd better start tonight
Posted on: Tuesday, July 3, 2007 - 18:45
SOURCE: NYT (7-3-07)
A PROMINENT American once said, about immigrants, “Few of their children in the country learn English... The signs in our streets have inscriptions in both languages ... Unless the stream of their importation could be turned they will soon so outnumber us that all the advantages we have will not be able to preserve our language, and even our government will become precarious.”
This sentiment did not emerge from the rancorous debate over the immigration bill defeated last week in the Senate. It was not the lament of some guest of Lou Dobbs or a Republican candidate intent on wooing bedrock conservative votes. Guess again.
Voicing this grievance was Benjamin Franklin. And the language so vexing to him was the German spoken by new arrivals to Pennsylvania in the 1750s, a wave of immigrants whom Franklin viewed as the “most stupid of their nation.”
About the same time, a Lutheran minister named Henry Muhlenberg, himself a recent arrival from Germany, worried that “the whole country is being flooded with ordinary, extraordinary and unprecedented wickedness and crimes. ... Oh, what a fearful thing it is to have so many thousands of unruly and brazen sinners come into this free air and unfenced country.”...
Scratch the surface of the current immigration debate and beneath the posturing lies a dirty secret. Anti-immigrant sentiment is older than America itself. Born before the nation, this abiding fear of the “huddled masses” emerged in the early republic and gathered steam into the 19th and 20th centuries, when nativist political parties, exclusionary laws and the Ku Klux Klan swept the land.
As we celebrate another Fourth of July, this picture of American intolerance clashes sharply with tidy schoolbook images of the great melting pot. Why has the land of “all men are created equal” forged countless ghettoes and intricate networks of social exclusion? Why the signs reading “No Irish Need Apply”? And why has each new generation of immigrants had to face down a rich glossary of now unmentionable epithets? Disdain for what is foreign is, sad to say, as American as apple pie, slavery and lynching.
That fence along the Mexican border now being contemplated by Congress is just the latest vestige of a venerable tradition, at least as old as John Jay’s “wall of brass.” “Don’t fence me in” might be America’s unofficial anthem of unfettered freedom, but too often the subtext is, “Fence everyone else out.”
Posted on: Tuesday, July 3, 2007 - 16:34
SOURCE: Informed Comment (Blog) (7-3-07)
In his son's administration, CIA covert operative Valerie Plame's name was leaked to the press to punish her husband, Ambassador Joseph Wilson IV, for having blown the whistle on how the White House depended on fraudulent information about Iraq buying yellowcake uranium from Niger.
In the wake of the leak, this is how Bush responded:
On Sept. 29, 2003, the White House, through spokesman
Scott McClellan, said:
"If anyone in this administration was involved in it, they would no longer be in this administration."
We now know that Vice President Richard Bruce Cheney was highly"involved" in the effort to out Plame, as was Bush's political adviser, Karl Rove. By the lights of the 2003 statement, both should have been fired when this came out.
Cheney and Rove were were running a vigorous campaign from inside the White House with that aim. Libby was acting at their behest.
So why hasn't Bush fired Cheney and Rove?
He flip-flopped and changed the grounds for firing someone in his administration to their having" committed a crime."
After the revelations of the identities of CIA operativeS by Philip Agee, Congress made outing agents a crime. But the law was vaguely worded, so you had to know the individual was a covert operative to be punished under the statute.
Some analysts have attempted to defend them on the grounds that they did not know that Plame was undercover, or that their repeated attempts to get journalists to out Plame were largely unsuccessful and she was actually inadvertently outed by then undersecretary of state Richard Armitage.
But they were trying to out her. Isn't trying to commit a crime a crime in itself? And, Rove did talk to Bob Novak, who was the reporter willing to leak.
As for claiming they didn't know she was undercover. I don't think that is an excuse, since they were in a position to ask the CIA that question and solve it. Wouldn't a prudent person have double-checked that item before dialing Judy Miller's number?
So Rove and Cheney claim not to have"broken the law." But that doesn't change how heinous their action was. They are, in George H.W. Bush's words,"the most insidious of traitors."
This flip-flopping on the grounds for which high White House officials would be fired allowed Bush to keep Rove and Cheney around even though they were clearly"involved" in the leak. In essence, his flip-flop was itself a way of commuting their own sentences of unemployment.
As for Libby's pardon, he was convicted of lying to a grand jury and obstructing the special counsel's investigation. Since Fitzpatrick could never have gotten to the bottom of whether crimes had been committed as long as key figures like Libby lied to him, Libby's crime was grave. The commutation of his sentence is a great injustice. It is not the first a Bush has committed.
Iran-Contra criminal Elliott Abrams, now a deputy National Security Agency adviser to Bush, essentially committed the same crime as Libby, though he only pled guilty to withholding information from Congress.
Abrams was pardoned by George H. W. Bush, and then his son hired him. Congress, which should have been permanently outraged by having been misled by Abrams, gave him a pass. A far rightwing Likudnik, he has been handling Palestine issues for Bush!
So, like father like son.
Except for that vision thing about the insidious traitors.
Basically, in Bushworld, high government officials are above the law, including all international law and most domestic. America is not nearly as much fun if you aren't rich.
Groupnewsblog says things about this matter in a language I could only aspire to..
For suggested action we all can take, see FireDogLake.
Posted on: Tuesday, July 3, 2007 - 15:51
SOURCE: Excerpt from a three-part series at TomDispatch.com (6-25-07)
... The very post-Vietnam détente-restraint of most of [Jimmy] Carter's advisors -- and the President's own inner hawkishness -- opened the way for his presidency to become (contrary to conventional wisdom) a precedent-setting period for covert intervention. And Gates, as [National Security advisor Zbigniew] Brzezinski's hard-line staff officer for Soviet affairs, and later his personal outer-office assistant in the White House West Wing, was at the center of it all.
In his 1996 memoir, he would write contemptuously (and, in the case of Secretary of State Vance, falsely), "Because Vance was unwilling to use diplomatic leverage against the Soviets, and [Secretary of Defense] Brown and others wanted no part of U.S. military involvement in the Third World, their standoff gave Brzezinski an enormous opportunity to put forward covert action -- which was under the purview of the NSC -- as a means of doing something to counter the Soviets."
Gates and Brzezinski promptly impressed upon Carter that, "It is his CIA," as Gates described it. Within weeks of his inauguration, at the urging of the national security advisor and his Soviet affairs specialist, the new president approved the first covert actions inside the USSR. These operations were aimed at inciting religious discontent in Soviet Central Asia by smuggling in tens of thousands of Korans, as well as radical Islamic literature. In that and other actions to come, it would be Jimmy Carter who first fanned Islamic fundamentalism -- which would have devastating consequences in our own era.
By July 1979 -- less than two weeks after the Sandinista rebels took power from the 43-year Somoza-dynasty dictatorship in Nicaragua, a long favored Washington client in Central America -- they would begin mounting the first covert actions against the popular, and populist, new regime in Managua, as they would soon be shoring up a ruling oligarchy that faced a mounting leftist insurgency in neighboring El Salvador.
There would be similar interventions and intrigues in the Horn of Africa, on the Arabian Peninsula, and elsewhere, always justified by the Soviet (or proxy Cuban) menace. "On the march" was the way both Gates and his boss were fond of describing the communist hordes. The result would be a rash of secret wars, assassinations, terrorist acts, and manifold corruptions around the world by the administration of the "human rights" president. Moreover, these acts preceded, sometimes by several years, the vaunted covert actions of the Reagan regime, which were often only continuations of Carter policy, in some cases even on a lesser scale. "Jimmy Carter was the CIA's first wholly owned subsidiary," an Agency operative would boast to a friend later, "and the beauty of it was that so few people, even on the inside, ever knew it."
Nowhere would their penchant for the covert prove more fateful than in the remote Hindu Kush. To an already seedy history of American covert intervention there, they now added their own bloody chapter.
At the behest of Pakistan, Communist China, and the Shah of Iran (and their intelligence services), the CIA had begun offering covert backing to Islamic radical rebels in Afghanistan as early as 1973-1974. The explanation for this was that the right-wing, authoritarian regime of Mohammed Daoud, then in power in Kabul, might prove a likely instrument of Soviet military aggression in South Asia. This was a ridiculous pretext. Daoud had always held the Russians, his main patron when it came to aid, at arm's length, and had savagely purged local communists who supported him when, in 1973, he overthrew the Afghan monarchy. For their part, the Soviets had not shown the slightest inclination to use the notoriously unruly Afghans and their ragtag army for any expansionist aim.
Support for the anti-Daoud religious insurgents, far more anti-American than the Kabul regime, actually served distinctly local interests. The Pakistanis and Iranians wanted to fend off Afghan irredentism on their disputed borders and Pakistan was eager to secure a pliant regime in Kabul on its western flank as it faced rival India in the East. The Nixon administration casually supported these aims in deference to its clients with little or no thought for the Afghans, a policy-atrocity which would be repeated for the next quarter-century.
All the backing ceased, however, after an abortive rebel uprising in 1975, as Daoud launched his own détente policy with Iran and Pakistan. Then, in April 1978, his blundering crackdown on Afghanistan's small communist party provoked a successful coup by party loyalists in the army. This happened in defiance of a skittish Moscow which had stopped earlier coup plans. Aware of these facts, Vance's State Department coolly adopted a wait-and-see attitude toward the new regime.
But with predictable alarm bells ringing in Iran, Pakistan, and Russophobic China, Carter's covert interventionists at the NSC saw an irresistible "opportunity," as Gates put it, "to counter the Soviets." Three weeks after the Kabul coup, Brzezinski was in Beijing discussing, among other matters of state in his Kissingeresque debut as a diplomat, the "Soviet peril" in Afghanistan.
Gates memoir dutifully notes the ensuing stream of bland speculations by the CIA's Soviet analysts about what the Soviets might next do in their tortured relationship with a faltering, needy, yet independent Afghan communist regime. But he spares us the covert actions the CIA carried out, amid a stream of memos Brzezinski and he sent Carter about the Soviet "threat" in South Asia -- an intervention kept secret from their hated rival, Secretary of State Vance, and the rest of government.
By summer 1978, the old insurgent training camps in Pakistan were open again and thronged with Islamic radicals. They were eager to fight a regime pushing land reform and education for women, while establishing a secular police state. By fall 1978, more than a year before Soviet combat troops set foot in Afghanistan, a civil war, armed and planned by the U.S., Pakistan, Iran, and China, and soon to be actively supported, at Washington's prodding, by the Saudis and Egyptians, had begun to rage in the same wild mountains of eastern Afghanistan where U.S. forces would seek Osama bin Laden a little more than twenty-three years later.
In April 1979, with arms and agitators paid for by the CIA and Pakistani intelligence (the Shah fell in January ending SAVAK's role), a radical Islamic uprising in Herat in western Afghanistan led to the slaughter of thousands on both sides, including more than 200 Russian military and civilian advisors and their families. Even so, the Soviets stoutly refused to intervene militarily. They even made their refusal absolutely plain to Washington by pointedly conducting telephone conversations with the Afghan leadership en clair for the Americans to intercept. But Gates, Brzezinski, and Carter were having none of it in what had become a deliberate plot to "suck" the Russians into Afghanistan.
The old Great Game was now in cynical full swing. In the sort of mad plan not even Rudyard Kipling could have imagined, they plotted to personally "give the Soviets their Vietnam," as Brzezinski was fond of saying.
The ceaseless machinations and bloody civil strife culminated, of course, in the December 1979 Soviet invasion. The Politburo had resisted it for more than a year and hesitated, even at the eleventh hour. It is, by any measure, one of the more dramatic, and chilling, stories in the annals of world politics. By now, Brzezinski and Gates had essentially created a new foreign policy for the United States and put it into action in secret with few co-authors and no parallel.
By the time, they and their co-conspirators are through, a course will have been set that will take the Afghans into a nightmare universe in which a million-and-a-half of them will die, millions more will become homeless (in what the UN will call "migratory genocide"), and, for more than a quarter-century, their country will be a continuing catastrophe beyond any other in the history of nation-states. In part, it is his own work that Gates now faces as secretary of defense....
Posted on: Tuesday, July 3, 2007 - 13:20
SOURCE: Weekly Standard (7-2-07)
The new strategy for Iraq has entered its second phase. Now that all of the additional combat forces have arrived in theater, Generals David Petraeus and Ray Odierno have begun Operation Phantom Thunder, a vast and complex effort to disrupt al Qaeda and Shiite militia bases all around Baghdad in advance of the major clear-and-hold operations that will follow. The deployment of forces and preparations for this operation have gone better than expected, and Phantom Thunder is so far proceeding very well. All aspects of the current strategy have been built upon the lessons of previous successful and unsuccessful Coalition efforts to establish security in Iraq, and there is every reason to be optimistic about its outcome.
The first phase of the new strategy unfolded over five months--between the president's announcement of the "surge" on January 10 and the arrival of the last of the five additional Army brigades and Marine elements in early June (though critical enablers for those combat forces have only just arrived). As the new units entered Iraq, commanders began pushing forces already in the theater forward from their operating bases into outposts in key neighborhoods of Baghdad and elsewhere. The purpose of these movements was to establish positions within those key neighborhoods and to develop intelligence about the enemy and relationships of trust with the local communities.
Also during this first phase, additional Iraqi security forces were deployed to Baghdad in accordance with a plan developed jointly by the U.S. and Iraqi military commands. All of the requested units were provided. The Iraqi military has just completed its second rotation of units into Baghdad; as before, all of the designated units arrived, and they were generally closer to being fully manned than in the first rotation.
The new U.S. troops have increased the available combat power in Iraq by about 40 percent, from 15 brigades to the equivalent of 21 brigades. Generals Petraeus and Odierno allocated only two of the additional Army brigades to the capital. The other three Army brigades and the equivalent of a Marine regiment they deployed in the surrounding areas, known as the "Baghdad belt." There, under the guise of Operation Phantom Thunder, they are now working to disrupt the car-bomb and suicide-bomb networks that have been supporting al Qaeda's counter-surge since January.
But this second phase is designed primarily to support the clearing and holding operations in Baghdad itself, which will continue for many months. It is those operations that are meant to bring lasting security to Iraq's capital and thus create the space for political progress.
The United States has not undertaken a multiphased operation on such a large scale since the invasion, so it is unsurprising that many commentators are confused about how to report and evaluate what is going on. Indeed, the current effort differs profoundly from anything U.S. forces have tried before in Iraq. As Coalition forces begin the attempt to establish sustainable security in Baghdad and its environs, it is worth reviewing past major combat operations in Iraq, since their clear lessons have informed planning for the current, much larger campaign....
Posted on: Monday, July 2, 2007 - 19:23
SOURCE: http://commonsense.ourfuture.org (6-30-07)
Justice Roberts, "restricting the ability of public school districts to use race to determine which schools students can attend," wrote for the plurality of Scalia, Thomas, and Alito that, "Before Brown, schoolchildren were told where they could and could not go to school based on the color of their skin."
If I were a high school teacher and young Johnny Roberts wrote this on an exam on civil rights history, I would give him an "F." The idea that the Chief Justice of the Supreme Court could cough up such a ludicrous hairball is evidence of a nation gone mad with amnesia. Or, if you prefer, a conservative intellectual class that knows the history full well, and has simply let itself lie.
Do educated people really need this explained to them? It wasn't merely "before Brown" that "schoolchildren were told where they could and could not go to school based on their color of their skin." It was long, long after the Supreme Court's unanimous decision in Brown v. Board of Education of Topeka - for the next seventeen years at least.
I mean, do I really have to explain this? In 1955, the year after Brown, the Supreme Court specified the compliance language for the first decision: Southern school districts would have to comply "with all deliberate speed."
Instead, they did not comply at all. Instead, the region staged a self-consious movement of "Massive Resistance." Nearly every Southern congressman signed a manifesto pledging to defy the Court by "all lawful means." In Virginia senator and former Klansman Harry Flood Byrd's minions pushed through the state assembly an order to close any school under federal court order to integrate. And in 1957 in Little Rock—well, has Justice Roberts never heard of this?
Since most Dixie municipalities had one school district for whites and another entirely separate district for blacks, and simply did nothing, the federal courts in 1964 ruled that all "dual school districts" not already under court order to do so would have to file desegregation plans with the Department of Health Education and Welfare. Congress was able to help in 1965, after the passage of the Elementary and Secondary Education Act provided the first serious federal funding to local school districts. Since the 1964 Civil Rights Act had provided that no segregated public institution could get federal funds, this was, finally, a chance to punish the vast, vast majority of Southern school districts who - read this carefully, Justice Roberts—11 years after Brown outlawed telling schoolchildren where they could and could not go to school based on the color of their skin.
By that point only 6 percent of Southern schoolchildren attended classes with children of another race. How did we know? Because the federal government counted.
In 1966, HEW published guidelines specifying that schools with no black students or staff would have to show evidence of "significant progress"; those with 4 to 5 percent black students or staff would have to triple that number within the '66-'67 school year; those with 8 to 9 percent would have to double them.
How did the South respond? By openly defying the law. In the same manner as a criminal, told to halt by police, but simply ran as fast as he could in the other direction....
Posted on: Monday, July 2, 2007 - 15:31
SOURCE: Guardian (6-27-07)
The historian Richard Hofstadter once wrote that third parties in the modern United States are like honeybees - they sting and then they die.
But not every candidate who runs for president outside the Demopublican oligopoly has left a significant mark on the body politic. Most, to paraphrase Muhammad Ali, floated like butterflies away from the main event and were soon forgotten.
As Michael Bloomberg prepares to enter the 2008 race as an independent, he might reflect on his forerunners whose historical impact was slight as well as those who, although they lost, did much to transform the political landscape.
The club of influential losers is ultra-exclusive. It includes Theodore Roosevelt, who finished second as a Progressive in 1912 and whose platform presaged that adopted by his distant cousin Franklin in the 1930s. George Wallace belongs in it too, thanks to his American Independent campaign in 1968 which showed conservatives how to run as moralizing populists. Wallace won five states and 46 electoral votes and almost deprived Richard Nixon of an electoral-college majority.
But other alternative candidates, even those who gained an impressive number of ballots, proved to be essentially ephemeral figures. In 1924, Robert LaFollette, running as a left reformer, won close to 17% of the vote but succeeded only in dividing the weak opposition to the Republican juggernaut.
Most of the 5.7 million votes independent John Anderson drew in 1980 would probably have gone to President Jimmy Carter. But Ronald Reagan would still have scored an easy victory that year, and Anderson's studied moderation was an empty vessel whose sinking few Americans regretted. For all the attention they received, neither of Ross Perot's well-publicized campaigns in the 1990s won a single electoral vote, and his eccentric personality overshadowed his advocacy of a balanced budget and curbs on free trade.
What did TR and Wallace have that their fellow also-rans lacked? Both were prominent, charismatic political figures before they launched their campaigns; each could depend on an army of loyal volunteers to carry his banner and echo his words, even after his electoral defeat.
In addition, both men spoke to widespread grievances - against corporate malfeasance in 1912, liberal hypocrisy and failure in 1968 - which neither major party was addressing in a forceful way. In both years, the prospect of a close election also stirred hope, or alarm, that the winner would have to broker a deal with the third-party nominee.
Michael Bloomberg's incipient candidacy may make the 2008 election a ferociously competitive one. But until quite recently, few Americans outside New York knew anything about the billionaire mayor, and he will thus have to recruit a legion of volunteers almost from scratch.
The Anderson example also suggests that the appeal of soft-spoken centrism may wither in the fury of partisan conflict. In the summer of 1980, the congressman's poll rating topped 20%. But as the campaign wore on and voters started really focusing on the choices before them, he had no idea how to stop millions of voters from "coming home" to the major parties.
When Carter refused to engage in a three-sided TV debate, Anderson could not call on the broad and ardent support necessary to force the incumbent to change his mind.
Michael Bloomberg has the money and brains to run a formidable campaign. There are certainly big causes he could embrace - fixing the nation's public schools, for example - that could gain the pragmatic, big-city mayor a sympathetic hearing. But vague talk about transcending partisan gridlock will grow stale very quickly. And if he can't recruit a passionate, grievance-filled following, he is unlikely to change history.
Posted on: Sunday, July 1, 2007 - 21:57
SOURCE: LAT (6-28-07)
WHAT ARE schools for?
For the last decade, I've taught a history course with that title at New York University. My students and I examine the different purposes that Americans have assigned to public schools, including:
A. to teach the great humanistic traditions of the West;
B. to develop the individual interests of the child;
C. to promote social justice;
D. to prepare efficient workers.
Over the last four centuries, Americans have struggled to balance these goals — and many others — in their schools. To Supreme Court Justice Clarence Thomas, however, there's only one right answer:
E. to instill discipline and obedience
That's what Thomas wrote this week in his strange concurring opinion in Morse vs. Frederick, better known as the "BONG HiTS 4 JESUS" case. A banner with those words was unfurled by senior Joseph Frederick outside his Alaska high school, and he was suspended.
Ruling 5 to 4 in favor of the principal who censored the banner, the court decided that the school's interest in discouraging drug use outweighed the student's free-speech rights. But Thomas went further, insisting that the student had no right to free speech in the first place and that the history of American education proves it.
He's wrong. Simply put, the accurate history in Thomas' opinion is not relevant. And the relevant history that he recounts is not accurate.
Let's start with what he got right. As he correctly asserts, America's first schools primarily promoted discipline. "Early public schools were not places for freewheeling debates or exploration of competing ideas," Thomas wrote. The mostly male teaching force in the early 1800s brooked little or no dissent, often whipping children who challenged adult authority.
True enough. But so what? Here's the part that Thomas leaves out. From the very birth of the common school system in the 1830s, the strict discipline that he celebrates came under fire from a host of different Americans. The most prominent champion of common schools, Horace Mann, warned teachers against excessive force and the suppression of students' natural inclinations.
That's one reason Mann and his generation backed the hiring of female teachers, who were seen as more kind, tolerant and nurturing. (The other reason was that schools could pay them less.) By 1900, roughly three-quarters of American teachers were women.
The early 20th century would bring another burst of change to American schools, centered on the question of democracy. To reformers like John Dewey, schools based on strict discipline — and its pedagogical companion, rote memorization — could never give citizens the skills they needed to govern themselves. Instead of fostering mindless obedience, then, schools needed to teach children how to make up their own minds — that is, how to reason, deliberate and rule on complex political questions.
To be sure, plenty of Americans still wanted teachers to bring the kids to heel. And it's fair to ask whether schools today promote the kind of inquiry that Dewey envisioned.
The point is not that Dewey was "right" or that everyone agreed with him. Rather, history teaches us that Americans have always disagreed on the proper goal for schools.
None of this debate appears in Thomas' opinion, which gets cut off just when things get interesting. To Thomas, American educational history seems to end at the start. Our first schools aimed to instill discipline, he wrote, so that's what schools should do.
Worse, Thomas assumes that the schools succeeded in this task. "Teachers commanded," he wrote, "and students obeyed." But this command melted away in recent years, Thomas claims, when courts invented specious student rights — and "undermined the traditional authority of teachers to maintain order in the public schools."
Here's the part of Thomas' opinion that would be relevant — if it were true. But it's not. Yes, teachers tried to establish strict order and discipline in early American schools. As often as not, however, they failed.
Consider the 1833 memoir of Warren Burton, a New Hampshire minister. When faced with a particularly cruel teacher, Burton writes, his classmates revolted. They tackled the teacher, carried him outside and threw him down an icy hillside.
The theme appears in other memoirs and especially in fiction from the 19th century, which depicts unruly students — usually boys — challenging or mocking teacher authority. Think of Tom Sawyer lowering a cat by a string to snatch his bald teacher's wig. Such stories resonated with Americans because they understood — in ways Thomas does not — the chaos and violence that pervaded so many public schools.
So Thomas can spare us the nostalgia. Our schools were never the paragons of discipline he imagines. And pretending otherwise simply diverts us from the big question, which will never have a single answer:
What are schools for?
Posted on: Sunday, July 1, 2007 - 21:56
SOURCE: Nation (7-9-07)
For as long as I can remember, there's been a generally accepted story about the recent history of Democratic Party fortunes, a neat little morality tale that goes something like this: The New Deal majority fell apart when the party was taken over by forces outside the mainstream of American life. Getting blindsided by Reaganism was the party's just deserts. And if Democrats wanted the country back, they would just have to learn to become mainstream again.
For as long as I can remember, liberals have been complaining about awkward, self-conscious attempts to recover this "mainstream" sensibility and how they have paradoxically weakened the party. They forced Democratic politicians to become obsessed with polls. That, in turn, boxed Democrats into an identity the public--the mainstream--found the most off-putting of all: Democrats became timid. They couldn't pursue a bold public agenda because they were too hemmed in by polls. Very recently, among progressives, a new dictum has emerged: Hug close to the polls, worship the polls, be the polls.
Trends in Political Values and Core Attitudes: 1987-2007, a massive twenty-year roundup of public opinion from the Pew Research Center for the People and the Press, tells the story. Is it the responsibility of government to care for those who can't take care of themselves? In 1994, the year conservative Republicans captured Congress, 57 percent of those polled thought so. Now, says Pew, it's 69 percent. (Even 58 percent of Republicans agree. Would that some of them were in Congress.) The proportion of Americans who believe government should guarantee every citizen enough to eat and a place to sleep is 69 percent, too--the highest since 1991. Even 69 percent of self-identified Republicans--and 75 percent of small-business owners!--favor raising the minimum wage by more than $2.
The Pew study was not just asking about do-good, something-for-nothing abstractions. It asked about trade-offs. A majority, 54 percent, think "government should help the needy even if it means greater debt" (it was only 41 percent in 1994). Two-thirds want the government to guarantee health insurance for all citizens. Even among those who otherwise say they would prefer a smaller government, it's 57 percent--the same as the percentage of Americans making more than $75,000 a year who believe "labor unions are necessary to protect the working person."
It's not just Pew. In the authoritative National Election Studies (NES) survey, more than twice as many Americans want "government to provide many more services even if it means an increase in spending" as want fewer services "in order to reduce spending." According to Gallup, a majority say they generally side with labor in disputes and only 34 percent with companies; 53 percent think unions help the economy and only 36 percent think they hurt. A 2005 NBC News/Wall Street Journal poll found that 53 percent of Americans thought the Bush tax cuts were "not worth it because they have increased the deficit and caused cuts in government programs." CNN/Opinion Research Corp. found that only 25 percent want to see Roe v. Wade overturned; NPR/Kaiser Family Foundation/Harvard found the public rejecting government-funded abstinence-only sex education in favor of "more comprehensive sex education programs that include information on how to obtain and use condoms and other contraceptives" by 67 percent to 30 percent. Public Agenda/Foreign Affairs discovered that 67 percent of Americans favor "diplomatic and economic efforts over military efforts in fighting terrorism."
Want hot-button issues? The public is in love with rehabilitation over incarceration for youth offenders. Zogby/National council on Crime and Delinquency found that 89 percent think it reduces crime and 80 percent that it saves money over the long run. "Amnesty"? Sixty-two percent told CBS/New York Times surveyors that undocumented immigrants should be allowed to "keep their jobs and eventually apply for legal status." And the gap between the clichés about what Americans believe about gun control and what they actually believe is startling: NBC News/Wall Street Journal found 58 percent favoring "tougher gun control laws," and Annenberg found that only 10 percent want laws controlling firearms to be less strict, a finding reproduced by the NES survey in 2004 and Gallup in 2006.
You suspected it all along. Now it just might be true: Most Americans think like you. Nearly two-thirds think corporate profits are too high (30 percent, Pew notes, "completely agree with this statement...the highest percentage expressing complete agreement with this statement in 20 years"). Almost three-quarters think "it's really true that the rich just get richer while the poor get poorer," eight points more than thought so in 2002.
If only there was an American political party that unwaveringly reflected these views, as a matter of bone-deep identity. You might think it would do pretty well. Which leads to the aspect of the Pew study that got the most ink: "Political Landscape More Favorable to Democrats," as the subtitle put it. When you compare Americans who either identify themselves as Democrats or say they lean toward the Democrats with Republicans and Republican leaners, our side wins by fifteen points, 50 percent to 35, the most by far in twenty years. As recently as 2002 it was a tie, 43 to 43.
Plunge below the surface, however, and this stirring tale becomes disconcerting. Yes, again and again, the views of independents track the views of Democrats--more so, in fact, with every passing year. Pew says it's "striking" that 57 percent of independents think government should aid more needy people even at the price of higher debt. In 1994 it was only 39 percent. When asked their opinion of statements like "Business corporations make too much profit," independents answer the same way as Democrats: about 70 percent agree. On questions like "Are you satisfied with the way things are going for you financially?" the chart is amazing: Republicans, independents and Democrats clustered together at 65 and 64 percent in 1994. But Republicans have increasingly answered that question in the affirmative--81 percent in 2007. Meanwhile, the lines for independents and Democrats headed down, down, down, nearly in lockstep, to 54 percent today.
Pew says independents are thinking like Democrats, and that fewer and fewer want much to do with the Republican Party. In 1994 independents gave the GOP a 68 percent approval rating; now only 40 percent do. And the percentage of people who call themselves Republicans has dropped from 29 percent in 2005 to 25 percent today. But these people are not signing up as Democrats. The proportion of those who call themselves Democrats has held steady, in the lower 30s.
Here's a riddle: What's an "independent"? More and more, it's an American who holds positions we associate with Democrats but who refuses to call himself by the name. Why? Part of the reason is that people say to themselves, "If only there was a party that thought like me--that was for harnessing the power of government to help the needy and protect the middle class; for reining in business excess; for fighting overseas threats through soft power instead of reckless force." But they don't find today's Democrats answering to the description. A Washington Post/ABC News poll published in early June proved it on Iraq: It heralded the emergence of what might be called "antiwar independents," who'd like nothing more than to find a party determined to end the war but don't see enough difference between Congressional Republicans and Democrats for the latter to earn their loyalty. Fueled, the Post suspects, by the failure of Congress to change course in Iraq, independents gave Congressional Democrats a 49 percent approval rating in April but only 37 percent in June.
The pattern--Democrats losing because they don't look enough like Democrats--is nothing new: During the 2002 election Democrats did such a poor job of selling themselves as better protectors of middle-class interests that Greenberg Quinlan Rosner Research found only 34 percent of voters saw a difference between Democrats and Republicans on prescription drug benefits to seniors. That year, when the party was handed a once-in-a-generation shot to prove itself as a protector against runaway greed (the corporate accounting scandals), DNC chair Terry McAuliffe called the swindling firm Global Crossing a "great company."
I suspect there's another reason, however, one much more easily fixed. There is a famous Washington story, perhaps apocryphal, about jovial, "all politics is local" Tip O'Neill. After his first run for local office, O'Neill was gabbing with a neighbor, perhaps someone he grew up with, with whose family his was entirely interlaced in that Boston, Irish Catholic way. He asked if she had voted for him. She answered, "No." Shocked, Tip demanded to know why. "Because you never asked," she replied.
Democrats make a similar mistake these days: They rarely ask the public to vote for them as Democrats. The trend was obvious by the 2006 season, for those who cared to see: The same Pew numbers that now show a 50-35 Democratic/Democratic-leaners advantage over Republicans had the advantage at 47-38 in 2006. Candidates would have earned a premium just slapping the label "Democrat" on their TV ads, but most didn't do it. That fall writers and readers of the website MyDD.com ran an ad watch. Some Democratic commercials failed to mention any of the issues. Bush's war was a disaster; Bush's government was a crony-infested sinkhole; under Bush, the middle class was having a hard time--these would have been immense burdens for GOP candidates. Other ads, though, were even more frustrating: They mentioned those issues--but never used the label "Democrat."
It could have been a virtuous circle, a matchless teachable moment: Voter identification with the positions articulated could have translated into a party identification that independents hadn't been inclined to feel before--a crucial party-building function. But that's just not how the Democratic consultancy class thinks. Their habits were set when they were blindsided by the Reagan presidency and the rise of popular conservatism ("It helped convince me that the national Democratic Party drag was such that good candidates were carrying an albatross around their necks with the words Democratic Party written on it when they went into elections," Will Marshall of the Democratic Leadership Council once said). Democratic leaders, scarred by the 1980s and frozen in the strategies of the 1990s, have repeatedly squandered the opportunities presented by the increasingly liberal sympathies of voters.
Of course, slapping a graphic reading "Democrat for Congress" on ads or reforming the vague shame some powerful Washington Dems feel toward their party--or even turning Democratic Congress members overnight into tough advocates for bringing the troops home from Iraq--may not be enough to bring election day tallies in line with the party's fifteen-point advantage in lean and identification. It's a problem with many moving parts. The stubborn oxen on TV and in the establishment media who tell the American people how to think are part of the problem too.
The commentariat tells itself a little fairy tale. As a new report from the Campaign for America's Future (my employer, though I'm solely responsible for the ideas in this essay) and Media Matters for America points out (The Progressive Majority: Why a Conservative America Is a Myth), when the GOP took over Congress in 1994, the New York Times front page claimed, "The country has unmistakably moved to the right." It hadn't; for an excellent study showing this wasn't so, see Ronald Rapoport and Walter Stone's Three's a Crowd, which shows how Newt Gingrich's Contract With America was tailored as an appeal to Perot voters, then retroactively spun as a mandate for conservatism. Ten years later, when Bush beat Kerry by three points, Katie Couric asked on Today, "Does this election indicate that this country has become much more socially conservative?" It was a rhetorical question, for the establishment had set the conclusion in stone long before. Three weeks before the 2006 election Candy Crowley of CNN said Democrats were "on the losing side of the values debate, the defense debate and, oh yes, the guns debate." After election day, Bob Schieffer of CBS said, "The Democrats' victory was built on the back of more centrist candidates seizing Republican-leaning districts." (Tell that to my favorite Democratic House pickup, Carol Shea-Porter, a former social worker who won a New Hampshire seat after getting kicked out of a 2005 presidential appearance for wearing a T-shirt reading Turn Your Back on Bush.) John Harris of the Washington Post, now of The Politico, said, "This is basically not a liberal country." Concludes the Media Matters/Campaign for America's Future report, "Democratic victories are understood as a product of the Democrats moving to the right, while Republican victories are the product of a conservative electorate."
The media have always been this stubborn, even when the conclusions they reached were 180 degrees reversed. In 1964, after Lyndon Johnson swamped Barry Goldwater, pundits said conservatism was dead as a force in American politics, and continued in that arrogant vein for years, ignoring plentiful evidence of the conservative upsurge. They were no less empirically impaired after they were shocked into making the pivot, and they won't turn again until they're forced, kicking and screaming, when the evidence finally becomes overwhelming and undeniable.
An important corollary of the media fairy tale is that the Democrats can't embody the will of the people. As an editorial in the Los Angeles Times explained in 2004, Kerry lost because of his party's "God gap." Once more, the data won't cooperate: A declining constituency--the devout--is treated as if it were booming. Pew shows that the number of people who "completely agree" that "prayer is an important part of my daily life" is down six points in the past four years. The number who "never doubt the existence of God" is down eight over the same period. The Barna Group likewise reports, "There has been a 92% increase in the number of unchurched Americans in the last thirteen years"--a population of 75 million, which is growing: According to the Pew report, "This change appears to be generational in nature, with each new generation displaying lower levels of religious commitment than the preceding one." America, of course, is a religious country--but 19 percent born after 1976 are either atheists, agnostics or claim no religion, compared with 5 percent born before 1946. Yes, social conservatives are a loud component of our body politic. But the numbers peaked long ago. Pew measures social attitudes via six questions, such as whether schools should have the right to fire gay teachers and whether AIDS is God's punishment for sexual immorality. In 1989 about half of respondents answered at least four of those six questions conservatively. Now, a mere 30 percent do.
Just who are these iniquitous citizens? People who identify themselves as secular or unidentified with a religious tradition represent about 5 percent of Republicans and 11 percent of Democrats. They are a downright heathenish 17 percent of independents. The Pew report has a chart of three descending trend lines of those who answer the social-values questions conservatively. The line for independents is less socially conservative than for Democrats. DLC types love to talk about "swing voters," a group often taken to largely overlap with "independents." Say party centrists, they just don't trust the Democrats--that "God gap." So Democratic candidates are supposed to wear their piety on their sleeve if they ever hope to creep over 51 percent in an election. The centrists are wrong. Independents are the most secular portion of the electorate.
Of course, the media business also has interests. Those interests happen to coincide with those in our party--the Democratic Leadership Council is the most notorious--who have been fighting since the 1980s to make the party more friendly to corporations. The two ostensibly nonconservative cable news channels look more and more like loss leaders for giant corporations eager to signal to other giant corporations that they won't do anything to harm them. There is little other rational explanation for why a network like CNN Headline News keeps on a spittle-flecked right-wing ranter like Glenn Beck (he got less than 60,000 viewers in the 25-54 demographic one recent Tuesday); or in a gentler, more culturally mediated way, why cable news gravitates toward ostensibly nonconservative commentary that posits an ineluctable social conservatism of the electorate as the reason the GOP is the country's natural governing party.
We may not be able to get the media to understand that this is the most favorable climate for liberalism in a generation. But I do know a class of people we might have a better chance of influencing: Democratic politicians--especially Democratic presidential candidates. But what I'd like to say is a paradox, given what I've been arguing above: Don't pay too much attention to polls, no matter how favorable they may be to the kind of politics you'd like to see. Not just because it keeps you from leading but because it can keep you from winning.
More and more I find myself telling a story I consider the key to understanding modern American political history: that of Ronald Reagan's 1966 California gubernatorial campaign. His expensive, top-drawer consultants had hired a company formed by psychology PhDs who promised that Reagan's would be the first campaign run "as a problem in human behavior." Many liberal interpreters of Reagan's career have pointed to this to suggest that he was plastic, or a pawn, or a manipulator of voters. Not so. In fact, he was the opposite. One of the first things he did was tell all these fancy pollsters to shut up. In his early, exploratory campaigning, he'd been attacking the insolence of insurgent Berkeley students--who "should have been taken by the scruff of the neck and thrown out of the university once and for all." His consultants told him to knock it off, pointing to their data: Berkeley didn't even show up as an issue. Reagan threw the polls back in their faces: "Look, I don't care if I'm in the mountains, the desert, the biggest cities of the state, the first question is: 'What are you going to do about Berkeley?' And each time the question itself would get applause."
Reagan followed his heart, of course, made Berkeley his signature issue and thumped Edmund Brown in one of the greatest upsets in modern political history (even though the establishment media hated his conservatism then more than they hate our liberalism now, and even though Republican elites were more unmistakably ashamed of the GOP "brand" than DLCers are of the Democratic one now). The technical lesson in this story is that longitudinal polls like Pew's are inherently incomplete. They derive their value from asking exactly the same questions over time, even though the banquet of issues people care about always changes. A politician who goes into battle believing polls can teach him "the issues" is fighting in a static world, which is not the world we live in.
But the more profound lesson is that the greatest politicians create their own issues, ones that no one knew existed. Was the mood in California favorable for Reagan's conservative message in 1966? Obviously, or else Reagan wouldn't have won; he wasn't a magician. But he was--yes--a great communicator, confident of his gifts. By listening and interacting with ordinary people, and sniffing out where his own sense of right and wrong dovetailed with what he heard, he divined a certain inchoate mood. It had to do both with a fear of breakdown of the social order and resentment of liberal elites. Finding those frequencies sounding via the trope of "Berkeley," he was able to turn that mood into a political appeal. In that regard, his pollsters could only hurt him. All they knew was that Berkeley wasn't an "issue."
That's the danger of even the best polling: its power to smother intuitive leaders in the cradle. The Pew poll and all the others can only point to the modern electorate's anxieties--anxieties that have something to do with a sense of breakdown in the economic order, and with resentment of conservative elites. But what story can Democratic politicians weave to repair them? None that they are telling yet. All I know is that to sound the right frequencies, we need candidates who know when to tell their pollsters to stuff it.
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Sunday, July 1, 2007 - 21:54
SOURCE: Nation (6-22-07)
Start with Secretary of Defense Robert Gates and Senator Carl Levin (D-MI), Chairman of the Senate Armed Services Committee. Gates announced on June 10 that he would not reappoint General Peter Pace as chairman of the Joint Chiefs of Staff. "At this moment in our history," Gates said, it was not in the interests of General Pace, the nation, and--oh so, predictably-- "our men and women in uniform" (today's last refuge for scoundrels) to have the "divisive ordeal" of confirmation hearings. "The focus of his confirmation process would have been on the past, rather than the future," Gates lamented.
Gates betrays his old CIA apparatchik origins, with his now-snug fit into the Bush-Cheney presidency that believes history is best ignored. Remember, he aided in consigning to the ashcan the Iraq Study Group's report with its painful reconstruction of four years of failure. Gates must be without irony for he served as a member of that group, but now he has reaffirmed that the only thing to learn from history is to forget it.
Gates and the President understandably are eager to avoid any inquiry into the origins and conduct of the Iraq invasion, war and occupation. But what about Congress? Alas! Senate Democrats are poised to continue their familiar, comfortable role as enablers of the Administration's dodges. Levin is as ahistorical as the Defense Secretary, when he echoed Gates, saying that Pace's confirmation hearing "would have been a backward-looking debate about the last four years."
But exactly! History counts; it matters. How often must we remember Lincoln's injunction that "we cannot escape history"? When we better understand what we have done, then perhaps our future course can be better informed. How will we know where to go if we fail to understand how we arrived at this point?
President Bush and Gates have settled on the nomination of Admiral Michael G. Mullen, currently Chief of Naval Operations and a member of the Joint Chiefs, to succeed Pace. Does he not share some responsibility with Pace for the past four years? Or was his service merely window dressing? Will Congress take a convenient pass, and refuse an inquiry into the Joint Chief's performances--with Admiral Mullen as a supposedly fully-functioning member?
Gates predictably described Mullen as a man with "vision, strategic insight, and [an] integrity to lead." Mullen's friends describe him as a "pragmatist"--that special Washingtonian stamp of approval. Mullen has said he weigh all options on the surge. But we have no word as to how and why; we have no knowledge that Mullen made his views known to either his peers, his underlings, or his superiors, including the President. Gates and the President do not know that this man will fully support or enlarge the current Iraq policies? That's hard to believe.
Levin's committee--including Senator Hillary Clinton, assuming she appears and reads the material--surely must question Admiral Mullen about those policies, and what he proposes to do now. They can heed John F. Kennedy's warning of exactly forty-five years ago. "The great enemy of the truth is very often not the lie--deliberate, contrived, and dishonest, but the myth--persistent, persuasive, and unrealistic," Kennedy warned. "Belief in myths allows the comfort of opinion without the discomfort of thought."
Now we have a new myth generating: We must remain in Iraq to "keep the peace." The Administration hints at various modes of maintaining our presence in the area, all underpinned by the example of our fifty-year "presence " in Korea.
The Mullen hearings are expected to begin as early as July; they should offer an opportunity for the Democrats to fully question the Iraq adventure--and what is to come. Perhaps Mullen might expand on his May 2007 Pearl Harbor speech when he said,"the enemy now is basically evil and fundamentally hates...the democratic principles for which we stand.... This war is going to go on for a long time. It's a generational war." George W. Bush has to love this man.
For Levin and his colleagues a simple question begs a simple, direct answer. Committee Member Senator James Webb (D-VA) has said we must get out, but how we leave is of profound importance. So, what options can they offer other than the President's dire alternative of the fifty-year occupation, à la Korea? Democratic presidential candidates, ever reluctant to alienate the military and fearful of talk radio's right-wing pseudo-macho howls, sound like they are trying to revive the absurd "enclave strategy" from the Vietnam years. Candidate Bill Richardson has noted that other Democratic candidates differ among themselves on how many troops to leave behind. "I would leave zero troops. Not a single one," Richardson said recently. Senator Levin might ask Admiral Mullen to respond to that.
Mullen's nomination hearings should be an opportunity to debate our future course. How will we achieve troop withdrawal, or is withdrawal merely troop redeployment? President Bush repeatedly has said we will leave Iraq if we are asked. Does Mullen believe that--or is a more accurate, inevitable parallel the German occupation of Europe and we will leave only when we are forced out?
Reprinted with permission from the Nation. For subscription information call 1-800-333-8536. Portions of each week's Nation magazine can be accessed at http://www.thenation.com.
Posted on: Sunday, July 1, 2007 - 21:32