Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Washington Times (12-20-11)
Mr. Pipes (www.DanielPipes.org) is president of the Middle East Forum and Taube distinguished visiting fellow at the Hoover Institution of Stanford University. © 2011 by Daniel Pipes. All rights reserved.
The formal end of the U.S. war in Iraq on Dec. 15 enhanced neighboring Iran as a major, unpredictable factor in the U.S. presidential election of 2012.
First a look back: Iran's mullahs already has had one opportunity to affect American politics, in 1980. Their seizure of the U.S. embassy in Tehran for 444 days haunted President Jimmy Carter's reelection campaign and – thanks to such developments as yellow ribbons, a "Rose Garden" strategy, a failed rescue operation, and ABC's America Held Hostage program – contributed to his defeat. Ayatollah Khomeini rebuffed Carter's hopes for an "October surprise" release of the hostages and twisted the knife one final time by freeing them exactly as Ronald Reagan took the presidential oath.
Today, Iran has two potential roles in Obama's reelection campaign, as disrupter in Iraq or as target of U.S. attacks. Let's look at each of them:
Who lost Iraq? Although George W. Bush's administration signed the status of forces agreement with the Iraqi government, stipulating that "All the United States Forces shall withdraw from all Iraqi territory no later than December 31, 2011," Obama's decision against keeping a residual force in Iraq made the troop withdrawal his choice and his burden. This puts him at risk: should things go badly in Iraq in 2012, he, not Bush, would take the blame. Iran's supreme guide, Ali Khamene'i, in other words, can make Obama's life miserable.
Khamene'i has many options: He can exert more control over those many Iraqi leaders who are Shiite Islamists with a pro-Iranian outlook, some of whom even lived in exile in Iran. For example, Prime Minister Nouri al-Maliki fits this mold. The Iranians can also influence Iraqi politics via the country's intelligence services, which they have already substantially penetrated. Or, they can move Iranian troops at will into Iraq, those tens of thousands of U.S. troops now gone from Iraq's eastern border, and engage in mischief of their choosing. Finally, they can support proxies like Muqtada al-Sadr or dispatch terrorist agents.
In 1980, the Iranians manipulated the American political process with hostages; in 2012, Iraq is their plaything. Should Iran's rulers decide to make trouble before Nov. 6, the Republican candidate will blame Obama for "losing Iraq." Given Obama's long opposition to the war, that will sting.
(Alternatively, the Iranians can shift gears and make good on their threat to close the Straits of Hormuz to imperil the 17 percent of world oil that goes through that waterway, thereby creating global economic instability.)
Mullahs chose to harm a weakened Democrat in 1980 and could do so again; or, they could decide that Obama is more to their liking and desist. The key point is, the troop withdrawal hands them extra options. Obama may well rue not having kept them there until after the elections, which would have allowed him plausibly to claim, "I did my best."
Bomb Iranian nukes? Almost two years ago, when Obama still held a threadbare popular plurality among Americans of +3 percent, I suggested that a U.S. strike on Iranian nuclear facilities "would dispatch Obama's feckless first year down the memory hole and transform the domestic political scene" to his benefit. With one action, he could both protect the United States from a dangerous enemy and redraw the election contest. "It would sideline health care, prompt Republicans to work with Democrats, make netroots squeal, independents reconsider, and conservatives swoon."
As Obama's popularity has sunk to -4.4 percent and the elections loom less than a year away, his incentive to bomb Iran has substantially increased, a point publicly discussed by a colorful range of figures, both American (Sarah Palin, Pat Buchanan, Dick Cheney, Ron Paul, Elliott Abrams, George Friedman, David Broder, Donald Trump) and not (Mahmoud Ahmadinejad, Fidel Castro). Health care, employment, and the debt offer the president little solace, the Left is disappointed, and the independent vote is up for grabs. Current skirmishes over sanctions and drones could be mere distraction; an attack on Iranian facilities would presumably take place in the first half of 2012, not too self-evidently close to the U.S. elections.
In conclusion: Khamene'i and Obama can both make trouble for the other. If they do, Iran and Iraq would play outsized roles in the presidential contest, continuing in their unique thirty-year role as the tar babies of American politics.
Posted on: Thursday, December 22, 2011 - 17:07
SOURCE: CNN.com (12-16-11)
Andrew J. Bacevich is Professor of International Relations and History at Boston University. This post is one of four from the Council on Foreign Relations in response to the question, Was the Iraq War worth it?
As framed, the question invites a sober comparison of benefits and costs - gain vs. pain. The principal benefit derived from the Iraq War is easily identified: as the war's defenders insist with monotonous regularity, the world is indeed a better place without Saddam Hussein. Point taken.
Yet few of those defenders have demonstrated the moral courage - or is it simple decency - to consider who paid and what was lost in securing Saddam's removal. That tally includes well over four thousand U.S. dead along with several tens of thousands wounded and otherwise bearing the scars of war; vastly larger numbers of Iraqi civilians killed, maimed, and displaced; and at least a trillion dollars expended - probably several times that by the time the last bill comes due decades from now. Recalling that Saddam's weapons of mass destruction and alleged ties to al-Qaeda both turned out to be all but non-existent, a Churchillian verdict on the war might read thusly: Seldom in the course of human history have so many sacrificed so dearly to achieve so little....
Posted on: Thursday, December 22, 2011 - 16:37
SOURCE: National Review (12-22-11)
Victor Davis Hanson is a classicist and historian at the Hoover Institution, Stanford University, and the author, most recently, of The End of Sparta, a novel about ancient freedom.
Two terrible September days sum up the first decade of the new American millennium.
The first, of course, was Sept. 11, 2001. Osama bin Laden’s suicide terrorists that morning hit the Pentagon, knocked down the World Trade Center, killed 3,000 Americans, and left in their wake 16 acres of ash in Manhattan and $1 trillion in economic losses. Two invasions, into Afghanistan and Iraq, followed — along with a more nebulous third "war on terror" against Islamic radicalism generally.
America was soon torn apart over both the causes and the proper reaction to the attacks. The Left often cited America’s foreign interventions and Middle East policies as provocations. And it soon bitterly opposed the war in Iraq, and even more adamantly decried the antiterrorism protocols that followed 9/11.
The Right countered that only unwarranted hatred of the U.S. prompted the carnage. The best way, then, to prevent more Islamic terrorism was to go on the offensive abroad against regimes that sponsored terrorism, whether the Taliban or Saddam Hussein. New security protocols and laws at home were likewise needed to prevent another major terrorist onslaught.
But a decade later, the unforeseen had happened...
Posted on: Thursday, December 22, 2011 - 13:01
SOURCE: CS Monitor (12-21-11)
Jonathan Zimmerman is a professor of history and education at New York University. He is the author of “Small Wonder: The Little Red Schoolhouse in History and Memory” (Yale University Press).
“He is distinctly not a man,” wrote one defender of hazings in 1915, describing the typical college freshman, “and the fraternity must take up the task of character shaping where the parents left off or never began.” Hazing, he added, “is a means of determining what a man possesses, whether he has a streak of ‘yellow’ or whether he has stamina.”
Universities struck back with anti-hazing regulations; in statehouses, hazing was banned as well. Today, 44 states have laws prohibiting the practice. From the very start, however, these rules were always observed in the breach. Boys would be boys, hazing advocates said, and no bureaucrat or legislator could stop them.
“What’s the matter with K.U.?” wrote a University of Kansas graduate in 1910, blasting the school’s new restrictions on hazing. “The authorities seem to think that the University is a school for namby-pambies and Lizzie boys.... Young men of talent and energy will not go to a school which bears so close a resemblance to a female seminary.”
They did keep going to universities, of course – and they kept hazing each other. Men at historically black colleges such as Florida A&M got in on the act, too, adding a new rationale: Hazing would toughen young African Americans for the long freedom struggle ahead. At the 1997 convention of Alpha Phi Alpha, the same fraternity that enrolled Martin Luther King, Jr., civil rights warrior Andrew Young joked that his beatings at the hands of the Ku Klux Klan were nothing compared to the ones administered by his brothers at APA....
Posted on: Wednesday, December 21, 2011 - 15:13
SOURCE: LA Review of Books (12-21-11)
LARB contributor and China matters specialist Jeffrey Wasserstrom offers multiple tiers of recommendation for year-end reading.
'Tis the season for best books lists, which—to invoke a Chinese saying—sprout up like bamboo shoots after a spring rain. Just in case somebody asked, I was prepared to offer my own: 2011's best books on recent Chinese political and cultural developments. No one asked. And while I could go ahead and simply post my list anyway, it feels a bit late for holiday book buying. What might be more useful—especially for readers without a lot of time during the holidays—is to highlight some of the notable short form and long form news reports, reviews, and commentary pieces from the last year. The result is a Top 10 list that I hope will give readers an enlightening overview of how a variety of writers have been addressing the major events, trends and phenomena in the world’s most populous country. And if you have the time to plunge into a book just now, or are looking for that last-minute gift, I’ll mention one published in the last couple of years to pair with the article in question. (I’ve reviewed many of these myself, for venues such as TIME and the Asian Review of Books, but in each case, I’ll point readers to a review by someone else, so that they get a perspective different from mine about the work’s value.)
Though the list is eclectic, a fair number of the titles address collective struggles for change or profile individuals known for speaking out against abuses of power. This perspective is not coincidental (see my own China in the 21st Century: What Everyone Needs to Know). Though TIME’s decision to name “The Protester” its 2011 person of the year was likely not driven by events in China—this was, after all, the year of Arab Spring and Occupy Wall Street—dissent is always a crucial theme when discussing China and its future....
1. “The Han Dynasty”: New Yorker staff writer Evan Osnos’s profile of one of China’s most interesting and hard to categorize intellectuals: Han Han. (Abstract only; full article behind a paywall.)
The subject of this piece, Han Han, made his name as a novelist and racecar driver, but he is now probably most influential as a blogger with a massive following who has become increasingly political in recent years, writing posts that are often scrubbed away by censors soon after they appear—but not before being shared and reposted. Pair it with New Yorker contributor Zha Jianying’s book, Tide Players: The Movers and Shakers of a Rising China, which limns the varied political choices being made by entrepreneurs and intellectuals, some of whom, like Han Han, are neither dissidents in the classic sense nor unquestioning supporters of the status quo (for more on the book, see David Pilling’s review).
2. “A View on Ai Weiwei’s Exit”: Australian Sinologist Geremie Barmé’s essay, inspired by the famous gadfly figure’s detention last spring.
The piece, accompanied by photographs of the artist’s work, offers a deeply informed look at how exactly Ai Weiwei’s art and political stances have developed in recent years. Pair it with the artist’s own commentaries and online posts, published in book form earlier this year (for more about that volume, see Los Angeles Review of Books contributor Alec Ash’s “The Last Rant," a review of the compendium).
Posted on: Wednesday, December 21, 2011 - 15:04
SOURCE: Salon (12-20-11)
Michael Lind is the author of "The Next American Nation: The New Nationalism and the Fourth American Revolution" and "The American Way of Strategy."
“In lapidary inscriptions a man is not upon oath,” Samuel Johnson remarked. Even so, claims that the world has lost a major thinker and great writer in the late Christopher Hitchens go beyond the mild flattery that is appropriate in obituaries and call for correction. The rule de mortuis nil nisi bonum does not apply to those who take part in public life or public debate; their deaths provide the most appropriate occasions to evaluate their significance and their legacies.
My assessment of Christopher Hitchens is not colored by any personal conflict with him. On the contrary, my few interactions with Hitchens were friendly. In 1995 he wrote a favorable review of my first book, “The Next American Nation,” in the New York Times Book Review, and thereafter invited me to drinks at a Washington bar several times. Some claim that he was a fascinating conversationalist, but as I recall he showed no interest in ideas and preferred to peddle gossip about politicians and journalists and authors, until I found opportunities to excuse myself. Gossip, like alcohol, is safely consumed only in small quantities.
He invited me to a dinner at his Washington apartment, where he introduced me to his friend Sidney Blumenthal, the journalist who had become an aide in the Clinton White House. Blumenthal and I discovered that Hitchens was remarkably ignorant of American history for someone who earned money writing about American politics. We spent much of the evening explaining the differences between Whigs and Jacksonians to the British expatriate, and I was not surprised that reviewers found his later book on Tom Paine to be riddled with mistakes. That particular evening ended with Hitchens cornering me at the door on the way out with a boozy harangue about how he was going to come to the defense of David Irving, a right-wing British author who had been denounced as a Holocaust denier. I was grateful to escape....
But though he played one on TV, Hitchens was not an intellectual, if the word has any meaning anymore. Those known by the somewhat awkward term “public intellectuals” can be based in the professoriate, the nonprofit sector, or journalism. They can even be politicians, like the late Daniel Patrick Moynihan. But genuine intellectuals, as distinct from mere commentators or TV talking heads, need to meet two tests.
First, intellectuals need to produce some substantial works of scholarship, literature or rigorous reporting, distinct from the public affairs commentary for which they may be best known to a broad public. If you do nothing but review other people’s work or write brief columns or blog posts, it is easy to appear to be much smarter and erudite than you really are.
Second, genuine intellectuals base their interventions in public debate on the basis of some coherent view of the world. A dedication to rigorous and systematic reasoning, wherever it may lead, is what distinguishes intellectuals from lobbyists or partisan spin doctors who change their views according to the demands of a special interest or a party. It also distinguishes them from mere “contrarians” — the term Hitchens used to describe himself — who attract publicity by taking controversial stands according to their whims....
Posted on: Wednesday, December 21, 2011 - 11:20
SOURCE: Financial Times (UK) (12-16-11)
The writer is former professor of modern British history at Cambridge. His new book, Mr Churchill’s Profession, will be published by Bloomsbury in 2012.
Nearly everyone agrees that the crisis in Britain’s relations with the European Union has a historic dimension. Some proclaim it all goes back to the Maastricht Treaty 20 years ago, while others talk of the end of 40 years’ work, ever since Britain’s accession to the European Community. If Britain is now alone, then popular invocations of Churchill take us back to alleged parallels more than 70 years ago. Whether this is the end of 200 years of history, as Jonathan Powell argued in the FT ("Cameron has betrayed 200 years of history", December 12), continues to agitate readers of these pages. Such perspectives, however, may be far too short.
After all, it is Charlemagne who is celebrated for establishing a certain idea of Europe. Crowned Emperor by the Pope in 800, he certainly put his stamp on half a continent. Yet though his Carolingian dynasty achieved mastery, its origins had been relatively humble. For the Carolingians had risen under their predecessors, the Merovingians, essentially by doing the dirty work for them, with no grander title than that of major domo or "mayor of the palace". The Merovingian kings were garbed in the trappings of monarchy, but it was the Carolingians who shrewdly grasped the levers of power, as the mighty Bishop of Rome had to recognise.
Just a fable for continental Europeans? The insight that it is not formal titles and institutions that confer power, but power that commands institutional recognition, transcends time and frontiers. Perhaps David Cameron forgot this when he headily suggested, at Brussels last week, that his veto would deny the other 26 leaders any access to the political and judicial organs of the European Union. For if 26 national leaders are intent on co-operation – a big if, perhaps – they will surely find the means of implementing their will, through shadow institutions that could soon become the real institutions. The Berlaymont in Brussels, the imposing official headquarters of the European Union, might become a mausoleum, its dusty corridors paced, like Merovingian monarchs, by lonely British prime ministers, puzzled at how quiet the old palace has become...
Posted on: Tuesday, December 20, 2011 - 18:42
SOURCE: LA Times (12-18-11)
David Greenberg is associate professor of history and of journalism and media studies at Rutgers University and the author of several works of political history, including "Nixon's Shadow: The History of an Image."
We Americans pride ourselves on our religious pluralism and toleration. Although presidents do feel obliged to end every speech with the title of an Irving Berlin song ("God Bless America"), by and large they adhere to the Founding Fathers' ideal of separation of church and state. But contrary to this general rule there each year arises the exceptional custom of White House Christmas cards.
Should the president and first lady really be issuing messages to celebrate a religious holiday that not all Americans celebrate? Strictly speaking, probably not, even if the costs are picked up by the political parties. Yet the practice has never incurred the wrath of the American Civil Liberties Union. That's probably because, since the beginning, these messages have usually taken on an inclusive, if not bland, character — one that manages to respect the holiday season and simultaneously to give scant offense.
According to Mary Evans Seeley's "Season's Greetings from the White House, " the key work on White House Christmas celebrations, presidential holiday messages originated with Calvin Coolidge. In 1923, Coolidge's first winter in office, Middlebury College, in his home state of Vermont, donated a 60-foot fir tree that was installed on the Ellipse, south of the Treasury Building, and illuminated in a public ceremony. In subsequent years, Coolidge — an unsung pioneer in the use of radio and mass media — became not only the first president to light a Christmas tree in public but also the first one to deliver a Christmas message over the radio and the first to issue a written statement, which many newspapers across the country reprinted.
Although issued on a Christian holiday, Coolidge's statement was, notably, mainly secular in nature. The 1920s witnessed cultural wars as fierce as those that have racked the country since the 1960s — over immigration (even then), Prohibition, the Ku Klux Klan — and Coolidge, though a conservative Republican who was a pious Christian in private, sought to maintain an ecumenical tone. Although the vague reference to "a Savior" gave his message a mild Christian cast, what the president called for was not any specific religious belief but "a state of mind" that cherished "peace and goodwill."...
Posted on: Tuesday, December 20, 2011 - 13:34
SOURCE: The Root (12-20-11)
Zaheer Ali is a doctoral student in history at Columbia University, researching 20th-century African-American history and religion.
Recent discussions of poverty have revealed themselves to be, in fact, coded conversations about race. When Newt Gingrich talks about poor kids having no work ethic and Donald Trump agrees, they discuss poor kids interchangeably with black or inner-city youths. For years politicians, policy wonks and others have used "disadvantaged," "underprivileged," "inner-city," "urban" and "poor" as code words for black and brown people.
This is not just a polite effort to avoid explicit mentions of race; it is an attempt to link African Americans to these characteristics, constructing a pathological view of black America. Poverty is, according to this view, a problem confined to the black community, the result of cultural pathologies. This view reached its ultimate expression last week with Gene Marks' much refuted "advice column" for poor black kids that was published in Forbes.
Our national conversations about poverty -- so entangled with race in unspoken ways -- have rendered the white poor invisible and the black poor pathological, and undermined our attempts to gain majority support for anti-poverty programs. Led to believe that the poor are "other people's problems," a significant portion of Americans have come to view social welfare programs designed to assist the poor as attempts at wealth redistribution -- not just across class lines but across the unspoken, coded racial lines.
If white America would come face-to-face with white poverty, it would realize that these anti-poverty programs are needed in their communities, too. And we would move beyond a view of poverty as the pathology of a specific racial or ethnic group. Would white people casually accept Newt Gingrich telling them that their children have no work ethic and need to start cleaning school bathrooms?...
Posted on: Tuesday, December 20, 2011 - 13:20
SOURCE: Bloomberg Echoes (12-16-11)
Louis Hyman is an assistant professor of history at Cornell University and the author of "Debtor Nation: A History of America in Red Ink."
It is commonly opined, in high school history classes and backyard barbeques, that government spending in the run-up to World War II "got us out of the Depression." This narrative conveys the sense that the end of the Great Depression was both accidental and necessarily belligerent.
But exactly how World War II got us out of the Depression is generally ignored -- even though it provides a lesson at odds with the accepted interpretation.
The war did provide a unique demand for entirely new industries. While airplanes had only been incidentally important in World War I, it was believed that they would be decisive in the 1940s. The problem, for the U.S. military, was that it had only a few planes, and fewer resources to construct them with. The government couldn't simply go to the market and buy some planes; it had to create the market. And it did.
In a genius marriage of finance and policy, the government founded the Defense Plant Corporation, or DPC, in 1940. The DPC was run by a committee of public-minded businessmen from all stripes of commerce: William Knudsen, who had helped organize Ford's production line and then became the president of General Motors Co.; Donald Nelson, a vice president at Sears, Roebuck & Co.; and Ralph Budd, president of the Chicago, Burlington and Quincy Railroad, to name a few....
Posted on: Sunday, December 18, 2011 - 14:32
SOURCE: NYT (12-15-11)
Samuel J. Redman is a historian in the Regional Oral History Office of the Bancroft Library at the University of California, Berkeley.Library
NEWT GINGRICH caused controversy recently with his unusual suggestion that schools "ought to get rid of the unionized janitors, have one master janitor and pay local students to take care of the school. The kids would actually do work, they would have cash, they'd have pride in the schools, they'd begin the process of rising."
While the notion that unionized school janitors are draining our economy is woefully misguided, Mr. Gingrich, the former House speaker who is seeking the Republican presidential nomination, is onto something when he says we should find ways to hire unemployed young people to maintain our educational infrastructure. And we can - by reviving the National Youth Administration.
Founded in 1935, the N.Y.A. aided over four million people between the ages of 16 and 25 in the midst of the Depression, providing desperately needed stipends to students while also working to improve and maintain the infrastructure of places like schools, universities and museums. High schools around the nation hired students to help maintain school grounds and athletic fields - not unlike Mr. Gingrich's proposal. Other students found temporary work at museums, earning money while helping preserve and organize priceless collections of artifacts. These jobs were not terribly glamorous - some were downright tedious - but in the climate of the Great Depression, students and other underemployed youths were grateful for the steady pay.
Recent oral histories collected by the Regional Oral History Office of the Bancroft Library at the University of California, Berkeley, throw light on the practical difference the N.Y.A. made in the lives of those it enrolled. Jack W. Rosston, for instance, told of working on campus at Berkeley through an N.Y.A. program. "For my freshman year, I got $10 a month," he explained. "That paid for my transportation to Berkeley from San Francisco."
Posted on: Friday, December 16, 2011 - 12:46
SOURCE: Guardian (UK) (12-13-11)
Philip F. Rubio is a retired postal worker and an assistant professor of history at North Carolina A&T State University. His second book, There's Always Work at the Post Office: African American Postal Workers and the Fight for Jobs, Justice and Equality (2010), won the 2011 Rita Lloyd Moroney Award for scholarship on the history of the American postal system.
The unthinkable now threatens the US Postal Service: bankruptcy. With no relief forthcoming from Congress, the USPS hopes to save itself by planning the closing of about 3,700 post offices next year, along with 252 mail processing centers.
Around 120,000 postal workers will lose their jobs with another 100,000 positions going unfilled. Saturday delivery will be gone, and first-class letter delivery will be slowed. Some historic postal buildings, including those with New Deal-era murals, have already been sold off.
How could this venerable institution founded in 1775, which ran deficits for most of its existence as the "US Post Office Department", face a possible shutdown?
Don't blame the internet. Online communications and transactions have cut into first-class mail use, but have also helped generate mail volume, particularly parcels. The internet is not the source of USPS red ink, although it provides a popular narrative for those who have wanted to privatise the USPS for years and are using the current crisis to push that agenda. The USPS still delivers 40% of the world's mail, and has done so without any taxpayer subsidies for 40 years....
Posted on: Friday, December 16, 2011 - 12:40
SOURCE: NYT (12-15-11)
Kirk W. Johnson, a former reconstruction coordinator in Iraq, founded the List Project to Resettle Iraqi Allies.
ON the morning of May 6, 1783, Guy Carleton, the British commander charged with winding down the occupation of America, boarded the Perseverance and sailed up the Hudson River to meet George Washington and discuss the British withdrawal. Washington was furious to learn that Carleton had sent ships to Canada filled with Americans, including freed slaves, who had sided with Britain during the revolution.
Britain knew these loyalists were seen as traitors and had no future in America. A Patriot using the pen name “Brutus” had warned in local papers: “Flee then while it is in your power” or face “the just vengeance of the collected citizens.” And so Britain honored its moral obligation to rescue them by sending hundreds of ships to the harbors of New York, Charleston and Savannah. As the historian Maya Jasanoff has recounted, approximately 30,000 were evacuated from New York to Canada within months.
Two hundred and twenty-eight years later, President Obama is wrapping up our own long and messy war, but we have no Guy Carleton in Iraq. Despite yesterday’s announcement that America’s military mission in Iraq is over, no one is acting to ensure that we protect and resettle those who stood with us....
Posted on: Friday, December 16, 2011 - 11:12
SOURCE: The National Interest (12-13-11)
Benny Morris is a professor of history in the Middle East Studies Department of Ben-Gurion University of the Negev. His most recent book is One State, Two States: Resolving the Israel/Palestine Conflict (Yale University Press, 2009).
According to many liberals and left wingers, Israel's democracy is under attack and under threat, with politicians currently promoting a series of laws that will curtail press freedom, left-wing NGO activities and the independence of the Supreme Court. One commentator, while agreeing that Israel is still a liberal democracy, defined what the right-wing coalition government, led by Benjamin Netanyahu, is doing as "nibbling away" at the foundations of Israeli democracy.
Meanwhile, last summer's mass protests calling for "social justice"— cheaper food and housing, higher wages and jobs for the unemployed—have petered out. The small tent encampments that sprang up in major towns now are cleared out, while most of the proposals of the Trachtenburg Committee, designed to alleviate some of the more prominent ills, are mired in the governmental bureaucracy and in Knesset committees. Nothing practical has yet resulted from the protests, partly because the Defense Ministry is steadfastly refusing to cut its budget—accounting for some 20 percent of state spending—in light of the growing Arab Islamist and Iranian threats to the country.
In an unprecedented public appearance last week, Dorit Beinish, the outgoing president of the Supreme Court, denounced the bills currently under deliberation in the Knesset and the cabinet as an incitement against the court and as jeopardizing Israel's democratic values....
Posted on: Thursday, December 15, 2011 - 19:17
SOURCE: Project Syndicate (12-13-11)
Harold James is Professor of History and International Affairs at Princeton University and Professor of History at the European University Institute, Florence. He is the author of The Creation and Destruction of Value: The Globalization Cycle.
LONDON – At the just-concluded European Union summit, British Prime Minister David Cameron vented decades of accumulated resentment stemming from his country’s relationship with Europe. Europeans were appalled at how the last-minute injection of finicky points about bank regulation could stymie what was supposed to be a breakthrough agreement on the regulation of EU countries’ budgets. Cameron’s supporters in Britain cheered and portrayed him as a new Winston Churchill, standing up to the threat of a vicious continental tyrant.
The United Kingdom’s view of Europe has always been both emotional and ambiguous. A Conservative government wanted to join the European Economic Community in the early 1960’s, but was rejected by French President Charles de Gaulle. The General mocked the British ambition with a rendition of Edith Piaf’s song about an English aristocrat left out on the street, “Ne pleurez pas, Milord.” In the end, Britain came in from the cold, but British leaders always felt that they were not quite welcome in the European fold.
At two critical moments in the past, a British “no” had a decisive impact on European monetary developments. In 1978, German Chancellor Helmut Schmidt and French President Valéry Giscard d’Estaing proposed an exchange-rate arrangement – the European Monetary System (EMS) – to restore stable exchange rates in Europe. Initially, the Germans and the French negotiated trilaterally, with the UK, in meetings that were slow, cumbersome, and unproductive....
Posted on: Thursday, December 15, 2011 - 19:14
SOURCE: San Francisco Chronicle (12-13-11)
Tony Fels is an associate professor of history at the University of San Francisco.
The Occupy Wall Street movement has reached a tactical dead end. This much has been apparent for weeks. Remarkably, its fundamental message of seeking a more equitable distribution of wealth in America has not been lost, despite the extremism and eccentricity of the protests.
The message clearly resonates with a majority of Americans, as many polls have indicated. The problem is not that the movement needs a sharper focus or a more detailed list of demands. Social movements do not have to make policy, much less write legislation. They simply need to articulate the strength of feeling in the population for a change of course.
The more all-embracing its message, the better. But how can the latent sentiments that so many Americans feel today for a return to the principles of fairness and equality of opportunity be expressed in all their fullness?
An analogy might be found in the movement to end the war in Vietnam in the 1960s. Protests against the war were started by small minorities of radicals among students, religious figures and draft-age youth. But in 1967 an umbrella organization calling itself the National Mobilization Committee to End the War in Vietnam (nicknamed the "Mobe") formed to sponsor huge marches against the war in New York, Washington D.C., and other cities....
Posted on: Thursday, December 15, 2011 - 19:12
SOURCE: The Atlantic (12-13-11)
Kevin M. Levin is a Civil War historian and history educator based in Boston. He is the author of the forthcoming book, Remembering the Battle of the Crater: War as Murder and can be found online at Civil War Memory.
Americans were exuberant in 1961 at the prospect of the upcoming Civil War centennial celebrations. For southerners, it was a chance to unfurl Confederate battle flags and ponder the character and heroism of such iconic figures as Robert E. Lee and Thomas "Stonewall" Jackson. Families could watch as reenactors brought to life memorable battles such as First Manassas and Gettysburg, where lessons could be taught about the common bonds of bravery and patriotism that animated the men on both sides. There would be no enemies on the battlefields of the 1960s.
So where are we now, as we make our way to the end of the first full year of the Civil War sesquicentennial? Well, if you were to listen to the mainstream media Americans could not be more divided over the central issues of the Civil War. The standard narrative pits northerners against southerners and blacks against whites. Spend enough time with FOX News, MSNBC, or CNN and you'll hear about almost daily controversies surrounding the public display of the Confederate flag. The pessimistic tone of these reports belies an important truth: the very fact that we can have these debates at all reflects how far we've come in the past 50 years.
When our grandparents geared up for celebrations in the early 1960s, the nation's collective memory was still dominated by the Lost Cause narrative. In this version of events, which started to gain popularity right after the Civil War, southern gentlemen fought valiantly against a much stronger (and less scrupulous) northern army, and their aim was to protect states' rights and an old-fashioned way of life. Slaves were portrayed as contented and loyal when they were discussed at all; the real tragedy of the war was seen as the brother-against-brother divide between white men....
Posted on: Thursday, December 15, 2011 - 17:57
SOURCE: NYT (12-13-11)
Scott Farris is the author of “Almost President: The Men Who Lost the Race But Changed the Nation.”
Eleven years ago today, Al Gore, for one important moment, was the most powerful man in our republic. The day before, the Supreme Court, in a 5-4 vote, halted the partial recount of presidential ballots in Florida that Gore had requested. But that did not mean the 2000 presidential election was over. George W. Bush could not declare victory until Gore conceded defeat.
This is our protocol in every presidential election, whether the results are clear on election night or weeks later. Our democratic political system works only when the losers give their consent to be governed by the winners. The first signal that this consent is granted comes with the losing candidate’s concession. At this moment, following a hard-fought election where passions have run high, the concession begins the process of reuniting an intensely divided country. Yet this vital service to the nation provided by losing presidential candidates is seldom appreciated....
In many countries, losing candidates do not peacefully accept defeat, and their obstinacy leads to political chaos, riots and sometimes civil war. Gore understood the risks to America from a prolonged dispute over an unresolved election....
Winning the presidency does not guarantee the winner will leave a great mark upon history; the office has certainly had its share of non-entities. Many losing candidates, though, helped bring into being political dynamics that still define our politics. Men like Henry Clay, William Jennings Bryan, Thomas Dewey, Barry Goldwater and George McGovern have created, transformed and realigned our political parties. Losing campaigns typically are the first to break barriers and expand participation. These include the first Catholic to be nominated for president, as well as the first woman and the first Jew to be named as vice presidential nominees....
Posted on: Wednesday, December 14, 2011 - 08:52
SOURCE: Commentary Magazine (12-12-11)
Max Boot is a senior fellow at the Council on Foreign Relations. He received his M.A. in history from Yale University.
Newt Gingrich has created a lot of waves by saying [the Palestinians are an invented people.]
Is Newt right? As Jonathan Tobin noted, he is historically accurate. There was no widespread sense of Palestinian nationhood until the last few decades. In fact, there was such widespread apathy among the Palestinians that and the PLO initially had little luck in mobilizing a revolt against Israeli rule. Arabs in Israel proper have been largely peaceful to this day. Even in the West Bank and Gaza Strip there was no widespread uprising until the First Intifada in 1987....
But the fact that Palestinian identity is largely an invention and has not existed for all time hardly makes the Palestinians unique. All national identity is to some extent invented. Britain, France, Italy, Germany, the United States: all are artificial entities that had to be forged over time. The process of state formation in the last three was relatively recent—the U.S. did not come into existence until 1776 and was arguably not a truly unified nation until 1865; Italy and Germany were created at roughly the same time. Britain and France are older, but they still had to be forged out of regional identities—the process of turning “Burgundians” and “Normans” into Frenchmen took centuries....
The real issue now is not whether the Palestinians should have a state—there seems close to universal agreement on that score, now—but at what pace and on what terms....
Posted on: Tuesday, December 13, 2011 - 15:27
SOURCE: NYT (12-10-11)
Robert A. Slayton is a professor of history at Chapman University and the author of “Empire Statesman: The Rise and Redemption of Al Smith.”
WITH Mitt Romney, a member of the Mormon church, quite possibly heading toward the Republican nomination, Americans may be faced with a presidential aspirant whose faith many find strange and troublesome. It would not be the first time that has happened, and during a previous campaign the response was pretty nasty.
By any measure, Alfred E. Smith, the Democratic candidate against Herbert C. Hoover in 1928, had a formidable record. Growing up poor, Al (as everyone except The New York Times called him) left school at age 12 to go to work, at jobs that included a stint in the Fulton Fish Market.
An outgoing lad with a fine speaking voice, he gravitated to street corner politics and on to Tammany Hall. Smith started as an unknown member of the New York State Assembly and rose to become speaker. In 1911, he led the investigation of the Triangle shirtwaist factory fire and sponsored the subsequent reform legislation that influenced fire codes nationwide.
Like Mr. Romney, Smith was the governor of a northeastern state. He served four nonconsecutive terms beginning in 1919, and a good argument can be made that Al was the greatest chief executive in the history of New York State, where he created the precursor of the New Deal.
So Smith should have been an impressive candidate, but the electorate had several problems with him. Voters reacted to his equivocal stance on Prohibition, to his Irish heritage, even to his New York roots. Their foremost objection by far, however, was to his religion: Smith was a devout Roman Catholic....
Posted on: Monday, December 12, 2011 - 16:43