Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: NYT (9-30-12)
Posted on: Sunday, September 30, 2012 - 11:34
SOURCE: Huffington Post (9-27-12)
William Astore teaches History at the Pennsylvania College of Technology. He retired from the Air Force as a lieutenant colonel in 2005, having taught at the Air Force Academy as well as the Naval Postgraduate School. He writes regularly for TomDispatch.com and History News Network.
There's little question that unmanned aerial vehicles or drones are helping to save the lives of U.S. and NATO troops in places like Afghanistan while aiding in the killing of terrorist suspects in regions largely inaccessible to ground troops.
But the bigger question is whether drones are in any way decisive to the war effort. Put bluntly, are they helping us to win wars, or are they essentially prolonging wars that are ultimately unwinnable?
So far, it appears that drones aren't decisive. They're merely instrumental. They're instrumental in keeping us in a losing cause. They keep our military's casualty rate at a "sustainable" level, low enough so as not to rankle the folks back home, while they give us an illusion of progress in the sense of a body count of suspected militants killed.
But is sustainability a good thing if you're sinking deeper and deeper into a quagmire? Is killing "militants" a good thing if in the process you alienate and terrorize the people, turning them against you and sowing the dragon's teeth of further militant action and more war?
Think here of the Vietnam War. Had we had drones in the skies over the Ho Chi Minh trail, surely we'd have seen with greater clarity the Viet Cong (VC) and North Vietnamese Army (NVA) coming. Surely we'd have killed more VC while losing fewer U.S. troops, at least in the short term. But would this technological advantage have translated into victory in Vietnam? Or would we have sunk even deeper into the Vietnam quagmire, bugging out not in 1975 but in 1985?
In our military's general embrace and praise of drones, we need to be careful not to lose sight of the larger realities of war. New weapons that keep us in a losing cause longer are nothing to praise. Weapons that generate resentment and blowback among peoples we say we're trying to win to our side are nothing to embrace.
In our eagerness to lower the immediate cost of war to ourselves, we may very well be elevating the long term costs of war both to ourselves and to others. But our military has great difficulty seeing this precisely because our focus is so tactical, so focused in the weeds. Much like our numerous drones, we focus on small slices of the battlefield, losing sight of the bigger picture, the larger battlespace, the reality that winning a war requires something more than a discrete set of killing operations.
Drones, in other words, are reinforcing the U.S. military's tendency to favor tactics and short-term expediency at the expense of strategy and long-term effectiveness. How long will it be before drones become merely a crutch that keeps us hobbling along in a losing cause? Or are we already there?
Posted on: Saturday, September 29, 2012 - 17:45
SOURCE: KatzEyeView (Blog) (9-28-12)
Mark N. Katz is a professor of government and politics at George Mason University, and is the author of Leaving without Losing: The War on Terror after Iraq and Afghanistan (Johns Hopkins University Press, 2012).
Washington, DC I was in the audience for the address given by Yemeni President Abd Rabuh Mansour Hadi today at the Woodrow Wilson International Center for Scholars in Washington, DC. He emphasized how Yemen is working with the U.S. and others to combat Al Qaeda in the Arabian Peninsula and other Islamic terrorists, Yemen is implementing the Gulf Cooperation Council (GCC)-sponsored agreement (which Washington supports) for a democratic transition in Yemen, and his new administration (in office just since February 27, 2012) is working to establish peace and security among Yemen’s many disparate groups. As he did in his speech to the UN General Assembly, President Hadi criticized the Assad regime in Syria. President Hadi also accused Iran of interfering in Yemen’s internal affairs, noting that the Yemeni government had rounded up five Iranian spy rings and was in the process of rounding up a sixth.
After the address, he took four questions from the audience before calling an end to the session. In answer to three of them—one on why he didn’t put former President Saleh on trial, another on why he had appointed so many members of Islah (an Islamist party) to office, and a third on what he was doing to ensure the progress of Yemeni women—he indicated that he was acting (or not acting) in accord with the provisions of the GCC-sponsored democratic transition agreement. When asked about the efficacy of the U.S. drone missile attacks on terrorist targets in Yemen, however, the President offered a spirited defense of them.
The U.S. Government was undoubtedly well pleased with everything President Hadi had to say. Indeed, his address appeared especially designed to please the U.S. Government. Still, though, it seems doubtful that Yemen will become completely democratic under the plan sponsored by the GCC since none of its member governments (Saudi Arabia, Kuwait, Bahrain, Qatar, the United Arab Emirates, and Oman) is democratic or even aspires to be. Further, blaming Iran for Yemen’s problems risks drawing attention away from—and not addressing—their primary causes: Yemen’s deep-seated poverty and internal divisions. Tehran certainly did not create these.
Compared to the problems currently being experienced by Egypt, Libya, Bahrain, and Syria, the political transition in Yemen appears to have gone rather well so far. But things have a way of going badly in Yemen. Concerted efforts on the part of the new Yemeni government, Yemen’s GCC neighbors, and the West will be needed to make sure that they do
Posted on: Saturday, September 29, 2012 - 17:17
SOURCE: The Daily Beast (9-28-12)
Peter Beinart, senior political writer for The Daily Beast, is associate professor of journalism and political science at City University of New York and a senior fellow at the New America Foundation. His new book, The Crisis of Zionism, was published by Times Books in April 2012.
Benjamin Netanyahu loves history. And he loves deriding his critics for not understanding it as well as he does. In his address to the United Nations General Assembly last year, he attacked journalists for their weak grasp of past events, calling for a “press whose sense of history extends beyond breakfast.”
But while Netanyahu’s sense of history may extend beyond breakfast, he doesn’t remember events the way most historians do. Take his comments in this year’s U.N. speech, delivered yesterday, about the cold war. In his argument for why the United States and other world powers should draw a “clear red line” specifying when Iran’s nuclear progress would trigger military action, Netanyahu approvingly cited NATO, whose charter “made clear that an attack on one member would be considered an attack on all.” According to Bibi, “NATO’s red line helped keep the peace in Europe for nearly half a century.”...
Posted on: Saturday, September 29, 2012 - 15:32
SOURCE: New Yorker (9-25-12)
Jill Lepore, a staff writer, has been contributing to The New Yorker since 2005. She is the David Woods Kemper ’41 Professor of American History at Harvard University.
George Romney was fifty-nine when he ran for reëlection as Michigan’s governor, in 1966. In this half-hour television special (see a clip above or the full-length version below), he explains his policies and plans for the state. (I came across the film in the records of Campaigns, Inc., in the California State Archives, while researching a piece on the history of political consulting.)
George Romney’s oldest son is now sixty-five. On television, he and his father look and speak uncannily alike. What they say, though, is strikingly different. Romney Republicanism in 2012 could hardly be more different from Romney Republicanism in 1966. The difference, of course, isn’t so much a family story as it is a story about the G.O.P.
Like Mitt, George started out as a businessman. Beginning in 1939, he was the head of the Automobile Manufacturers Association. In 1954, he became president of American Motors. He was committed to public education; in the nineteen-fifties, he ran a Detroit public-school-improvement citizens’ committee. He ran for governor as a moderate Republican in 1962. Two years later, he refused to support the Presidential campaign of Barry Goldwater, calling Goldwater conservatism “extremist."...
Posted on: Thursday, September 27, 2012 - 16:09
SOURCE: The New Republic (9-25-12)
Michael Kazin’s most recent book is American Dreamers: How the Left Changed a Nation. He teaches history at Georgetown University and is co-editor of Dissent.
Which past president stood up most stalwartly for the anti-tax, anti-welfare, anti-union principles that animate today’s conservative movement? Of course, most activists on the right would confer that honor on Ronald Reagan. However, the only Republican chief executive with a major airport named after him often governed in ways that would place him on the tiny left fringe of today’s GOP: Reagan raised income and payroll taxes, increased federal spending on domestic programs as well as the military, and avoided attacking labor unions in the private sector.
Some on the right speak kindly of Calvin Coolidge. But those who praise “Silent Cal” for cutting taxes on the rich are understandably mute about his fondness for the Ku Klux Klan, his racist attitudes toward all non-“Nordic” races, and his contempt for women who dared to drive cars, ride horses, or engage in politics. Moreover, Coolidge was no union-basher. In 1926, he signed a landmark bill which established collective bargaining for railroad workers, then a key sector of the labor force.
Ironically, the White House occupant who best represented the views that now dominate the American right was a Democrat: Grover Cleveland, the only Democratic president from the eve of Civil War to Woodrow Wilson in 1912. When Cleveland, a rotund New Yorker, was first elected in 1884, his party’s base was remarkably similar to that of the GOP today: white Southerners from all classes and white workers everywhere who did not belong to unions. The Democrat’s standard-bearer ALSO expressed doubt that any “sensible and responsible” woman would ever want to vote....
Posted on: Thursday, September 27, 2012 - 15:49
SOURCE: Philadelphia Inquirer (9-25-12)
Terrance Williams was sexually abused by the two men he killed, according to his lawyers. He was poorly represented at his trial, where jurors never heard about these circumstances. And the widow of one of his victims wants Williams' death sentence commuted.
But those aren't the strongest arguments for sparing the life of Terrance Williams, who is scheduled to be executed on Oct. 3. The best reason is the simplest: Capital punishment is inherently wrong, no matter the circumstances. And Philadelphians should understand that better than anybody else.
That's because the movement to abolish the death penalty in America began right here, in the City of Brotherly Love. On March 9, 1787 - just a few months before the drafting of the Constitution - Philadelphia physician and patriot Benjamin Rush delivered a stinging rebuke to capital punishment at a lecture in the home of another famous local patriot, Benjamin Franklin.
Part of Rush's argument would be familiar to us today: The death penalty won't deter the most vicious criminals. But his real concern - repeated until his own death in 1813 - was the effect of capital punishment on the rest of us....
Posted on: Thursday, September 27, 2012 - 15:45
SOURCE: Informed Comment (Blog) (9-27-12)
Juan Cole is the Richard P. Mitchell Professor of History and the director of the Center for South Asian Studies at the University of Michigan. His latest book, "Engaging the Muslim World," is just out in a revised paperback edition from Palgrave Macmillan. He runs the Informed Comment website
Paul Ryan knows nothing about the world or foreign affairs. I presume he may have been abroad at some point somewhere. I don’t know. As is usual in American politics, however, cavernous ignorance is no bar to holding forth, if it is not in fact a qualification. After all, ignorance is compatible with untainted national chauvinism (he would say patriotism), whereas if you actually know something it is harder mindlessly to wave the flag.
Paul Ryan has been attacking President Obama’s foreign policy as weak and resembling that of Jimmy Carter during the Iran hostage crisis, and as ‘blowing up in our faces.’
But is it true that Obama’s foreign policy is ‘blowing up in our faces?’ And how could that have been prevented? Rembember, Ryan’s running mate, Mitt Romney, began calling for Hosni Mubarak to step down on Feb 1, 2011....
What could Obama have done to keep Mubarak in power, if that is what Ryan is advocating? Some honest journalist should ask Ryan if he believes the US government should have encouraged the Egyptian military to shoot down the Egyptian youth who were demonstrating, and how many exactly it would have been legitimate to massacre in that way....
Posted on: Thursday, September 27, 2012 - 15:37
SOURCE: PJ Media (9-22-12)
Michael Ledeen is the Freedom Scholar at the Foundation for Defense of Democracies
We are somewhere between the old bipolar Cold War world, which we generally understood, and to which most of the rest of the world had adjusted, and something else, we know not what. The consequent chaos and uncertainty is remarkable, especially after the long peace and stable international environment that followed the Second World War. That is the underlying cause of an entire generation of leaders who either are obviously unable to master their challenges, or are applying ideological models from the recent past that are embarrassing anachronisms. Leaders like Putin, or Obama, or Mursi, for example, trot out various explanations for their behavior, but the explanations don’t “fit” the real conditions they are dealing with.
To be sure, there’s a comic side to the spectacle: the oracular class confidently laying out scenarios, most of which will be proven wrong in very short order. Can you count the number or times you have read a presumably well-informed prediction of the Israeli assault on Iran? Or stories of clandestine U.S. military actions inside the Islamic Republic? Or detailed analyses of the Iranian nuclear program, based on IAEA data, at least some of which, the head of the Iranian program has said, were lies?
So the oracles are unreliable, just as we should expect. Whenever some captain of industry asks me to do a “risk assessment” for him, I tell him to save his money for blackjack or craps; the odds are better than betting on my–or anyone else’s–crystal ball....
Posted on: Thursday, September 27, 2012 - 15:34
SOURCE: National Review (9-27-12)
At the end of World War II, the Americans and the British ruled, or heavily influenced by traditional right, or occupied, or sustained by force of arms righteously exercised, almost all the world except what was under the hobnailed jackboot of Stalin’s Red Army (largely supplied by the United States as it was). The masses of the world were generally uneducated and didn’t speak English, but most of their local leaders did. Latin America admired the U.S., as Roosevelt’s Good Neighbor Policy gave back a good deal of sovereignty and ended the practice of having the United Fruit Company and other American corporations deploy the U.S. Marines around Central America. Most of the Latin American countries joined the war effort, if only to be in at the founding of the United Nations, and the whole hemisphere — except for Canada, a dominion of the British Commonwealth and autonomous but in close alliance with Great Britain — sheltered under the shield of the Monroe Doctrine, which under its more vigorous espousers had purported to authorize the U.S. to intervene anywhere in Latin America for almost any reason....
It would be foolish and futile to become too misty-eyed about the era of Anglo-American paramountcy. The multipolar world, with generally declining patches of poverty and illiteracy, is a better place, and it is much preferable to be concerned about the antics of almost stateless terrorists, horrifying though their activities sometimes are, than to have to worry about immense nuclear arsenals in the hands of the Great Powers and on trip wires of massive retaliation and Mutually Assured Destruction....
No sane person would suggest that the U.S. try to resurrect that degree of enforced respect, but it is becoming so routine to watch the burning of American flags, and it is so inadequate for America to respond to attacks on its embassies and the murder of a distinguished ambassador in Libya with milquetoastish platitudes from an administration that has laboriously accepted the legion of acts and conditions it has declared to be “unacceptable,” that it is time, in Monty Pythonese, for “something completely different.”...
Posted on: Thursday, September 27, 2012 - 15:28
SOURCE: National Review (9-27-12)
Victor Davis Hanson is a classicist and historian at the Hoover Institution, Stanford University, and the author, most recently, of The End of Sparta. You can reach him by e-mailing firstname.lastname@example.org.
There was only one presidential debate in 1980 between challenger Ronald Reagan and President Jimmy Carter. Just two days before the October 28 debate, Carter was eight points ahead in the Gallup poll. A week after the debate, he lost to Reagan by nearly ten percentage points.
Reagan’s debate quip, “there you go again,” reminded voters of Carter’s chronic crabbiness. Even more devastating was Reagan’s final, direct question to American voters: “Are you better off than you were four years ago?” No one, it seemed, could muster a “Yes!”
Yet there was more to the 1980 campaign than the final game-changing debate rhetoric — and some of the details are relevant to 2012....
Posted on: Thursday, September 27, 2012 - 15:20
SOURCE: Huffington Post (9-25-12)
Steven Conn, editor of the forthcoming To Promote the General Welfare: The Case for Big Government (Oxford University Press USA/September 14th 2012), is Professor and Director of Public History at Ohio State University.
The religious world was rocked this past week by a tiny piece of ancient papyrus which suggests that Jesus had a wife. Well, really... who among us thought that a nice Jewish boy who went into his father's business was going to stay single forever??!!
The controversy that this scholarly discovery is already causing and will continue to cause isn't simply about the specifics -- did Jesus marry? did he have a female disciple? -- but rather about the role historical research plays in confirming or refuting traditional religious verities. For well over a century, science hasn't been the only threat to religious literalism; history has too.
For example, when the epic of Gilgamesh was discovered and translated in the mid-19th century, many Christians and Jews were perturbed that it presents an almost identical version of Noah's flood and the garden of Eden as the one found in the Bible. Only Gilgamesh pre-dates that text. Genesis seems to be an act of plagiarism rather than divine inspiration.
The threat the historical research poses for religious truth is even greater for religious traditions of more recent vintage. Which means history has always been an existential threat to Mormonism, since so much of the Mormon theology is founded upon events that are historically verifiable -- or more to the point, refutable.
Forget the Mormon magic underwear -- after all, I'm not sure that believing in divine boxers requires any more credulity than believing in transubstantiation. A leap of faith is just that, and for the leapers it matters. But the Mormon story is predicated on a history that we know to be demonstrably, categorically wrong.
For starters, Joseph Smith claimed that the golden plates which were revealed to him were written in a "reformed" Egyptian hieroglyphic, and we know that there is no such language.
The actors in the Bible all belong to societies evident in the historical and archaeological record. The Egyptians, the Babylonians, the Romans, the Jews all left written and other records. No one can doubt that they actually existed.
The book of Mormon, on the other hand, describes civilizations of "Nephites" and "Lamanites" living in North America, both groups having migrated -- who knows how -- from the Near East. When asked where the remains of those Nephite cities were, Mormons have pointed to the great Indian mounds found in the Ohio River valley and elsewhere. We know that those mounds were built by a variety of indigenous people including the Adena and Hopewell cultures, but archaeologists have yet to turn up any Nephites.
Worse, Mormon history posits that the Lamanites, an evil race which destroyed the Nephites, are the ancestors of Native Americans, though there is no linguistic or genetic connection between Native Americans and the peoples of the Near East. Never mind that in the Mormon world Indians are cast as cosmological villains.
Contemporaries of Joseph Smith in the 1830s and '40s accused him and his church of a variety of frauds and deceptions. From the very beginning, therefore, the institutional Mormon church has guarded its history very jealously and regarded historians with a great deal of suspicion. That hasn't changed much over the years. Elder Boyd Packer, for example, attacked any Mormon who might, "write history as they were taught in graduate school rather than as Mormons." A police investigation into a controversy involving historical documents "revealed the church's hierarchy to be obsessed with stopping any tampering with the church's official accounting of the past."
Article VI of our Constitution declares that "no religious test shall ever be required as a qualification to any office or public trust under the United States," and it seems un-American to challenge a candidate's religious beliefs, or to insist that he or she worship one god rather than another.
But in the case of Mitt Romney and his Mormonism, do we have a right to know what he believes about the past, about the history of North American settlement, the creation of the nation and the founding of his church?
Romney, of course, is no desultory Mormon. His is one of the founding families of the religion, he has been one of its bishops and he has contributed enormous sums to the church (though until we see all those tax returns we won't know just how much). What would we think about a president whose understanding of history is contradicted by history itself?
Posted on: Thursday, September 27, 2012 - 15:06
SOURCE: CNN.com (9-18-12)
Julian Zelizer is a professor of history and public affairs at Princeton University. He is the author of "Jimmy Carter" and of the new book "Governing America."
(CNN) -- Politics has returned to the water's edge. In the past week, we've seen how international events can suddenly dominate a political campaign, at least for a few days.
The uproar in the Middle East over a YouTube video that featured anti-Islamic messages triggered widespread protests. An attack on the U.S. consulate in Libya left four people dead, including U.S. Ambassador Christopher Stevens. Protests in Egypt were equally intense as some people in a crowd of more than 2,000 scaled the walls of the U.S. Embassy. Protests spread to 20 other countries in the Middle East and beyond. In Tunis, protesters destroyed a school run by Americans while in Afghanistan protesters lit with a torch an effigy of Obama, the U.S. leader once hailed as the president who would repair America's image in the world, and watched it burn up.
Added to this brew was Israeli Prime Minister Benjamin Netanyahu's warning that, in his mind, President Obama has refused to be tough enough with Iran on its nuclear program and the prospect that Israel could take military action against Iran on its own....
Posted on: Thursday, September 27, 2012 - 15:00
SOURCE: Yale Global (9-24-12)
Jeff Wasserstrom is the author of China in the 21st Century: What Everyone Needs to Know, co-editor of Chinese Characters: Profiles of Fast-Changing Lives in a Fast-Changing Land, Asia editor, Los Angeles Review of Books; Chancellor's Professor and Chair, history Department, University of California at Irvine; and editor, Journal of Asian Studies. He can be reached at @jwassers
IRVINE: Four years ago, when the Beijing Olympic slogan was “One World, One Dream,” global audiences were wowed by a Chinese spectacle that began with a quote from Confucius describing the pleasure of welcoming friends from afar. Now, the sounds coming from China and grabbing our attention are not spirited drumming but angry chanting about settling scores.
It’s worth comparing the recent street actions in China, triggered by an ongoing dispute over control of specks of land known as the Diaoyu Islands in Chinese, the Senkaku Islands in Japanese, with the mesmerizing gala of the Beijing Games. The two spectacles offer a striking study in contrasts – and intriguing parallels.
Let’s start with contrasts.
The 2008 spectacle, choreographed by filmmaker Zhang Yimou, was held in one locale and, though filled with historical allusions, included no nods to Japanese invasions or direct references to Chairman Mao. Today’s demonstrators, marching through streets across China, carry portraits of Mao and refer continually to past atrocities committed by Japanese soldiers....
Posted on: Tuesday, September 25, 2012 - 15:41
SOURCE: Chronicle of Higher Ed (9-17-12)
Michael D. Gordin is a professor of history at Princeton University. He is the author of The Pseudoscience Wars: Immanuel Velikovsky and the Birth of the Modern Fringe, due out in October from the University of Chicago Press.
The term "pseudoscience" gets thrown around quite a bit these days, most notably in debates about the dominant consensus on anthropogenic climate change. Say "pseudoscience," and immediately a bunch of doctrines leap to mind: astrology, phrenology, eugenics, ufology, and so on. Do they have anything in common? Some posit unknown forces of nature, others don't. Some are advocated by outsiders to the scientific community, while others have been backed by the elite. And the status of each can fluctuate over time. (Astrology, for example, was considered an exemplary field of natural knowledge from antiquity through the Renaissance.)
For millennia, philosophers have attempted to erect a boundary between those domains of knowledge that are legitimate and those that are anything but—from Hippocrates' essay on "the sacred disease" (epilepsy) to editorials decrying creationism. The renowned philosopher Karl Popper coined the term "demarcation problem" to describe the quest to distinguish science from pseudoscience. He also proposed a solution. As Popper argued in a 1953 lecture, "The criterion of the scientific status of a theory is its falsifiability." In other words, if a theory articulates which empirical conditions would invalidate it, then the theory is scientific; if it doesn't, it's pseudoscience.
That seems clear enough. Unfortunately, it doesn't work. Epistemologists present several challenges to Popper's argument. First, how would you know when a theory has been falsified? Suppose you are testing a particular claim using a mass spectrometer, and you get a disagreeing result. The theory might be falsified, or your mass spectrometer could be on the fritz. Scientists do not actually troll the literature with a falsifiability detector, knocking out erroneous claims right and left. Rather, they consider their instruments, other possible explanations, alternative data sets, and so on. Rendering a theory false is a lot more complicated than Popper imagined—and thus determining what is, in principle, falsifiable is fairly muddled....
Posted on: Tuesday, September 25, 2012 - 10:20
SOURCE: NY Review of Books (9-12-12)
Diane Ravitch won the Daniel Patrick Moynihan Prize of the American Academy of Political and Social Sciences in 2011 for her “careful use of social science research for the public good.” (July 2012)
According to most news reports, the teachers in Chicago are striking because they are lazy and greedy. Or they are striking because of a personality clash between Mayor Rahm Emanuel and union president Karen Lewis. Or because this is the last gasp of a dying union movement. Or because Emanuel wants a longer school day, and the teachers oppose it.
None of this is true. All reports agree that the two sides are close to agreement on compensation issues—it is not money that drove them apart. Last spring the union and the school board agreed to a longer school day, so that is not the issue either. The strike is a clash of two very different visions about what is needed to transform the schools of Chicago—and the nation.
Chicago schools have been a petri dish for school reform for nearly two decades. Beginning in 1995, they came under tight mayor control, and Mayor Richard Daley appointed his budget director, Paul Vallas, to run the schools; Vallas set out to raise test scores, open magnet schools and charter schools, and balance the budget. When Vallas left to run for governor (unsuccessfully), Daley selected another non-educator, Arne Duncan, who was Vallas’s deputy and a strong advocate of charter schools. Vallas had imposed reform after reform, and Duncan added even more. Duncan called his program Renaissance 2010, with the goal of closing low-performing schools and opening one hundred new schools. Since 2009, Duncan has been President Obama’s Education Secretary, where he launched the $5 billion Race to the Top program, which relies heavily on student test scores to evaluate teacher quality, to award merit pay, and to close or reward schools; it also encourages the proliferation of privately managed charter schools....
Posted on: Tuesday, September 25, 2012 - 10:00
SOURCE: LA Times (9-22-12)
Thomas V. DiBacco is a historian and professor emeritus at American University in Washington.
Wednesday marks the 52nd anniversary of the first televised presidential candidate debate, between John F. Kennedy and Richard M. Nixon. It's a dubious distinction. Although there's every indication that debates matter in voter selection of a candidate, such rhetorical confrontations are poor indicators of future leadership.
By any reasonable standard, debates are won on form and rarely on substance. In the pre-microphone age of American politics, he who had the booming voice had the decided edge, as evidenced in the 1858 U.S. Senate race debates in Illinois between Stephen Douglas and Abraham Lincoln.
Lincoln had a "shrill, piping, squeaking and unpleasant" voice, according to his law partner. Douglas, after 15 years in Congress, was the more skilled public speaker, energetically roaming the platform with gestures and colorful language sure to capture the attention of local audiences. After seven debates, Douglas won the Senate seat. And had debates been continued on a presidential level two years later, Douglas may well have emerged as the 16th president...
Posted on: Monday, September 24, 2012 - 13:20
SOURCE: NYT (9-21-12)
Simon Sebag Montefiore is the author of “Stalin: The Court of the Red Tsar” and “Jerusalem: The Biography.”
MASHA GESSEN, a Russian journalist, was recently fired for refusing to cover President Vladimir V. Putin’s hang-glider flight with migrating cranes, an exploit that was much mocked. Last week, she received an unexpected phone call, which she recounted in a blog post for The International Herald Tribune, this newspaper’s global edition. “My phone rang. ... I listened to silence for two minutes.” Finally: “Don’t hang up. I will connect you.” Frustrated, she shouted: “Do you want to introduce yourself?” A famous voice replied: “Putin, Vladimir Vladimirovich. I heard you were fired and that I unwittingly served as the reason for it.” He invited her to meet....
Educated Russians would have spotted similarities between this call and earlier Olympian interventions into the lives of writers by Romanov and Communist autocrats, illuminating rituals of Russian leadership and the relationship between power and art. This tradition flatters the writer in a culture where literature has special prestige. But the surprise also promotes the cult of the unpredictable czar who moves, like God, in mysterious ways.
Eighty-two years ago, Mikhail Bulgakov, novelist and playwright, had been fired from Soviet theaters, his works banned, when his phone rang: “Comrade Bulgakov? ... Please hold. Comrade Stalin will speak to you.” Then the famous voice began, “I apologize ... we shall try to do something for you.” Afterward Bulgakov phoned the Kremlin: was it a prank? It was Stalin. Soon, the theater employed Bulgakov again....
Posted on: Monday, September 24, 2012 - 10:01
SOURCE: NYT (9-23-12)
Posted on: Monday, September 24, 2012 - 09:43
SOURCE: NYT (9-23-12)
Eliza Griswold is a senior fellow at the New America Foundation and the recipient of a Guggenheim fellowship.
On June 4, 1963, less than a year after the controversial environmental classic “Silent Spring” was published, its author, Rachel Carson, testified before a Senate subcommittee on pesticides. She was 56 and dying of breast cancer. She told almost no one. She’d already survived a radical mastectomy. Her pelvis was so riddled with fractures that it was nearly impossible for her to walk to her seat at the wooden table before the Congressional panel. To hide her baldness, she wore a dark brown wig.
“Every once in a while in the history of mankind, a book has appeared which has substantially altered the course of history,” Senator Ernest Gruening, a Democrat from Alaska, told Carson at the time.
“Silent Spring” was published 50 years ago this month. Though she did not set out to do so, Carson influenced the environmental movement as no one had since the 19th century’s most celebrated hermit, Henry David Thoreau, wrote about Walden Pond. “Silent Spring” presents a view of nature compromised by synthetic pesticides, especially DDT. Once these pesticides entered the biosphere, Carson argued, they not only killed bugs but also made their way up the food chain to threaten bird and fish populations and could eventually sicken children. Much of the data and case studies that Carson drew from weren’t new; the scientific community had known of these findings for some time, but Carson was the first to put them all together for the general public and to draw stark and far-reaching conclusions. In doing so, Carson, the citizen-scientist, spawned a revolution.
“Silent Spring,” which has sold more than two million copies, made a powerful case for the idea that if humankind poisoned nature, nature would in turn poison humankind. “Our heedless and destructive acts enter into the vast cycles of the earth and in time return to bring hazard to ourselves,” she told the subcommittee. We still see the effects of unfettered human intervention through Carson’s eyes: she popularized modern ecology....
Posted on: Friday, September 21, 2012 - 15:34