Roundup: Talking About HistoryFollow RU: Talking About History on RSS and Twitter
This is where we excerpt articles about history that appear in the media. Among the subjects included on this page are: anniversaries of historical events, legacies of presidents, cutting-edge research, and historical disputes.
Sally Satel, a psychiatrist and a scholar at the American Enterprise Institute, and co-author of the forthcoming One Nation Under Therapy, in the NYT (March 5, 2004):
... just as the press has spent a year comparing the invasion of Iraq to Vietnam, it has begun drawing parallels between today's troops and Vietnam veterans, who are believed to suffer from a high rate of war-related psychiatric disorders.
But as we try to help the soldiers of Operation Iraqi Freedom meld back into society, it would be a mistake to rely too heavily on the conventional wisdom about Vietnam. What is generally put forth as an established truth — that roughly one-third of returnees from Vietnam suffered psychological problems — is at best highly debatable.
That much-cited estimate comes from the National Vietnam Veterans Readjustment Study, released in 1990 by the Veterans Administration. It concentrated on post-traumatic stress disorder, a psychiatric condition marked by disabling painful memories, anxiety and phobias after a traumatic event like combat, rape or other extreme threat. It found that 31 percent of soldiers who went to Vietnam, or almost one million troops, succumbed to post-traumatic stress. The count climbed to fully half if one included those given the diagnosis of"partial" post-traumatic stress disorder.
On closer inspection, however, these figures are shaky. After all, only 15 percent of troops in Vietnam were assigned to combat units, so it is odd that 50 percent suffered symptoms of war trauma. True, noncombat jobs like driving trucks put men at risk for deadly ambush, but Army studies on psychiatric casualties during the war found the vast majority of cases referred to field hospitals did not have combat-related stress. Rather, most were sent for medical attention because of substance abuse and behavioral problems unrelated to battle.
Moreover, during the years of the most intense fighting in Vietnam, psychiatrists reported that psychiatric casualties numbered between 12 and 15 soldiers per thousand, or a little more than 1 percent. If the 1990 readjustment study is correct, the number afflicted with diagnosable war stress multiplied vastly in the years after the war. Again, it does not add up.
How to explain the postwar explosion in Vietnam cases? The frequently proffered answer is that the start of the disorder can be delayed for months or years. This belief, however, has no support in epidemiological studies. And consider the striking absence of delayed cases in long-range studies like that of people affected by the Oklahoma City bombing. Such studies have found that symptoms almost always develop within days of the traumatic event and, in about two-thirds of sufferers, fade within a year.
It is worth noting that the concept of delayed post-traumatic stress was introduced in the early 1970's by a group of psychiatrists led by Robert Jay Lifton, an outspoken opponent of the war. They decided that many former soldiers suffered what was called post-Vietnam syndrome — marked by"alienation, depression, an inability to concentrate, insomnia, nightmares, restlessness, uprootedness and impatience with almost any job or course of study" — and that this distinguished veterans of Vietnam from those of any other war.
While there was little data to back up the existence of this delayed syndrome, the image of the veteran as a walking time bomb was a boon to the antiwar movement, which used it as proof that military aggression destroys minds and annihilates souls. Yes, some veterans suffered the crippling anxiety of chronic post-traumatic stress disorder. But the broad-brush diagnosis of post-Vietnam syndrome also served political ends. ...
George Shadroui, in FrontPageMag.com (March 5, 2004):
Until the mid 1960s, it could be argued, the National Book Awards, a much heralded and sought after honor, was a fair recognition of great writing across different perspectives and genres poetry, fiction, non-fiction, history, etc. But increasingly since the 1960s, the awards have been an exercise in political as much as literary judgment.
To refresh memories, during the 1950s and into the early 1960s a host of writers were honored who crossed the political and cultural spectrum. James Dickey, Wallace Stevens, John Crowe Ransom and Robert Penn Warren won in the arena of poetry. Walker Percy and William Faulkner took honors for their fiction. All were men of arguably conservative sensibilities, even if not notably political. Prominent liberals also were recognized, among them Rachel Carson, George Kennan, Ralph Ellison, Archibald MacLeish and William Shirer. Whether one agreed or disagreed with the choices, there was ecumenical representation that suggested the absence of political or cultural litmus tests.
Things began to change in the 1960s as the radical left began mobilizing against American power and the Vietnam War. In 1966, Arthur Schlesinger Jr. won for his adoring portrait of the short-lived Kennedy administration, A Thousand Days. In 1968, liberal icon George Kennan was honored for his memoirs. In 1969, Norman Mailers Armies of the Night, an anti-Vietnam war memoir, captured a National Book Award.
As we head into the 1970s, Erik H. Erikson is honored in 1970 for his work on Gandhi and his non-violent techniques, a politically correct stance in the midst of the anti-war movement. Other winners included former communist and leftist Lillian Hellman for her memoir, An Unfinished Woman; James MacGregor Burns for his flattering biography of Franklin Roosevelt: The Soldier of Freedom; Joseph Lash for Eleanor and Franklin; and France Fitzgerald for Fire in the Lake, a book that glorified the Viet Cong.
This trend toward liberal and leftist perspectives continued over the next two decades. Winners included Murray Kempton, Peter Gay, Joyce Carol Oates, Allen Ginsberg, Adrienne Rich, Irving Howe, Schlesinger again for another Kennedy portrait, Robert Jay Lifton, Peter Matthiessen, Malcolm Cowley, Barbara Tuchman, Susan Sontag, Edmund Morris, Ronald Steel, Victor Navasky of the Nation magazine, Thomas Friedman of the New York Times, Alice Walker, Alan Brinkley, etc.
Not only are many of these folks left or liberal in their views, so is the subject matter. We find biographies of Norman Thomas, the Kennedys, Huey Long, Walter Lippmann, FDR and Eleanor, Lyndon Johnson, etc. There are multiple studies on slavery and Western complicity in that sordid business. There are critical accounts of the dropping of the Atomic bomb at the end of World War II, and the failures of the American system to deal with seemingly intractable social ills, but very little that celebrates Western contributions to freedom and democracy. No prominent conservative or Republican person is even the subject of a winning book save Theodore Roosevelt, though a number of authors are honored in the non-fiction categories for their critical reviews of the U.S. role in Vietnam -- Fitzgerald and Mailer being joined by Gloria Emerson, for Winners and Losers, Neil Sheehan for The Bright Shining Lie, and James Carroll for An American Requiem.
As we move into the 1990s, virtually every nonfiction winner was written by liberals or noted leftists: Orlando Patterson, at Harvard, for his book Freedom (1991); Gore Vidal for his collection of essays, United States (1993); Tina Rosenberg, for The Haunted Land (1995); Carroll for An American Requiem (1996), Edward Ball, former Village Voice writer, for Slaves in the Family (1998); John Dower for his book, Embracing Defeat: Japan in the Wake of World War II (1999); and Robert Caro, for Master of the Senate: The Years of Lyndon Johnson (2002).
The question that must be asked of the NBA and those publishers who often drive nominations is this: were all of these books of such a high quality that they necessarily closed the door on those written by conservatives? Emersons book on Vietnam, for example, took a major hit not from the right but from fellow liberal Garry Wills, who criticized Emerson in the New York Review of Books in 1977 for inserting herself too often into the Vietnam story she was trying to tell.
Wrote Wills: But Emerson herself shows a groupie tendency for all those connected with the war . sees malice where there was little, and saintliness where there was little, and has no mind at all for sorting out various kinds of mindlessness on both sides. Not the type of endorsement one expects for a National Book Award winner.
The skeptic might ask why it matters, particularly if the work honored is of high quality. The answer is obvious. Prestigious awards mean more attention, resources, and venues for discussing history, and ultimately more notoriety, which means greater book sales, more readers and more power in the debate over ideas.
And many of those who have won have not been shy about using their status to pronounce on the issues of the day. We all know that Mailer and Vidal are actively baiting the Bush administration, and they are hardly alone. James Carroll and John Dower, to pick two relatively recent winners, also have been given platforms for criticizing the Iraq war effort. In both cases, their credibility as witnesses to history are built upon their work and the recognition it has received.
Carroll, whose memoir on Vietnam is highly critical of the United States, has nothing good to say about the war in Iraq either. Writing in the Boston Globe in September 2003, Carroll opened with this riveting proclamation: The War is Lost. He goes on: The Bush administrations hubristic foreign policy has been efficiently exposed as based on nothing more than hallucination. High-tech weaponry can kill unwilling human beings, but it cannot force them to embrace an unwanted idea. Carroll then argues: Sooner or later, the United States must admit that it has made a terrible mistake in Iraq, and it must move quickly to undo it.
It has to be observed that by most historical measures liberating 50 million people in a matter of months, with American losses totaling in the hundreds, would be called a remarkable success. Remember, we lost 3,000 people in 90 minutes on 9/11. Many of the wars opponents certainly predicted much worse thousands of dead soldiers, the unleashing of weapons of mass destruction, regional instability, etc. Instead, rogue regimes Iran, Libya, Syria and North Korea are beginning to cooperate on issues of arms proliferation and weapons of mass destruction, and known terrorists are sending missives lamenting the refusal of the United States to withdraw because our troops are suffocating the terrorist effort. Yes, the war continues and the terrorists are clearly undaunted by holy traditions (witness the most recent attacks in Baghdad and Karbala) that our critics insist we respect.
Dower, another scholar on the left, also is called upon to issue an indictment of Bush. Writing in the Manchester Guardian in November 2002, Dower argues that efforts to compare post-World War II Japan with Iraq are absurd. He observes that not a single attack was launched against American occupation forces in Japan after the cessation of hostilities in 1945, and he argued, rightly, that this would not be the case in Iraq. Then again, Japan had just been through almost a decade of war and had been totally defeated and threatened with decimation. So he is right that comparisons are not always instructive, but we might reach different conclusions as to why.
This past December, writing in the Los Angeles Times, Dower takes Bush to task again for trying to make the case that Iraq, like Japan and Germany, could be rebuilt and transformed. What are we to make of this murky use of history? The truth is that what is happening in Iraq presents a stunning and fundamental contrast to what took place in occupied Japan and Germany over a half century ago and not a positive one.
Dower may be correct that an attempt to compare the situations has marginal value, but was Bush trying to draw a direct parallel? One thinks not. The situation in Iraq differs from the situation in Japan and Germany, but for many reasons Dower does not explore. The United States did not firebomb Baghdad the way allies did Dresden, for example. Is Dower complaining about that? We also did not drop an atomic bomb on Tikrit, though surely that would have reduced resistance from Al Qaeda and hard-core Baathists. Nor does Dower inform readers that in post-war Germany attacks against American troops continued for several years.
Sheehan, author of The Bright Shining Lie, took America to task in the 1980s for its arrogance in dealing with Vietnam. In an interview with Harry Kresler, Sheehan argues: we didnt understand the Vietnamese whom we were allegedly helping, and we didnt understand the Vietnamese we were fighting. He observes that the United States failed to grasp the historical enmity between China and Vietnam, a regional tension that transcended allegiance to communism. However, it remains true that despite the Sino-Soviet split, or the nationalism of the Vietnamese, both China and the Soviet Union were providing major support for the communists in the north and the insurgency in the south.
An article by Fox Butterfield (himself a National Book Award winner) that appeared in the New York Times magazine in the early 1980s demonstrated, if not that U.S. war policy was correct, that many of the assumptions of anti-war activists were flawed. Few leftists in the "peace" movement would have conceded that a North Vietnamese victory would result in millions of liberated Vietnamese fleeing in boats or being imprisoned or killed in reeducation camps, though no one is likely to win many awards for pointing this out.
On another front, after the collapse of the Soviet system, Tina Rosenberg went to Eastern Europe to document the aftermath of that historic transformation. One might have hoped she would focus on the obvious atrocities perpetrated by communist regimes. Instead, she wound up in part -- romanticizing communism and its leadership and portraying the anti-communist sentiments of the new governments as a major problem confronting the newly liberated states.
In the final pages of her NBA winning book, The Haunted Land, she writes: Fascism espouses repugnant ideas, but communisms ideas of equality, solidarity, social justice, an end to misery, and power to the oppressed are indeed beautiful .Communism is lofty and grand, but human beings are flawed creatures, unwilling to pay communism the tribute of sacrifice it demands.
Here we have clear evidence of Arthur Koestlers claim that the poetry of communism remains seductive even as the reality of it leads to the nightmare of totalitarianism. But even the alleged poetry of communism is overstated. Reread The Communist Manifesto. It is not a hymn to brotherhood but a declaration of war on all save the working class, and particularly on those who do not embrace underlying Marxist assumptions. In her NBA acceptance speech Rosenberg, who is now an editorial writer for the New York Times, thanks Mikhail Gorbachav for making her work possible, but does not mention the United States or its role in liberating Eastern Europe (much less Ronald Reagan).
One could go on this way. Indeed, scanning the list of winners over three decades it is difficult to find more than a smattering of conservative writers, historians or thinkers. Robert Nozick in 1975 was honored for Anarchy, State and Utopia and Henry Kissinger in 1980 for White House Years. You could argue that Tom Wolfe, who won for Right Stuff, has conservative leanings. Certainly, he has shown up at National Reviews anniversary dinners and has had some fun at the expense of radicals. Likewise, though Fox Butterfield writes for the liberal New York Times, his book, China: Alive in the Bitter Sea, has been rightly praised for its honest portrait of life under communist rule. Carlos Eire, who won for his book, Waiting for Snow in Havana, did not hesitate to observe during his acceptance speech that he would be imprisoned for his writings were he still in Cuba. Bill Buckley won not for his political commentary, but for his mystery, Stained Glass. David Horowitz and Peter Collier managed a nomination for their book on the Rockefellers, but that was before their turn to the right.
This obvious political slant does a disservice to the marketplace of ideas and the notion of basic fairness. After all, if Gore Vidal deserved a National Book Award for his collection of essays, why not William F. Buckley, Jr. for his collection of speeches, Let Us Talk of Many Things, which traces the political and cultural landscape of our nation over four decades? If communist apologist Lillian Hellmans memoir was instructive, what about Horowitzs memoir, Radical Son, which documents his fascinating intellectual journey?
Other notable books by conservatives come to mind that not only did not win, but did not even warrant a finalist mention: Michael Novak for The Spirit of Democratic Capitalism, George Gilder for Wealth and Poverty, Charles Murray for Losing Ground and Stephen and Abigail Thernstrom for America in Black and White. These works were ground-breaking studies dealing with some of the most difficult issues of our day and surely compared with other books NBA judges chose to honor. While we are at it, how is it that the judges of the National Book Award have managed to overlook entirely Allan Bloom, Thomas Sowell, William Safire, George Will, Russell Kirk, and others.
Christopher Lasch is an interesting case. He has always been a good read, whether writing as a leftist back in the 1960s or 1970s, or as a reformed leftist in the 1980s and early 1990s. Though he won the National Book Award for The Culture of Narcissism, which was certainly a worthy effort, it could be argued that his most expansive and impressive book was The True and Only Heaven: Progress and Its Critics. Why did he not win or even get nominated again after embracing a more a more nuanced position toward progressive ideologies? That he had already won is not an argument against. After all, David McCullough and Schlesinger have both won twice.
One possible explanation could be the judges. Though I was unable to obtain a complete list over the years (the names are not readily shared or available), a review of the judges since 2001 turned up a disproportionate representation of writers with alternative or left to liberal perspectives: Richard Rodriguez, a liberal commentator for the Public Broadcasting System; Christine Stansell, a Princeton professor whose history on the Bohemians includes a romantic portrait of John Reed, Emma Goldman and other communists; Alex Kotlowitz, whose book, There Are No Children Here, documents life in a Chicago housing project; and Terry Tempest Williams, a prominent pro-environment activist from Utah. Others included Gail Buckley, author and daughter of entertainer Lena Horne, Mary Karr, a professor at Syracuse University, Jonathon Kirsch, book editor of the Los Angeles Times, Lawrence Jackson, professor at Emory University, and Michael Kinsley, one of liberalisms leading lights. The only judge with any clear connections to conservatism was Terry Teachout, who has written for the Wall Street Journal and Commentary and who edited a collection of Whittaker Chambers journalism.
Whatever the talents of these writers, and in some cases they are significant, it can still be fairly observed that most are hardly immersed in the political or cultural mainstream of America. Is there any question that the institutional embrace of counterculture or alternative viewpoints amounts to a bias against those who celebrate business ingenuity, free enterprise or the dominant traditions of Western culture?
Unfortunately, favoritism in the direction of the liberal/left perspective is not limited to the National Book Awards. The folks over at the University of Georgia who bestow Peabody Awards have recognized Bill Moyers on eight occasions for his television work, including The World of Ideas and an MX Missile Debate he hosted in 1980. An on-line search of the Peabody winners discloses that Bill Buckleys Firing Line, though it raised intellectual and policy discussion to the highest levels for over three decades, never won. Surely, a man considered one of our great public intellectuals should have been recognized for his lifetime contribution to debate and policy analysis, dont you think? Oprah Winfrey has been.
Here is the raw fact. Since the 1960s, of the 90 writers (give or take a couple) honored by the NBA in nonfiction categories for books that dealt with historical, political or political culture matters, only three or four could be called conservative. More than 60 have had clear ties to leftist/liberal causes or concerns. In fact, it is difficult to find a conservative who has made the finalist list in the past 20 years. This is even less intellectual diversity that the faculties of American college campuses. It seems as though we have become so accustomed to an exclusionary culture that no one even notices anymore.
Yong Tiam Kui, in the New Straits Times (Malaysia) (March 3, 2004):
IN his famous travelogue Description of the World, Italian traveller and explorer Marco Polo claimed that he reached China with his father and uncle in 1275, met Kublai Khan and impressed him so much that the Mongol emperor made him a special emissary. He said he was sent on missions throughout China and governed the city of Yangzhou for three years before returning to Venice in 1295.
Scholars now suspect that Marco Polo never actually went to China. They believe that he could have gained his knowledge of the Middle Kingdom from Arab or Persian guidebooks.
This is because while he described some features of Chinese society such as porcelain, and the use of coal and paper money, he neglected to mention many others.
For instance, he did not mention tea, chopsticks, foot-binding, the Chinese writing system, woodblock printing and the Great Wall.
Furthermore, no reference to Polo has ever been found in the Yuan Shih (the official history of the Mongol Yuan dynasty).
It would have been extremely unusual for Chinese historians, who meticulously recorded everything for posterity, to have neglected to mention so important a personage as Polo if he had really served as an envoy of the emperor and been governor of an important city like Yangzhou.
What's more, he also failed to learn Chinese or pick up even a few Chinese or Mongol place-names despite having supposedly lived in China for 17 years.
In 1998, National Geographic photographer Michael Yamashita set out to prove once and for all that Polo was a fraud. Using Polo's book as a travel guide, he spent the better part of two years retracing the footsteps of the Venetian explorer.
After retracing the entire route, Yamashita is utterly convinced that Polo's account was based on first-hand experience.
"I wanted to see for myself how accurate he was. What really struck me was how accurate and detailed he was and how much one man contributed to geographic knowledge. There is no doubt in my mind that he did go to China," says Yamashita in an interview conducted recently in Taipeh, Taiwan.
Khang Hyun-sung, in the South China Morning Post (March 2, 2004):
Contradictory historical claims by China and South Korea over an ancient kingdom that ceased to exist 1,300 years ago has led to an unlikely alliance between the two Koreas, with important implications for a future reunified Korea.
The warrior kingdom of Koguryo, which for almost a century straddled most of the Korean peninsula and a considerable part of what today is northeastern China, was dragged into the public spotlight last year by a Beijing-backed study that concluded it was historically an integral part of China.
The claim has been fiercely rejected by South Korea and the country's academics who consider Koguryo (BC 37-AD 668) to be one of the founding kingdoms of the country and the basis for the word "Korea".
"It is an indisputable historical fact that Koguryo is the root of the Korean nation and an inseparable part of our history," South Korean Foreign Minister Ban Ki-moon said.
"We will sternly and confidently deal with any claims or arguments harming the legitimacy of our rights," he said after the Chinese Institute of Social Science released the report as part of its Northeast Asia project.
The issue has also struck a raw nerve with North Korea, which is lobbying to have Koguryo temples placed on Unesco's world cultural heritage list of historical sites. It has been blocked by Beijing, which is presenting Koguryo tombs in northeast China as substitutes.
In recent months, the North Korean media has published several broadsides against claims by Chinese academics that Koguryo was a vassal state that maintained a tributary relationship with China.
Koguryo fended off successive waves of invading Chinese armies of the Sui and Tang dynasties to defend its sovereignty, noted North Korea's official newspaper Minju Joson.
South Korea has also responded by setting up a research institute with the task of supporting the territorial and cultural sovereignty of the kingdom.
The historical dispute goes beyond accusations of historical revisionism and highlights long-term, potentially competing geopolitical ambitions for the area.
The decision by Egyptian President Anwar al-Sadat to remove the Soviet military presence from his country during the summer of 1972 has often been viewed as the first step on the road to the October War the following year. By removing the Soviet presence, it has been argued, Sadat was also removing the major obstacle preventing him from engaging in another war with Israel.(1) Though Sadat insisted at the time that the expulsion of the Soviets was simply a result of the growing differences between Moscow and Cairo,(2) and while others have argued that their removal was a direct result of the Soviet-American detente,(3) it seemed clear that since Moscow was opposed to risking its new relationship with the United States by supporting Egypt in another war with Israel, Sadat had no choice but to ask for their departure.
In Washington, American officials were reportedly "shocked" to learn of Sadat's announcement. Henry Kissinger later recalled that Sadat's decision came as a "complete surprise to Washington," and he quickly met with the Soviet ambassador to dispel any notion that the United States had colluded with the Egyptians in reaching this end.(4) President Nixon, similarly, hurried a letter to Leonid Brezhnev, claiming the United States had "no advanced knowledge of the recent events in Egypt," and assured the Soviet Premier that the United States would "take no unilateral actions in the Middle East" as a result of the recent developments.(5)
Early scholarly treatment of Sadat's decision to remove the Soviet military presence has generally fallen in line with this official account. William B. Quandt, for example, argued that the expulsion of the Soviet advisors came at "curious" time in Washington since Nixon was preoccupied with an election campaign and would not risk his lead in the polls "by embarking on a controversial policy in the Middle East."(6) In his study of the Soviet-Egyptian relationship, Alvin Z. Rubinstein also concluded that "as far as can be determined Sadat consulted no one; his decision was his own."(7)
More recently, scholars have placed the expulsion in the context of Soviet-American relations rather than in the deteriorating relationship between Egypt and Russia. In Raymond L. Garthoff's view, it was the agreements reached between the United States and the Soviet Union during the 1972 Moscow Summit, which effectively put the Arab-Israeli conflict on the backburner, that became the "last straw" for Sadat.(8) Henry Kissinger reached similar conclusions in his 1994 study Diplomacy, in which he argued that "the first sign that [detente] was having an impact came in 1972 [when] Egyptian President Anwar Sadat dismissed all his Soviet military advisors and asked Soviet technicians to leave the country."(9)
Without archival evidence, however, several questions surrounding Sadat's decision to expel the Soviet military presence from Egypt still remain: To what extent did the United States have prior knowledge of Sadat's intentions? Did the United States work with Sadat in seeking the removal of the Soviets? And was the expulsion of the Soviet military presence from Egypt really the first step to the October War, as some have argued, or was it simply the easiest way for Sadat to tell the United States that he was prepared to take Egypt in a new direction?
New material emerging from American archives and summarized in this article suggests that Sadat's decision to remove the Soviet advisors was hardly the surprise that American officials later claimed it to be. Documents now declassified from State Department and National Security Council files, as well as numerous hours of recorded conversations between President Nixon and his senior foreign policy advisors, show that as early as May 1971, over a year before the expulsion of the Soviet advisors, American officials were well aware of Sadat's intentions and worked aggressively to ensure the removal of the Soviet presence from Egypt. Throughout the summer of 1971, these sources show, the Nixon administration took numerous steps to help Sadat remove the Soviet military presence from his country. We now know, in fact, that Nixon's decision to suspend the supply of aircrafts to Israel at the end of June, and his decision to aggressively press for the reopening of the Suez Canal as part of an interim agreement between Egypt and Israel had just as much to do with getting the Soviets out of Egypt as it did with finding a long-term peace agreement between Egypt and Israel.
Just as important, though, these new sources demonstrate that the expulsion of the Soviet military presence had very little to do with preparing Egypt for another war with Israel. For Sadat, the decision to remove the Soviets was clearly a decision he had made from the earliest days of his presidency to not only become much closer to the West, but to avoid another war with Israel, which he knew Egypt would undoubtedly lose.
Emily Eakin, in the NYT (Feb. 28, 2004):
In February 1766, taken aback by the violent reaction to the Stamp Act, its latest attempt to impose taxes on the restive American colonies, Britain summoned Benjamin Franklin to Parliament in London. The interview, which lasted several hours, was less than friendly. The Americans, Franklin reminded his interrogators, were voracious consumers of British goods, buying them at a rate that far exceeded the colonies' staggering population growth. But this lucrative spending habit, he warned, should not be taken for granted.
The colonists could either produce necessities themselves or do without, he testified. As for"mere articles of fashion," he said, they"will now be detested and rejected."
A month later the Stamp Act was repealed. And American trade in British goods — valued at more than a million pounds a year — continued at a galloping pace. But Franklin's words represented a turning point in the struggle for independence, says T. H. Breen, the William Smith Mason professor of American history at Northwestern. Americans, he argues, had discovered a political weapon without which the Revolution might not have been successful: consumerism.
Is it possible that a signature attribute of contemporary America — and a trait for which it is frequently criticized — lay at the heart of its most inspiring foundational achievement? This is the startling implication of Mr. Breen's new book,"The Marketplace of Revolution: How Consumer Politics Shaped American Independence," published earlier this month by Oxford University Press. In his account, the self-sufficient yeoman farmer of Jeffersonian lore is nowhere to be found. Even before America was a nation, Mr. Breen insists, it was a society of consumers.
Deceptively simple, his argument goes like this: two and a half million strong and scattered along 1,800 miles of coastline, the colonists had little in common besides a weakness for what Samuel Adams derisively termed"the Baubles of Britain." When Britain imposed stiff taxes on this appetite for stuff — without granting any political representation — Americans responded with an ingenious invention with instant and widespread appeal: the consumer boycott. By the time the First Continental Congress was convened in September 1774, transforming mass consumer mobilization into a successful political rebellion was a relatively straightforward task.
Or, as Mr. Breen, 61, explained in a recent telephone interview:"Every predictive model that one could have put forward at the time indicated that the colonies, should they beat the British, would have broken into 13 separate entities. Yet somehow enough colonists found enough common cause to make war on what was the strongest military power in the world. How did they create the bond of political trust so that if one city protested or resisted the British, the rest said, `We'll stand with you'? It was this great swelling of consumer experience that was the transformative element."
It sounds far-fetched, possibly scandalous: pinning Americans' success in the war for independence even partly on their common experience in the marketplace. Moreover the notion seems to contradict the long-standing assumption among scholars that lofty ideas elegantly expressed — and a brisk trade in political pamphlets and newspapers — were sufficient to unite the public behind the revolutionary cause.
But in keeping with the latest academic trends, intellectual history is out and material culture is in. Consumerism in particular is a hot topic in American studies these days, and Mr. Breen's book comes garnished with encomiums from senior members of his field, including Joseph J. Ellis, the Pulitzer Prize-winning author of"Founding Brothers" (Knopf, 2001), who calls it"the most original interpretation of how the American Revolution happened to appear in print in the last 50 years."
Slave labour, beatings, sexual abuse, fear and isolation were the norm for thousands of “Verdingkinder”, or"discarded children", who were given away or sold as cheap labour until the 1950s.
Historian Marco Leuenberger told swissinfo that the time has come for reappraisal of this dark episode.
Leuenberger was ten years old when his father first told him of his childhood as a discarded child. Also aged ten, his father had to endure the daily grind of getting up at 5am and working until late into the night.
Inspired by his father and thousands of children like him, Leuenberger in 1991 embarked on a huge research project to explore this dark chapter in Switzerland's history.
The discarded children were usually orphans, illegitimate or came from the poorest families and they were either given away or sold to farmers.
“Most of these children were used as cheap labour, exploited physically or even sexually abused,” Leuenberger concludes in his study.
Leuenberger and other historians are calling for a nationwide research project to be carried out into the trade in “Verdingkinder”, while many of these former child labourers are still alive.
swissinfo: Were children given away or sold throughout Switzerland?
Marco Leuenberger: Yes, especially in German-speaking Switzerland in the Protestant cantons, though also in Catholic areas. It also happened in [French-speaking] canton Vaud. It is also known that children from [Italian-speaking] canton Ticino were sent to work as chimney sweeps in northern Italy.
swissinfo: How many of these ‘Verdingkinder' were there?
M.L.: For years, the trade involved more than 10,000 children [every year]. But it's very difficult to come up with an estimate because there is no evidence available prior to 1820. There were also lots of children who were traded without the knowledge of the local authorities.
swissinfo: How did Swiss authorities manage this child trade?
M.L.: Poor families were forced to register with their local authority every year. It was then decided whether all the family members were [adequately] provided for. Authorities in the 19th century had the right to separate the poorest families.
There were no criteria that [farmers] had to fulfil to receive a “Verdingkind”. They only had to prove that they needed more cheap workers.
David Clark Knowlton, associate professor of anthropology at Utah Valley State College, in the Salt Lake Tribune (Feb. 29, 2004):
The controversies stemming from the University of Utah History Department's attempt to fill the position vacated by the untimely death of Dean L. May raise issues of consequence to all of us. While the members of that department have to make the often-difficult decision of who to hire, all of us have a horse in this race.
The Salt Lake Tribune paraphrased a professor in the department saying that"anyone who studies the history of the American West can teach Utah history." It also quotes the department chair, Eric Hinderaker, on the need to"differentiate between Mormon history and Mormon studies."
Of course we in Utah, no matter our religion or ethnicity, have an interest in how the history of our state and region is taught in the state's flagship university. The extent of our interest is informed by discussions underlying the perspectives of the aforementioned scholars.
So-called Western history has tended to be the hegemonic story of the expansion of the dominant U.S. society over the territory of the American West. As a result, until recently, as Professor Patricia Limerick of the University of Colorado notes, it has not taken account of other peoples inhabiting the region and their stories. Thus it did not grasp the story of Utes, Paiutes, Shoshones, Cheyenne, Hopis, Navajos, etc., in their struggles with the national economic and political project.
It did not include Hispanics, even though this land was part of Mexico and numerous people were forcibly joined to the U.S. when their homes became part of this country.
It has tended to not see half of the mainstream population, women.
Similarly, Western history tends to be a discourse with a missing center, a doughnut, to paraphrase Indiana historian Jan Shipps. It has ignored one of the most densely populated areas of the region along with its people's social dynamics and concerns. Both Shipps and Limmerick claim it substantially ignores the Mormons.
The Mormon culture region does present some complications. The main one may stem from the fact that Mormons are both a people and a religion. Thus someone who can decipher this increasingly important corner of the American tapestry must have skills that join religious history with the more standard approaches of social, political and economic analysis.
For too long, Mormon history has been relegated to a ghetto called Mormon studies. Mormons are an increasingly important component of our national society. The influence of The Church of Jesus Christ of Latter-day Saints is growing in the U.S. and abroad. Its members are prominent in many fields of endeavor and, especially, in national politics. Even American history is becoming increasingly difficult to narrate without some understanding of Mormons.
Unlike standard Western history, Mormon history requires international skills. Whether because of the migration of large numbers of Welsh, English and Danes (on which Brigham Young depended for survival and which gave this region a distinctive population) or because more than half of all Mormons now live in other countries, a Mormon historian must have an awareness of international issues.
Standard Western history is simply not an adequate disciplinary base from which to study or teach about this people. Mormon history is broader than Western history. Furthermore, Western history increasingly needs Mormon history to come out of its ghetto and broaden the story of the American West.
This matters to all of us. It is not simply something for academics to argue about in dusty journals or at some academic coffeehouse.
A quality public university such as the U. has a dual mission. It must house representatives of the range of academic concerns that crisscross the globe with abandon. It also, for reasons of political survival if for no other, must represent the life and concerns of its host society in its research and teaching.
To be sure, the U. continues a tradition of positively engaging its local hosts, as I experienced when I was a visiting professor there. However, much still needs to be done.
It is not my place to intervene in affairs internal to the U.'s history department. However, as the discussion has become public, it is appropriate to weigh in. I feel strongly that Dean May should not be replaced by a mainstream Western historian. Such hegemonic history would break a relationship between the history department and its hosts. This is not about religion. It is about good, professional history and its relationship with a people.
Gearóid Ó hAllmhuráin, in emediawire (March 2004):
On March 17, Irish people all over the world celebrate the patron who won the souls of their ancestors over fifteen hundred years ago. Celebrated in America since at least 1737, St. Patrick is still one of the most illustrious of the uncanonised saints in the Christian pantheon. An icon to schools and dance halls, public buildings and street names all over the world, his feast day marks a focal point in the cultural calendar of over forty million Americans who claim Irish heritage. And yet, despite the festivities and kitsch, there is much concern in the Irish-American community that modern-day ‘traditions' more often perpetuate derogatory cultural stereotypes of the Irish. As colorful parades make their way down international thoroughfares from Dublin to San Francisco, from Montserrat to Western Australia, the historic figure of Patrick himself remains mysterious. However, behind the crass frivolity of green beer and seasonal shamrocks lies a compelling humanitarian message of goodwill and social justice which is as valid today as it was fifteen centuries ago when it was first penned by a little known Roman cleric. ...One of Patrick’s greatest adversaries was Coroticus, a British king who pillaged the north of Ireland and carried off thousands of Patrick’s converts ‘the chrism still fragrant on their foreheads.’ Patrick tried to have Coroticus condemned by the British bishops, hoping that isolation and excommunication would soften his resolve. The outcome of Patrick’s appeal is unknown; however, the saint seems clearly frustrated by the indifference of the British hierarchy. Basking on the dying embers of the Roman Empire, this clerical intelligentsia was asked: ‘can it be that they do not believe that we have received one baptism, or that we have one God and Father? Is it a shameful thing in their eyes that we have been born in Ireland?’
Contrary to popular opinion, Patrick was not the first Christian missionary to work in Ireland. Trade relations between Ireland and Gaul (modern France) exposed the island to new spiritual influences. Likewise, the barbarian invasions into Roman Gaul during the early fifth century sent a stream of refugees and scholars to southern Britain and Ireland. Early Irish historians have argued incessantly as to whether there were two Patricks or one. There is also doubt as to when Patrick arrived in Ireland - whether it was in 432 or 456? It is certain, however, that he did arrive. This fact is confirmed by Patrick’s own writings and especially in his Confessio, considered to be the first contemporary text in Irish history. This private document was one of the key sources used by his biographers Muirchú and Tíreachán in the second half of the seventh century.
Although Ireland lay beyond the civilizing influence of Rome, the Latin scholarship, which Patrick brought to the Gaelic speaking Irish, took less than two centuries to usher in its own Golden Age of learning. By the early Middle Ages, Irish literati plied their scholarship from Iceland to Kiev and from Iona to the Mediterranean.
In this modern era, it is worth noting that both Catholic and Protestant traditions in Ireland embrace Patrick’s philosophy. Perhaps now, as the country enters a new era of peace and prosperity, we might revisit that philosophy again. After centuries of what the Ulster poet Michael Longley referred to as the ‘abnormality of cultural apartheid’ in Ireland, it behooves Irish people everywhere to embrace the message of the abducted slave who could so easily have chosen to remain indifferent to their voices over fifteen hundred years ago.
From the official newsletter of the Nixon Library & Birthplace (March 1, 2004):
There's yet another “new Nixon.”
Nixon Milner, born to Todd and Lisa Milner, arrived at 9:30 p.m. on December 18, 2003. With jet-black hair and deep blue eyes, this healthy newborn is one of Sydney, Australia's newest citizens.
The Nixon name is no coincidence. Todd, and equity dealer, and Lisa, an accountant, indeed named their baby after America's 37 th President.
The admiration the Milners share for President Nixon stems from Todd's American roots and an appreciation of U.S. history. “I think Richard Nixon was President during a very interesting as well as a very difficult time,” said Todd Milner. “It amazes me how many people are not aware, or conveniently overlook, his foreign policy achievements.”
The Milners are proud of their son's chosen name and feel strongly about RN's legacy. “I think that President Nixon was a tremendous, tenacious politician with good intentions at heart,” said Todd. “He certainly played an important role in history,and I regard him highly as a President, and for his achievements in office.”
Todd views RN's historic trip to China and his 1968 presidential victory after losing the governorship of California as his greatest accomplishments. He also admires the late President for the way he “pulled himself back up after his resignation to contribute to society as an author and speaker.”
Since little Nixon's birth, the Milners have found that most people comment on what an original name they have chosen for their son. They look forward to telling their son about his namesake.
Lisa, born and raised in Sydney, met Todd through a mutual friend.
While Todd and Lisa have no immediate plans to travel to the U.S., they do plan on visiting when little Nixon is old enough to appreciate the trip. They hope to make it to Yorba Linda to introduce Nixon Milner to President Nixon and his legacy.
By the way, little Nixon's not the first Milner named after the President. Todd and Lisa used to have a pet poodle named Milhous.
Laura Miller, in Salon (March 1, 2004):
Most Americans -- at least, the ones who aren't addicted to the History Channel -- know about the bombing of Dresden in 1945 from Kurt Vonnegut's bestselling novel"Slaughterhouse-Five," based on Vonnegut's own experiences as a prisoner of war. The attack is still a touchstone for the moral perils of war. Frederick Taylor, a British historian whose new book on the subject goes on to challenge much of what we think we know about the bombing, describes the conventional understanding thus:"Dresden was the unforgivable thing our fathers did in the name of freedom and humanity, taking to the air to destroy a beautiful and, above all, innocent European city. This was the great blot on the Allies' war record, the one that could not be explained away."
"Slaughterhouse-Five" came out in 1969, a time when many Americans were wondering just how much carnage could be justified by the trumpeted ideals of democracy and freedom. Like Joseph Heller's"Catch-22,""Slaughterhouse-Five" is a book set during World War II that was read in the light of Vietnam. It wasn't the first time Dresden was seen as a proxy. Taylor writes that not long after the war's end, and certainly before that,"Dresden became one of the most well-placed pawns on [a] virtual propaganda chessboard." There is the real Dresden and the Dresden of legend. Taylor makes what is by all appearances a good-faith effort to excavate the former by digging through the many layers of the latter. His"Dresden: Tuesday, Feb. 13, 1945" aims to be the last word on the subject, though it's sure to be argued about for years to come.
Alisa Solomon, in the Village Voice (Feb. 25-March 2, 2004):
In a gesture that consolidates the 1990s culture wars, the post-9-11 chill on dissent, and the relentlessness of hawkishly pro-Israel lobbying, the U.S. House voted unanimously last fall to establish an advisory board to monitor how effectively campus international studies centers serve"national needs related to homeland security" and to assess whether they provide sufficient airtime to champions of American foreign policy. Currently the Senate Committee on Health, Education, Labor, and Pensions is considering a parallel provision for its upcoming higher education reauthorization bill. The bill will likely go to the floor in March.
Though it's just a few paragraphs in an arcane piece of routine legislation reauthorizing a relatively small amount of money to what's called"area studies," the advisory board provision represents an ominous offensive against academic freedom and oppositional views. For decades now, since the end of the McCarthy period that saw countless academics expelled from the classroom for their views and international research controlled by a Cold War agenda, the critical assault on left-leaning professors has been launched from books, articles, websites, and media broadcasts—unpleasant enough for the people targeted, but still the stuff of discourse. Even the creepy post-9-11 list of 40 profs accused by the American Council of Trustees and Alumni of giving comfort to America's adversaries turned out to have no teeth.
But the very possibility of legislation sounds old alarms anew. Even if the measure does not make it past the Senate—ranking Democrats on the panel don't expect it to get much traction—the very idea of ideological feds inspecting campus lecture halls takes the culture wars to a perilous new level.
The seven-member advisory board—which would include two appointees"from federal agencies that have national security responsibility"—would oversee the country's 118 international studies centers. This year, they shared about $95 million under Title VI of the Higher Education Act. Centers may use the funds only for graduate student fellowships, language instruction, and lectures and other public programs. They do not hire faculty or offer courses—traditional departments such as art history or political science do that. The centers then involve local faculty from across the disciplines who have expertise in such areas as Latin America, Russia, Africa, and East Asia. Only 17 of the nation's international studies centers focus on the Middle East—covering the Arab countries, Turkey, Israel, and Iran—but no one doubts that they are the intended targets of the legislation.
"The priority of those behind this is defending Israel from any criticism," says Zachary Lockman, director of the Hagop Kevorkian Center at New York University."They understand that universities are one of the few places where debate and argument take place that cannot be heard in the media or anywhere else."
Indeed, the most vociferous critics of the centers have been three right-wing Zionist think-tankers : Stanley Kurtz of the Hoover Institution and a columnist for the National Review Online ; Martin Kramer, whose screed Ivory Towers on Sand: The Failure of Middle East Studies in America was published by the Washington Institute for Near East Policy; and Daniel Pipes of the Middle East Forum, whose website Campus Watch posted"dossiers" on professors whom Pipes deemed to hold unacceptable views on Islam, Palestinian rights, and U.S. or Israeli policy. Students were urged to send in reports on teachers who made any dubious remarks.
Chester E. Finn, Jr., on the website of the Fordham Institute; from the foreword to A Consumer's Guide to High School History Textbooks (Feb. 2004):
Within days of the terrorist attacks of September 11, 2001, major textbook publishers began scrambling to revise their high school history texts to include information about 9/11. An understandable, even commendable impulse, but it went badly. Because these hasty updates or supplements had to be written by early 2002 in order to be included in the 2003 editions of the textbooks—publishing timelines are nothing if not long and slow—by the time they reached classrooms just about all the information in them was obsolete.
Far more troubling, because textbook publishers bend over backwards not to offend anybody or upset special interest groups, the 9/11 information, like so much else in today's history texts, was simplified and sanitized. The reader would scarcely learn that anybody in particular had organized these savage attacks on innocent Americans and citizens of 80 other nations, much less why. The impression given by most textbooks was more like"a terrible thing happened"—reminiscent of the two-year-old gazing upon the shards of his mother's shattered glass vase and saying"It broke."
I've dubbed such verb usages the"irresponsible impersonal" voice and, regrettably, they're more norm than exception in U.S. history textbooks. As with the vase breaking, things happen in these books (though not necessarily in chronological order), but not because anybody causes them. Hence, nobody deserves admiration or contempt for having done something incredibly wonderful or abominably evil. No judgments need be made. (To judge, after all, might upset a person or group who disagrees with the judgment or dislikes the way it makes them or their ancestors look.) The result: fat, dull, boring books that mention everything but explain practically nothing; plenty of information but no sorting, prioritizing, or evaluating; and a collective loss of American memory.
World-history texts present similar problems. It's hard to name a culture or era that doesn't turn up somewhere in these sprawling compilations, but no real story is told. There's no thread, no priorities, no winnowing of the important from the trivial, the history-shaping from the incidental. It's as if a car's chrome trim and speaker system were equivalent to its chassis and engine.
Why does this matter? Some successful countries— Japan and Singapore come to mind—get by fine with slender curriculum guides rather than enormous textbooks. That's because their teachers are subject-matter experts in fields like history and, when supplied with guidance about what state or national standard-setters deem most important, can easily generate their own lessons and find their own materials. They don't depend on textbooks except as reference works.
That's not true in the United States , where few history teachers ever learned much history themselves. More than half of high school history teachers did not major or even minor in history in college. Instead, most studied education or psychology or sociology, perhaps with a focus on"social studies education." As a result, teachers charged with imparting essential information to young Americans about the history of their country and world must rely heavily on the textbooks available to them—often textbooks that teachers themselves had little to do with selecting. Because these texts end up serving as students' primary sources of information, it's vitally important that they be accurate and interesting, and that they establish a narrative of events with a strong sense of context. They must tell"the main story" without neglecting lesser stories that form part of an accurate picture of the past. What they must not be is sprawling, drab assemblages of disjointed information in which everything matters equally and nothing is truly important.
How many—if any—of today's textbooks live up to that obligation? Unfortunately, few independent reviews of textbooks have been conducted to help answer that question. Hence, we at the Thomas B. Fordham Institute, as part of our broader effort to strengthen history education and breathe new life into the moribund field of social studies, judged that it was time to look closely at widely used high school level textbooks in American and world history. In the spirit of being constructive as well as critical, we judged that a competent appraisal would provide practical help to educators tasked with the selection of history texts, to parents concerned about their children's education, to policymakers,
and even to publishers eyeing improvements in their products.
Jennifer Barnett Reed, in the Arkansas Times (Feb. 27, 2004):
We all know the elementary-school stories of the Underground Railroad: tales of Harriet Tubman's daring trips south to lead more slaves to freedom, of kindly white Northern abolitionists hiding fugitives in concealed cellar rooms, ushering them down secret passageways toward freedom.
But those stories, it turns out, are only the pale, one-dimensional, mythologized version of who and what made up the Underground Railroad.
The truth is a much richer epic: giant, sprawling, ever-shifting, the unwritten histories of slaves who ran and slaves who stayed behind, free blacks in the North, Native Americans in the West, and otherwise ambivalent whites moved to single acts of assistance.
The Underground Railroad was vaster and less formal than the memoirs of conductors suggest. Its routes started in the South, ushering slaves north to free states and Canada, but also west to Indian Territory and California, south to Mexico and the Spanish colony of Florida, aboard New Orleans steamships bound for the Caribbean, and onto New England whalers headed for the frigid waters off the coast of Alaska.
The National Park Service has been trying to recover and preserve this larger story since the mid-1990s. Its Network to Freedom project redefines the Underground Railroad broadly as"resistance to slavery through escape and flight" - in short, anything slaves did or used to steal away to freedom - and recognizes it as the first major chain of events in the fight for civil rights.
"This project is different from anything else the Park Service has done," said James Hill, head of the Network to Freedom region that includes Arkansas."There's not a land base or any one particular site at this point."
The Network is set up like the National Register of Historic Places, Hill said. It includes sites connected with the Underground Railroad - homes, churches, even the Great Dismal Swamp in North Carolina and Virginia - as well as educational programs and research facilities. There are over 125 so far in 25 states and the District of Columbia.
Ferreting out Underground Railroad sites has been easy in some Northern states (Ohio alone has 24 so far), Hill said. But they tell only part of the tale.
"To tell the story accurately you have to go down South, where people were leaving from," he said.
And that means Arkansas, where the Underground Railroad is a more elusive beast. A few histories of slavery in the state mention runaways, but no one's ever really sat down and studied the subject in detail.
Enter Charles Bolton, UALR professor and Arkansas history expert. Bolton has signed on with the National Park Service to do a three-year study of the state's fugitive slave phenomenon: how many and who they were, where they ran from and where they were headed, how they traveled and who might have helped them.
"Chances are [runaway slaves] were relying on other slaves and free blacks, instead of white abolitionists, in the South," Hill said."And the truth is, if it weren't for the desire for freedom of African-American slaves, there wouldn't have been any Underground Railroad."
Daniel Pipes, in the NY Sun (Feb. 24, 2004):
Here's a prime example, one that involves me personally, of how the radical Left and the Islamists, those new best friends, readily deceive.
It has to do with a proposed piece of U.S. legislation passed by the House, the"International Studies in Higher Education Act of 2003," known familiarly as H.R. 3077, and awaiting action by the Senate. H.R. 3077 calls for the creation of an advisory board to review the way in which roughly US$100 million in taxpayer money is spent annually on area studies, including Middle East studies, at the university level.
This board is needed for two reasons: Middle East studies are a failed field and the academics who consume these funds also happen to allocate them — a classic case of unaccountability. The purpose of this subsidy, which Congress increased by 26% after 9/11, is to help the American government with exotic language and cultural skills. Yet many universities reject this role, dismissing it as training"spies."
Martin Kramer pointed to the need for Congressional intervention in his 2001 book, Ivory Towers on Sand . Stanley Kurtz picked up the idea and made it happen in Washington, testifying at a key House hearing in June 2003.
My role in promoting this advisory board? Writing one favorable sentence on it eight months ago, based on an expectation that the board creates some accountability and helps Congress carry out its own intent. While hoping the Senate passes H.R. 3077, I have otherwise done nothing to praise or lobby for this bill.
Well, that's the record. But why should mere facts get in the way? Seemingly convinced that turning H.R. 3077 into my personal initiative will help defeat it in the Senate, leftist and Islamist organizations have imaginatively puffed up my role.
- The American Civil Liberties Union accuses me of"enlisting the aid of the government" to impose my views on academia.
- The American-Arab Anti-Discrimination Committee titles its alert"Academic Freedom Under Attack by Pipes and Big Brother."
- The Council on American-Islamic Relations states that I am"actively pushing" for the advisory board.
This deception prompted campus newspapers — for example, at Columbia, CUNY, Swarthmore, and Yale — to link me to the bill, as have city newspapers such as the Berkshire Eagle and the Oregonian , Web sites, and listservs.
What these folks missed is my skepticism about the advisory board's potential to make a major difference. It is important symbolically and it can throw light on problems. But odds are it won't be able thoroughly to solve them.
I say this because unlike comparable federal boards, this one has only advisory, not supervisory, powers. It also has limited authority, being specifically prohibited from considering curricula. Professors can teach politically one-sided courses, for example, without funding consequences. More broadly, such federal boards generally do too little. I have sat on two other ones and find them cumbersome bureaucratic mechanisms with limited impact.
Will a new board improve things? Sure. But Congress should consider more drastic solutions. One would be to revoke post-9/11's $20 million annual supplement for area studies at universities, using this money instead to establish national resource centers to focus on the global war on terror. They would usefully combine area expertise with a focus on militant Islam.
A second solution would zero-out all government allocations for area studies. This step would barely affect the study of foreign cultures at universities, as the $100 million in federal money amounts to just 10% of the budget at most major centers, funds those centers could undoubtedly raise from private sources. But doing this would send the salutary message that the American taxpayer no longer wishes to pay for substandard work.
Either step would encourage younger scholars to retool in an effort to regain public trust and reopen the public purse.
If the advisory board is not the ideal solution, it is the best to be hoped for at the moment, given the power of the higher-education lobby. I am ready to give H.R. 3077 a chance. But should the board not come into existence or fail to make a difference, I'll advocate the better solution — defunding — and work to spread these ideas among the public and in Congress. My opponents will then learn what happens when truly I am"actively pushing" for Congress to adopt a measure.
This article is reprinted with permission by Daniel Pipes. This article first appeared in the New York Sun.
Clarence Page, in the Chicago Tribune (Feb. 25, 2004):
Black History Month was never intended to make people uncomfortable--unless maybe they ought to be.
Nevertheless, despite the best of intentions, a misunderstanding of what the month is all about can lead sometimes to a whopper of an embarrassment.
That's sort of what happened recently at Connecticut's Suffield High School when a group of sociology students decided to hang posters around the school to promote April as"White History Month."
Shortly after they were nabbed, the five students explained to their upset principal, Thomas Jones, that, alas, it was all a misunderstanding, according to The Hartford Courant. The students had been assigned to"explore the effect of rumors." They decided the posters would be a real nifty way to do that. Needless to say, their experiment triggered a lot of rumors, especially in the school's small but understandably alarmed black student population.
The principal scolded the white students for their insensitivity and turned them over to a teacher who reportedly specializes in civil rights and cultural sensitivity issues. In this way, the school at large was able to turn the incident into what one school board official called a"teachable moment," an opportunity to educate both offenders and the offended about differences in how the world looks through each other's eyes.
Good for them. No long-term harm done, I hope. This particular high school poster flap is the most embarrassing incident related to Black History Month that I can recall since early 2001. That was when then-Virginia Gov. Jim Gilmore revoked a proclamation declaring May to be"European Heritage and History Month." The governor had learned to his deep dismay that the request for the commemoration had come from a white separatist group headed by former Ku Klux Klan leader David Duke. Such an embarrassment.
But you don't need to be a klansman, past or present, to ask"Why don't we have a White History Month?" I've heard that question quite a few times over the years. So have other black people I know. Some of us have come up with a list of appropriate responses to it, such as:
1."Because every month is white history month."
2."Because white history has not been lost, stolen or suppressed over the years as much as black history has."
4."History is taught so poorly in our schools these days that maybe we should have a white history month."
5."That's right. I said, `Yo' mama'!!!"
Now, now. We should all try to manage our anger at such moments. Such encounters reveal precisely what Black History Month was intended to remedy: an ignorance about history--black and otherwise. That's why I oppose so-called"political correctness." We need more dialogue, not less....
when the late black scholar Carter G. Woodson dreamed up what was then called Negro History Week in 1926, he too dreamed of the day when it no longer would be needed.
He imagined a day when every student's education would include such African-American figures as Crispus Attucks, who died in the Boston Massacre; Matthew A. Henson, who co-discovered the North Pole with Robert Peary, and Benjamin Banneker, the pioneer scientist who helped conduct the first survey of Washington.
It was important, Woodson felt, that African-Americans understand that we had more to our history than our victimization. In fact, there was a much greater all-American story to be told in how mightily many of our ancestors had triumphed despite adversity.
Woodson imagined a day when the contributions of people from various races, ethnicities and, for that matter, genders would be taught fairly and properly. Then Americans might move more swiftly toward a society where such differences would no longer matter.
Unfortunately, history seems to be given such a low priority in today's schools that I sometimes wonder whether Woodson's dream day is slipping further away.
Jonathan Thompson, in the London Independent (Feb. 22, 2004):
Simon Schama, one of Britain's best-known historians , has accused fellow academics of making the subject too dull.
Professor Schama is calling for a return to a "golden age" of historians of the calibre of Gibbon, Macaulay and Carlyle. He says modern-day historians - with a few notable exceptions - have lost the ability to inspire the public with tales of the past in the same way as their predecessors.
"History's adventure has become a bit lost," said Professor Schama. "It's not as explosive or exciting as it used to be. What we need to recover is our reckless literary courage." He blamed the subject's demise on the "juggernaut of academic history" which is obsessed with scientific data and obsessive footnotes rather than good storytelling.
On the eve of his new BBC television series, Historians of Genius, Professor Schama holds up the great historians of the 18th and 19th centuries as examples of how history should be written. Three of these - Thomas Babington Macaulay, author of The History of England, Edward Gibbon, author of The Decline and Fall of the Roman Empire , and Thomas Carlyle, who wrote The French Revolution - are dealt with in depth during the series, which begins on BBC4 tomorrow evening. Professor Schama won great acclaim for two lengthy and highly popular television series on the history of Britain
"Macaulay was the first person to really bring history to the mass reading public," said Professor Schama. "His writing had great propulsion and he was very proud of the fact that he could entertain people without making anything up.
"Carlyle, too, was an exhilarating read - a fantastic dramatist. It was as though he kicked open a window and dumped you in the very room you were reading about."
But Professor Schama, who teaches at New York 's Columbia University , said today's books pale in comparison with those of the Victorian era. Although he conceded that narrative British history does have some proponents - notably Antony Beevor and Niall Ferguson - these are the exception rather than the rule.
"Popular narrative history was never entirely lost. It was just condescended to by the juggernaut of academic history which seemed to dictate how and to whom you wrote."
Professor Schama's comments look certain to provoke argument within academic institutions, but he received support last night from another high-profile historian - David Starkey, the Cambridge don and television presenter.
"Undoubtedly, academic history is deadly," said Dr Starkey. "A lot of books have become rarely animated footnotes. In fact, they should really be written upside down, with the footnotes at the top and a drip of text underneath. Footnotes aren't new, but what is new is our worship of them."
Dr Starkey, whose series on Elizabeth I and the six wives of Henry VIII helped win him the title of Britain's highest-paid TV presenter, added: "When I was at university, writing a readable book was seen as the height of frivolity: academics were taught to write for each other. There's this entire lack of public presence. Things like revisionism have led to utter trivialisation in the name of scholarship."
Professor Niall Ferguson, whose recent Channel 4 series on the British Empire led to him being nicknamed "The Errol Flynn of British history", was more guarded on the perceived crisis. "Happily, most of my colleagues understand that to reach a mass audience, one must make certain sacrifices", said Professor Ferguson, who recently left Oxford University to teach in New York . "For example, sacrifice of footnotes and of long historiographical introductions. And one must also strive to write rather more fluently - indeed grippingly - than is usual in the academic world."
Despite Professor Schama's criticism, record numbers of history books were sold in the UK last year. According to figures from The Bookseller, sales in 2003 totalled pounds 32m - or 3 per cent of the total market.
Last night, the president of the Royal Historical Society, Professor J L Nelson, denied claims that history isn't as good as it used to be. "History is very much more diverse in the things it covers now," she said. "There are more people studying history - it's more popular than ever.
"People do write in a different style nowadays. There's not such a large vocabulary, but that doesn't mean we're not writing interesting things."
But Bettany Hughes, presenter of The Spartans and one of the new generation of television historians , was less convinced: "There is definitely a danger that academics will simply sharpen their wits on each other. Throughout the 1970s, history became increasingly scientific, focusing on data, process and analysis rather than on comment and argument. There was almost a denial that you could be passionate about your subject.
"We need to reinforce the idea that it is the study of the living rather than the study of the dead."
Dru Sefton, for the Newhouse News Service (Feb. 2004):
Rosie the Riveter -- the collective nickname evokes images of American women going from kitchens to factories in a home-front effort to win World War II, then relinquishing those jobs to returning soldiers after fighting ceased.
But the reality is far more nuanced, shaped by a massive propaganda effort, entrenched gender and race issues, and the need to move into a postwar consumer economy.
"Generalizations assume that all Rosies were the same, with the same motivations," said Sherna Berger Gluck, author of"Rosie the Riveter Revisited: Women, the War and Social Change."
Not so, Gluck said. Black women, for instance,"had a chance to earn very good money for the first time, and they wanted to keep those jobs."
The experiences of women who toiled stateside will be part of the new Rosie the Riveter/WWII Home Front National Historical Park, under construction in Richmond, Calif. The park, which focuses on the entire home-front experience during the war, is gathering stories and artifacts for its Rosies section. Assistance in the effort is provided by sponsors including Ford Motor Co. (www.ford.com/go/rosie).
The 150-acre park on San Francisco Bay north of Oakland is at the site of four Henry Kaiser Shipyards that produced 747 ships, more than any other domestic wartime complex. A Ford assembly plant was also there; workers built nearly 50,000 Jeeps and 90,000 tanks.
The home-front effort"was an enormously important part of the war story," said Judy Hart, park superintendent."It was a time of great progress in equal opportunity, mixed with wide and deep discrimination. It was working through, on a personal, one-on-one basis, layers of prejudice and preconceived ideas, that so deeply and permanently changed America forever."
The memorial will strive to capture the diversity of the Rosies."A story this rich and nuanced is best told by the individual," Hart said."Once visitors listen to a number of stories, they will understand just how varied the experience was."
The number of employed women went from 12.8 million in 1940 to 13.9 million in '42, then 15.6 million in April '43 and 17.7 million in July '43, according to 1943 War Manpower Commission figures. Later statistics showed the peak in 1944 at 18.4 million -- more than 35 percent of the work force.
And each Rosie had"very different experiences, depending on where she was in her life cycle," said Gluck, director of the Oral History Program at California State University, Long Beach. She spent four years listening to Rosies before her book, a collection of reminiscences, came out in 1987.
After the war,"Some stayed in the work force, some got married, some went back home, others went home but returned to work in the 1950s," Gluck said.
It's hard to get precise figures, Gluck said, but it's estimated that half the women working on aircraft production in Los Angeles during the Korean War were former Rosies.
Ron Chernow, in the NYT (Feb. 22, 2004):
As the Democratic primaries reach a critical stage, partisan spirit is running high, and the presidential campaign is already verging on blood sport. George Washington's birthday today serves as a reminder of how presidents can transcend politics and embody the national spirit.
From the time he was recruited as commander in chief in 1775, Washington personified the often tenuous hope of unity among the 13 fractious colonies. With most of the early patriot blood spilled in Massachusetts, the second Continental Congress wanted a Southern general who could lend a national imprint to the struggle. Washington shed his Virginia identity and forged a Continental Army that tutored its green recruits into thinking of themselves as Americans.
It is impossible to assess Washington's career without stumbling over the words"unity" and"unanimity" at every turn. He was unanimously chosen as president of the Constitutional Convention in Philadelphia, where he presided with customary tact. Since it was assumed that Washington would be the first president, his taciturn but resolute presence reconciled many skittish delegates to the vast powers invested in the executive branch. Twice in a row, in 1789 and 1792, the Electoral College elected him president by a unanimous vote, confirming his status as a political deity who seemed to hover above the petty feuds of lesser mortals.
Nevertheless, Americans today tend to take George Washington for granted. He seems less soulful than Lincoln, less robust than Theodore Roosevelt, less charismatic than Franklin Roosevelt. His bloodless image as a remote, Mount Rushmore of a man — partly a byproduct of the craggy face recreated endlessly by Gilbert Stuart — has worked to obscure the magnitude of his achievement. Too often Washington seems a dull, phlegmatic figure, wooden if worthy, whose self-command stemmed from an essential lack of inner fire.
In fact, Washington was a strong-willed, hot-blooded personality."I wish I could say that he governs his temper," a rich Virginian told Washington's mother when George was 16 years old."He is subject to attacks of anger and provocation, sometimes without just cause." The young man mastered his wayward emotions by reading history, studying deportment, and learning how to dance and dress smartly. Like other founders, Washington was an ambitious, insecure provincial, committed to a strenuous regimen of self-improvement.
Over time, Washington would retreat behind an iron mask of self-control. Alexander Hamilton, his chief aide for four years during the Revolution, glimpsed the well-concealed inner man and found him unbearably moody and irritable. As with many passionate but guarded personalities, Washington sometimes burst out unexpectedly in anger....
The prodigious self-restraint enabled Washington to rise above the sectional strife that threatened to tear the 13 states apart. He adopted a detached, even cryptic facade to resist association with any particular faction or interest. In a noisy world of blustering politicos, he possessed the"gift of silence," as John Adams phrased it. Washington articulated his secret succinctly:"With me it has always been a maxim rather to let my designs appear from my works than by my expressions." ...
In his farewell address in 1796, Washington warned against"the common and continual mischiefs of the spirit of party." By this point, however, it was abundantly clear that the two-party system was here to stay. During his single-term presidency, John Adams, a nominal Federalist, tried in vain to perpetuate the notion of a president above party labels. When his successor, Thomas Jefferson, was inaugurated, he intoned famously,"We are all Republicans, we are all Federalists" — a neat rhetorical flourish that thinly disguised his status as the first president to head a political party.
Ever since, the occupants of the White House have experienced an uneasy tension between their role as party leader and as president of all of the people. George Washington never doubted which role should come first.
David C. Unger, in the NYT (Feb. 15, 2004):
ISRAEL'S brief history falls into two periods. Four heroic wars shaped its first quarter-century. Defeat in any could have brought the end of the Jewish state. Yet Israel emerged victorious from all of them, each time extending, even if only temporarily, the amount of territory under its control. The second period, still under way, has been less dangerous but more frustrating as Israel has struggled to translate military strength and territorial gains into real security and diplomatic recognition by its Arab neighbors.
The Arab-Israeli war of October 1973 forms the hinge between these two periods. Never did the prospect of Israel's destruction seem more imminent than in the first days after Egypt and Syria's devastatingly effective surprise attack. Israel rallied its reserves, recovered its losses and added new territory along the critical Golan frontier. But Egypt, a country humiliated and demoralized by Israel's crushing victory in June 1967, regained enough of its pride to pursue peace. A direct line runs from the Egyptian Army's crossing of the Suez Canal in 1973 to Anwar Sadat's historic visit to Jerusalem four years later, and then on to the first formal peace treaty between Israel and an Arab state. Underscoring those connections, Islamic opponents of that treaty assassinated Sadat at an October 1981 military parade marking the anniversary of the 1973 war.
Sadat's example challenges the lifetime belief of Ariel Sharon, Israel's current prime minister, that Arab nations will not make peace with Israel unless they have been so thoroughly beaten and humiliated that they internalize the certainty of defeat. But in the absence of extraordinary leaders like Sadat, recovered Arab pride is no sure formula for peace. Consider President Hafez al-Assad of Syria, whose armies did as well as Egypt's during the early days of the Yom Kippur War, but who never followed Sadat's path to peace. The lessons of the war point in no single direction, but they have much to teach Israelis, Arabs and all who yearn for a comprehensive Middle East peace.
Two new books re-examine the events and lessons of the war. Abraham Rabinovich, the author of ''The Yom Kippur War,'' is an American who moved to Jerusalem in 1967 and covered the 1973 war for The Jerusalem Post. Howard Blum, who has written ''The Eve of Destruction,'' is a contributing editor of Vanity Fair. Both aim to knit together the military, strategic and political levels of the war much as Michael B. Oren's ''Six Days of War'' did for the June 1967 war. Neither fully equals Oren's magisterial achievement. But these authors are working with more difficult material. Sixteen days of grueling back-and-forth combat is not as inherently dramatic as six epochal days in which large Arab armies seemed to melt miraculously before Israeli arms. On their own terms, both of these books offer lively and informative accounts of a pivotal conflict.