Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: NYT (2-27-10)
The Islamic Solidarity Games, the Olympics of the Muslim world, which were to be held in Iran in April, have been called off by the Arab states because Tehran inscribed “Persian Gulf” on the tournament’s official logo and medals.
It’s a small but telling controversy. It puts the lie to the idea of the Islamic world as a bloc united by religious values that are hostile to the West. It also gives clues as to how the United States and its allies should handle two of their most urgent foreign policy matters: the Iranian nuclear program and the Israeli-Palestinian conflict....
In this history of a single body of water, one sees a perfect example of the so-called Islamic Paradox that dates from the seventh century. For although the Prophet Muhammad took great pains to underscore the equality of all believers regardless of ethnicity, categorically forbidding any fighting among the believers, his precepts have been constantly and blatantly violated.
It took a mere 24 years after the Prophet’s death for the head of the universal Islamic community, the caliph Uthman, to be murdered by political rivals. This opened the floodgates to incessant infighting within the House of Islam, which has never ceased. Likewise, there has been no overarching Islamic solidarity transcending the multitude of parochial loyalties — to one’s clan, tribe, village, family or nation. Thus, for example, not only do Arabs consider themselves superior to all other Muslims, but inhabitants of Hijaz, the northwestern part of the Arabian Peninsula and Islam’s birthplace, regard themselves the only true Arabs, and tend to be highly disparaging of all other Arabic-speaking communities.
Nor, for that matter, has the House of Islam ever formed a unified front vis-à-vis the House of War (as Muslims call the rest of the world). Even during the Crusades, the supposed height of the “clash of civilizations,” Christian and Muslim rulers freely collaborated across the religious divide, often finding themselves aligned with members of the rival religion against their co-religionists. While the legendary Saladin himself was busy eradicating the Latin Kingdom of Jerusalem, for example, he was closely aligned with the Byzantine Empire, the foremost representative of Christendom’s claim to universalism.
This pattern of pragmatic cooperation reached its peak during the 19th century, when the Ottoman Empire relied on Western economic and military support to survive. (The Charge of the Light Brigade of 1854 was, at its heart, part of a French-British effort to keep the Ottomans from falling under Russian hegemony.) It has also become a central feature of 20th- and 21st-century Middle Eastern politics....
Beyond the customary lip service about Western imperialism and “Crusaderism,” most other Muslim countries would be quietly relieved to see the extremist regime checked. It’s worth noting that the two dominant Arab states, Egypt and Saudi Arabia, have been at the forefront of recent international efforts to contain Iran’s nuclear ambitions....
Posted on: Sunday, February 28, 2010 - 16:09
SOURCE: LA Times (2-28-10)
For centuries, historians, political theorists, anthropologists and the public have tended to think about the political process in seasonal, cyclical terms. From Polybius to Paul Kennedy, from ancient Rome to imperial Britain, we discern a rhythm to history. Great powers, like great men, are born, rise, reign and then gradually wane. No matter whether civilizations decline culturally, economically or ecologically, their downfalls are protracted.
In the same way, the challenges that face the United States are often represented as slow-burning. It is the steady march of demographics -- which is driving up the ratio of retirees to workers -- not bad policy that condemns the public finances of the United States to sink deeper into the red. It is the inexorable growth of China's economy, not American stagnation, that will make the gross domestic product of the People's Republic larger than that of the United States by 2027.
As for climate change, the day of reckoning could be as much as a century away. These threats seem very remote compared with the time frame for the deployment of U.S. soldiers to Afghanistan, in which the unit of account is months, not years, much less decades.
But what if history is not cyclical and slow-moving but arrhythmic -- at times almost stationary but also capable of accelerating suddenly, like a sports car? What if collapse does not arrive over a number of centuries but comes suddenly, like a thief in the night?
Great powers are complex systems, made up of a very large number of interacting components that are asymmetrically organized, which means their construction more resembles a termite hill than an Egyptian pyramid. They operate somewhere between order and disorder. Such systems can appear to operate quite stably for some time; they seem to be in equilibrium but are, in fact, constantly adapting. But there comes a moment when complex systems "go critical." A very small trigger can set off a "phase transition" from a benign equilibrium to a crisis -- a single grain of sand causes a whole pile to collapse.
Not long after such crises happen, historians arrive on the scene. They are the scholars who specialize in the study of "fat tail" events -- the low-frequency, high-impact historical moments, the ones that are by definition outside the norm and that therefore inhabit the "tails" of probability distributions -- such as wars, revolutions, financial crashes and imperial collapses. But historians often misunderstand complexity in decoding these events. They are trained to explain calamity in terms of long-term causes, often dating back decades. This is what Nassim Taleb rightly condemned in "The Black Swan" as "the narrative fallacy."
In reality, most of the fat-tail phenomena that historians study are not the climaxes of prolonged and deterministic story lines; instead, they represent perturbations, and sometimes the complete breakdowns, of complex systems.
To understand complexity, it is helpful to examine how natural scientists use the concept. Think of the spontaneous organization of termites, which allows them to construct complex hills and nests, or the fractal geometry of water molecules as they form intricate snowflakes. Human intelligence itself is a complex system, a product of the interaction of billions of neurons in the central nervous system.
All these complex systems share certain characteristics. A small input to such a system can produce huge, often unanticipated changes -- what scientists call "the amplifier effect." Causal relationships are often nonlinear, which means that traditional methods of generalizing through observation are of little use. Thus, when things go wrong in a complex system, the scale of disruption is nearly impossible to anticipate.
There is no such thing as a typical or average forest fire, for example. To use the jargon of modern physics, a forest before a fire is in a state of "self-organized criticality": It is teetering on the verge of a breakdown, but the size of the breakdown is unknown. Will there be a small fire or a huge one? It is nearly impossible to predict. The key point is that in such systems, a relatively minor shock can cause a disproportionate disruption.
Any large-scale political unit is a complex system. Most great empires have a nominal central authority -- either a hereditary emperor or an elected president -- but in practice the power of any individual ruler is a function of the network of economic, social and political relations over which he or she presides. As such, empires exhibit many of the characteristics of other complex adaptive systems -- including the tendency to move from stability to instability quite suddenly.
The most recent and familiar example of precipitous decline is the collapse of the Soviet Union. With the benefit of hindsight, historians have traced all kinds of rot within the Soviet system back to the Brezhnev era and beyond. Perhaps, as the historian and political scientist Stephen Kotkin has argued, it was only the high oil prices of the 1970s that "averted Armageddon." But this did not seem to be the case at the time. The Soviet nuclear arsenal was larger than the U.S. stockpile. And governments in what was then called the Third World, from Vietnam to Nicaragua, had been tilting in the Soviets' favor for most of the previous 20 years.
Yet, less than five years after Mikhail Gorbachev took power, the Soviet imperium in central and Eastern Europe had fallen apart, followed by the Soviet Union itself in 1991. If ever an empire fell off a cliff, rather than gently declining, it was the one founded by Lenin.
If empires are complex systems that sooner or later succumb to sudden and catastrophic malfunctions, what are the implications for the United States today?..
Posted on: Sunday, February 28, 2010 - 14:25
SOURCE: The Nation (2-25-10)
Last summer Robert Proctor, a Stanford professor who studies the history of tobacco, was surprised to receive court papers accusing him of witness tampering and witness intimidation, along with a subpoena for his unfinished book manuscript. Then in January he got another subpoena, this one for three years of e-mails with a colleague, and also for his computer hard drive. Attorneys for R.J. Reynolds and Philip Morris USA are trying to get him barred from testifying in a Florida court as an expert witness on behalf of a smoker with cancer who is suing the companies.
Proctor hadn't tampered with any witnesses; all he had done was e-mail a colleague at the University of Florida asking about grad students there who were doing research for Big Tobacco's legal defense. But he's had to hire his own lawyers and spend days in depositions, defending himself from the charges. He told me he had recently spent "sixteen hours under oath, twelve lawyers in a room overlooking San Francisco Bay, a million dollars spent on deposing me and going after these e-mails."
There's a reason Big Tobacco would like to keep Proctor out of the courtroom. He's one of only two historians who currently testify on behalf of smokers with cancer--while forty historians have testified on behalf of the tobacco industry. In 1999 Proctor became the first historian to testify against Big Tobacco, and over the past ten years he has testified in fifteen cases. He's published several books, including Cancer Wars: How Politics Shapes What We Know and Don't Know (1995), and in his co-edited book, Agnotology: The Making and Unmaking of Ignorance (2008), he examines "the tobacco industry's efforts to manufacture doubt about the hazards of smoking." He's also a fellow of the prestigious American Academy of Arts and Sciences.
The harassment of Proctor by Big Tobacco's law firms reflects the new landscape of litigation over the health hazards of smoking. In the previous chapter of this long-running story, forty-six state attorneys general reached a master settlement of $246 billion with Big Tobacco in 1998 as compensation for states' expenditures on cancer caused by tobacco. The next year the Clinton Justice Department filed a federal lawsuit, U.S. v. Philip Morris et al., which was decided in 2006 by Judge Gladys Kessler in federal district court in Washington. She ruled that for fifty years the tobacco companies had "lied, misrepresented and deceived the American public...about the devastating health effects of smoking." In late February both sides asked the Supreme Court to review that case.
Meanwhile, plaintiffs' attorneys were working on a national class-action suit, Engle v. R.J. Reynolds, on behalf of smokers with cancer. But the Court of Appeals for the Third Circuit limited the suit to Florida, where in 1999 jurors awarded smokers with cancer $145 billion, the largest punitive damage jury award in US history. In 2006 the Florida Supreme Court accepted the decision but dissolved the class and said each case had to be tried separately. As a result, there's a lot of tobacco litigation going on in Florida right now--potentially 9,000 lawsuits. In one of the first of those "Engle progeny" cases, a Fort Lauderdale jury in November awarded Lucinda Naugle $300 million. Proctor is scheduled to testify in another.
In these cases, history has become a key component in the tobacco attorneys' defense strategy. In the past, when smokers with cancer sued for damages, the companies said they shouldn't have to pay, because there was a "scientific controversy" about whether smoking causes cancer. But in recent years they have given up that argument and now argue something like the opposite: "everybody knew" smoking causes cancer. So if you got cancer from smoking, it's your own fault.
To persuade juries, they need historians--experts who, for example, can testify that newspapers in the plaintiff's hometown ran articles about the health hazards of smoking in the 1940s or '50s or '60s, when he or she started. So Big Tobacco has been spending a lot of money hiring historians--and is stepping up the harassment of Proctor.
The charges of witness tampering and witness harassment concerned history grad students at the University of Florida who had been hired to do research for Big Tobacco by Gregg Michel, a historian at the University of Texas, San Antonio. Proctor learned about the grad students from Michel's deposition. (Michel did not respond to requests for an interview.) "I e-mailed a colleague at the University of Florida asking about this," Proctor said--Betty Smocovitis, a historian of science. "She wrote back and said she was horrified. Said it couldn't be true. Then she found that it was."
The next thing Proctor knew, tobacco attorneys were telling a court in Florida last June that Proctor, simply by e-mailing his colleague, had engaged in an "unethical" campaign of "intimidation," seeking "to malign and harass graduate students who serve as research assistants." As a result, one of the students who had been asked by the department chair about the job had "voiced doubts whether she should continue working" for Big Tobacco. Proctor's e-mail, they told the court, therefore constituted an "improper" effort to "influence, interfere or intimidate" a witness for the defendants.
They also subpoenaed Smocovitis, hoping to get her to say that Proctor had been threatening to "out" the grad students in question. At her deposition, she told me, she told tobacco attorneys that "Robert Proctor never said he would name names, and I don't believe he ever intended to. He's not out to get grad students." She recalled that during a break in her deposition, when the tobacco attorneys "saw they were not getting what they wanted from me about Proctor, they screamed across the table, 'We're going to get him. He's never going to testify again!'"
In the end, the judge ordered Proctor to hand over the e-mails--all ten of them. Nothing improper was found, no witness tampering or intimidation, and the tobacco attorneys dropped the issue--for a while.
In August, when attorneys for R.J. Reynolds subpoenaed Proctor's unpublished work-in-progress, a history of global tobacco, The Chronicle of Higher Education said the subpoena had "major implications for scholars and publishers." Ordinarily litigants are entitled to have everything relevant to prepare their case, and the tobacco attorneys said they needed Proctor's manuscript. Proctor replied that forcing him to release his unfinished manuscript would violate his academic freedom, his privacy rights and his freedom of speech. The Florida court agreed with him in a November ruling; the judge held that an author has a constitutional right to choose when and where his writings are published. (In that ruling the judge cited a 1985 Supreme Court ruling that Harper & Row's right to control publication of Gerald Ford's memoirs superseded the First Amendment right of a magazine to publish excerpts without authorization--the loser in that case was The Nation.) But the fact remains that Proctor was forced by R.J. Reynolds attorneys to spend time and money fighting harassment-by-subpoena.
And it's not over yet, according to the plaintiffs' attorney, William Ogle. If R.J. Reynolds loses a jury verdict in the trial at which Proctor will testify, the company will almost certainly appeal, on the grounds that it should have been given the book manuscript. "So the issue will be litigated again in the court of appeals," Ogle said. "Then they could take it to the Supreme Court of Florida, and to the US Supreme Court." And since cases are being argued all over the state, "they could raise it again in Daytona Beach, Tampa, Fort Lauderdale or Miami--anywhere Proctor is scheduled to testify."
The same legal filing that accused Proctor of witness tampering also argued that he had "already caused a mistrial...by gratuitously injecting...racial slurs into his testimony to impugn defendants." That's another example of the tactics practiced by tobacco lawyers. Proctor was the leadoff witness in the first of the "Engle progeny" cases in Florida, the follow-up to the class-action suit with the $145 billion verdict. On the stand Proctor began to explain racism in tobacco marketing. He started to say that the companies had marketed products called Nigger-Head Tobacco and Nigger-Hair Tobacco--brands that existed as late as the 1960s. But a Philip Morris attorney, objecting that Proctor had injected racial slurs into the courtroom, demanded a mistrial--and got it. The judge ruled that Proctor's utterance of those words was "prejudicial."
If Proctor had been found to have engaged in witness tampering or witness intimidation in the case of the Florida grad students, he would probably not work again as an expert witness. Then there would be only one historian left who testifies against Big Tobacco: Louis Kyriakoudes.
Kyriakoudes, who has faced a similar campaign of harassment and intimidation, is in a more vulnerable position than Proctor. He's not a full professor or a member of the National Academy; he's an associate professor of history at the University of Southern Mississippi. He's published one book and is writing a second, Why We Smoked: Culture, History, and the North American Origins of the Global Cigarette Epidemic. He's also published many articles in scholarly journals--notably, research about tobacco advertising and about historians as tobacco experts.
Kyriakoudes was also harassed over the University of Florida grad student researchers. His offense: sending Proctor the deposition--which is public information--in which Proctor found the names of the students. Tobacco attorneys told a judge in Broward County that this was grounds for excluding Kyriakoudes as an expert witness. The judge rejected that motion in October. But, Kyriakoudes told me, "since last January  I've been deposed by the other side at least seven or eight times." The tobacco attorneys' strategy, it appears, is to make it so time-consuming for him to continue that he will conclude it's not worth it. And it's had an effect: "I've cut back a lot of what I've been doing," Kyriakoudes told me in mid-February. "They hit me pretty hard, making it difficult to do my research. So I've pulled out of cases. I cut back to one or two trials a year. Harassment is effective."
One more historian has testified against Big Tobacco: Allan Brandt. But he testified only once. His 2007 book, The Cigarette Century, won several awards. Brandt is now dean of the Graduate School of Arts and Sciences at Harvard and a professor of the history of medicine and the history of science. He has not testified in a case since U.S. v. Philip Morris in 2003. When I asked why, he said, "That case appealed to me because it was the United States bringing a case against all the tobacco companies, a case on behalf of the American public, a historic case." But "it's enormously time-consuming and labor-intensive to testify," he said. And, as he explained in his book, "I had no interest in becoming an expert witness.... I did not want my scholarship to be dismissed as 'advocacy.'"
Brandt changed his mind, he explains in his book, after he saw the arguments offered by historians working for the tobacco companies, people like Lacy Ford of the University of South Carolina, who "had published no research at all" on the subject. That left Brandt with a feeling of "disgust." (Ford declined to comment for this story.) And he was "appalled" at the defense of tobacco companies offered by Kenneth Ludmerer, a historian of medicine at Washington University in St. Louis, who was an expert for Philip Morris. Brandt considered Ludmerer's testimony to be bordering on "historical malpractice" because he "has never published on the history of tobacco, on lung cancer, on the impact of tobacco on health, or on the industry's claims about smoking and health." So Brandt agreed to testify for the government in U.S. v. Philip Morris.
When I asked Ludmerer about Brandt's criticism of his testimony in U.S. v. Philip Morris, he replied, "Where is civility in this country? These ad hominem attacks are injurious. I had coronary artery bypass surgery in 2005. I'm sure a lot of the disease came from tension from the comments people made about my testimony. I've never done anything other than serve the public interest." He added, "I was hoping the tobacco industry would lose." But then why did he testify for the industry? "I considered it honorable to stand up for doing history properly," he answered. I asked how much he had been paid by Big Tobacco for working as an expert witness. "Maybe $500,000," he said. (Patricia Cohen of the New York Times reported in 2003 that he had earned "more than $550,000.")
Brandt decided not to testify in any other cases because, he said, "I found my time on the stand highly frustrating." The cross-examination and the media coverage left him feeling "a bit bruised." And in the meantime, Bill Clinton, whose Justice Department brought the suit, had left office and the new Bush administration ordered the government trial team to reduce its claim for damages from $280 billion to $10 billion--a tremendous victory for Big Tobacco, which was celebrated on Wall Street. (In late February the Obama Justice Department asked the Supreme Court to restore the $280 billion penalty.)
Nevertheless, Judge Kessler's 2006 decision was a monumental one: the tobacco companies "suppressed research, they destroyed documents, they manipulated the use of nicotine so as to increase and perpetuate addiction...and they abused the legal system in order to achieve their goal--to make money." Brandt felt vindicated but unhappy that the claims for remedies had been vastly scaled back by the Bush White House.
Brandt, Kyriakoudes and Proctor are proud of their work and let everyone know about it, while those on the other side never mention their work for Big Tobacco on their faculty websites or online CVs. Lacy Ford doesn't, and neither does Michael Schaller at the University of Arizona or Kenneth Ludmerer at Washington University. James Kirby Martin's CV at the University of Houston website says he has "consulted on various historical-related product liability and health issues" but doesn't say which products, or which side, he has worked for.
As the University of Florida events demonstrate, a lot of the actual research for the tobacco attorneys is done not by their historian experts but by grad student assistants. Birte Pfleger was one. She was working on her dissertation at the University of California, Irvine (where I teach), in 2002 when an e-mail was circulated from John Snetsinger, a professor at California Polytechnic, San Luis Obispo, seeking a research assistant and offering $25 an hour. At the time, Pfleger told me, that "sounded like an awful lot of money." She took the job.
The assignment was the standard one: find articles in the local newspapers about the dangers of smoking, starting in 1950. "We found ads that said smoking was glamorous and sexy and fun, but he said he didn't want those," she remembered. "He just wanted articles that said smoking was bad for your health." At that point, she recalled, "we started wondering who he was and what he was doing with this. We asked him, but he never really explained it."
I figured out what case Pfleger had been working on and told her about it: a lawsuit against Philip Morris brought by Betty Bullock of Newport Beach, who was dying of lung cancer and eventually won a big punitive damages award. "I would not have done the work if I had known what it was for," Pfleger then said. "I'm relieved that the jury rejected the tobacco industry's argument." (Snetsinger did not respond to interview requests.)
In Bullock's trial, for which Snetsinger had been deposed in 2002, the jury awarded her an awesome $28 billion. That set a record as the single largest judgment against Philip Morris. The company appealed, and the court reduced the $28 billion to
$28 million. The company appealed that too, but a California appeals court concluded in 2006 that "Philip Morris's misconduct was extremely reprehensible" and that "the vast 'scale' and 'profitability'" of the misconduct justified an award of $28 million--to Betty Bullock's daughter Jodie, since Betty had died of smoking-related causes in 2003. The company appealed again on another issue and won a retrial in 2009, which ended recently with an award of $13.8 million. Philip Morris attorneys have said they will appeal that verdict as well.
Is it true that "everybody knew" in the 1950s and '60s that smoking could kill you? A consensus of medical opinion had formed by the mid-1950s that smoking caused lung cancer, as Allan Brandt shows in The Cigarette Century. But the tobacco industry denied that fact and did everything it could to create doubt about the health effects of smoking. It paid doctors and scientists to say there was "no proof" and suggested through advertising that smoking was glamorous and sexy, rebellious yet deeply American. In the late 1940s, for example, "More Doctors Smoke Camels" was a ubiquitous print ad. Despite the surgeon general's 1964 report that smoking causes cancer, the Marlboro Man indelibly linked smoking and masculinity-in-the-mountains for a generation of Americans.
A centerpiece of Big Tobacco's defense strategy is the argument that smoking is voluntary, and thus it's your own fault if you get cancer. That neglects the problem that nicotine is addictive, and poses another issue for historians--what did the tobacco companies know about addiction, and when did they know it? As Brandt's book documents, the companies knew that nicotine was described as addictive by many scholars in the 1940s. Nicotine creates a physical dependency; trying to quit leads to classic symptoms of withdrawal, including anxiety, depression and craving for the missing chemical. But the tobacco companies denied that smoking was addictive. When teens started smoking in the 1950s and '60s--the people now dying of lung cancer who are suing Big Tobacco--they didn't make an informed choice based on knowledge of nicotine addiction. And later, when they had trouble quitting, many followed the advice of the companies and switched to "lite," "low tar" or filter cigarettes--which are also hazardous.
Given the deception practiced by Big Tobacco, how are the historians who work for tobacco attorneys able to blame the smokers? As they admit under cross-examination by plaintiffs' attorneys, in their "research," they fail to examine the most important source of information on the history of smoking: the archives of the tobacco manufacturers and their public relations firms, which are readily available online at tobaccodocuments.org, as required by the 1998 settlement in the state attorneys general lawsuit. These materials document industry efforts to suppress information about cancer and smoking and, in Kyriakoudes's words, to "secretly sponsor disinformation."
In a major research paper published in the international peer-reviewed journal Tobacco Control, Kyriakoudes examined the testimony of eighteen experts in twenty-seven trials. He found that the tobacco companies' historians "present a history of the cigarette in which the tobacco industry all but ceases to exist." Research in archives is the hallmark of historical scholarship. The court testimony of Lacy Ford, James Kirby Martin and Michael Schaller, along with that of Nixon biographer Joan Hoff of Montana State, Southern historian Robert Jeff Norrell of the University of Tennessee, Knoxville, and the rest, Kyriakoudes concluded, "fails to meet basic professional standards of scholarship."
Of course, some historians have refused to work for Big Tobacco, on the grounds of those same scholarly standards. One is Richard Abrams of the University of California, Berkeley, an expert on government-business relations. He said that when tobacco attorneys from the firm Arnold & Porter approached him fifteen years ago, "I told them that tentatively I was sympathetic to their position for the post-1965 period, but I wasn't sure about before that--so I needed to get into their records to see what they were telling the public. They said, 'You can't see our archive, but we'll send you stuff.' I said, 'If you're going to put me on the stand as an expert witness, I can't say I had access only to what you chose to send me.' They still wouldn't let me see their archives, so I said forget it."
Why, over the past fifteen years, have forty historians wanted to help Big Tobacco? I asked a dozen historians on Kyriakoudes's and Proctor's lists. Virtually all declined to be interviewed, including Otis Graham, emeritus at the University of California, Santa Barbara; Elizabeth Cobbs Hoffman of San Diego State; and Terry Parssinen of the University of Tampa, who was Big Tobacco's expert in the recent Fort Lauderdale case where the jury awarded the smoker with cancer $300 million.
Michael Parrish of the University of California, San Diego, did agree to talk about it. He said he had worked on five cases, the last in 2003, and isn't doing it anymore. "For doing research, I charged $110 an hour," he told me. "If I was deposed, it was $250 an hour. If it went to trial, $400 an hour. I didn't do it out of love for the tobacco industry." But, he added, he hadn't done it just for the money: "I was a smoker for twenty years and quit. I felt there had to be a little more personal responsibility there, instead of [plaintiffs] putting all the blame on the tobacco companies."
But money seems to be the main inducement--at least that was the pitch when Michael Schaller invited me to work as an expert for the tobacco companies in 2005. He called it "a lucrative consulting opportunity." (I declined.)
Historians earn big money working for Big Tobacco: Stephen Ambrose, who taught at the University of New Orleans and was famous for writing bestsellers about D-Day, Lewis and Clark, and Eisenhower as a World War II general, was asked in a deposition why he was testifying for the companies. His answer was brief: "for compensation." Tobacco companies paid him $25,000 for just one case in 1994, according to Laura Maggi in The American Prospect. (Ambrose, a smoker, died of lung cancer in 2002, when he was 66.)
But don't plaintiffs' attorneys also have big money to hire their own historian experts? The jury award in California's Bullock case, for example, was $28 billion. Proctor told me he has made an average of about $40,000 a year over the twelve years he has worked as an expert witness. Kyriakoudes told me he made $75,000 last year. "I testified in seven trials, all in Florida," he said.
Forty historians have testified for Big Tobacco; only three have testified against--why the disparity? Two factors help explain it. First, the tobacco attorneys many years ago organized the recruitment of historians and coordinated the creation of a common body of research. Kyriakoudes wrote in his article for Tobacco Control that in 1984, "the industry's law firms formed the Special Trial Issues Committee," whose task, according to a memo to Brown and Williamson, was to develop witnesses who "will also explain" to juries that Americans' decisions to smoke cigarettes were "wholly unrelated" to industry "promotion or coercion." Plaintiffs' attorneys, in contrast, typically work as single practitioners and thus can't come close to matching the organization and coordination of the other side.
They also have nothing like the money Big Tobacco pays its law firms. The reasons were explained by Michael Piuze, the Los Angeles attorney who won the $28 billion verdict in the Bullock case. When it comes to the harm caused by smoking, he said, Big Tobacco is unique. "In most product liability litigation--auto manufacturing or pharmaceuticals--there may be one lawsuit for every 50,000 customers," Piuze said. "But tobacco companies kill or seriously injure one in two of their customers." (That is the standard scientific view, endorsed by the American Cancer Society and the World Health Organization.) Thus they can't possibly pay for the damage they have caused. "So the industry decided in the 1950s on a scorched-earth litigation policy. They would never give up. Never settle. If they ever lost a case, they would appeal. Forever. That's the way it still is. The message to the plaintiffs' bar is clear: don't screw with us, or you'll be sorry. We will break you financially."
"There are 38 million people who live in California, and there is one tobacco case pending in California," says Piuze. "In the entire history of the state there have been eight tobacco trials. That's one side of the ledger. On the other side, 37,000 people die of tobacco-related causes in California every year. That's 100 every day. Have they been successful with their litigation strategy? You better believe it."
Copyright © 2009 The Nation
Posted on: Friday, February 26, 2010 - 15:47
SOURCE: CNN (2-22-10)
When Sen. Evan Bayh announced that he would step down from the Senate, he said that Congress had become a dysfunctional institution. "I love helping our citizens make the most of their lives, but I do not love Congress," Bayh lamented.
Bayh is not the only politician or pundit to issue this warning in recent months. There have been an abundance of proclamations that Congress no longer works....
But we must not blame it all on the institution and downplay the human failures of leadership either. At this point, Democrats must start to question two aspects of their performance in 2009. The first has been the White House strategy of allowing Congress to dictate the timing and substance of legislation. The second has to do with Sen. Harry Reid and his inability to keep his caucus united and to move major bills despite leading a sizable majority....
The reality is that passing legislation through Congress has never been easy. The nostalgia for better times is a constant refrain. During much of the 19th century, Congress was legendary for looking more like a boxing ring than a site of distinguished debate.
During the progressive era, a series of powerful House speakers caused grief for presidents as they obstructed legislation. President Teddy Roosevelt complained about the tyrannical powers of Speaker Joe Cannon, known as the "Czar" of the House, who controlled committee assignments and manipulated procedures that gave him the power to block proposals for government expansion. One observer said: "There is room for saying Cannon is even more powerful than the president of the United States."
Between the 1930s and 1970s, Southern Democrats dominated Congress by relying on the power of committee chairmanships. Mississippi Sen. James Eastland, chairman of the Senate Subcommittee on Civil Rights, proudly boasted that he had special pockets "put into my pants" in which he carried "those bills around in my pockets everywhere I went and every one of them was defeated."...
The complaints about heightened party polarization and a supermajority Senate have been standard since the 1970s. The trends that resulted in the current process were not invented in 2008. Rather they were the result of long-term changes such as the movement of moderates, Southern Democrats and liberal Northeastern Republicans, out of their respective parties as well as the impact of the 24-hour news cycle with cable television....
Congress also created a national security state during the early Cold War. Even since the 1970s there have been big breakthroughs, such as the Americans with Disabilities Act (1990) and the reorganization of homeland security under President George W. Bush....
Sen. Chuck Schumer announced his support for using the budget reconciliation process, which prohibits a filibuster, to pass specific sections of health care reform. While Republicans have been comfortable using all Senate procedures this year, including the threat of a filibuster and holds on presidential appointments, Democrats have been much more hesitant to use this equally legitimate procedure.
But this announcement is just a start. There is a need for much more introspection about why Reid has had so much trouble reaching the kind of intra-party deals between moderates and liberals that Speaker Nancy Pelosi has achieved in the House, and whether President Obama is willing to undertake a different approach to dealing with the legislature.
If Democrats ignore these tough questions, focusing only on the flaws in procedures, they will find themselves in bad shape going into 2010.
Posted on: Thursday, February 25, 2010 - 21:38
SOURCE: Foreign Policy (2-22-10)
Over the course of the disastrous 20th century, inhabitants of the liberal democratic world in ever-increasing numbers reached this conclusion: War doesn't pay and usually doesn't work. As recounted by historian James J. Sheehan in his excellent book, Where Have All the Soldiers Gone?, the countries possessing the greatest capability to employ force to further their political aims lost their enthusiasm for doing so. Over time, they turned away from war.
Of course, there were lingering exceptions. The United States and Israel have remained adamant in their determination to harness war and demonstrate its utility.
Europe, however, is another matter. By the dawn of this century, Europeans had long since lost their stomach for battle. The change was not simply political. It was profoundly cultural.
The cradle of Western civilization -- and incubator of ambitions that drenched the contemporary age in blood -- had become thoroughly debellicized. As a consequence, however willing they are to spend money updating military museums or maintaining war memorials, present-day Europeans have become altogether stingy when it comes to raising and equipping fighting armies.
This pacification of Europe is quite likely to prove irreversible. Yet even if reigniting an affinity for war among the people of, say, Germany and France were possible, why would any sane person even try? Why not allow Europeans to busy themselves with their never-ending European unification project? It keeps them out of mischief.
Washington, however, finds it difficult to accept this extraordinary gift -- purchased in part through the sacrifices of U.S. soldiers -- of a Europe that has laid down its arms. Instead, successive U.S. administrations have pushed, prodded, cajoled, and browbeaten European democracies to shoulder a heavier share of responsibility for maintaining world order and enforcing liberal norms.
In concrete terms, this attempt to reignite Europe's martial spirit has found expression in the attempted conversion of the North Atlantic Treaty Organization (NATO) from a defensive alliance into an instrument of power projection. Washington's aim is this: take a Cold War-inspired organization designed to keep the Germans down, the Russians out, and the Americans in, and transform it into a post-Cold War arrangement in which Europe will help underwrite American globalism without, of course, being permitted any notable say regarding U.S. policy.
The allies have not proven accommodating. True, NATO has gotten bigger -- there were 16 member states 20 years ago, 28 today -- but growth has come at the expense of cohesion. Once an organization that possessed considerable capability, NATO today resembles a club that just about anyone can join, including, most recently, such military powerhouses as Albania and Croatia.
A club with lax entrance requirements is unlikely to inspire respect even from its own members. NATO's agreed-upon target for defense spending, for example, is a paltry 2 percent of GDP. Last year, aside from the United States, exactly four member states met that goal.
The Supreme Allied Commander in Europe -- today, as always, a U.S. general -- still presides in splendor over NATO's military headquarters in Belgium. Yet SACEUR wields about as much clout as the president of a decent-sized university. He is not a commander. He is a supplicant. SACEUR's impressive title, a relic of World War II, is merely an honorific, akin to calling Elvis the King or Bruce the Boss.
Afghanistan provides the most important leading indicator of where Washington's attempt to nurture a muscle-flexing new NATO is heading; it is the decisive test of whether the alliance can handle large-scale, out-of-area missions. And after eight years, the results have been disappointing. Complaints about the courage and commitment of NATO soldiers have been few. Complaints about their limited numbers and the inadequacy of their kit have been legion. An immense complicating factor has been the tendency of national governments to impose restrictions on where and how their forces are permitted to operate. The result has been dysfunction...
Posted on: Thursday, February 25, 2010 - 21:37
SOURCE: The Jewish Chronicle (2-25-10)
Political Islam is partly the product of a cultural fusion between European and Islamist traditions of Jew-hatred. Nazism's Arabic-language propaganda aimed at the Middle East during the Second World War indicates that a crucial chapter in the history of that fusion took place in Berlin during the war.
It was then and there that the highest-ranking officials of the Nazi regime, including Hitler and officials in the Foreign Ministry and the SS, had a meeting of hearts and minds with pro-Nazi Arab and Muslim exiles such as Haj Amin el-Husseini (the Grand Mufti of Jerusalem) and Rashid Ali Kilani, the former head of a short-lived Iraqi regime. Throughout the Second World War, their collaboration led to thousands of hours of short-wave Arabic-language radio broadcasts to the Middle East....
In their Arabic-language broadcasts, the Nazis stated that Zionists had started the Second World War in order to establish a Jewish state and dominate the Arab world. In its propaganda aimed at Germans at home, the Nazi regime publicly asserted that it was "exterminating" the Jews of Europe. In the Arabic-language broadcasts to the Middle East, Arabic language announcers called on listeners to take matters into their own hands and "kill the Jews" in the Middle East themselves. Rather than translations of speeches by Hitler or Goebbels, it was a fundamentalist reading of the Koran that was crucial for justifying Jew-hatred with Muslim listeners. Husseini and others asserted that Jews had been the enemies of Islam from its inception. They presented Zionist goals in Palestine as only the latest of the Jews' efforts to destroy Islam....
...For the Islamists, Israel's creation proved that Nazism's conspiracy theory about vast Jewish power had been proven correct. After 1945, the ideological poison unleashed by the Nazi regime did not die. Rather it mutated, changed its cultural context and merged with a distinctly different cultural tradition. The history of the seven decades of that mutation and evolution remains to be written. When it is, the chapter on wartime Berlin will be an important one.
Posted on: Thursday, February 25, 2010 - 21:25
SOURCE: OpEdNews (2-25-10)
It has been about three decades since self-identified "conservatives" pulled off the remarkable feat of turning liberal into a four-letter word. Now they're trying to do the same to progressive.
At the recent CPAC convention in Washington, television screaming head Glenn Beck wrote the word "Progressivism" on his traveling chalkboard and proclaimed, to howls of approval, "This is the disease. This is the disease in America... It is progressivism."...
Mitt Romney's contribution was "liberal neo-monarchists." Michele Bachman was... Michele Bachman.
It's time to fight back in an appropriate manner.
I have for several years been urging progressives to stop calling these people "conservatives," which they plainly are not, and instead to call them what people who long for the bad old days of the Robber Barons, William McKinley, and Calvin Coolidge really are: regressives....
So, from this day forward, whenever you find yourself about to say or write the word "conservative," replace it with the accurate term, "regressive."
Posted on: Thursday, February 25, 2010 - 21:10
SOURCE: Daily Beast (2-24-10)
Now that President Obama has confronted some harsh political realities over health-care reform and has seen his public-approval ratings fall down to earth, he can do many things to set his administration on a fresh course. He might begin, though, by heeding some lessons of history. Like too many unsuccessful presidents, he has surrounded himself in the Oval Office with a select coterie of campaign loyalists from Chicago politics and his former Senate office. As Politico editor John Harris reported Jan. 22, this inner circle consists of “romantics” who are enveloped by pettiness, grandiosity, and hero worship left over from the 2008 primaries and general election.
Unable to shift out of campaign mode, Harris writes, the president’s confidants are driven by “a basic attitude toward Clinton-style governance [that] is hostile,” even though one member of the Cabinet is named Clinton and an array of veterans of the Bill Clinton administration are on Obama’s staff—including White House Chief of Staff and former Chicago Rep. Rahm Emanuel, who, according to a recent Financial Times report, “treats Cabinet principals like minions.” By seeing their own world starkly, as the stage for a dramatic struggle of world-historic, “transformational” proportions, these would-be saints close to the president have embraced fantasies of transcendence that have yielded to needless factionalism within the Democratic Party and inside Obama’s administration: Blue Dogs versus liberals, idealists versus pragmatists, as well as, evidently, dueling bands of White House insiders.
History, alas, is filled with examples of insular White House palace guards undermining presidents’ political survival as they seek to shield him from influences other than themselves....
...When Lincoln became president, he freed himself from the Illinois politicos who had paved the way for his nomination and election and instead sought intelligence from a broad array of office holders and military men. He did not forget the men who had elected him—he named his campaign manager, David Davis, to the Supreme Court in 1862—but neither did Lincoln cloister himself within a White House inner circle. Nor was Lincoln afraid to dump appointees. In 1862, he forced the resignation of his politically influential but not very capable secretary of War, Simon Cameron, and replaced him with the unlikely Edwin Stanton, the attorney general in the previous Democratic administration whose contemptuous opinion of Lincoln was well-known. Nevertheless, Lincoln and Stanton forged a superbly effective partnership.
If the president will not shake up his inner circle, he at least ought to start expanding it and talking seriously with a host of people well outside his comfort zone, much as Lincoln and Reagan—as well as Franklin D. Roosevelt, who operated through a constantly changing cast of characters—did before him. Staff changes may not be enough to reverse the looming legislative mess over health-care reform, or even win back the support he has lost among traditional Democratic voters in time for the midterm elections. And timing in these matters is delicate. But only the president is indispensable. If a staff shakeup—an obvious measure employed by all successful presidents—does not prove sufficient, it is certainly necessary, and inevitable sooner rather than later if the president is to achieve much of anything, preventing the unmaking of both his own administration and his party.
Posted on: Thursday, February 25, 2010 - 14:43
SOURCE: CNN (2-25-10)
After the fall of South Vietnam in 1975, U.S. Col. Harry Summers remarked to his North Vietnamese counterpart, "You know you never defeated us on the battlefield." After a moment, the North Vietnamese officer replied: "That may be so, but it is also irrelevant."
Although that blunt exchange took place nearly 35 years ago, it's still worthy of close consideration in light of America's wars in Iraq and Afghanistan.
Americans did win their battles in Vietnam, but, as the outcome of the war made clear, raw battlefield prowess did not lead to victory. Why? Because the war there was not for Americans to win or lose. It was a Vietnamese war....
... [W]e must examine the old chestnut regarding the American understanding of the Vietnam War; the idea that we simply backed the wrong Vietnamese. The idea that, no matter what, the South was never viable. South Vietnam certainly had its share of problems -- endemic corruption, endless political infighting, and a sense of nationalism so weak it couldn't hold back a massive uprising. For all of its failings, however, South Vietnam fought long and hard against difficult odds.
In South Vietnam, during 20 years of war, more than 200,000 people were killed in battle, as many as 1 million civilians died and 1.5 million fled the fallen country as refugees. This is not the story of a nation that did not fight and was simply doomed to defeat. Indeed, South Vietnam's government and its military, though troubled, were arguably much more functional than those of the regimes in Iraq and Afghanistan today....
The Army of the Republic of Vietnam was created to fight alongside its American sponsors in crushing battles of annihilation. Battles, when all else failed, that relied on air- and artillery-delivered firepower to save the day. What is often forgotten is that the plan worked remarkably well, as long as American forces and or support was close at hand.
But the South Vietnamese military was never meant to fight on its own. Once American troops had gone back to "the world" and U.S. funding for the continuation of the war in Vietnam dried up, the doom of South Vietnam was clear....
Today, Americans are urging the Afghans to fight alongside a major U.S. advance in the Helmand Province. With the aid of capable U.S. advisers, and as the recipients of overwhelming U.S. logistical and firepower support, the Afghans will, no doubt, achieve some notable success.
But one day, possibly soon, American forces will withdraw, and -- good intentions not withstanding -- American funding for continued operations in Afghanistan will dry up. We must ask ourselves, is the state and military we are helping to create in Afghanistan "Afghan" enough to survive very long after our withdrawal?
Posted on: Thursday, February 25, 2010 - 13:46
SOURCE: NYT (2-24-10)
Vice President Joe Biden complains that he is being driven crazy because so many people are betting on America’s demise. Reports of it are not just exaggerated; they are, he insists, ridiculous. Like President Obama, he will not accept “second place” for the United States. Despite the present crippling budget deficit and the crushing burden of projected debt, he denies that the country is destined to fulfill a “prophecy that we are going to be a great nation that has failed because we lost control of our economy and overextended.”
Mr. Biden was referring in particular to the influential book “The Rise and Fall of the Great Powers” by Paul Kennedy, a British historian who teaches at Yale. Published in 1988, the book argues that the ascendancy of states or empires results from the superiority of their material resources, and that the wealth on which that dominance rests is eroded by the huge military expenditures needed to sustain national or imperial power, leading inexorably to its decline and fall. The thesis seems a tad schematic, but Professor Kennedy maintains it with dazzling cogency. In any debate about the development of the United States, one would certainly tend to side with the detached historian rather than the partisan politician.
All too often, however, students of the past succumb to the temptation to foretell the future. For reasons best known to himself, for example, the eminent British historian A. J. P. Taylor predicted that the Second World War would reach its climax in the Spanish port of Vigo. Equally preposterous in its way was Francis Fukuyama’s claim that the conclusion of the cold war marked the end of ideological evolution, “the end of history.”
When indulging his own penchant for prophecy, Paul Kennedy too proved sadly fallible. In his book, he wrote that Japan would not stagnate and that Russia, clinging to Communism, would not boom economically by the early 21st century. Of course, Professor Kennedy did not base his forecasts on runes or entrails or stars. He weighed the available evidence and extrapolated from existing trends. He studied form, entered suitable caveats and hedged his bets. In short, he relied on sophisticated guesswork. However, the past is a map, not a compass. It charts human experience, stops at the present and gives no clear sense of direction. History does not repeat itself nor, as Arnold Toynbee would have it, does it proceed in rhythms or cycles. Events buck trends. Everything, as Gibbon said, is subject to “the vicissitudes of fortune.”
Still, history is our only guide. It is natural to seek instruction from it about the trajectory of earlier great powers, especially at a time when the weary American Titan seems to be staggering under “the too vast orb of its fate.” This phrase (loosely taken from Matthew Arnold) was used by the British politician Joseph Chamberlain to depict the plight of his nation in 1902. The country had indeed suffered a severe setback during its South African war and its global supremacy was under threat from mighty rivals in the United States and Germany. Yet the British Empire was at its apogee.
Paradoxically, the larger great powers grow, the more they worry about their vulnerability. Rudyard Kipling wrote this elegy to the empire, of which he was unofficial poet laureate, to mark its most spectacular pageant, Queen Victoria’s Diamond Jubilee in 1897.
Far-called, our navies melt away;
On dunes and headlands sinks the fire;
Lo, all our pomp of yesterday
Is one with Nineveh and Tyre!
Aptly quoting these lines exactly a century later, when Britain gave up its last major colony, Hong Kong, this newspaper’s editorial page noted that the queen’s empire had been relegated to the history books; the United States had become the heir to Rome.
Now doom-mongers conjure with Roman and British analogies in order to trace the decay of American hegemony.
In so doing they ignore Gibbon’s warning about the danger of comparing epochs remote from one another. It is obviously possible to find striking similarities between the predicament of Rome and that of Washington (itself modeled on classical lines, incidentally, because it aspired to be the capital of a mighty empire). Overstretch is common to both, for example: Rome defended frontiers on the Tigris, the Danube and the Rhine; America’s informal empire, controlled diplomatically, commercially and militarily, girdles the globe.
But the differences are palpable...
Posted on: Thursday, February 25, 2010 - 09:50
SOURCE: The American Interest (2-21-10)
At the tea parties here in glamorous Queens we make sure we serve genuine Devonshire clotted cream with the scones and we keep our pinkies carefully extended while lifting the delicate porcelain cups to our lips, but a very different kind of Tea Party has my friends in the upscale media and policy worlds gravely concerned. To hear them talk, all the know-nothings, wackadoo birther wingnuts, IRS plane bombers, Christian fundamentalists out to turn the US into a theocracy, the flat earthers and the racists have somehow joined together into a force that is as politically formidable as it morally and intellectually contemptible. These Tea Partiers, I am frequently told, are ‘reactionaries’. They long for an older, safer and whiter America — a more orderly place where their old fashioned values were unchallenged, one in which ethnic minorities weren’t in their faces, gays weren’t demanding acceptance, and in general life looked more like “Ozzie and Harriet” and less like “South Park.”
I’m sure that description fits some of the people at some of the Tea Parties, but I think it misses the point. Yes, the Tea Partiers represent something very old in American life and in some ways they want a return to traditional American values, but the traditional American value that inspires them the most is the value of revolutionary change. The Tea Party movement is the latest upsurge of an American populism that has sometimes sided with the left and sometimes with the right, but which over and over again has upended American elites, restructured our society and forced through the deep political, cultural and institutional changes that from time to time the country needs and which the ruling elites cannot or will not deliver.
That doesn’t mean that everything populists want works out. Andrew Jackson’s war against the Second Bank of the United States caused a depression in the short term and then left the country with a lousy, crash-prone financial system for the next eighty years. His immensely popular Indian Removal Act that sent the eastern Indian tribes to Oklahoma was no triumph of justice and compassion. And while a later generation of populists gave women the vote, it also brought in Prohibition....
Today in the United States many of our core institutions are fundamentally out of sync with reality: they cost more than we can pay but they don’t do what we need. We have colleges our people cannot afford — and that often leave graduates without a basic grounding in either the history of our civilization or the practicalities of contemporary life. We have a health system that we cannot pay for and which fails to cover enough people. We have a public school system which has been failing too many of our children for far too long, costs unconscionably large amounts of money considering its poor performance — and vested interests block necessary reforms. Our federal, state and local governments are locked into an employment system and mode of organization that we cannot pay for — and that does not do the job. Our retirement system is a time bomb and all our political class can do is watch the fuse burn. We cannot regulate our financial industry effectively — and we cannot live without a financial system that remains innovative and dynamic. We are fighting a global conflict whose name we dare not speak against an enemy we do not know how to defeat and in a world that is more volatile and fluid than it has been since World War Two we are very far from any kind of national consensus (or even thoughtful conversation) about what our priorities and strategies should be....
My guess would be that the Tea Party movement is part of a very big wave. The link between a business driven agenda of modernization and reform and a populist agenda for empowerment, deregulation and attacks on privileged professions which are also costly economic bottlenecks is what, historically, has driven many of the populist movements that change the face of the country. That was true in the Jacksonian era and again during the progressive era and the New Deal when the desires of a left of center populism meshed with corporate needs for a stronger national framework of policy and regulation. It was true when the Republican Party pushed through the wave of changes and restructurings in the 1860s that ushered in the rise of the national industrial economy. It is equally true of the right of center populism that now seems to be taking shape, and potentially this movement could have the kind of impact on the country that the original Jacksonians did.
What this means for conventional politics is harder to predict. American populism is notoriously turbulent and unstable. As populist energy shifted from the pro-slavery Democrats in the twenty years before the Civil War, different movements like the Know-Nothings and the Free Soilers rose and fell until the new Republican Party harnessed northern and midwestern populist sentiment together with the nationalist vision of the rising industrial and railroad interests. (Abraham Lincoln wasn’t just the leader of a populist political revolt; he was a railroad corporate lawyer who combined populist politics with Henry Clay style nationalist economic ideas.) Crackpots and wackadoos often surface and achieve some temporary notoriety before the sorting out process of political exposure and debate winnows out the leaders from the loudmouths. At the moment the Tea Party seems to be more at the fermentation stage; the movement is still finding its feet and in terms of both program and personnel the new populists are still getting their act together....
Posted on: Wednesday, February 24, 2010 - 17:41
SOURCE: Foreign Policy (3-1-10)
Ever since the U.S. Strategic Bombing Survey cast doubt on the efficacy of aerial bombardment in World War II, and particularly after its failure to bring victory in the Vietnam War, air power has acquired a bad reputation. Nowadays, killing enemies from the skies is widely considered useless, while its polar opposite, counterinsurgency by nation-building, is the U.S. government's official policy. But it's not yet time to junk our planes. Air power still has a lot to offer, even in a world of scattered insurgencies.
Military aviation started off splendidly in 1911, when the Italians pioneered aerial bombing in Libya. But since then it has often been a great disappointment because the two overlooked conditions of success in 1911 have been absent: the barrenness of the Libyan desert, which allowed aviators to see their targets very clearly, and the total lack of an enemy air force or anti-aircraft weapons that could interfere with their attacks.
Through all the wars since, the 1911 rules have held. Aerial bombing works very well, but only if the enemy must move in open, arid terrain and has no air force or effective anti-aircraft weapons. These conditions emphatically did not apply to World War II until the very end. And Vietnam was full of trees, as well as brave men: hence the failure of tactical bombing in the south, while the strategic bombing of the north was strongly resisted and there were too few good targets anyway....
What about Afghanistan? Do the 1911 rules work there? The expert consensus again seems to be no. And yet the Taliban, for all their martial virtues, are still a few centuries removed from having an air force capable of engaging U.S. fighter-bombers -- which fly too high for hand-held anti-aircraft weapons -- and even in that most mountainous of countries, Taliban fighters must cross open, arid terrain to move from one valley to the next.
Most unfortunately, having so often greatly overestimated air power in the past, the United States is now disregarding its strategic potential, using it only tactically to hunt down individuals with remotely operated drones and to support ground operations, mostly with helicopters, which are the only aircraft the Taliban can shoot down. Commanding Gen. Stanley McChrystal, understandably concerned about the political blowback from errant bombings widely condemned both inside and outside Afghanistan, has put out the word that air power should be used solely as a last resort. He intends to defeat the Taliban by protecting Afghan civilians, providing essential services, stimulating economic development, and ensuring good government, as the now-sacrosanct Field Manual 3-24 prescribes. Given the characteristics of Afghanistan and its rulers, this worthy endeavor might require a century or two. In the meantime, the FM 3-24 way of war is far from cheap: President Barack Obama is now just about doubling the number of U.S. troops by sending another 30,000, at an average cost of $1 million per soldier per year, to defeat perhaps 25,000 full-time Taliban....
Posted on: Wednesday, February 24, 2010 - 16:58
SOURCE: InsiderIowa.com (2-23-10)
After thirteen months on the job, President Obama’s renewed call to “move forward in a bipartisan fashion” faces an increasingly skeptical and entrenched opposition. As he prepares to convene a highly publicized peace conference concerning the health care impasse, Republican leaders openly suspect the meeting is merely a diversion to rally public opinion on his behalf.
Perhaps the President is summoning his inner James K. Polk. The Tenth President, another determined chief executive with an extraordinarily ambitious agenda, earned an enduring reputation as a crafty political operator and, perhaps more notably, left his mark on the presidency through his remarkable capacity for accomplishment.
President Polk spent his first two years in office assiduously attempting to fulfill his expansionist campaign promises of 1844. After splitting the difference with Great Britain on Oregon, Polk turned to the Southwest. Intent on separating Mexico from two coveted territories, New Mexico and California, the President feigned conciliation while actually maneuvering toward armed conflict. Polk dispatched an envoy, John Slidell, to negotiate a settlement with Mexican diplomats, but cleverly structured the undertaking in a way that virtually precluded any chance of success.
Following the anticipated collapse of the Slidell mission, the President prepared a message for Congress asking for a declaration of war against Mexico for refusing to negotiate. Before he could deliver the formal request, however, the President received welcome news. In addition to the diplomatic slight, a Mexican cavalry detachment had engaged an American unit of dragoons along the Rio Grande—placed there, of course, by Polk to provoke just such a reaction.
Polk quickly rewrote his message, reporting to Congress that “war exists, and, notwithstanding all our efforts to avoid it, exists by the act of Mexico herself.” We had pursued every effort to arrive at a peaceful settlement, Polk lamented to his national audience, but “our cup of forbearance had been exhausted.”...
Back to the Future. Even as the parties prepare, ostensibly, to negotiate healthcare in good faith at the Blair House, Republicans suspect President Obama of orchestrating an alternative plan to achieve his goal by more violent means. Republicans breathlessly warn the President and his party that a healthcare victory gained through the unusual legislative remedy of “reconciliation” will cost them dearly....
If President Obama chooses to walk the path of James K. Polk, how might things turn out?...
Posted on: Wednesday, February 24, 2010 - 13:25
SOURCE: National Review (2-24-10)
The first year of the Obama administration has been a vertiginous pile of confusions and contradictions. In hunting for a theme to its decision making, we might start with Obama’s relation to his predecessor.
THE WORLD WAR II ANALOGY
George Bush, a purported conservative, ran up deficits reaching in aggregate $2.5 trillion; therefore I, Barack Obama, a liberal, can legitimately exceed that figure by a factor of three or four. That seems to be the thinking of the present administration. And its common defense of the massive new deficit is the historical analogy that it will snap us out of the recession in the same way that deficit spending during World War II lifted us out of the Great Depression.
Even many supporters of the new stimuli confess that the Depression was not cured by the New Deal, but rather by the strong demand in goods and services brought on by the war that followed. So the new mega-Keynesians describe their current remedies in terms not of 1933–39, but of 1941–45.
But even if one were to accept the questionable assumption that our current recession is anything like the downturn of the 1930s (10 percent unemployment versus 25 percent), we forget that what allowed us to manage the high levels of incurred debt was the rebound after 1945, when U.S. manufacturing, natural resources, and expertise met much of the industrialized world’s postwar demand until the wrecked economies of Europe, Russia, and Japan rebounded. Yet in the current weak recovery, we certainly will not be paying back our borrowed trillions by exporting to a needy world already well supplied by Europe, Japan, Korea, and China.
Bottom line: We have no easy means to create the wealth necessary to pay back the unprecedented trillions we now owe — and we have no accurate historical parallel to guide us through these upcoming years of unsustainable levels of indebtedness, other than perhaps a Greece or Argentina writ large....
Posted on: Wednesday, February 24, 2010 - 11:00
SOURCE: China Daily (2-21-10)
Jeffrey N. Wasserstrom is a Professor of History at the University of California, Irvine, and the author, most recently, of “China in the 21st Century: What Everyone Needs to Know” (Oxford University Press, April 2010).
Note: an earlier version of this essay appeared on the “History News Network” website.]
There was a lot of debate in the American press late last year over whether Barack Obama’s first trip to China was a success, but there was a consensus on one thing: it was best understood in light of the visits to Beijing that other American presidents have made since the seventies.
We agree. But we do not just mean the 1970s, which is as far back as other commentators tended to go. We think that a look back to the 1870s has at least as much to tell us....
The significance of the 1870s lies in the fact that Ulysses S. Grant was on a world tour that included a stop in China as that decade came to a close. And although by the time Grant did his globetrotting, he was a former occupant of the White House, he did many of the same things in China that ex-President that Nixon, Obama, and all recent Commanders-in-Chief in between have done while in office.
Grant met with some of China's leaders (including Li Hongzhang) to discuss U.S.-China relations. He expressed the hope for a future in which closer ties would develop between our “young” country and the “ancient” one across the Pacific. And he even went to see the Great Wall and reflected on its meaning. In his case, inspired by a comment about the landmark made by William H. Seward (a member of Abraham Lincoln's storied "team of rivals" who made it to China earlier in the 1870s), Grant mused on the fact that as much labor was probably required to construct it as had been expended on building all of America's railroads, which at the time were considered marvels of state-of-the-art engineering....
One such striking parallel relates to large-scale global events. The most important of these in Grant’s day were World’s Fairs, and just three years before his trip to China, he had presided over the first of these ever held outside of Europe. That spectacle, the 1876 Philadelphia Centennial Exhibition, signaled America’s entry into the charmed circle of thoroughly modern lands. It was a wake-up call to established powers such as France and Great Britain that the ground was shifting under their feet--though the impressive showing American industry had made the Crystal Palace Exhibition of 1851, the very first World's Fair, had made some European leaders take notice even earlier.
The Olympics are now as important as World’s Fairs once were. And when Obama went to China, it was his Chinese counterpart, Hu Jintao, who was fresh from presiding over a grand spectacle, the 2008 Beijing Games, which was the first event of its kind held on Chinese soil. Like the Philadelphia World's Fair, it was viewed by many as an impressively produced gala. And like that International Exhibition, it was interpreted as symbolizing that a familiar geopolitical landscape had been altered, and would soon perhaps be transformed further....
These kinds of switches should not surprise us for, the great contrasts relating to political systems aside, China is now much like the U.S. was then. This is a point that others have made before us, including American historian Stephen Mihm, whose essay "A Nation of Outlaws"(Boston Globe, August 26, 2007) traces parallels between the way an era of "exuberant capitalism" played out on our side of the Pacific over a century ago and is now unfolding in China.
Posted on: Tuesday, February 23, 2010 - 19:17
SOURCE: China Beat (Blog) (2-23-10)
In January, we marked the end of our second year online. China Beat has changed a lot during that time, and will be changing more in the coming weeks and months as China Beat’s new editor, Maura Cunningham, takes the helm. It’s been my pleasure to have been founding editor of China Beat, and as I transition to a new role at the blog (I will now join the ranks of the blog’s consulting editors), I wanted to look back at how China Beat has developed since January 2008—for new readers and for readers who have been with us since the beginning.
How did China Beat get started?
China Beat grew out of conversations between Ken Pomeranz and Jeff Wasserstrom, both professors of Chinese history at the University of California, Irvine (UCI). In fall 2007, they began to talk about the likely focus on China during its Olympic year, and felt there was a need to bring more scholars of China into those media discussions. Inspired by other historian bloggers, for instance the big crew at History News Network and Juan Cole’s Informed Comment, they thought that a blog might be one way to start getting those voices into the mix. At the time, I was a Ph.D. candidate at UCI, and when Jeff and Ken began to draw up a list of potential regular contributors to the blog (mostly academics in various disciplines, but also a couple of writers from outside of the academy) I was one of the people they approached. As the only local who had previously dabbled in blogging, I volunteered to get the venture up and running. At that point, we all envisioned that the blog would be self-perpetuating—that the “editor” would be doing little more than ensuring that the blog stayed online and that the group of 20 or so contributors would regularly generate and post their own content.
As it turned out, however, we quickly began soliciting new content from contributors outside that initial group to respond to current events or to address specific topics. Very quickly, we were operating much more like a standard magazine than a group blog—soliciting pieces, editing submissions, and heavily moderating the content that appeared online. Jeff and Ken, as the blog’s founders and “Consulting Editors,” have not only contributed posts but also play an important role in recruiting contributors and brainstorming with me (and more recently Maura, too) about new directions the blog could or should move. I kept the blog running day-to-day; I also wrote or pulled together the many byline-less posts from the author “China Beat” (those posts have, in recent weeks, largely shifted to Maura’s responsibility).
What was China Beat’s primary goal and how has it changed over time?
China Beat’s primary goal was to counter the steady refrain of reports on air pollution, Chinese nationalism, and other Western media tropes that were the stock in trade for many daily journalists and even more so newscasters leading up to the Olympics. That isn’t to say that those weren’t important stories—just that we felt that there were more complicated, interesting ways to tell those stories (as well as the many other stories that were overlooked). And we knew that we could draw on a network of scholars of China who were rarely, if ever, tapped by journalists and Western media as “China experts,” despite the fact that these people had valuable, critical knowledge to contribute to the discussion. We also thought we would find willing collaborators among some non-academics (such as China-based freelance writers and journalists) with an interest in offering beyond the headlines views of the PRC.
In addition, we worried that the media was relying too heavily on a small group of “China experts” (some of whom were wonderfully perceptive but others of whom had rather limited knowledge of China), and these voices were dominating the discussion, sometimes to the detriment of Western understandings of China. We wanted to feature opinions and perspectives on contemporary China that were grounded in cutting edge scholarship, that were historically contextualized, and that were informed by what was actually happening on the ground in China. Most importantly, we hoped to convey the diversity and heterogeneity of China (intellectually, socially, culturally, demographically, etc.) rather than trying to boil China down to simple sound bites (which often ended up being “China’s scary,” “China’s impervious to change,” or “China’s becoming just like the U.S.”).
We can claim some successes in this regard—China Beat contributors like Susan Brownell and Caroline Reeves were contacted to comment on contemporary events as a result of posts they wrote for us, and writings that first appeared at China Beat have been reprinted (and sometimes translated), reaching broader audiences at publications from Shanghaiist and Japan Focus (where some of our posts have run in expanded formats, after skillful editing by Mark Selden there) to Huffington Post and Poland’s Gazeta Wyborcza. But there is still a lot of work to do to shift how China is framed in popular discussions in the US and elsewhere, and it will take more than just the work of China Beat to accomplish it.
That’s one important reason we think of ourselves as part of a broader network of China-interested writers, and why we celebrate when one of China Beat’s contributors has work appear elsewhere. That is really our second mission—to draw attention to quality writing on China. That is a pretty standard goal for a blog, and one that we share with our colleagues at China Digital Times, Danwei, and others. To that end, we are always looking to bring new voices into the discussion. Our special approach is that we focus on voices that are coming from the academy, but we’ve always had contributors who are outside it as well, from Leslie T. Chang (part of the original group Ken and Jeff lined up for the blog) to Xujun Eberlein (a more recent addition), whose works are, like those of our academic contributors, grounded in research and careful analysis. Some of our early contributors even had a foot in both academia and journalism—like Susan Jakes, a former reporter for Time and current graduate student of Chinese history at Yale, and Howard French, the former Shanghai bureau chief for The New York Times and current professor at Columbia School of Journalism.
What is the institutional structure for China Beat and how does that shape its content?
China Beat was initially built around a group of contributors at the University of California, Irvine (not only Jeff, Ken, and me, but also Yong Chen, Guo Qitao, Nicole Barnes, Pierre Fuller, Jennifer Liu, Shi Xia, Miri Kim, Chris Heselton, and others), but quickly grew beyond that. Even so, the blog’s focus on thinking of China in the world reflects some of the particularities of how China is studied at UCI—which has a vibrant and friendly cross-disciplinary community of China scholars (like Dorie Solinger in Political Science, Wang Feng and Su Yang in Sociology, Hu Ying and Bert Scruggs in East Asian Languages and Literatures, and many others across campus) as well as being the center of an innovative approach to the study and teaching of world history that was pioneered by two China scholars—Ken Pomeranz and R. Bin Wong (now at UCLA). In addition, UCI hosts a lively community of writers centered around the UCI MFA in Writing program and the International Center for Writing and Translation (an organization that funded a memorable weeklong visit to campus by novelist and essayist Pankaj Mishra, a longtime friend of the blog).
Despite the strong influences of UCI on China Beat, the blog does not share an official affiliation with the university—we do not receive regular financial support from the university (though some campus entities have helped us to put on local events, like the recent campus reading by Peter Hessler, an original member of the blogging team) and none of us are paid for our work at China Beat. This limited support means that, over time, we have scaled back the interactive features of the blog, such as the decision last year to eliminate reader comments (we were spending increasing amounts of time moderating reader feedback and spam). We do invite readers to submit more traditional “letters to the editor” (they can be sent to our email address, email@example.com) and we have run several of these submissions as stand-alone commentaries at the blog in recent months.
As the blog has grown over the past two years, we have incorporated contributors from many other institutions—not just in the United States but around the world—and each brings a unique perspective and frame of analysis to China Beat. As a result, though our UCI roots were important in shaping the blog in its early days, we now see ourselves as reflecting a broader conversation among China specialists and writers who seek to reach a wider audience.
What is China Beat’s future?
As I mentioned above, we are going through some personnel changes. I have recently accepted a position in History and Asian Studies at Penn State, and will begin my position there in fall 2010. In anticipation of my changing status, we brought Maura Cunningham on as associate editor in fall 2009; she will now transition into the post of editor of the blog and I will move into the role of consulting editor, continuing to contribute posts and be involved, as Ken and Jeff have been, in coordinating and recruiting content.
At a less functional level, China Beat’s future is hard to predict. Practically, we intend to keep on as we have—featuring quality writing about China and drawing attention to good stuff that appears elsewhere—but we also recognize that the blog is a platform that continues to evolve. Blogs are now recognized as an important component of the media landscape (and now look more like magazines than the navel-gazing personal sites that were the granddaddies of the form), but the technology is not standing still. We want to continue to reach new audiences where and how they read. To do that, and yet retain the deep and critical analysis we think is a vital part of the intellectual project, is an exciting challenge. We love our print books enormously (we are historians, after all, and we were delighted to see a print book, China in 2008: A Year of Great Significance, bring the sensibility of and some material from China Beat into bookstores and Amazon.com), but we aren’t afraid of the changes that are inevitably coming to how we teach, learn, discuss, and, ultimately, think. China Beat is just one way that many of us are experimenting with how to reshape (or perhaps revitalize) the role of the academic as public intellectual.
Posted on: Tuesday, February 23, 2010 - 19:01
SOURCE: The End is Coming (Blog) (2-23-10)
It was recently revealed by an anonymous e-mail to the authorities that heroin sale and use at Oxford’s Christ Church College is “rampant”. Analysts suggests that this is no surprise and that it is even part of a “decades-long drug culture” that has been running wild at Oxford and other high-level institutions of schooling since the 1960s. One would think, with a reputation as one of the most prestigious higher education institutions on earth, Oxford University would be the very last place where the large-scale use and sale of heroin would become a problem. Some further studying proves otherwise.
Oxford is composed of some 36 different colleges but none are more renowned than Christ Church. Certified during the reign of Henry VIII in 1546, it accepts few admissions yet has yielded over a dozen leaders of the UK and countless important politicians, economists and businessmen; this is more than all the other Oxford colleges combined. On another note, Christ Church was also the castle setting in both Lewis Carroll’s Alice in Wonderland and J. K. Rowling’s Harry Potter series. It began it’s life as a cathedral of Henry VIII’s new and defiant Anglican Church and has apparently not lost its attraction to controversy in the twentieth century and beyond.
As for the illicit substance in question, Heroin, or its constituent parts anyway, it has been around for far longer than you might think. From the Opium poppy grown today in Afghanistan and in South-East Asia’s “Golden Triangle”, the sap has been harvested and smoked at least since the eighteenth century when it became known in the West. In the mid 1800s, scientists distilled the drug and concentrated its narcotic properties into an even more powerful sedative and anesthetizing drug, morphine. A miracle cure, it was used to treat headaches, ulcers, alcoholism and…opium addiction. Later on, a German company called Bayer (still the makers of the Aspirin), refined the drug further into its ultimate form, diacetylmorphine. It was not a very sexy brand name so they tested it on a few subjects who then suggested they felt immortal (before presumably falling in a drug-induced semi-coma/stupor); the substance was thus named “Heroin”, a name it still carries despite the fact that Bayer, incidentally, no longer sells the product. In the twentieth century, Heroin has become illegal, its production has been refined and thus prices are much lower now than they once were. That being said, it remains one of the most expensive street drugs.
Making its way from pharmacies, to the underground scene, to street dealers and finally in schools internationally, Heroin became a staple drug in the post Woodstock 1970s and 80s. From inner city schools in America where it was relatively little-used when compared to marijuana, LSD and cocaine which were the affordable and available drugs of choice, it was introduced and much more popular in ivy league universities and upper-class institutions such as Christ Church in an era where drugs opened the “doors of perception” and were a way to the hippie culture and the search for personal enlightenment. Thus with a heavier price tag, black tar heroin came with a certain level of sophistication and became a right of passage for some at Oxford where a beer could serve the same role in a Canadian college. The harmless use of heroin in the 70s gradually became the full-blown and devastating addiction in the 80s, 90s and 00s.
This relatively underground use at Christ Church exploded onto the public air waves in 1986 when Student Olivia Channon was found dead of a heroin overdose. Not only was she a promising young student that was seemingly on the fast track to a successful career, she was the daughter of Lord Paul Channon, Cabinet minister of Margaret Thatcher’s conservative government. Uproar at first asked how such a promising and seemingly responsible young adult studying at Oxford could use such a drug but investigators soon found that at Christ Church “There are quite a few people on drugs”.
Incidentally, Ms. Channon had been found in the bed of another Christ Church attendee and indeed another Heroin fiend. Count Gottfried von Bismarck (great-great-grandson of Germany’s founder and Iron Chancellor Otto von Bismarck) was expelled in 1986 for hosting the alcohol and heroin fuelled party but stayed around and became known for the extravagant and hedonistic functions he would hold. Another of aristocratic Europe’s best and brightest, he finally passed away in 2007 at the age of 44, leaving a corpse riddled with Hepatitis B, C, HIV and the hourly injections of cocaine that morning that led to a what looks like a voluntary overdose.
These are but two of the most tragic stories that seem to show that there is something amiss in Christ Church, Oxford College and prestigious higher education institutions in the Western World in general. Whether it be due to the high level of stress that comes from the, media attention, family expectations and responsibility of studying at such an institution, these students of rich and dynastic families seem no different than other post-secondary schools in regards to drug traffic. If anything, they seem worse.
From the sexual revolution to the doors of perception to recreation and addiction, drugs are today as much a part of studying at Oxford and other such universities as the outrageous tuition fees and prestigious diplomas they dispense. The next time someone says they graduated from Harvard, Cambridge, or Yale, congratulate them on surviving without a $1000-a-week drug habit.
Posted on: Tuesday, February 23, 2010 - 17:39
SOURCE: CuttingEdgeNews (2-22-10)
In recent days, Israeli Ambassador to the United States Michael Oren was heckled relentlessly and interrupted vociferously by members of University of California at Irvine's Muslim Student Union. Such negation of civility, discourse and decorum, which was noisily and gleefully celebrated by still other members of this group, is often defended by solemn-sounding references to United Nations resolutions.
This case was no exception. In a subsequent statement, the Muslim Student Union said it opposed having university departments sponsor a speaker representing a country that “is condemned by more UN Human Rights Council resolutions than all other countries in the world combined”—which is, in fact, the case. Those who use this type of argument rely on the halo effect of the United Nations, which is held, implicitly or explicitly, to embody “international opinion,” a term that can be invoked with reverential awe to dignify a bad, dishonest argument.
So let's tell the truth: the U.N. is not a democratic body. It represents governments, not societies, and it consists mainly of unrepresentative governments. The U.N. Human Rights Council cited by the university's Muslim Student Union is a case in point: Non-democratic African and Asian regimes exercise an unbreakable controlling majority of 26 of its 47 seats.
It is these dictatorships that set the council's agenda and determine its vote—and thus decide what constitutes “international opinion” as cited by the Muslim Student Union....
“International opinion,” in short, is whatever a consensus of tyrannies says it is.
It follows that whatever a majority of U.N. member states declare can, at best, only incidentally reflect what their societies think, if it does at all. And what most people think about other countries or foreign policy in any case may bear little relation to the facts....
This state of affairs obliges us to be guided by this golden rule: Disbelieve anyone who appeals to “international opinion” or its imagined embodiment in the consensus of this or that United Nations organ to burnish his argument. As for why democratic governments and societies continue the damaging practice of investing moral authority in “international opinion,” that is a subject for serious study—and correction.
Posted on: Tuesday, February 23, 2010 - 14:01
SOURCE: Philadelphia Inquirer (2-23-10)
Let's suppose a student walks through the halls of her high school carrying a big banner denouncing one of her teachers. I'd be OK with the school's confiscating the banner, and I bet you would be, too.
So should we let her post similar remarks on the Internet?
Last week, a Florida judge said yes. And, unlike most of my fellow liberals, I think he was wrong.
If we really care about protecting free speech, we need to teach our kids some basic principles of civility. And sometimes that means we have to restrict their speech, even on the Web....
The Florida case began in 2007, when a high school principal suspended senior Katherine Evans for creating a Facebook page that vilified her English teacher. "Ms Sarah Phelps is the worst teacher I've ever met!" Evans wrote. "To those select students who have had the displeasure of having Ms Sarah Phelps, or simply knowing her and her insane antics: Here is the place to express your feelings of hatred."...
Last week, U.S. Magistrate Judge Barry Garber ruled that the case could go forward. "Evans' speech falls under the wide umbrella of protected speech," Garber wrote. "It was an opinion of a student about a teacher, that was published off-campus ... and was not lewd, vulgar, threatening, or advocating illegal or dangerous behavior."
But it was rude, boorish, and ill-mannered. It showed just how little Evans has learned about civil discourse, which requires a set of shared values: reason, tolerance, and decency. And when we forsake these ground rules, we lose our ability to communicate - literally, to "make common" - with each other....
All of the recent decisions rest on the Supreme Court's landmark 1969 ruling Tinker v. Des Moines, which allowed students to wear black armbands to school to protest the Vietnam War. As the court famously pronounced, students do not "shed their constitutional rights to freedom of speech or expression at the schoolhouse gate." To suppress student speech, the court added, officials must show that it would "disrupt the work and discipline of the school."...
So why not let the kids have their fun? The answer lies elsewhere in the Tinker decision, which insisted that public schools must promote "a robust exchange of ideas." That means they also must show kids how to engage in such dialogue: with clarity, patience, and respect for one's adversary.
Otherwise, our much-vaunted "exchange of ideas" devolves into a crude shouting match. That's why we should punish students' Internet attacks on school employees, which echo the worst aspects of our debased popular culture.
Isn't it time we taught children a better way to talk? Their very freedom depends on it.
Posted on: Tuesday, February 23, 2010 - 11:48
SOURCE: National Review (2-22-10)
The clash between China and Google is the first shot in what could be a long war. Why should this be the case when, as Fareed Zakaria notes, the U.S. and China “have powerful reasons to cooperate with one another”?
Those reasons, however powerful, might not be enough. The trouble is that what Americans think of as basic liberty, something that’s as necessary to the good life as the air we breathe, China regards as imperialism....
In America, we take “civil society” — a sphere of activity where free and equal citizens may do and say what they please — for granted. Since that is our experience, we tend to think that it’s simply how the world is supposed to be. Yet individual liberty, personal privacy, and the rights to assemble, to lobby, and to exchange thoughts and beliefs are particular ideas, even if they are universal in scope. To provide just one example: In China, there are five officially recognized religions. Members of other religious groups are not free to practice openly....
China understands America very differently than we do. To them, the U.S. private sector and the U.S. government comprise one all-encompassing nation. Hence the “campaign for the uncensored free flow of information” is a “U.S. campaign,” and they see Google as a tool of the U.S. government. To them, the U.S., through Google, is engaged in “Internet warfare.” To allow information to flow freely, without monitoring by the government, is to knuckle under to Western imperialism. In short, China does not acknowledge the distinction between state and society that is fundamental in America....
We face, in short, a clash of regimes. Values and institutions that are fundamental in and essential to the United States, and that make the U.S. what it is, are incompatible with values and institutions that are fundamental in and essential to China, and that make it what it is....
As China becomes an ever more important player in world affairs, the clash between the liberal regimes of the West and China will continue. The conflict will not end until either the U.S. or China changes fundamentally. Ultimately, the very existence of the Chinese regime, as it is currently constituted and as it understands itself, is irreconcilable with the idea of a truly private sector, one with freedoms of expression and religion. As the regime sees it, China cannot accept the free flow of ideas, information, and goods and still be China.
Posted on: Monday, February 22, 2010 - 16:06