Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: NY Review of Books (11-17-05)
You do not take a person who, for years, has been hobbled by chains and liberate him, bring him up to the starting line of a race and then say, "you are free to compete with all the others," and still justly believe that you have been completely fair.... It is not enough just to open the gates of opportunity. All our citizens must have the ability to walk through those gates.... We seek not just legal equity but human ability, not just equality as a right and a theory but equality as a fact and equality as a result.
The other argument, which is reflected in recent Supreme Court decisions and is currently much heard, is based on the assumption that racial and ethnic diversity among "elites"— relatively well-off people who have some degree of responsibility for others, whether private or public—is beneficial to society and its institutions. Prominent among those who defend affirmative action, for example, are spokesmen for the American mili-tary who lent conspicuous support to the University of Michigan's side in the 2003 Supreme Court case. Since the enlisted ranks are disproportionately black and Latino, discipline and morale are presumably inspired by having the same groups represented among the officers, including those of the highest rank. Corporations that deal with a multicultural and multiracial clientele, sometimes on an international scale, find obvious advantages in being represented by people who reflect the racial and ethnic diversity of those with whom they are do-ing business. Many large corporations practice affirmative action voluntarily even when there is no significant pressure from the government.
In higher education the diversity argument takes a slightly different form. Racially and ethnically heterogeneous student bodies are said to create an appropriate educational environment for students who will encounter many different kinds of people when they go out into the world. Faculties, moreover, must be diverse if they are to provide inspiration and suitable "role models" for minority students.
Clearly affirmative action has had its greatest success in producing more diverse elites, particularly in the much-heralded emergence of a substantial African-American middle class, something that never existed before. But as the sociologist William Julius Wilson has argued for many years, this process of embourgeoisement has been accompanied by the equally substantial growth of "the truly disadvantaged," the economically marginal black inhabitants of the urban ghettos. Since the advent of affirmative action in the 1960s, the overall differences between blacks and whites have changed very little with respect to average incomes, property holdings, and levels of educational attainment. What is new is the gulf that has opened in the black community between the middle class and the lower or "under" class.
Affirmative action originated as a pragmatic response by those in the federal government responsible for enforcing the fair employment provisions of the Civil Rights Act of 1964. The Equal Employment Opportunity Commission (EEOC) set up under the act lacked the staff to investigate most individual claims of discrimination in employment. It also lacked legal authority to act effectively on behalf of the complainants. As a result, the only way that the EEOC could begin to do its job was to request government contractors to provide statistics on the racial composition of their labor force. If blacks (and by the 1970s other minorities as well) were underrepresented among the workers relative to their percentage of the local population, the EEOC set numerical goals for minority recruitment sufficient to correct this disproportion. Employers were then required to make "good faith efforts" to meet "quotas" for black workers. If they didn't hire more blacks, they risked losing contracts. The professed aim was equal opportunity, not racial favoritism; but the paradox that bedeviled the program from the start was that it appeared to require preferential means to reach egalitarian ends.
After its fitful beginnings during the Johnson administration, affirmative action took a dramatic turn under Richard Nixon, whose administration put into effect a controversial plan to integrate Philadelphia's construction trades. Historians have concluded that the Philadelphia Plan of 1969–1970, which set firm racial quotas for hiring for one industry in one city, was a political ploy. It was designed by the Nixon Republicans to cause friction between two of the principal consti-tuencies of the Democratic Party— organized labor, which opposed the plan because of the threat it posed to jobs under its control, and African-Americans, who had overwhelmingly supported the Democrats since the New Deal. At the same time, Nixon was trying to appeal to Southern whites by doing little to enforce desegregation, especially in the schools.
When rising opposition to the war in Vietnam became the critical issue for his administration in 1970 and 1971, and hard hats like the construction workers of Philadelphia were in the forefront of those opposing the anti-war protesters, the Philadelphia Plan was quietly shelved. From then on, Republicans were, for the most part, strongly opposed to affirmative action and benefited from the backlash against it, attacking the Democrats as the "party of quotas" because of their continued support for the policy.
Affirmative action was declared constitutional in 1971 when the Supreme Court ruled in Griggs v. Duke Power Co. that discrimination in employment could be subject to affirmative action even if it were not intentional or motivated by prejudice. The Court found that the standardized aptitude tests given by the company to employees prevented blacks from moving to higher-paying departments. Such requirements could be "fair in form," the Court said, but they could still be described as "discriminatory in operation" if they had an "adverse impact" on blacks. Hence the EEOC was legally entitled to set goals for increasing minority employment and to require periodic reports on the progress being made on fulfilling such goals by any of the 300,000 firms doing business with the federal government....
Posted on: Monday, November 7, 2005 - 17:16
SOURCE: Chronicle of Higher Education (11-4-05)
But the problem runs deeper. Disasters and crises in American history have, in fact, rarely produced any fundamental changes in economic or social policy. Natural disasters, like hurricanes, floods, earthquakes, and fire, are invariably local events, leaving too much of the nation unscathed to generate any broad-gauged shift in understanding or ideology. Moreover, the most readily adopted policy changes have involved technical issues, like the approval or revision of fire codes, the earthquake-proofing of new buildings, and the raising of the height of levees. The most far-reaching policy decision after the San Francisco earthquake of 1906 was to flood the beautiful Hetch Hetchy Valley to guarantee San Francisco a more reliable source of water.
Poverty, however, is not a technical issue, but a deep, structural problem that implicates our values, our economic institutions, and our conception of the proper role of the state. There are fixes, but no quick fixes — and no fixes that will not cost something to at least some other members of our society. Understandably, thus, there has always been resistance to government actions, such as increasing the minimum wage, that might aid the poor; and that resistance has long been grounded both in self-interest and in willful blindness of a type that does not succumb to relatively brief crises.
Nowhere is that more evident — or more relevant — than in the history of responses to the panics and depressions that have been a prominent feature of American economic life for almost two centuries. (We called them "panics" until the early 20th century, when widespread recognition that capitalist economies had business cycles led to the use of more reassuring words like "depression," "downturn," and, still later, "recession.")
As American society grew more industrial and urban in the 19th century, and as markets increasingly shaped the ability of its populace to earn a living, the impact of business-cycle downturns broadened and deepened. The long post-Civil War downturn of the 1870s, for example, toppled millions of people into destitution or near-destitution; arguably the first depression of the industrial era, it created widespread distress that prompted pioneering efforts to count the unemployed, while also contributing to outbursts of working-class violence in 1877. Less than a decade later (1884-86), the country lived through another downturn, followed in the 1890s by the most severe depression of the 19th century. The Panic of 1893 (which lasted until 1897) sharply lowered wages, while making jobs exceedingly scarce in many industries. It also prompted the first major protest march on Washington, a national movement of the unemployed led by Ohio's populist leader Jacob S. Coxey.
Yet many, if not most, Americans resisted the notion that millions of their fellow citizens were genuinely in need of aid. New York's distinguished Protestant cleric Henry Ward Beecher famously commented in 1877 that a man could easily support a family of eight on a dollar a day: "Is not a dollar a day enough to buy bread with? Water costs nothing; and a man who cannot live on bread is not fit to live." In the spring of 1894, after the dreadful winter that sparked the formation of Coxey's Army, Daniel B. Wesson, of Smith and Wesson, observed that "I don't think there was much suffering last winter." Others insisted that if men and women were destitute, it was because they were improvident or they drank: "Keep the people sober that ask for relief, and you will have no relief asked for," said Henry Faxon, a businessman from Quincy, Mass. Even Carroll D. Wright, who supervised the first count of the unemployed in American history in 1878 (and a few years later became the first head of the U.S. Bureau of Labor Statistics), expressed the view that most of the men who lacked jobs did not "really want employment."
Private charities, as well as what were called "overseers of the poor" (overseers? the language itself is telling), did, of course, provide some relief to the victims of economic crisis. Yet they did so grudgingly and warily, often insisting that recipients perform physical labor in return for food and fuel, while also (particularly with women) conducting home interviews to verify that applicants for aid were of good moral character. In many states, the sole legislative response to the depressions of the late 19th century was the passage of "anti-tramp" laws that made it illegal for poor workers to travel from place to place in search of work.
Over time, to be sure, the recurrence of panics, coupled with the learning gradually acquired by charity officials and dedicated antipoverty activists like settlement-house workers, did contribute to more-hospitable attitudes toward the poor. They also gave birth to new policy ideas (like unemployment insurance) that might help alleviate the problem of poverty. Such ideas, which began to gain a bit of traction in the early 20th century, were grounded in the supposition that American society had a responsibility to help not only the destitute, but also the poor: the millions of men and women who worked hard, and as steadily as they could, but lived in substandard conditions, a short distance from dire, material need.
Resistance to such proposals, however, did not vanish overnight: Suspicions that the poor were "unworthy" persisted, as did a reluctance to expand the role of government. Of equal importance, crisis conditions (as business-cycle theorists pointed out) did come to an end, lessening the visible urgency of the problem. That was particularly true during the first three, sharp depressions of the 20th century, in 1907-8, 1913-14, and 1920-22. The second of those led to the drafting of unemployment-insurance bills in several states, but by the time the bills were ready for legislative action, the economy was picking up, and interest had ebbed. In 1921 a charming, eccentric organizer named Urbain Ledoux put together sensational anti-unemployment demonstrations in both New York City and Boston, including mock auctions of unemployed "slaves" on the Boston Common. He garnered enough front-page attention to earn a meeting with President Warren G. Harding and an invitation to attend the President's Conference on Unemployment — which turned out to be a bureaucratic vehicle that effectively delayed action long enough for the economic crisis to come to an end, with no new policies put into place.
The Great Depression and the New Deal, of course, constitute the most dramatic and far-reaching exception to the pattern of a crisis causing (and revealing) poverty while yielding no basic changes in policy. As always, the exception sheds some light on the dynamics that produce the rule. The Great Depression gave rise to significant shifts in policy not just because the downturn was severe but also — and more important — because of its unprecedented length. There were few innovations in public responses to poverty and unemployment during the early years of the 1930s (no one, after all, knew that this depression would turn out to be the Great Depression), and the era's most durable, systematic legislation came more than two years after Franklin D. Roosevelt took office in 1933. The Social Security Act (with its provisions for unemployment and old-age insurance) and the Wagner Act (which strengthened the right of workers to join unions) were passed only in 1935, the same year that the pioneering work-relief programs of the Works Progress Administration were launched; the Fair Labor Standards Act, mandating a federal minimum wage, was not enacted until 1938, nearly a decade after the stock-market crash.
The greatest economic crisis in American history, thus, did not produce a quick turnaround in antipoverty policy, even though the ideas eventually put into effect had been circulating among progressives for decades. The recognition that millions of people were suffering (inescapable as early as 1931) was not enough to produce action. What was also needed was time for political movements to build and to generate a leadership with the political will to take action.
That involved not just the election of Roosevelt in 1932 (which was primarily a repudiation of Herbert Hoover) but also the Share-the-Wealth movement of Louisiana's own Huey P. Long, the election of many left-leaning Democrats to Congress in 1934, and Roosevelt's overwhelming re-election in 1936. It was at his second inaugural, in January 1937, that Roosevelt famously referred to "one-third of a nation ill-housed, ill-clad, ill-nourished" and committed his administration to dealing with the "tens of millions" of citizens "who at this very moment are denied the greater part of what the very lowest standards of today call the necessities of life."...
Posted on: Friday, November 4, 2005 - 21:34
SOURCE: Newsweek (11-7-05)
Presidents fall into second-term slumps for different reasons. More important for President Bush is how they get out of them. Roosevelt gained an unprecedented third term by convincing the country that he alone was equipped to shield them against the growing threats from Hitler and the imperial Japanese. Ronald Reagan shook off the albatross of Iran-contra by joining Mikhail Gorbachev to wind down the cold war. Straining to survive the Monica Lewinsky mess, Bill Clinton boasted that he was responsible for the longest economic expansion in American history. ("Dow Jones, not Paula Jones.")
If a majority of the public thought the Iraq war were going well, Bush might naturally turn—as second-term presidents often do—to foreign affairs, climbing aboard Air Force One to pursue his aim of expanding democracy throughout the Middle East and the world.
During his last 18 months in office, Eisenhower flew to Asia, Europe and Latin America and deployed his war hero's popularity to seek new friends for America while trying to improve relations with Moscow. By the time Ike left office, most Americans had forgotten their anger over losing the space race to the Soviets.
Truman and Johnson would have loved to use foreign policy to boost their sagging popularity. LBJ was privately desperate to make the first presidential trip to the Soviet Union (even after Nixon's election to succeed him) and show Americans once and for all that he was no warmonger.
But as their second terms ground to a close, Truman and Johnson both sadly realized that the public was focused on the news from the battlefront. And so long as these beleaguered war presidents remained in office, that news never got better.
Historians sometimes view presidents very differently from the way the public did at the time. Sometimes they don't. Hoping for vindication by history [as would happen in the case of Truman but not in the case of Nixon]....
Posted on: Friday, November 4, 2005 - 18:09
SOURCE: Los Angeles Times (11-3-05)
It is just a coincidence, but fortuitous nevertheless, that the Democrats forced the Senate into a special secret session to discuss how we got into the war in Iraq during the same week that we finally learned the nation was deliberately misled about the famous "Tonkin intercepts" that helped lead us into Vietnam more than 40 years ago.
What worried the Democrats about Iraq turns out to be exactly what happened in Vietnam. We know now, thanks to one brave and dogged historian at the National Security Agency, that after the famed Gulf of Tonkin "incident" on Aug. 4, 1964 -- in which North Vietnam allegedly attacked two American destroyers -- National Security Council officials doctored the evidence to support President Johnson's false charge in a speech to the nation that night of "open aggression on the high seas against the United States of America."
In fact, no real evidence for those attacks has ever been found. The entire case rested on the alleged visual sightings of an inexperienced 23-year-old sonar operator. Nevertheless, Johnson took the opportunity to order the bombing of North Vietnam that night and set the nation inexorably on a path toward the "wider war" he promised he did not seek. And administration bigwigs never admitted publicly that they might have acted in haste and without giving contradictory signals their proper weight.
On the contrary, military and national security officials scrambled wildly to support the story. The media cooperated, with lurid reports of the phony battle inspired by fictional updates like the one Johnson gave to congressional leaders: "Some of our boys are floating around in the water."
The new study apparently solves a mystery that has long bedeviled historians of the war: What was in those famous (but classified) North Vietnamese "intercepts" that Defense Secretary Robert McNamara was always touting to Congress, which allegedly proved the attack took place? Until recently, most assumed that McNamara and others had simply misread the date on the communications and attributed conversations between the North Vietnamese about an earlier Tonkin incident on Aug. 2, 1964, (when the destroyer Maddox was briefly and superficially under fire) to Aug. 4, the day of the phony attack. But, according to the New York Times, NSA historian Robert J. Hanyok has concluded that the evidence was deliberately falsified: there were translation mistakes that were not corrected, intelligence that was selectively cited and intercept times that were altered.
In revealing the story Monday, the Times reported that Hanyok's efforts to have his classified findings made public had been rejected by higher-level agency policymakers who, beginning in 2003, "were fearful that it might prompt uncomfortable comparisons with the flawed intelligence used to justify the war in Iraq."
And rightly so. The parallels between the Tonkin episode and the war in Iraq are far too powerful for political comfort. In both cases, top U.S. national security officials frequently asserted a degree of certainty about the alleged actions and capabilities of an adversary that could not possibly be supported by the available evidence. In both cases, it's possible that the president might have been honestly misguided rather than deliberately deceptive -- at least at first. But in neither case would anyone admit the possibility of an honest mistake.
Posted on: Friday, November 4, 2005 - 12:33
SOURCE: Commentary (11-2-05)
To commemorate Commentary magazine's 60th anniversary, and in an effort to advance discussion of the present American position in the world, the editors asked a number of thinkers to consider a statement and four questions.
In response to a radically changed world situation since the Islamist attacks of 9/11, the United States under George W. Bush has adopted a broad new approach to national security. The Bush Doctrine, as this policy has come to be known, emphasizes the need for preemption in order to "confront the worst threats before they emerge." It also stresses the need to transform the cultures that breed hatred and fanaticism by—in a historically stunning move—actively promoting democracy and liberty in the Middle East and beyond. In the President's words, "We live in a time when the defense of freedom requires the advance of freedom."
This sweeping redirection of policy has provoked intense controversy, especially but not only over its practicality, and especially but not only over its application to Iraq. At issue as well are the precise nature of the threats faced by the United States and the West, the specific tactics adopted by the Bush administration in meeting them, American capabilities and staying power, relations with traditional allies, the larger intentions and moral bona fides of U.S. foreign policy, and much else besides. Opinion on these matters is divided not only between the Left and the Right in political and intellectual life but, quite sharply, among American conservatives themselves.
1. Where have you stood, and where do you now stand, in relation to the Bush Doctrine? Do you agree with the President's diagnosis of the threat we face and his prescription for dealing with it?
2. How would you rate the progress of the Bush Doctrine so far in making the U.S. more secure and in working toward a safer world environment? What about the policy's longer-range prospects?
3. Are there particular aspects of American policy, or of the administration's handling or explanation of it, that you would change immediately?
4. Apart from your view of the way the Bush Doctrine has been defined or implemented, do you agree with its expansive vision of America's world role and the moral responsibilities of American power?
As the editors note, the Bush Doctrine consists of two parts, preemption and democracy, both of them far-reaching in their implications. Yet their scope is different. Preemption specifically concerns the most aggressive tyrannies and radical groups. Democracy primarily concerns one region, the Middle East. The two require separate consideration.
The United States and other democratic governments have historically relied not on preemption but on deterrence to stave off enemies. Deterrence signals, "Don't harm us, or you will pay dearly." It has many successes to its credit, notably in the cold war. But deterrence also has significant drawbacks; it is slow, passive, and expensive. Worst, if it fails, war follows. That happens when a tyrant is not intimidated (Hitler) or when the deterrent threat is not clearly enough articulated (Kim Il Sung, Saddam Hussein).
Several recent changes render deterrence less adequate than in the past. For one thing, the demise of the Soviet Union means that no preeminent enemy power exists to restrain the hotheads, for example in North Korea. For another, the proliferation of weapons of mass destruction raises the stakes; a U.S. President cannot afford to wait for American cities to be destroyed. And for a third, the spread of Islamist terror networks renders deterrence ineffectual, there being no way to retaliate against al Qaeda.
Responding to these changes, President Bush in June 2002 added a second policy option, that of preemption. Americans, he announced, are not prepared to wait for deterrence to fail and war to start. "We must take the battle to the enemy, disrupt his plans, and confront the worst threats before they emerge." U.S. security, Bush said, requires Americans "to be forward-looking and resolute, to be ready for preemptive action when necessary to defend our liberty and to defend our lives."
Preemption is to be deployed in unusual cases, against enemies of a particularly vicious and ephemeral sort. According to a draft Pentagon document, "Doctrine for Joint Nuclear Operations," the military is preparing guidelines for commanders to receive presidential approval to use nuclear weapons to preempt a WMD attack or to destroy enemy stockpiles of WMD.
To date, preemption has been used only once: in the March 2003 war against Saddam Hussein. It most likely would be brought into service a second time against Iran or North Korea.
I have endorsed preemption, both in the abstract and as applied to the Iraqi dictator. But in doing so, I am aware of its special difficulties: error is likely, and uncertainty is inescapable. That three Arab states tightened a noose around Israel in 1967 did not prove they intended to attack it. That Saddam Hussein had a WMD infrastructure still left his plans ambiguous.
These difficulties place special responsibility on a government that preempts. It must act in as transparent a manner as possible, without guile. It must first establish the validity of its actions to its own citizenry. Second, because Americans heed so much what others think, the opinion of the targeted country's population also matters, as does the opinion of other key countries.
In this regard, the Bush administration has fared poorly, convincing only half of Americans and far fewer among most other peoples, including Iraqis and Britons. Should preemption be invoked against Iran or North Korea, public diplomacy would need to be a far higher priority.
When it comes to spreading democracy, the Bush administration breaks no conceptual ground. Since its own war of independence, the United States has inspired others by its example, and its government has consciously promoted democracy since World War I. What is novel today is the interventionist quality of this policy and its application to the Middle East.
Concerning the latter, it is notable that in November 2003, the President referred to what had been an enduring, consensual, bipartisan policy as "sixty years of Western nations excusing and accommodating the lack of freedom in the Middle East." In fact, that emphasis on stability resulted from a recognition of Middle East exceptionalism—that, unlike elsewhere in the world, popular attitudes in this region were deeply anti-American, and distinctly more so than the attitudes of the region's emirs, kings, and presidents. Such a situation naturally led Washington to conclude it had best work with dictators, lest democracy bring radicalized governments to power.
This fear was entirely reasonable, as the 1978 revolution in Iran established and as the Algerian elections of 1991 confirmed. But, setting aside such apprehensions, Bush now insisted that Middle Easterners would, no less than other peoples, benefit from democracy and mature through it. He drew direct comparisons with American success in sponsoring democracy in Europe and Asia.
I cheered this change in direction when it was announced, and still do. But here, too, I find the implementation flawed. The administration is trying to build democracy much too quickly. A mere 22 months, for example, passed between the overthrow of Saddam Hussein and elections for the prime minister of Iraq; in my view, the interval should have been closer to 22 years.
Haste ignores the historical record. Democracy has everywhere taken time, and especially so when it builds on a foundation of totalitarian tyranny, as in Iraq. As I wrote in April 2003:
Democracy is a learned habit, not instinct. The infrastructure of a civil society—such as freedom of speech, freedom of movement, freedom of assembly, the rule of law, minority rights, and an independent judiciary—needs to be established before holding elections. Deep attitudinal changes must take place as well: a culture of restraint, a commonality of values, a respect for differences of view and a sense of civic responsibility.
As for the editors' final question, although Americans have no moral obligation to sponsor freedom and prosperity in the rest of the world, it does make for an excellent foreign-policy goal. The more the world enjoys democracy, the safer are Americans; as other free peoples prosper, so do we. The bold aim of showing the way, however, requires a cautious, slow, and tempered policy. The Bush administration has a visionary boldness but not the requisite operational caution.
Posted on: Wednesday, November 2, 2005 - 15:01
SOURCE: Tribune Media Services (10-31-05)
Amnesty International, Human Rights Watch and other global humanitarian groups recently expressed criticism over the slated trial of the mass murderer Saddam Hussein. Such self-appointed auditors of moral excellence were worried that his legal representation was inadequate. Or perhaps they felt the court of the new Iraqi democracy was not quite up to the standards of wigged European judges in The Hague.
Relay those concerns to the nearly 1 million silent souls butchered by Saddam's dictatorship. Once they waited in vain for any such international human-rights organization to stop the murdering. None could or did.
Now these global watchdogs are barking about legalities — once Saddam is in shackles thanks solely to the American military (which, too, is often criticized by the same utopian-minded groups). The new Iraqi government is sanctioned by vote and attuned to global public opinion. Saddam Hussein was neither. So Amnesty International can safely chastise the former for supposed misdemeanors after it did little concrete about the real felonies of the latter.
We've seen many examples of this in-your-sleep moralizing. U.N. Secretary-General Kofi Annan pronounced from on high that the American effort to remove Saddam was "illegal" — this after moral paragons in the Security Council like China and France chose not to sanction the enforcement of their own resolutions.
Annan presided over a callous, scandalous oil-for-food program that starved millions with the connivance of international financial players, among them his own son. Again, it is easier to grandstand on television than curb illicit profits or be firm with a killer in the real world.
Europeans especially demand heaven on earth. The European Union is now pressuring the United States to turn over its exclusive control of the Internet, which it invented and developed, to the United Nations. So far the Americans, so unlike a Saudi Arabia or China, have not blocked users from net access, and freely adjudicate the worldwide Web according to transparent protocols.
That would never be true of the United Nations. If Iran or Zimbabwe were to end up on the Human Rights Commission, then they would be equally qualified to oversee the computers of millions of Americans. The same European elites who nitpick the United States about its sober stewardship of the Internet would be absolutely impotent once a China or Syria began tampering with millions logging on.
We see still more in-your-sleep moralizing when it comes to the topic of global warming. The heating up of the planet — and the American rejection of the Kyoto Protocol that was supposed to arrest it — is a keen source of anti-Americanism, especially in Europe....
Posted on: Tuesday, November 1, 2005 - 21:34
SOURCE: Ottawa Citizen (11-1-05)
Canada's decisions to stay out of the Iraq war and the American missile defence program were both driven by overwhelming opposition in Quebec -- views not shared to the same degree in the rest of Canada, Jack Granatstein told the annual Ottawa conference of the Canadian Defence & Foreign Affairs Institute.
"French-Canadians have largely shaped our defence and foreign policy since 1968," said Mr. Granatstein, a former director of the Canadian War Museum.
"If it's bad policy to let Canadian Jews or Canadian Muslims have undue influence on Canadian policy toward Israel, it's similarly bad policy to let French-Canadians determine Canadian foreign policy," he said, adding: "Is that too strong, to say it that way?"
Allowing Quebecers to determine Canada's policy on Iraq and missile defence damaged our national interest, Mr. Granatstein said, since our economy depends on trade with the U.S.
"We're extremely vulnerable when the United States is unhappy with us," he said.
"In neither case was there any leadership from Ottawa to try to persuade Quebec that the economy, their jobs, their pocketbook might actually matter more than whether or not Canada supported the U.S. in Iraq or supported ballistic missile defence."
Letting Quebec set the foreign policy agenda also strains national unity, argued Mr. Granatstein, who noted that polls showed a 40-percentage-point difference in support for Canadian involvement in Iraq between Alberta and Quebec at one point.
Posted on: Tuesday, November 1, 2005 - 18:58
SOURCE: NY Sun (11-1-05)
"Iran's stance has always been clear on this ugly phenomenon [i.e., Israel]. We have repeatedly said that this cancerous tumor of a state should be removed from the region."
No, those are not the words of Iran's president, Mahmoud Ahmadinejad, speaking last week. Rather, that was Ali Khamenei, the Islamic Republic of Iran's supreme leader, in December 2000.
In other words, Ahmadinejad's call for the destruction of Israel was nothing new but conforms to a well-established pattern of regime rhetoric and ambition. "Death to Israel!" has been a rallying cry for the past quarter-century. Mr. Ahmadinejad quoted Ayatollah Khomeini, its founder, in his call on October 26 for genocidal war against Jews: "The regime occupying Jerusalem must be eliminated from the pages of history," Khomeini said decades ago. Mr. Ahmadinejad lauded this hideous goal as "very wise."
In December 2001, Ali Akbar Hashemi Rafsanjani, a former Iranian president and still powerful political figure, laid the groundwork for an exchange of nuclear weapons with Israel: "If a day comes when the world of Islam is duly equipped with the arms Israel has in possession, the strategy of colonialism would face a stalemate because application of an atomic bomb would not leave anything in Israel but the same thing would just produce minor damages in the Muslim world."
In like spirit, a Shahab-3 ballistic missile (capable of reaching Israel) paraded in Tehran last month bore the slogan "Israel Should Be Wiped Off the Map."
The threats by Messrs. Khamenei and Rafsanjani prompted yawns but Mr. Ahmadinejad's statement roused an uproar.
The U.N. secretary-general, Kofi Annan, expressed "dismay," the U.N. Security Council unanimously condemned it, and the European Union condemned it "in the strongest terms." Prime Minister Martin of Candida deemed it "beyond the pale," Prime Minister Blair of Britain expressed "revulsion," and the French foreign minister, Philippe Douste-Blazy, announced that "for France, the right for Israel to exist should not be contested." Le Monde called the speech a "cause for serious alarm," Die Welt dubbed it "verbal terrorism," and a London Sun headline proclaimed Ahmadinejad the "most evil man in the world."
The governments of Turkey, Russia, and China, among others, expressly condemned the statement. Maryam Rajavi of the National Council of Resistance of Iran, a leading opposition group, demanded that the European Union rid the region of the "hydra of terrorism and fundamentalism" in Tehran. Even the Palestinian Authority's Saeb Erekat spoke against Mr. Ahmadinejad: "Palestinians recognize the right of the state of Israel to exist, and I reject his comments." The Cairene daily Al-Ahram dismissed his statement as "fanatical" and spelling disaster for Arabs.
Iranians were surprised and suspicious. Why, some asked, did the mere reiteration of long-standing policy prompt an avalanche of outraged foreign reactions?
In a constructive spirit, I offer them four reasons. First, Mr. Ahmadinejad's virulent character gives the threats against Israel added credibility. Second, he in subsequent days defiantly repeated and elaborated on his threats. Third, he added an aggressive coda to the usual formulation, warning Muslims who recognize Israel that they "will burn in the fire of the Islamic umma [nation]."
This directly targets the Palestinians and several Arab states, but especially neighboring Pakistan. Just a month before Mr. Ahmadinejad spoke, the Pakistani president, Pervez Musharraf, stated that "Israel rightly desires security." He envisioned the opening of embassies in Israel by Muslim countries like Pakistan as a "signal for peace." Mr. Ahmadinejad perhaps indicated an intent to confront Pakistan over relations with Israel.
Finally, Israelis estimate that the Iranians could, within six months, have the means to build an atomic bomb. Mr. Ahmadinejad implicitly confirmed this rapid timetable when he warned that after just "a short period … the process of the elimination of the Zionist regime will be smooth and simple." The imminence of a nuclear-armed Iran transforms "Death to Israel" from an empty slogan into the potential premise for a nuclear assault on the Jewish state, perhaps relying on Mr. Rafsanjani's genocidal thinking.
Ironically, Mr. Ahmadinejad's candor has had positive effects, reminding the world of his regime's unremitting bellicosity, its rank anti-Semitism, and its dangerous arsenal. As Tony Blair noted, Mr. Ahmadinejad's threats raise the question, "When are you going to do something about this?" And Mr. Blair later warned Tehran with some menace against its becoming a "threat to our world security." His alarm needs to translate into action, and urgently so.
We are on notice. Will we act in time?
This article is reprinted with permission by Daniel Pipes. This article first appeared in the New York Sun.
Posted on: Tuesday, November 1, 2005 - 17:09
SOURCE: NYT (11-1-05)
Predicting that Patrick J. Fitzgerald's investigation will go no further than the alleged perjury of I. Lewis Libby Jr., David Brooks invokes the great historian Richard Hofstadter's characterization of a "paranoid style in American politics" to describe Democrats who conclude that Mr. Libby's lies obfuscated the administration's intention to go to war in Iraq.
But Mr. Hofstadter's focus in that essay was the contemporary right wing, exemplified by Joseph McCarthy and Robert Welch, founder of the John Birch Society. For Mr. Hofstadter, a central characteristic of the paranoid style is the conviction that the nation faces a hostile and conspiratorial world and the absence of sensible judgments about how to respond to that threat.
"Respectable paranoid literature not only starts from certain moral commitments that can be justified to many non-paranoids but also carefully and all but obsessively accumulates 'evidence,' " Mr. Hofstadter wrote. The goal is "to prove that the unbelievable is the only thing that can be believed." Senator McCarthy waved fictional lists of Communist agents.
Is not "paranoid" more appropriate as a description of those who misled the nation into war on the "evidence" of weapons of mass destruction?
Posted on: Tuesday, November 1, 2005 - 16:44