Roundup: Media's Take
This is where we excerpt articles from the media that take a historical approach to events in the news.
Murray Friedman, director of the Feinstein Center for American Jewish History at Temple University in Philadelphia, in FrontPageMag.com (Feb. 13, 2004):
Early this year, I was a panelist in a program at a leading Episcopalian Church in Philadelphia. The topic was a discussion on the public policy postures of the various faith groups. When the subject got around to Israel and the Intifada, I noted that many within this upper class group seemed hostile to Israel. The acts of suicide bombers, some felt, were a response to the imperialistic designs of the Jewish State. “What alternatives do the Arabs have?” one member of the audience asked.
The incident underlined my feeling that there has been a marked shift on the part of the Left with regard to many issues of concern to Jews, especially Israel. In the period following World War II, Jews were aligned with liberal church groups and others on the Left in the fight to end poverty and gain greater equality for society's disadvantaged, as well as by their mutual support for the State of Israel. In recent years, however, the Left no longer stands by the side of Jews in Middle East struggles; the Right, in fact, has emerged as a more reliable ally to Israeli interests. How did this shift come about? And what does it portend for the future?
The beginnings of the shift can be traced to the racial disorders of the l960s and the transformation of the civil rights movement into a race revolution. As racial upheavals in major American cities spread across the land, reaching a crescendo of violence following the murder of Martin Luther King, Jr. in Watts in l968, a new group of African-American leaders arose who challenged the integration strategies of King and other Black moderates. They argued the civil rights gains achieved by King did not reach down deeply enough into the smoldering ghettos of urban America, and that new approaches must be tried. The radicals, including Malcolm X, Stokely Carmichael (who later called himself Kwame Toure) and Carl Foreman, head of the Student Non-Violent Coordinating Committee (SNCC), urged separation from hitherto white allies and demanded “Black Power.” They called also for African-Americans to identify with the struggle of colored peoples throughout the world against colonial imperialism. In this new paradigm, Israel came to be seen—and portrayed—as an outpost of Western imperialism in the Middle East. All this came to a head following Israel's stunning victory in the Six Day War in 1967.
The date for the split between the Left and Israeli interests can almost be set precisely. Concerned about the impending fragmentation within the Left, a number of “progressives,” including Martin Peretz, publisher of the New Republic, convened a “Conference for a New Politics” in Chicago over Labor Day weekend l967. The meetings quickly became a fiasco. Peretz, who had funneled hundreds of thousand of dollars into the civil rights and peace movements, was not allowed to speak, even though he was on the event's steering committee. The conference keynoter, Martin Luther King, Jr, was jeered by black militants shouting, “Kill whitey!” Along with Peretz, he stormed out. The conference went on to adopt a number of resolutions, the most troublesome of which condemned the “imperialist Zionist war.” The Palmer House conference marked the last serious effort to forge a national, interracial, coalition of the Left.
Other collisions further highlighted the Left's meltdown. In l968, under a plan developed by the WASP-led Ford Foundation, devastating school strikes took place in New York City, as Black militants seized control of the community-controlled Ocean Hill School District in Brooklyn and fired thirteen Jewish teachers. In the l970s a furor broke out over the use of racial preferences, or quotas, as critics called them, in university and professional school admissions. And in l979, Andrew Young, King's chief aid who had been appointed by President Carter to serve as American ambassador to the UN, was forced to resign following his meeting with a PLO official in New York City.
Before l967, most mainline Protestant religious groups—a term used for the United Methodist Church, the Evangelical Lutheran Church in America, the Episcopal Church, the Presbyterian Church (U.S.A.), the United Church of Christ among others—had backed Israel. The creation of a Jewish State was seen as atoning for the Holocaust and part of a progressive ideology. Israel's military success, however, transformed its image from an embattled and isolated state into an occupying power, a vehicle of Western colonialism. In July l967, the Executive Committee of the General Board of the National Council of Churches released a statement concentrating mainly on the plight of Palestinian refugees. Deplorable as the problem was, the statement ignored its context. The refugees had been urged to leave during the war by invading Arab nations who promised they could return as soon as the war was won. Mainline church groups seemed unaware that unlike Israel, which had taken in significant numbers of Jewish refugees from Arab countries following its creation, Arab countries have used the Palestinians as pawns in their efforts to destroy the Jewish state.
While these church bodies maintain that they continue to endorse Israel's right to exist and support the end of suicide bombings, they have been critical of the military measures taken by the Jewish State to protect its citizens and have urged an Israeli withdrawal from the occupied territories. A statement by the United Methodist Council of Bishops in May 2002, for example, deplored the disproportionate use of force by the Israelis, assuming that the all-out war currently underway can somehow be fought without casualties to innocent people. The governing body of the Lutheran church in August 200l went so far as to call for the U.S. government to withhold military aid to Israel.
“There is an ambivalence [among] Lutheran churches as to just how productive it would be to have speakers not willing to see both side of an issue,” Del Leppke, a convener of the Middle East Working Group for the Chicago branch of the Evangelical Lutheran Church in America declared recently. With mainline Protestant groups clearly in mind, Church historian Martin Marty has pointed out “Being anti-Israel has become part of the anti-Establishment gospel, the trademark of those who purport to identity with the masses, the downtrodden and the Third World.”
It is not just mainline Christian churches that have joined in highly charged criticism of the tactics employed by Israel in its war on terrorism and the Intifada. A number of Left-leaning Jews have identified with these criticisms as well. The main function of a new organization, Jews for Peace in Palestine and Israel, has been organizing rallies backing the PLO. “There are many American Jews who are flat-out embarrassed by the fact that the prime minister of Israel is guilty of war crimes,” the group's executive director has said. Like the mainline churches, most of these pro-Palestinian Jewish groups maintain that they continue to remain supporters of the Jewish State. However, figures such as Noam Chomsky and Norman Finkelstein have denounced Israel harshly. In a December 200l speech in Beirut, Lebanon, Finkelstein compared Israeli behavior to “Nazi practices” during World War II.
A leading figure here has been Rabbi Michael Lerner, editor of Tikkun and long-time supporter of the Israeli Left. Lerner likes to argue that he is pro-Israel and pro-Palestinian and seeks to avoid rhetoric demeaning to Israel, but his actions and associations show otherwise. In 2002, he announced the creation of the Tikkun community, a multi-issue national organization “of liberal and progressive Jews,” to help bring about, among other things, broader concessions by the Israelis (meaning giving up territory) to gain peace. Two months after its founding, however, Lerner along with militant Black activist and Princeton professor Cornel West (described as its co-chair) took out a full-page Tikkun community ad in the N.Y. Times attacking the Jewish State's “oppressive occupation of the territories” and congratulating Israeli reservists who said they would not serve there. The ad, which said nothing about Palestinian terrorism, featured a cartoon of a hook- nosed, disreputable-looking Jew. Israel was described as a “Pharaoh,” while Israeli troops were likened to Nazis blindly “following orders" in “a brutal occupation” that violated international law and human rights.
Complaints against anti-Israel bias on the part of the liberal media have increased since the outbreak of the second intifada in 2002. Terrorists are often described as “militants” in leading newspapers like the N.Y. Times and the Philadelphia Inquirer. NPR, America's foremost, publicly-funded radio network, has been charged frequently with exhibiting a subtle, Left-wing bias. Unsupported and anecdotal Palestinian charges of Israeli misconduct are routinely aired without any balance or counterpoint. Thus, NPR's Peter Kenyon devoted an entire “Morning Edition” segment on January 9 of this year to the grievances of Palestinians in Nablus following Israeli military action there. Among other things, Israel was accused of demolishing houses, killing “unarmed bystanders,” damaging ancient walls and streets," delaying Palestinian firefighters as they try to"save “burning buildings” and wrecking water and sewer pipes in the city. Kenyon did not provide a single Israeli speaker to convey the necessity for operations in Nablus to disrupt the city's terrorist violence, which has produced a quarter of all Palestinian suicide bombers in the last three years.
It has been on college campuses, however, where the Left is most deeply entrenched and has contributed most heavily to anti-Israel and anti-Jewish sentiments. For example, “Die Jew, die, die, die, die, die. Stop living, die, die, DIE! Do us all a favor and build yourself [an] oven,” was an expression found recently in a student newspaper at Rutgers. A tenured professor at Georgetown asks, “How have Judaism and Jews, and the international forces all permitted Zionism to become a wild, destructive beast capable of perpetrating atrocities?” In addition to such harsh rhetoric, radical professors at many upper-class universities have cooperated with Arab students to urge their institutions to divest from the “apartheid” state of Israel.
In contrast, much of the support Israel has received in recent years has come more from conservative groups and the Right. Even as mainline Protestant groups began to shift ground following the Six Day War, Israel's victory dramatically intensified its positive image among many evangelical leaders. The latter began to call increasingly for greater U.S. support for the Jewish State. For the variously estimated forty to sixty million evangelicals, Israel's success in l967 was seen as a sign of God's favor. Critics charge such support has more to do with their theology—that the second coming of Jesus will be linked to the return of Jews to the Holy Land. While this may influence some, a poll taken by the International Fellowship of Christians and Jews released in 2002 indicated more than half supported Israel because it is a democracy and an important U.S. ally. Besides, as the late Holocaust historian Lucy Dawidowicz asked, “Why should Jews care about the theology of a `fundamentalist preacher' who speaks with no authority as to God's intentions? And what did such ‘theoretical abstraction' matter when the preacher is vigorously pro-Israel?”
As a result, beginning in the l980s, a number of Jewish bodies, including the American Jewish Committee, began reaching out to evangelicals. In l983, Yechiel Eckstein, a young, Orthodox rabbi who once worked for the ADL, founded the International Fellowship of Christians and Jews in Chicago in an effort to cement ties with evangelicals. Claiming, “True Christians are the Jews' best friends,” Eckstein also inaugurated the Center for Judeo Christian Values in Washington, D.C. The group sought to find common religious ground and establish moral standards and a greater sense of personal accountability in society. Significantly, Senators Joseph Lieberman (D., Conn.) and Dan Coats (R., Ind.), widely seen as more centrist or conservative in their respective parties, were the organization's original co-chairs. In 2002, Eckstein reported American evangelicals had quietly given over $l00 million over the previous seven years in humanitarian assistance for needy Jews world-wide, including resettlement costs, housing, food, and medical aid.
In what has been perhaps the most astonishing development, in the last two or three years, we have witnessed a significant shift by Jewish leaders in their response to the Christian Right. Just a few years earlier they had attacked it as anti-Semitic and criticized it for engaging in missionary activity among Jews. But in the summer of 2002, a regional branch of the Zionist Organization of America in Chicago honored Christian Coalition head Pat Robertson at its annual Salute to Israel Dinner. And on May 2, 2003, the Anti-Defamation League, which had sharply criticized Robertson and other Christian Right leaders in a widely commented upon l994 pamphlet, took out an ad in the Los Angeles Times and N.Y. Times featuring Ralph Reed, former spokesman for the Christian Coalition. Reed called Israel's continued survival “proof of God's sovereignty.”
Criticized for this, ADL head Abe Foxman remained unrepentant, saying: “I am proud to have Ralph Reed as a friend and as an advocate on Israel.” Foxman was only sorry, he added, that politically liberal Christians tended to be weaker in their support for Israel.
Meanwhile, the seeds of the Left's criticism of Israel that came to light at the New Politics convention in l967 have continued to sprout. In 1991, following a period of relative calm after the rioting of local African Americans against Chasidim Jews living in Crown Heights in Brooklyn (due to a traffic accident that took the life of a black child and resulted in the murder of a Chasidic scholar), black-Jewish tensions heightened again. In the off-year congressional elections in 2000, two African-Americans in the House of Representatives in Washington widely seen as anti-Israel, Reps. Earl Hilliard (D.,Ala.) and Cynthia McKinney (D., Ga.)—the latter given to conspiracy theories about Jews—lost their seats in the Democratic primaries following an intense campaign against them by pro-Israel elements. Conflicts between blacks and Jews were exacerbated also when a significant number of the members of the Black Caucus in the House voted against or listed themselves as present when a pro-Israel resolution came up for a vote.
Significantly, even as the Left has become less reliable, support for Israel has grown within the political Right in Congress and elsewhere. Early in July of last year, Tom DeLay, the House majority leader (and a leading evangelical) visited Israel and addressed the Knesset. He pledged continued backing for the Jewish State and opposition to President Bush's “roadmap” for peace if it meant coercing the Jewish state into making concessions that would harm its security.
The President himself, unlike his father, has given many signs of his strong backing for Israel. He has refused to meet or permit government officials to meet with Yasir Arafat, the head of the Palestine Authority, who Bush feels has been unreliable as a peace partner. His strongest statement was delivered in a landmark address on the Middle East in the White House Rose Garden on June 24, 2003. In it, he declared that the Palestinians would only achieve their goal of statehood if they initiated “new leadership, new institutions, and new security arrangements.” He urged Palestinians to “elect new leaders, leaders not compromised by terror” and indirectly accused Arafat—he did not use his name—of leading an authority that was rife with “official corruption.” Last month, the Jewish Telegraphic Agency reported that the Bush Administration, overcoming some reluctance, was preparing a brief for a hearing before the International Court of Justice in the Hague on behalf of Israel's decision to erect a West Bank security barrier, a measure widely criticized by the Left.
What, finally, can be said about the shifts described here? While Jews can still be characterized as liberals and will continue to vote heavily for Democratic candidates, it is a chastened liberalism at best. The safety and security of Israel and the war against terrorism remain central keys to Jewish political behavior today. As former New York Mayor Ed Koch, a life-long Democrat who continues to disagree with much of the Republican domestic program, wrote in the Forward on January 9, 2004, “President Bush has earned my vote because he has shown the resolve and courage necessary to wage the war against terrorism.” Koch added: “… I am prepared, as an American and a Jew, to make the well being of Israel my primary concern,” Gary Rosenblatt, the highly respected editor of the New York Jewish Week recently said, “believing that a government that protects a democratic ally in danger shows the greatest understanding and compassion for human rights and values.” A poll released by the American Jewish Committee covering the period from November 25 to December 11 suggests that this view may be gaining ground in the Jewish community. It showed that while Jews are still predominantly Democrats, Bush would get 3l percent of the Jewish vote in a match up with most of the aspiring Democratic candidates with the exception of Senator Lieberman. This is a figure about three times greater than in the national election in 2000. As the Left continues to waffle or worse with regard to what Jews feel to be their most fundamental concerns, we may well see the beginnings of the long-predicted Jewish shift to the Right .
Daniel Henninger, in the WSJ (feb. 13, 2004):
The Democrats, from day one of Terry McAuliffe's year-long nomination rondo, wanted a liberal who would be cast in their own likeness. They never wanted a moderate like Joe Lieberman, a Democrat trying to come to grips with the new political century--its security dangers, efficient global markets and a ragged domestic culture. Mr. Lieberman and those who share his views are secondary Democrats. They don't count. The Democrats who pick the winners in their party's primaries also choose its political course. They are the Primary Democrats. To oppose George W. Bush and his politics, the Primary Democrats want a candidate shaped as they were shaped in the late 1960s and the hard political battles they waged in the succeeding 30 years.
The Primary Democrats danced a few rounds with Howard Dean, whose rage-at-the-machine temperament recalled their own best memories way back when. They have since settled on John Kerry, and properly so. John Kerry, in his person and career, exists today as the embodiment of Democratic Party politics from 1968 to this moment. For Primary Democrats, he is their perfect vessel.
These Democrats opposed the Vietnam War, and like Mr. Kerry, that event serves as sextant in their political journey. Primary Democrats regard their active and successful opposition to Vietnam as moral affirmation of their world view, which holds, more as a matter of belief than principle, that any American foreign policy not of their making is too aggressive, morally suspect and wholly wrong.
It doesn't matter that the iconic president bearing Mr. Kerry's initials (as a young man, Mr. Kerry dated Jackie Kennedy's half-sister, Janet Auchincloss) sent the U.S. into Vietnam on a flying carpet of moral certainty. Or that the political commitment to repulse communism in Vietnam, a commitment that troubled Mr. Kerry as he departed in 1968 for heroic service in the war and revulsed him when he left, was set by Lyndon Baines Johnson. Primary Democrats, for reasons that await the tools of psychoanalysis, believe Vietnam was"Nixon's war." After winning Iowa's caucuses, Mr. Kerry volunteered,"I stood up and fought against Richard Nixon's war in Vietnam."
The Republican Nixon's too-ardent anticommunism, they came to believe, was the provenance for Ronald Reagan's wrongful spending on the communist"threat." So it followed that Primary Democrats would then resist Ronald Reagan on Grenada, Nicaragua and installing Pershing missiles in Europe. As senator, Mr. Kerry held hearings into Ollie North and the Iran-Contra connection. In the same Iowa interview just last month, Mr. Kerry described that effort in the words used in the 1980s by all Primary Democrats:"I stood up and fought against Ronald Reagan's illegal war in Central America."
John Kerry was present at the creation of the moral and intellectual voyage of post-1960s Democrats. He helped map its course. He testified in 1971 against the Vietnam War as a young veteran before the Senate Foreign Relations Committee. He appeared as an antiwar spokesman on"60 Minutes" and"The Dick Cavett Show." John Kerry was a celebrity among Primary Democrats as Bill Clinton never was during this important period. As a Southern governor, Mr. Clinton learned about the inevitable left-right compromises of public policy in ways that rarely tainted the austere ideological experience of Mr. Kerry in the liberal northeast and Washington. (This may well disadvantage Mr. Kerry in the election.)
We have in George Bush a president for whom the formative event of his political life is not Vietnam and the years after but September 11, a catastrophic attack on American soil by an organized global enemy. With his doctrine of pre-emption for threats to U.S. security, his destruction of the Taliban and overthrow of the Hussein regime in Iraq, Mr. Bush has largely broken free of the political period that shaped John Kerry's career. Mr. Bush argues that he is dealing with a world and enemy that has not previously existed. But with Iraq, 30 years of Primary Democratic belief instinctively reappears as resistance, led again by John Kerry. If George Bush's sense of right purpose flows directly from September 11, 2001, so too does many Democrats' from what John Kerry was doing and thinking in 1968 in the Mekong Delta.
Mr. Bush would do well, if he has not already, to revisit the histories of this period. Through the years that John Kerry was personally helping form--and represent--the cognitive gestalt of modern Democratic voters, Mr. Bush was in business. But the Democrats who came to maturity around 1968 spent those years deepening their beliefs and baptizing younger adherents, who filled the streets of San Francisco and elsewhere to oppose"George Bush's illegal war in Iraq."
Michael Chapman, editorial director of the Cato Institute (Feb. 13, 2004):
The Third Way-a mix of capitalism and welfare-state socialism-is the dominant political philosophy in Europe and parts of Asia. Britain's Tony Blair is a Third Way-er, as is America's Bill Clinton. Unfortunately, in many ways, so is President George Bush. His brand of Third Way politics is called "compassionate conservatism." The Democratic Leadership Council terms its version "tolerant traditionalism."
In either case, the tack is the same: steady, incremental steps toward more governmental control over society through myriad laws, regulations and taxes. The state doesn't take over private industries outright-it intervenes, to cover where the free market supposedly fails. As it does so, the state makes things worse, so it intervenes more to "fix" the problems it created, paving the way for more "reforms"-and problems. This interventionism is evident with health care, but it's also clear in the way that Bush and Congress treat other industries.
On Dec. 24, 2003, Bush signed bill passed by the GOP-controlled Congress that would provide prescription drug benefits to people on Medicare, the national health insurance program for seniors. The cost of this law, which (further) subsidizes prescription drug coverage, is now pegged by the White House at $534 billion over 10 years, up from the administration's $400 billion estimate last November. Some analysts project the cost at more than $1 trillion over 10 years.
Given the history of Medicare and other government interference in the health care market, $1 trillion is credible. Medicare is government intervention in the health care industry through subsidies, which boost demand and raise costs, and regulations, which further boost costs. With subsidies, someone else is paying most of the bill. Therefore, neither patients, doctors, hospitals, nor politicians have an incentive to control costs. As costs go up, prices go up. Then patients and doctors complain and politicians blame "greedy" drug companies and "ruthless" health care providers. So, their solution? More subsidies. And it starts all over again.
Medicare, launched in 1965, was the fruit of some 50 years of lobbying by national health insurance advocates, including Harry Truman and Theodore Roosevelt. "The program was created as part of a larger plan to create a government-financed national health care system," reports Sue A. Blevins in her book, "Medicare's Midlife Crisis." "Incremental steps were taken in 1965 toward that goal, including the establishment of Medicare Part A, Medicare Part B, and Medicaid, the government program for low-income individuals of all ages."
In 1965, the government estimated that the cost of Medicare Part A (hospital coverage) would grow to a mere $9 billion by 1990. Wrong. The program ended up costing $66 billion in 1990. The government mis-estimated by more than 600 percent. Total program costs reached $221.8 billion in 2000. Today, Medicare regulations fill more than 130,000 pages. And now you can tack on all the regs, rules, costs, demands, and higher prices that the new prescription drug "reform" will add. Look for Congress to fix this reform in a few years.
At least, one may argue, President Bush and Congress did not enact a national health care plan, as Bill and Hillary Clinton tried to do. But, as Blevins explains, Medicare is a national health care program, 39 years old and getting fatter by the day. The prescription drug reform is just further intervention, another step down the road toward a broader plan: a third way.
"There are middle-of-the-roaders who think they have been successful when they have delayed for some time an especially ruinous measure," said economist Ludwig von Mises of third way interventionism. "They are always in retreat. They put up today with measures which only 10 or 20 years ago they would have considered as undiscussable. They will in a few years acquiesce in other measures which they today consider as simply out of the question."
In 1982, President Reagan called for dismantling the Department of Education. In 1995, the GOP-controlled House approved a budget that called for eliminating three Cabinet departments: Education, Commerce and Energy. Now, nine years later, a GOP-controlled Congress and a "compassionate conservative" president have boosted spending on education to $57 billion, a 70 percent increase since 2002. Neither Bush nor Congress has any active plans to scrap any departments or agencies.
In the 1990s, the GOP Congress tried to eliminate the National Endowment for the Arts. Failing that, they capped its funding. This year, Bush has called for boosting the NEA's budget 15 percent, to $139.4 million. There are nearly 8,000 pork projects in the Omnibus spending bill, and corporate welfare-more government intervention in the marketplace-continues, to the tune of $100 billion.
Welcome to Washington. Welcome to the Third Way.
Noemie Emery, in the Weekly Standard (Feb. 9, 2004):
CLOSE YOUR EYES on some days, and you can almost believe it: You're back somewhere in the mid-1980s, 1984 to be precise. At least from the Democrats' side of the aisle. There it all is: The Republican president denounced as a dunce and a dangerous cowboy; the left on a tear against corporations and tax cuts; and the vast, murky war against a dangerous enemy, which Republicans think of as a crusade against evil and Democrats think is a sham. Magically, the three intervening elections--1992, 1996, and 2000--appear to have vanished, as have their protagonists: Bill Clinton is gone, as is the George W. Bush of 2000, gone in the moment he learned, on live cameras, that tower number two had been hit. We are back now in Reagan country, with deep divisions, big issues, deep feelings, big wars. And quite a few things are familiar. This is the way they equate.
l. THE PRESIDENT. Now, as in those days, there is a Republican president, a man of the West, detested in Europe and deeply despised by the base of the Democrats, who are driven to distraction by his mere presence. He is looked down on by them as a dupe or dullard, and portrayed, as Richard Wirthlin, Reagan's favorite pollster, once put it, as "dumb, dangerous, and a distorter of facts." Reagan was described also, by professional crony Clark Clifford, as an "amiable dunce." Bush should be so fortunate as to have the word amiable invoked in this way by his foes. Instead, he is widely regarded by liberals as swaggering, arrogant, clueless, vindictive, and mean. Opinion differs as to whether he is an evil political mastermind, surrounded by similar knaves and connivers, or merely an empty suit dressed up and guided by others (in which case the "evil genius" description is used to describe his counselor Karl Rove).
Despite all of this, or perhaps owing to it, Bush is nonetheless liked by the rest of the country, which gives him high marks for his leaderly qualities. Leadership and national security are his best issues. His weakest one seems to be the environment. Although his beloved ranch is run on the greenest of principles, the greens turn him down three to one. Repeatedly, they claim he has poisoned the air, poisoned the water, and is feeding small children a diet of arsenic. For these reasons, and others, they long to destroy him, and are united with the rest of the left in this great cause. "Ronald Reagan has provided all the unity we need," Gary Hart said at the 1984 Democratic convention. "Not one of us is going to sit this campaign out. You have made the stakes too high." But not high enough to impress most Americans, who remained less than outraged by the president.
Deep tranches of rage did not produce general anger with Reagan. Thus far, they have failed to do so with Bush....
Simon Jenkins, in the London Times (Feb. 11, 2004):
Your starter for ten. What is the difference between a sadistic oil-rich Arab dictator who must be backed and feted by the West and a sadistic oil-rich Arab dictator who must be bombed and sanctioned into submission? Answer: none.
The lucky dictator in the 1980s was Saddam Hussein and today it is Colonel Muammar Gaddafi. The unlucky dictator in the 1980s was Gaddafi and the unlucky one today is Saddam. In the 1980s the Americans and British were selling Saddam materials for his weapons systems. We knew he was massacring civilians with them. During that time American planes took off from British bases to assassinate Gaddafi in his Tripoli palace. The planes were no more accurate than a similar mission to kill Saddam last year. Dozens of civilians died, including one of Gaddafi's children, but not the target.
Had Gaddafi died in 1986, his death would have been hailed as a triumph against terrorism. Had Saddam been killed then, it would have been seen as a blow to stability and anti-fundamentalism in the Gulf region. Twenty years later neither Saddam nor Gaddafi had changed in their essentials. Both tyrants had aged and become less of a menace to the world. Gaddafi had stopped sponsoring terrorists. Saddam had let the UN destroy his weapons stockpiles. Both still killed their enemies, suppressed opposition and impoverished their peoples.
Yet now it is Saddam whose death is sought by the West and Gaddafi who is hailed by Tony Blair as "courageous and statesmanlike"....
The past month has been astonishing. Another dictator, General Pervez Musharraf of Pakistan, has admitted that his nuclear weapons team have been supplying material to every rogue state in the world, including Iran and North Korea. This has been in defiance of supposedly fierce controls on nuclear dissemination and under the nose of Western intelligence so obsessed with finding non-existent Iraqi bombs that it neglected real Pakistani ones.
Musharraf runs a repressive regime that harbours the Taleban forces out to topple the Kabul regime of Hamid Karzai. The historian of the Taleban, Ahmed Rashid, reports in this month's New York Review of Books that conditions in southern Afghanistan remind him of ten years ago. He is now seeing "history repeat itself, in some respects worse than before". The Taleban is fuelled by unprecedented opium money, released by the US-backed warlords. And what does Musharraf do? He leaves the Taleban in peace and grants a state pardon to his nuclear salesmen, knowing that the West dares not abandon him....
How should we react to a Western foreign policy that is so promiscuously cynical? The answer might be with a weary sigh: it was ever thus. Young diplomats are told that foreign policy is about interests, never morality. I see the recent turn of events as more optimistic. Mr Blair's crusade to save the world has strutted its bloodthirsty hour upon the stage. Its downfall was in being joined to America's search for punitive revenge after 9/11. Both crusade and revenge are now stumbling to a finish in the poppy fields of Afghanistan and the shanty towns of Iraq. We shall not see them again for a generation.
Frida Ghitis, author of The End of Revolution: a Changing World in the Age of Live Television, in the LAT (Feb. 11 2004):
Just about the time that the White House announced plans for an investigation into faulty Iraq intelligence, my Cambodian friend Phead took me to visit one of the monuments to the victims of his nation's genocide. On the way to see the collection of human bones and skulls gathered from the killing fields of the Khmer Rouge, I asked Phead what he thought about the U.S.-led war in Iraq.
Phead, like other survivors of this country's almost incomprehensible tragedy, has plenty of reason to abhor war -- and to resent and distrust the United States. After all, Washington's Vietnam adventure provided the ferment for the Cambodian civil war that in the 1970s propelled to power the deranged regime of Pol Pot.
The U.S. had carpet-bombed Cambodia in an effort to root out Vietnamese fighters and their supply lines. By some accounts, the bombings killed more than 200,000 villagers. To this day, the scattering of unexploded American ordnance -- along with millions of land mines left by an assortment of armies -- continues to take limbs and lives.
In this atmosphere of chaos Pol Pot came to power, and in less than four years the Paris-educated leader and his followers pursued a Maoist utopia that pushed this country into a nightmare of terror, hunger and death. Other countries contributed to decades of bloodshed in Cambodia, but the main culprit was the demented Pol Pot and his Khmer Rouge followers.....
Stopping the killing in Iraq has now become the argument of choice for defenders of that war. Politicians and historians will continue to debate the true reasons behind Washington's decision to target Hussein's regime. Standing in Cambodia's killing fields, what seems inexcusable is doing nothing to stop genocide.
The U.S. track record on stopping mass murders remains unimpressive. The U.S. -- and the rest of the world -- has looked the other way while hundreds of thousands were killed, most recently in places like Rwanda and Sudan.
The experience of Cambodia -- and Iraq -- points to the need for a clear policy spelling out what is to be done when a twisted dictator sets out to destroy his own people.
Amity Shlaes, in the London Financial Times (Feb. 9, 2004):
John Kerry, Howard Dean, George W. Bush and Joe Lieberman all have something in common, and it is not merely that they spent January running for the US presidency. They are all Yale men.
But then Bush v Clinton was also a Yale v Yale event. A Yale graduate has occupied the Oval Office for a decade and a half now. Assuming Hillary Clinton (Yale Law School, 1973) is all her fans hope, the reign of Yale could stretch to 2012.
Observers argue that Yale's dominance reveals something shameful: moneyed dynasties rule the US. The fact that several of the politicians (both Bushes, John Kerry) belonged to a Yale senior society, Skull and Bones, seems to underscore the claim of exclusivity.
But we can also argue the opposite: that Yale's dominance today proves the value of adopting a conscious policy to effect meritocratic change.
This is a story that starts with old Yale, founded in 1701. That Yale enjoyed bright periods and distinguished graduates. But it also suffered long stretches of mediocrity, during which it was known principally for its peculiar rallying cry, "Boola, Boola". Compared with the University of Chicago after the second world war, for example - or the University of Wisconsin before it - Yale was not so exciting. The only president Yale produced for a century and a half was William Howard Taft - remembered by most Americans as the president so corpulent that he is reported to have got stuck in a White House bathtub.
Yale's problem was that it cared more about class than quality. The college excluded all qualified women, nearly all qualified blacks, many qualified Jews and some qualified Catholics. It routinely rejected pupils from public schools - the state schools of towns and cities - on principle. It lagged behind Harvard when it came to accepting outstanding students. Eugene Rostow, who later became Lyndon Johnson's under-secretary of state, was a Yale undergraduate in the 1930s. In a student publication, the Harkness Hoot, Rostow noted that there were no Jewish faculty members. This was a message to the serious Jewish student that "his academic ambitions can never be realised".
In the 1960s, however, two successive Yale presidents, A. Whitney Griswold and Kingman Brewster, set about making a new Yale. As Dan Oren writes in his book, Joining the Club, the pair hired Arthur Howe and R. Inslee Clark as admissions officers, who insisted that Yale must open its gates wider if it wanted to achieve greatness. By 1964, the share of freshmen admitted from public schools stood at 56 per cent, compared with 36 per cent in 1950.
In the early 1970s Yale admitted its first women to the college. The new arrivals were quicker and tried harder than the old Yale boys. Admissions policy became "need blind"; the university picked students first, then figured out how much financial support they required, and delivered much of it.
Michael Dobbs, a Washington Post reporter and the author of Saboteurs: The Nazi Raid on America (2004), in the Post (Feb. 8, 2004):
There are few more difficult issues for a democracy than how it metes out justice to its enemies in time of war. Over the coming weeks and months, as the Supreme Court hears a series of challenges to the Bush administration's proposed use of military commissions to try suspected terrorists, we will become spectators to an extraordinary constitutional drama.
For a preview of how the action is likely to unfold, consider what happened the last time the play was performed, 62 years ago. The setting: wartime Washington. The leading characters: a president determined to make an example out of a group of captured saboteurs; a gritty, Army-appointed defense lawyer intent on doing the best he can for his unpopular clients; nine Supreme Court justices struggling to balance the competing demands of law and war. These characters -- like their modern-day counterparts -- epitomized the American justice system to the rest of the world, and history has delivered a mixed verdict on their performance.
I became fascinated with the case of the Nazi saboteurs (who traveled to America by U-boat with the aim of blowing up factories, bridges and department stores) at about the time the planes crashed into the World Trade Center and the Pentagon on Sept. 11, 2001. The more I delved into the archives, the more I was struck by the parallels between then and now. When President Bush decided, two months after 9/11, to emulate President Franklin D. Roosevelt and establish military tribunals for alleged al Qaeda operatives, history appeared to be repeating itself....
There are differences, of course. Unlike the well-trained killers who destroyed the World Trade Center and fought with U.S. forces in Afghanistan, the Nazi saboteurs were the gang that couldn't shoot straight. They were captured, on American soil, before they got around to blowing anything up. Furthermore, the war on terrorism is a much more nebulous kind of war than World War II, which had a clear goal and clear enemies, whose ideological appeal faded with their physical overthrow. World War II ended when Hitler was defeated; the war on terrorism could go on forever.
But when it comes to the way we deal with captured foes, the similarities are evident enough to ask whether military commissions are compatible with American ideas of justice. As the Pentagon gears up for military tribunals at our Guantanamo Bay base in Cuba, and defense lawyers rehearse the arguments they will make before the Supreme Court at the end of March, it is as if everybody is slipping into pre-assigned roles. In some cases, the actors are reading from the very same text as their World War II predecessors.
The most obvious example of this phenomenon: the rules of procedure for the modern-day military commissions, which were copied almost verbatim from those that Roosevelt established for trying the Nazi saboteurs. The tribunals will consist of seven members. A two-thirds vote is sufficient to secure a verdict. The tribunals will not be required to abide by the cumbersome rules of evidence that are a feature of civilian trials, or even military courts martial. Instead, the presiding officer can admit any evidence that, in his opinion, has "probative value to a reasonable person." (A bow to political correctness: In 1942, the phrase was "a reasonable man.") The appeals process is reduced to a review by the president or the secretary of defense.
David Zurawik, writing in the Baltimore Sun (Feb. 11, 2004)
Indecency on the airwaves has become such a hot button issue since Janet Jackson's Super Bowl stunt that there will be two hearings today in Washington - and no shortage of politicians and regulators making pronouncements about the decline in broadcast standards as they promise reform.
But even as the TV networks race to delete images of nudity and sex from such prime-time dramas as ER and Without a Trace in an effort to show that they can police themselves, media historians and analysts say real, lasting change is unlikely. As dramatic as the pictures and soundbites coming out of Washington today might be, it will be mostly political posturing, the experts say, merely the latest movement in a dance between Hollywood and Washington that started with the Communications Act of 1934.
"It is absolutely political theater - especially on the part of Federal Communications Commissioner Michael Powell," said Douglas Gomery, resident scholar at the American Library of Broadcasting at the University of Maryland in College Park and co-author of Who Owns the Media?"These hearings are not going to result in any meaningful change in the kind of television that comes into our homes over the network airwaves."...
... Today's hearing in the House is on legislation that would fine stations 10 times the current amount for carrying material judged indecent by the FCC. Network officials, FCC commissioners and representatives of the National Football League will testify before the House panel today.
"With my bill multiplying FCC fines for indecency tenfold, networks will do more than just apologize for airing such brazen material, they will be paying big bucks for their offenses," said Republican Rep. Fred Upton of Michigan, who introduced the legislation before the Super Bowl flap.
At the Senate hearing, most eyes will be on Ernest Hollings of South Carolina, the ranking Democrat, who wants to link increased fines for indecency to a ban on violent programming between 6 a.m. and 10 p.m. Furthermore, Hollings wants the FCC to revoke the licenses of stations that air indecent or violent material during those hours. License revocation is the most severe penalty the FCC can impose. Powell and the four other FCC commissioners will go from the Senate hearing to Upton's House inquiry.
But analysts say there is little chance Hollings' ban on violence and call for license revocation will ever become law. And, while Upton's effort to increase maximum fines to $275,000 from the current $27,500 is expected to pass with President Bush backing it as a way to help parents protect children from unwanted media messages, it will result in little change...
...The one thing on which all media analysts agreed is the importance of seeing today's hearings and the Super Bowl fallout in context of the larger issue of media consolidation on the part of companies like Viacom.
The flurry of self-censorship since the Super Bowl - a fairly transparent attempt by the networks to convince Washington that they can be trusted to self-regulate - does raise some First Amendment concerns. As John Wells, president of the Writer's Guild and executive producer of ER, said last week, such actions could"have a chilling effect on the narrative integrity of adult dramas." But the FCC does not regulate cable, so HBO and the other channels can continue to do adult content no matter what happens in the hearings.
Furthermore, at this point, such First Amendment concerns are nothing compared to the rule changes championed by Powell last year that would allow a company to own TV stations that reach 45 percent of the households in the United States (up from 35 percent). The protests to that FCC action were so widespread that Congress responded with legislation capping ownership at 39 percent.
The battle continues, but Powell's credibility and image have been bloodied, and experts say his outrage over Jackson's bare breast is an attempt to shift the focus of the debate...
Economist Hal Varian, in the NYT (Feb. 12, 2004):
PRODUCTIVITY growth took a breather last quarter, slowing to 2.7 percent, after the previous quarter's torrid 9.5 percent growth. Still, by historical standards, 2.7 percent is a respectable number.
From 1948 to 1973, productivity grew at close to 3 percent annually, doubling the living standard in that period. Then came the dark age of productivity growth: from 1974 to 1994, it averaged only 1.4 percent a year. From 1995 to 2000, we had something of a productivity renaissance, with growth climbing to more than 2.5 percent a year.
When the economy started to slow down in 2000, many economists expected productivity growth to fall back under 2 percent. But contrary to these expectations, productivity has continued to grow strongly.
It is difficult to overstate the importance of productivity growth for the long-run health of the economy. Over the years, virtually all economic progress has come from productivity growth. An increase of half a percent a year can make a huge difference over 20 or 30 years.
So it's pretty important to understand why productivity growth declined so sharply in the 70's and rebounded so strongly in the 90's.
Unfortunately, there is no consensus about why productivity growth slowed in the first place, though there is no shortage of theories. Various factors, including the 1973 oil price shock, the baby boomers' entry into the labor market, an increase in regulation and a slowdown in technological innovation seem to have played a role.
But there is an emerging consensus about why productivity growth surged again in the mid-90's: most economists say information technology played a major role.
Rachel Sauer, in the Palm Beach Post (Feb. 4, 2004):
Oh, the deliciousness of scandal! The lurid details! The shock and outrage! The entertainment piled on entertainment!
Because entertainment without occasional scandal is just a little, well, boring. Having our envelopes pushed can engage us in art and entertainment in a way that quality and highbrow notions often can't. If nothing else, scandal keeps us looking, and talking.
So we're scandalized by Janet Jackson's Super Bowl halftime spectacle: a bare breast mixed into an entertainment extravaganza. We're tut-tutting and gossiping and theorizing.
And we're remembering when we were here before, at this place of scandal, when artists and entertainers did shocking things in the course of a performance and we couldn't stop talking. Let's take a stroll back, keeping in mind that we used to be easier to outrage.
At the May 29, 1913, debut performance of Igor Stravinsky's Rite of Spring in Paris, audience members were so upset by the work -- a violent ballet choreographed by Vaslav Nijinsky depicting fertility rites, set to Stravinsky's primitive, unsettling music -- that fistfights broke out in the aisles. Soon a riot erupted and police couldn't restore order.
In 1926, Mae West wrote and starred in a play called Sex, about a Montreal prostitute, that ran on Broadway for almost a year before New York City's deputy police commissioner raided the theater. West was charged with lewdness and corrupting youth and spent 10 days in jail.
Also in 1926, actress Clara Bow exuded such open sexuality in the movie Mantrap, as the supposedly predatory wife of a woodsman, that audience members couldn't hide their outrage.
Although there was no visible tongue, Greta Garbo gave John Gilbert the screen's first obviously open-mouth kiss in 1927's Flesh and the Devil. Fans' tongues wagged in response.
Audiences were indeed shocked when actress Jean Harlow asked, "Would you be shocked if I changed into something more comfortable?" in 1930's Hell's Angels.
Vladimir Nabokov's novel Lolita, about a middle-aged man who lusts mightily and seduces his 12-year-old stepdaughter, was released in the United States in 1958, following its original 1955 release in France. Enjoying three year's worth of scandal in Europe -- the book was banned in Great Britain and France -- it sold more than 100,000 copies in its first three weeks of U.S. release. Critics loved it; the moral majority called it pornography.
It was a true rock 'n' roll moment when Elvis Presley sang Hound Dog on the June 5, 1956, Milton Berle Show. His wild, pelvis-thrusting dance style inspired outraged TV critics to decry the performance for its "appalling lack of musicality," "vulgarity" and "animalism." The Catholic Church issued a statement called "Beware Elvis Presley."
Molly Ivins, in her column for Creators Syndicate (Feb. 10, 2004):
Just for the record, since the record is in considerable peril. These are Orwellian days, my friends, as the Bush administration attempts to either shove the history of the second Gulf War down the memory hole or to rewrite it entirely. Keeping a firm grip on actual historical fact, all of it easily within our imperfect memories, is not that easy amid the swirling storms of misinformation, misremembering and misstatement. But since the war itself stands as a monument to what happens when we let ourselves get stampeded by a chorus of disinformation, let's draw the line right now.
According to the 500-man American team that spent hundreds of millions of dollars looking for Iraqi weapons of mass destruction, there aren't any and have not been any since 1991.
Both President Bush and Sen. Pat Roberts, chairman of the Senate Intelligence Committee, now claim Saddam Hussein provoked this war by refusing to allow United Nations weapons inspectors into his country. That is not true. Bush said Sunday: "I had no choice when I looked at the intelligence. ... The evidence we have discovered this far says we had no choice."
No, it doesn't. Last week, CIA director George Tenet said intelligence analysts never told the White House "that Iraq posed an imminent threat."
Let's start with the absurd quibble over the word "imminent." The word was, in fact, used by three administration spokesmen to describe the Iraqi threat, while Bush, Vice President Cheney and Secretary of State Donald Rumsfeld variously described it as "immediate," "urgent," "serious and growing," "terrible," "real and dangerous," "significant," "grave," "serious and mounting," "the unique and urgent threat," "no question of the threat," "most dangerous threat of our time," "a threat of unique urgency," "much graver than anybody could possibly have imagined," and so forth and so on. So, could we can that issue?
A second emerging thesis of defense by the administration in light of no weapons is, as David Kay said, "We were all wrong."
No, in fact, we weren't all wrong.
Bush said Sunday, "The international community thought he had weapons." Actually, the U.N. and the International Atomic Energy Agency both repeatedly told the administration there was no evidence Iraq had WMD. Before the war, Rumsfeld not only claimed Iraq had WMD but that "we know where they are." U.N. inspectors began openly complaining that U.S. tips on WMD were "garbage upon garbage." Hans Blix, head of the U.N. inspections team, had 250 inspectors from 60 nations on the ground in Iraq, and the United States thwarted efforts to double the size of his team. You may recall that during this period, the administration repeatedly dismissed the United Nations as incompetent and irrelevant. But containment had worked.
Nor does the "everybody thought they had WMD" argument wash on the domestic front. Perhaps the administration thought peaceniks could be ignored, but you will recall that this was a war opposed by an extraordinary number of generals. Among them, Anthony Zinni, who has extensive experience in the Middle East, who said, "We are about to do something that will ignite a fuse in this region that we will rue the day we ever started." After listening to Paul Wolfowitz at a conference, Zinni said, "In other words, we are going to go to war over another intelligence failure." Give that man the Cassandra Award for being right in depressing circumstances.
Marine Gen. John J. Sheehan was equally blunt. Any serving general who got out of line, like Army Chief of Staff Eric Shinseki, was openly dissed by the administration.
Michael Novak, in National Review (Feb. 2, 2004):
The news media, which constantly accuse the Bush administration of exaggerating the threat in Iraq, are constantly exaggerating the number of U.S. combat deaths. I first pointed this out last August [in www.nationalreview.com]. For a while, the exaggeration stopped, but early in January it recommenced. The round number "500" was apparently irresistible.
Yet as of January 15, exactly ten months after the war began on March 16, 2003, the official number of US combat deaths listed by the Defense Department was 343. Another 155 had died from non- hostile causes, including 100 in accidents and others from illness, etc. Since non-hostile causes are responsible for army deaths in peacetime as well as wartime, in bases at home as well as in war zones, many of the non-hostile deaths ought not to be counted as specific to Iraq, although of course a portion of them are.
These 343 (not 500) combat deaths, furthermore, need to be set in context. During 2003, the number of homicides in Chicago was 599, in New York City 596, in Los Angeles 505, in Detroit 361, in Philadelphia 347, in Baltimore 271, in Houston 276, and in Washington 247. That makes 3002 deaths in only eight cities.
The least the media could do is print the number of combat deaths in Iraq in two columns. The first would show the number of days since the war began (as of January 15, 305). The second column might show the number of combat deaths as of the same date (343).
Gregory Kane, in the Baltimore Sun (Feb. 11, 2004):
RUN THAT ONE by me again, Thomas G. Duncan. I don't think I quite got it.
Duncan, for those of you who don't read The Sun every day, is a county councilman in Talbot County on the Eastern Shore. A quote of his appeared in Monday's edition of The Sun, in a story written by reporter Chris Guy. It seems Duncan has a problem with a statue of Talbot County's most famous resident, abolitionist and statesman Frederick Douglass, being placed on the courthouse lawn in Easton. That honor, Duncan feels, should only be for those who served in the armed forces, even"The Talbot Boys," who fought for the Confederacy.
Here's how Duncan summed it up, according to Guy:
"I think that ground is hallowed ground. People there either served or died for their country."
It's at this point that Duncan needs to run it by me again. A statue paying tribute to"The Talbot Boys" stands on the courthouse lawn. They most assuredly did not"serve their country." The country they served was the Confederate States of America - Maryland was never a member of that country, the last time I checked my history - which should not be confused with the United States of America.
Bret Stephens, editor in chief of the Jerusalem Post, in the WSJ (Feb. 11, 2004):
In Israel, where I live and work, suicide bombings are commonly understood by the foreign press as acts of desperation by a people who have lost all hope for a better future. Ease the economic hardships of Palestinians and end the occupation, so the thinking goes, and terrorism will be deprived of its motive.
It's a convenient notion, which more or less excuses mass murder as the deeds of men who have been robbed of their property, pride and patrimony. But is it right? What if suicide bombings aren't an act of despair at all but something approaching the opposite: a supreme demonstration of contempt for everything Westerners hold dear, not least life itself? What if, too, suicide bombers are no poor-man's F-16 but a robust expression of confidence that the Palestinians are infinitely more ruthless than Israelis in what amounts to a zero-sum game?
Lee Harris believes that these are exactly the sorts of questions that we should be asking today, and not only about the war in the Mideast. In" Civilization and Its Enemies " (Free Press, 231 pages, $26),he argues, brilliantly at times, that if you want to understand your enemy, you must understand him on his terms, not yours.
Take 9/11. Everyone from George W. Bush to Noam Chomsky agreed that the attacks were acts of war, even if they disagreed about exactly which political aims the acts were meant to further. But Mr. Harris takes a different view: 9/11, he says, was"a spectacular piece of theater."
"The targets were chosen by al-Qaeda not for their military value -- in contrast, for example, to the Japanese attacks on Pearl Harbor -- but entirely because they stood as symbols of American power universally recognized on the Arab street. They were gigantic props in a grandiose spectacle in which the collective fantasy of radical Islam was brought vividly to life."
Steven Aftergood, in Secrecy News, the newsletter of the Federation of American Scientists Project on Government Secrecy (Volume 2004, Issue No. 17 February 11, 2004):
It is often noted that espionage is an ancient enterprise with roots at least as old as the Bible.
But what is rarely if ever recalled is that intelligence oversight and accountability are *also* part of the Biblical record, and that the Deity imposed a severe penalty upon those who distorted intelligence and inflated threats.
A Washington Times op-ed writer today attempted to defend the CIA by citing the first half of the Biblical precedent.
"Some Americans find in the CIA a convenient scapegoat, failing to recognize that throughout history espionage has been used to protect peoples from their enemies. Ancient Israel had spies:
'Moses sent them to spy out the land of Canaan [to see] whether the cities they dwell in are camps or strongholds.'
(Numbers 13:17-19)," wrote Ernest W. Lefever of the Ethics and Public Policy Center in the Washington Times, Feb. 11, p. A18.
What Dr. Lefever failed to mention is that the spies sent by Moses came back with a hyped National Intelligence Estimate, with unhappy results.
"The land, through which we have gone, to spy it out, is a land that devours its inhabitants... and we seemed to ourselves like grasshoppers, and so we seemed to them." (Numbers 13: 32-33).
Only Joshua and Caleb dissented from this majority view.
Because they wittingly or unwittingly exaggerated the capabilities of the Canaanites, God sentenced the spies to death, displaying no judicial deference to the intelligence agencies.
"The men who brought an unfavorable report about the land died by a plague before the Lord," we are told.
"But Joshua son of Nun and Caleb son of Jephunneh alone remained alive, of those men who went to spy out the land." (Numbers
Russell Shorto, in the NYT (Feb. 9, 2004):
Acre for acre, Lower Manhattan may be the most historic piece of real estate in America. Here the Sons of Liberty plotted revolution, the Stamp Act Congress met to defy taxation without representation, colonists exchanged fire with British ships in the harbor, and General Washington and his officers celebrated their victory. The first president was inaugurated here, and Congress, meeting at Federal Hall, wrote the Bill of Rights. In one remarkable moment in time, Washington, Jefferson, Madison and Hamilton all lived and worked in these narrow streets. For two centuries, this tiny quadrant was New York — and the gateway to America for millions of immigrants.
It was also, of course, the site of the World Trade Center. Both the building of the twin towers and their destruction flow from that deep history, for the events that occurred here contributed not only to the nation's growth but to the rise in might of New York as global capital and lower-case world trade center.
It would seem natural, then, to connect the site to its past. But neither the winning design for the World Trade Center memorial nor any of the public conversation or press attention surrounding it have attempted to do so. A sense of history has been absent from the whole process.
Maybe that shouldn't be a surprise. The fact is, Lower Manhattan — which after all is America's financial capital, and has business to do — has never had the reflective, carefully tended atmosphere of Boston's historic center, Philadelphia's Independence Hall, or the memorials in Washington. Monuments that anywhere else would serve as a city's cherished heart are lost in the Wall Street shuffle. Many historic events that took place here don't even rate a plaque. Over the past three years, while working on a book about Manhattan's founding, I spent a lot of time in and around the historic sites of Lower Manhattan, and routinely encountered clusters of tourists zigzagging haphazardly through the area, guidebooks in hand, knowing that they were walking streets redolent of the past but having a hard time sniffing it out.
The reason for the city's offhand approach to its most elemental history goes all the way back to the beginning, and has to do with New York's unique development. While the colonies to the north and south were English, the population of New York, dating from its beginnings as the Dutch city of New Amsterdam, was mixed,"foreign." For all New York's power, the Brahmins in Boston and the planters of Virginia kept it at arm's length. New York's image of itself has always reflected this. From early on, the city ceded patriotic sentiment to others and put its energy into the present. New York, the feeling goes, is too big, too chaotic, too jazzed and hustling and busy to turn itself into a museum.
Editorial in the NYT (Feb. 11, 2004):
If President Bush thought that his release of selected payroll and service records would quell the growing controversy over whether he ducked some of his required service in the Air National Guard three decades ago, he is clearly mistaken. The payroll records released yesterday document that he performed no guard duties at all for more than half a year in 1972 and raise questions about how he could be credited with at least 14 days of duty during subsequent periods when his superior officers in two units said they had not seen him.
Investigative reporting by The Boston Globe, our sibling newspaper, revealed in 2000 that Mr. Bush had reported for duty and flown regularly in his first four Texas Guard years but dropped off the Guard's radar screen when he went to Alabama to work on a senatorial campaign. The payroll records show that he was paid for many days of duty in the first four months of 1972, when he was in Texas, but then went more than six months without being paid, virtually the entire time he was working on the Senate campaign in Alabama. That presumably means he never reported for duty during that period.
Mr. Bush was credited with 14 days of service at unspecified locations between Oct. 28, 1972, and the end of April 1973. The commanding officer of the Alabama unit to which Mr. Bush was supposed to report long ago said that he had never seen him appear for duty, and Mr. Bush's superiors at the Texas unit to which he returned wrote in May 1973 that they could not write an annual evaluation of him because he had not been seen there during that year. Those statements are so jarringly at odds with the payroll data that they demand further elaboration. A Guard memo prepared for the White House by a former Guard official says Mr. Bush earned enough points to fulfill his duty but leaves it unclear whether he got special treatment.
The issue is not whether Mr. Bush, like many sons of the elite in his generation, sought refuge in the Guard to avoid combat in Vietnam. The public knew about that during the 2000 campaign. Whether Mr. Bush actually performed his Guard service to the full is a different matter. It bears on presidential character because the president has continually rejected claims that there was anything amiss about his Guard performance during the Alabama period. Mr. Bush himself also made the issue of military service fair game by posturing as a swashbuckling pilot when welcoming a carrier home from Iraq. Now, the president needs to make a fuller explanation of how he spent his last two years in the Guard.
Greg Tate, in the Village Voice (Feb. 4-10, 2004):
We African Americans lead strange and conflicted lives at the movies. For this reason, the Internet was recently abuzz with calls by actor and self-described semiotician Erik Todd Dellums to boycott Cold Mountain , a Civil War film noticeably lacking in melanin content. Charles Frazier's novel hardly avoids African Americans as concertedly as the Anthony Minghella film starring Jude Law and Nicole Kidman. The versions share some key erasures, though—the opening scene, a re-creation of the legendary Battle of the Crater in Petersburg, Virginia, is perhaps the most egregious. On that July 1864 morning, Union soldiers exploded the ground underneath a drowsy Confederate regiment. Novel and film fail to mention how specially trained African American troops had been poised to attack the Crater (now a historical tour site) and the Southerners it swallowed. Historians claim that the African Americans were withdrawn due to fears of Northern political fallout if they were used as cannon fodder. Whatever, dude. Methinks the sight of armed African Americans freely picking off shocked and awed white Southern troops was too avant-garde for 1864. In any event, the upshot of the switch was that untrained white Unionists didn't flank the Crater as the brothers were trained to, but rushed in and got shot up like fish in a barrel. At which point all the bloods got thrown in as cannon fodder anyhow. The Confederates, already peeved at being sneak-attacked, lost it when they saw armed and uniformed men of African descent. One need only imagine the language they used. A military adviser on the film recalls Minghella shooting a scene in which a crazed Confederate soldier slaughters a wounded African American. The adviser believes the scene got cut because it was"too over-the-top" and"too painful." Minghella has similarly explained away the film's eschewing the immorality of slavery. Since that would entail having Nicole Kidman's snow-pure love object reflect on being a slave owner, one can see why. Once again liberal guilt goes belly-up in the guts sweepstakes.
As Abraham Lincoln's birthday approaches Republicans around the nation gather together in country clubs and halls for their annual Lincoln Day banquets. Dressed in their finest, the loan officer of the branch bank sips his gimlet while the cross-wearing locksmith intones the virtues of home schooling to the party faithful gathered around the table. The rank and file eye each other warily occasionally straining a smile during the dinner. After desert a speaker arises who, after repeating some old Clinton jokes he remembers and praising the candidates strung along the front table as the only hope for America, will remind the assembly that Abraham Lincoln was the first Republican president and proclaim that the party of Lincoln is marching on to victory with his spirit guiding their footsteps.
Fortified with those words, the party stalwarts depart, carrying away the souvenir crepe paper centerpieces to their foreign made cars. But behind the facade of the Republican party's claim to be the party of Lincoln is the unpleasant but undeniable truth that if Abraham Lincoln were alive today he would be a Democrat. History and his story along with his views on government strongly support it.
Abraham Lincoln was not in fact much of a Republican when he was alive. Lincoln was for most of his life a proud member of the Whig party whose platform of harnessing the power of the government to invest in the public good is not greatly different in substance than of modern mainstream Democrats. As the Whig party fell apart in the 185Os over the issue of the expansion of slavery into Kansas and Nebraska Lincoln tried to hold the party together. In 1854 he ran for the state legislature identifying himself as an Anti-Nebraska Whig. During the campaign a group of Republicans named Lincoln to the central committee of the new party without his consent or knowledge."I have been perplexed some to understand why my name was placed on that committee," he wrote.
In 1856 Lincoln, as recorded in his writings, reluctantly joined the Republican presidential campaign of John Fremont. In the campaign Lincoln preferred to be identified as a Fremont man or simply Anti-Nebraska, still finding the Republican label distasteful. The party was tainted in Lincoln's view by the alliance with the former members of the bigoted American or"Know-Nothing" party whose anti-immigrant rhetoric foreshadowed the far right faction of present day Republicans.
Lincoln ran on a Republican ticket only one time: his 186O election as President. In his famous but unsuccessful campaign for the Senate against Stephen Douglas in 1858 Lincoln technically did not run as a Republican, although he was widely known as their candidate. At that time neither he nor Douglas appeared on the voters' ballots under the old system of indirect selection of Senators by state legislators. Lincoln's last campaign, the re-election in 1864, was under the banner of the Union Party bringing together pro-Union Democrats including vice president Andrew Johnson along with Republicans.
Although the names have remained the same the parties have changed their principles and positions, in many ways flipping to the same degree that regions have flipped their party strengths in the last 150 years. Lincoln's reticence about the Republican party of his day would be more than matched by the sheer rejection the modern GOP would have for a Lincoln living in these times. Lincoln was a deeply devout and spiritual man but was not a churchgoer. On that basis alone the Christian Coalition, which exercises a disproportional power within the Republican party, would effectively veto his chances for public office, distributing fliers in church parking lots denouncing him on the Sunday before the election, much as what happened to John McCain. And what over blown scandal could Ken Starr have made out of Ann Rutledge?
A reincarnated Lincoln would relive part of his past life listening to the states' rights arguments contemporary Republican use against any proposal to help working families. The man who created the Department of Agriculture would recoil at the anti-government diatribes of House Republicans. The president who levied an income tax on the wealthy would have be shocked at George W. Bush's disproportionate tax cut to the wealthiest one percent. The chief executive who believed in practical action to regulate the marketplace such as standardizing railroad gauges across the country would face a barrage of paranoia about big government from the right wing think tanks and media. But above all the president who did everything he could to avoid a war would not have sent Americans into battle on false or faulty pretenses based on slanted intelligence.
Everything Lincoln stood for, if stripped of its nineteenth century labels, places him within the modern day Democratic Party. The man who in 1858 spoke of the"eternal struggle" between right and wrong, the"two principles that have stood face to face since the beginning of time...The one is the common right of humanity and the other the divine right of kings," would not be a Republican today. He made his choice long ago. It would not have been for a party that has tarnished and misused his name, and could more accurately call itself the party of Richard Nixon or Trent Lott or Dick Army or Tom Delay or Rush Limbaugh, but not Abraham Lincoln.