Roundup: Media's Take
This is where we excerpt articles from the media that take a historical approach to events in the news.
Tim Rutten, writing in the LAT (Feb. 4, 2004):
[Mel] Gibson has allowed himself to be characterized as a Catholic and has reinforced that impression by seeking the Vatican's approval of his film and then publicizing a purported papal endorsement. Reams of sympathetic publicity continue to describe Gibson as "a devout Catholic."
In fact, he is not. Catholics belong to churches that recognize the pope as their religious leader. If you don't, you're not a Catholic. It's as simple as that. What Gibson would rather not discuss is his membership in a schismatic group that has appropriated various pious practices and sacramental rites from preconciliar Roman Catholicism, but which rejects the contemporary church's leaders and teachings. Among the most important of those teachings is a complete rejection of any interpretation of the Passion that attributes a particular or continuing responsibility for Christ's execution to the Jewish people.
Because Gibson attends -- indeed, finances -- a church that rejects such teachings, other questions arise: For instance, in an interview with commentator Peggy Noonan to be published in the forthcoming Reader's Digest, Gibson says, "My dad taught me my faith, and I believe what he taught me. The man never lied to me in his life."
Fair enough, but Hutton Gibson -- the filmmaker's father -- is a well-known Holocaust denier.
Reader's Digest declined to make a full text of the interview available to The Times, but in the brief promotional excerpt released this week, Noonan is quoted as asking, "You're going to have to go on the record. The Holocaust happened, right?"
Gibson replied, "Yes, of course. Atrocities happened. War is horrible. The Second World War killed tens of millions of people. Some of them were Jews in concentration camps. Many people lost their lives. In the Ukraine, several million starved to death between 1932 and 1933. During the last century, 20 million people died in the Soviet Union."
Abraham H. Foxman, national director of the Anti-Defamation League, responded that "reading this, I have to conclude that, at best, Mr. Gibson is ignorant and, at worse, he is insensitive. War was not the cause of the Holocaust; Jews died because of who they were. The Holocaust is different in kind from other historical tragedies because it's about people being slaughtered for who they were. Comparing it to the famine in the Ukraine, which was terrible, is nonetheless ignorant and insensitive."
Foxman also said he was "troubled by the cavalier way in which he treats the question of his father's influence. I respect people who respect their elders. But Mr. Gibson says his father never lied to him and yet he has been lying for years to the world about the Holocaust. Saying everything his father said is true puts him in a very strange position, since his father is a public Holocaust denier."
Walter V. Robinson, writing in the Boston Globe (Feb. 5, 2004):
A detailed Globe examination of the records in 2000 unearthed official reports by Bush's Guard commanders that they had not seen him for a year. There was also no evidence that Bush had done part of his Guard service in Alabama, as he has claimed. Bush's Guard appointment, made possible by family connections, was cut short when Bush was allowed to leave his Houston Guard unit eight months early to attend Harvard Business School.
Bush received an honorable discharge in 1973. The records contain no indication that Bush's commanding officers, one of them a friend, ever accused him of shirking his duty.
In an interview yesterday, Dan Bartlett, the White House communications director, asserted that Bush "fulfilled his military requirements." Bartlett acknowledged that Bush's "irregular civilian work schedule could have put strains on when he served, when he performed his duty."
Before the Globe report in May 2000, Bush's official biography reported erroneously that he flew fighter-interceptor jets for the Houston Guard unit from 1968 to 1973. In a 1999 interview with a military publication, Bush said that among the values he learned as a pilot included "the responsibility to show up and do your job."
Most Democrats consider Moore's accusation of desertion unsupportable.
Still, according to the records and interviews in 2000, Bush's attendance record in the Guard was highly unusual:
* Although he was trained as a fighter pilot, Bush ceased flying in April 1972, little more than two years after he finished flight school and two years before his six-year enlistment was to end, when he was allowed to transfer to an Alabama Air Guard unit. The records contain no evidence that Bush performed any military duty in Alabama. His Alabama unit commander, in an interview, said Bush never appeared for duty.
* In August 1972, Bush was suspended from flight status for failing to take his annual flight physical.
* In May 1973, Bush's two superior officers in Houston wrote that they could not perform his annual evaluation, because he had "not been observed at this unit" during the preceding 12 months. The two officers, one of them a friend of Bush and both now dead, wrote that they believed Bush had been fulfilling his commitment at the Alabama unit.
Two other officers, in interviews, offered a similar account of Bush's absence, saying they had assumed Bush completed his service in Alabama.
* Bush's official record of service, which is supposed to contain an account of his duty attendance for each year of service, shows no such attendance after May 1972. In unit records, however, there are documents showing that Bush was ordered to a flurry of drills - over 36 days - in the late spring and summer of 1973. He was discharged Oct. 1, 1973, eight months before his six-year commitment ended.
Through Bartlett, Bush insisted in 2000 that he had indeed attended military drills while he was in Alabama during 1972 and in 1973 after returning to his Houston base. At the time, Bartlett said Bush did not recall what duties he performed during that period.
Albert Lloyd Jr., a retired colonel who was the personnel officer for the Texas Air National Guard at the time, said in an interview four years ago that the records suggested to him that Bush "had a bad year. He might have lost interest, since he knew he was getting out."
Lloyd said he believed that after Bush's long attendance drought, the drills that were crammed into the months before Bush's early release gave him enough "points" to satisfy the minimal requirements to earn his discharge. At the time, Lloyd speculated that after the evaluation of Bush could not be done, his superiors told him, 'George, you're in a pickle. Get your ass down here and perform some duty.' And he did."
From NPR (Feb. 5, 2004):
The flu epidemic of 1918 ranks with the Black Death of the Middle Ages as one of the deadliest contagions of all times. The virus swept across the Earth, killing an estimated 20 million people in little over a year. In the United States, more than half a million people died from the illness between September 1918 and June 1919. To this day, no one knows why the virus was so deadly.
Called the Spanish flu, the illness started with aches and fever. As the disease progressed, its victims' faces turned dark, the soles of their feet blackened and they coughed blood. In days, sometimes hours, those infected essentially drowned, their lungs heavy, sodden and engorged with a thin, bloody liquid.
Nearly everyone caught the flu in 1918 in some form; 2.5 percent of its victims died, making the strain 25 times more deadly than any flu before or since.
The flu left behind many questions: why was it so deadly, why were young, apparently healthy people particularly affected, and why did it never reappear? Researchers continue to look for those answers, and two studies published this week in Science magazine shed new light on the killer strain's origins. Analysis of a protein coating the virus suggests it started out as an avian virus. Another study, in the Proceedings of the National Academy of Sciences , shows that combining pieces of the 1918 strain with a mouse flu virus results in a very lethal flu.
As NPR's Richard Knox reports, the findings trouble health officials who worry a similar scenario is developing in Asia. The region is currently battling a massive avian flu outbreak, which has infected hundreds of millions of birds and killed 16 people. So far, evidence suggests the virus isn't easily spread among humans. But health officials fear the bird virus might combine with a human flu virus, unleashing another potentially uncontrollable pandemic among people.
Joshua Micah Marshall, writing in the New Yorker (Feb. 2, 2004):
For leftist critics of America's role in the world, it has long been a baleful article of faith that the United States is an agent of “neo-imperialism,” exerting its power through global capital and through organizations like the World Bank and the International Monetary Fund. After September 11th, a left-wing accusation became a right-wing aspiration: conservatives increasingly began to espouse a world view that was unapologetically imperialist....
In “Empire,” which appeared last spring, the acclaimed historian Niall Ferguson presented the British Empire as a model of how to secure global stability, foreign investment for developing countries, and simple good government. “What the British Empire proved is that empire is a form of international government that can work—and not just for the benefit of the ruling power,” he wrote. Through more than three hundred slick, illustrated pages, Ferguson mapped the past onto the present, identifying the building blocks of Britain's empire with their contemporary American analogues. For Britain's gunboats, America's F-16s and Tomahawk missiles—always prepared to knock around troublemakers on the empire's periphery. For Britain's missionary and social-uplift societies, today's N.G.O.s. In place of Britain's long-running policing action against the slave trade, similarly high-minded campaigns against ethnic cleansing.
Why did the British imperium come to an end? The standard histories tell us about great-power rivalries, a diminishing technological gap between overlords and subjects, growing independence movements among the colonized. Some conservative scholars have suggested, however, that the British Empire fell apart because of war-induced impoverishment and national fatigue. Finally, they say, the Brits just lacked will. But in 2002 America had will in abundance, and more money and guns than the British had ever had. Ferguson was challenging us simply to face up to what we already were. In the closing pages of his book, he wrote, “Americans have taken our old role without yet facing the fact that an empire comes with it.” We were, in his view, an empire “that dare not speak its name . . . an empire in denial.”...
An “empire of bases” is what Chalmers Johnson calls it in his new book, “The Sorrows of Empire” (Metropolitan; $25). It is not, for him, an edifying spectacle. Much in Johnson's account is no different from what might be found in a host of other left-leaning critiques of American power, but the trajectory of his career sets him apart. For decades, Johnson, an Asia specialist, was one of those stock figures of the Cold War: the defense analyst and academic in constant orbit of the C.I.A. Then, late in his career, he began to reconsider his Cold War commitments, particularly in East Asia. The way America garrisoned allied countries like Japan and South Korea put him in mind of the de-facto empire that the Soviets had created in Eastern Europe. Once he made that turn, he never looked back. ...
President Clinton came to office intending to keep foreign entanglements to a minimum. That isn't what happened, of course. Despite dire predictions that every military engagement would lead to a quagmire, America found that it could strike with virtual impunity almost anywhere on the globe, and military forays became more common....
The trend was accelerated by changes in the structure of the military. The Pentagon had for decades divided the world into a series of regional commands—sometimes known as cinc doms, after the acronym for commander-in-chief, the title held, until recently, by those who command them. (The last of these— centcom , which covers the Middle East, Central and South Asia, and the Horn of Africa—was created in 1983.) But a reorganization of the Pentagon in 1986 vastly increased the power of the cinc s by having them report directly to the President as well as to the Secretary of Defense, unlike the chiefs of the military's four services, who report to civilian secretaries. By the late nineties, the officers who led these commands—men like General Wesley Clark, at the European Command; Marine General Anthony Zinni, at centcom ; and Admiral Dennis Blair, at Pacific Command—were far more powerful than the various ambassadors who conduct the nation's diplomatic business in the countries under each cinc 's oversight. Johnson notes that when, in October, 1999, General Pervez Musharraf seized power in a bloodless coup in Pakistan, President Clinton called in protest and asked that his call be returned. Musharraf called Zinni instead. “Tony,” Musharraf reportedly said, “I want to tell you what I am doing.” So the trend hasn't been simply a militarization of foreign policy. It has also been a diplomatization of the American military. In the architecture of empire, the cinc s functioned like proconsuls or regional managers of Pax Americana, with plenty of money and guns and no little ingenuity.
If America, militarily unchallenged and economically dominant, indeed took on the functions of imperial governance, its empire was, for the most part, loose and consensual. In the past couple of years, however, neo-imperialism, this thing of stealth, politesse, and obliquity, has come to seem, so to speak, too neo. Especially as the war on terror began, hard-liners who were frustrated by Clinton's bumbling and hesitations saw no reason to deny that America was an imperial power, and a great one: how else to describe a country that had so easily vanquished Afghanistan, once legendary as the graveyard of empires? The only question was whether America would start running its empire with foresight and determination, rather than leaving it to chance, drift, and disaster.
Adam Clymer, writing in the NYT (Feb. 5, 2004):
Democrats who once rebelled at having their presidential choices dictated by big-city bosses seem to have cheerfully handed over that power to small-town Democrats in Iowa and New Hampshire. And this year New Hampshire may have subcontracted its role to Iowa.
How else can John Kerry's five victories, with about two-fifths of the total vote on Tuesday, be explained? After all, according to the National Annenberg Election Survey, only about one-third of prospective voters in those primary states said they knew enough about the Democratic candidates to make an informed choice. What the voters did know was that Mr. Kerry had won in Iowa and New Hampshire, and that polls showed that he was winning or gaining in their states. ...
Missouri may be the best example from Tuesday's primaries of the voters' choosing on the basis of front-runner status. Neither Mr. Kerry nor any other candidate had campaigned in the state until a week ago today, assuming that Dick Gephardt, the native son, had it locked up....
Oddly enough, it was Missouri that helped start the process that led to this spate of bunched-up primaries. At the boss-dominated 1968 Democratic convention in Chicago, Missouri's delegation was controlled, absolutely, by Gov. Warren Hearnes. Some of the meetings that had elected delegates were held in secret. One was held at night on a speeding bus. In that case it was impossible for supporters of Eugene McCarthy to participate.
In reaction to the behavior of Governor Hearnes and the other bosses, a party commission headed by George McGovern wrote the new rules that substantially survive today. It said its objective was to give Democratic voters a"full, meaningful and timely opportunity" to participate in the nomination process. Those rules encouraged states to hold primaries.
William Broad, writing in the NYT (Feb. 3, 2004):
In 1989, when the first President George Bush announced his plan to send American astronauts back to the Moon and on to Mars, he called the proposed space station"our critical next step in all our space endeavors." It would be a base in the weightlessness of space where big rockets would be assembled and blast off on voyages of exploration:"a new bridge between the worlds."
Now, with the outpost hurtling through space 240 miles above Earth and with 16 nations struggling to complete the most challenging engineering project of all time, the station has suddenly become a $100 billion dead end.
The current President Bush made no mention of it as a steppingstone in his speech on Jan. 14 reviving the call for missions to the Moon and Mars. Instead, he spoke of it as a site of biomedical research and an"obligation" that the United States had to help finish.
Mr. Bush gave no clear indication how, or whether, the United States planned to use the station after its prospective completion in 2010. With NASA focusing its efforts and its budget on the Moon and Mars, the station's prospects are uncertain.
"I'm worried that they're going to cut off the space shuttle before we have another vehicle that can fly," said Senator Bill Nelson, a Florida Democrat who is the only current member of Congress to have flown in space."And that will drastically reduce space station use."
What happened? How did the station go from star to sideshow? Experts cite a litany of factors: cost overruns, design changes, new perceptions of technical risk after the shuttle disasters and shifting national priorities. For instance, orbital changes to accommodate Russia after the cold war made it harder to use the station as a launching pad.
The tale has no real bad guys, the experts say, but many false promises.
"It was always a steppingstone to the stars," said Dr. Howard E. McCurdy, a space historian at American University."It was sold as all things to all people."
Dr. Alex Roland, a former NASA historian now at Duke University, said a moral of the story was that Congress and the public needed to work harder to hold the space agency accountable for its dreams.
"They keep getting trapped in their own rhetoric," he said."They're willing victims of it. But as public policy it's a disaster because it feeds unrealistic expectations."
At the start of the space age, visionaries invariably saw outposts in earth orbit as jumping-off points. Dr. Wernher von Braun, in a famous 1952 article, told of a huge inhabited wheel."From this platform," he said,"a trip to the Moon itself will be just a step."
In 1968, Stanley Kubrick's movie"2001: A Space Odyssey" featured a giant outpost in Earth orbit that was a way station to the Moon and Jupiter.
Finally, after decades of fantasies, President Ronald Reagan proposed in 1984 that the United States actually build a space station. It too was envisioned as a hub for colonies on the Moon and Mars. For Mr. Reagan, the station also represented a way to challenge the Soviet Union. In the cold war, Moscow made human outposts a hallmark of its space activities.
But Congress did not vote construction money to pay for either Mr. Reagan's vision or that of the first President Bush. Not until 1993 did a new a new vision for space take shape, this one emphasizing harmony over rivalry. That September, President Bill Clinton announced that Russia had joined the station effort as a full partner. Its giant rockets were seen as a boon for the project and a good backup if the shuttles should again fail catastrophically, as the Challenger did in 1986.
"One world, one station," said Daniel S. Goldin, NASA's administrator at the time.
There was just one problem. For the Russian rockets to reach the grand unified station, it would need a different orbit.
Shuttles flying out of Florida usually go into an orbit at an angle of 28.5 degrees to the Equator. The original station, meant to be built piecemeal as the shuttles carried up parts, was to have taken shape there.
But Russian rockets blast off in Kazakhstan, much higher on the globe than Florida. They cannot fly much lower than 51.6 degrees latitude without running the risk of dropping spent rocket stages or astronauts during an emergency re-entry on Mongolia or northern China. So the Clinton administration decided to erect the station at 51.6 degrees, hailing it as a"world orbit" accessible to all spacefaring nations.
Orbiting at 51.6 degrees, the new station could no longer act as the perfect jumping-off point for the Moon and beyond, experts said.
Ginia Belafante, writing in the NYT (Feb. 1, 2004):
Americans have historically maintained a high regard for complementarity in public marriages. The impish are better served by upright spouses than by charmingly impudent ones; the ponderous enlivened by the light of heart; the swaggeringly confident (Franklin Roosevelt) humanized by the sheepishly less self-assured (Eleanor).
Mitigating Kennedy's overweening passion was his wife's remove. Hardly a union of opposites, the Clinton marriage failed to satisfy. Politicians, particularly those aiming for highest office, score a public relations coup when their partners are perceived as completing them.
Though the value of her unbridled volubility has been the subject of debate since her husband's entry into the presidential race, Teresa Heinz Kerry has unquestionably animated the Massachusetts senator's bid for election. A foil to his stiffness, she is the sort who is happy to tell an interviewer - as she did on CNN Tuesday night - that her husband learned of his victories in Iowa and New Hampshire while in the bathroom. In the first instance Ms. Heinz Kerry delivered the news while he was shaving, and in the second instance as he made his way out of the shower.
In her stylishness and social facility, Ms. Heinz Kerry bears at least some likeness to Grace Goodhue Coolidge, the wife of Calvin Coolidge and a woman with the flapper-era equivalent of an enormously high personality rating. As a vice-presidential wife in the early 1920's, she had already become one of the best loved figures in Washington. As first lady, in a coy act of protest against Prohibition law, she named her collie Rob Roy after the Scotch cocktail.
"Coolidge was so incredibly laconic,'' noted Carl Sferrazza Anthony, a historian of first ladies and their political roles."His whole public persona was built around this idea of silent Cal.
"He was shy to the point of not really wanting to talk in public, and she had so much personality and warmth. He desperately leaned on her social ease, and she offset his peculiarly cranky way."
What is expected of someone bound by wedlock to a presidential contender has evolved as women's roles have changed. Though Eleanor Roosevelt addressed the Democratic National Convention in 1940, calling on her party to nominate her husband for a third term, she worked largely in the background, contrary to current assumption, during her husband's two previous bids for the White House.
Wives first assumed a highly visible role in presidential politics in 1960, at a time when middle-class women had become the country's most important consumer targets. On the occasion of his wife's funeral in 1993, Richard Nixon privately remarked that he felt unpopular on the campaign trail but always knew that"everybody liked Pat." Republicans, in fact, built a whole merchandising effort around Mrs. Nixon, distributing buttons and other paraphernalia that read"A Winning Team: Pat and Dick Nixon," or"I'm for Pat."
Jacqueline Kennedy wrote a column in the fall of 1960, distributed by the Democratic Party, in which she discussed health care, education and the atomic bomb. As the wife of the vice-presidential candidate, Lyndon Johnson, Lady Bird Johnson traveled 35,000 miles stumping for various Democratic contenders. In 1964, with her husband running for president, Mrs. Johnson embarked on a train called the Lady Bird Special and toured eight Southern states to seek support for her husband.
During the women's movement of the 1970's, the voting public paid increasing attention to what a candidate's wife could bring to the table. Rosalynn Carter independently promoted the welfare of the mentally ill during her husband's presidential campaigns. While immersed in the Iranian hostage crisis during his 1980 re-election effort, Jimmy Carter sent his wife as one of his surrogates to campaign in the New Hampshire primary.
Today, as the debate around Mrs. Dean indicates, spousal involvement seems a cultural imperative - as long as the spouse does not appear to be doing so in service of her own ambitions.
Darrin M. McMahon, writing in the Boston Globe (Feb. 1, 2004):
HOWARD DEAN SPECULATES on National Public Radio that George W. Bush may have been warned of 9/11"ahead of time by the Saudis." University professors imply with an air of sophistication that the war in Iraq was a plot to fill contracts for Halliburton. Radio shock-jocks rant against the machinations of the United Nations and the"New World Order." And the conservative pundit Ann Coulter makes the rounds of the talk shows with a book,"Treason," built on the claim that the vilification of Joseph McCarthy was the"greatest Orwellian fraud of our time." The man who warned famously of a"great conspiracy" of communists, it seems, was himself the victim of a plot by"liberals" to blacken his good name.
Hillary Clinton may have given up her talk about the"vast right-wing conspiracy." But there are plenty of others on both sides of the political divide anxious to continue the conversation. In today's popular culture and even the elite media, plots lurk behind every door.
Nor is the anxiety confined to the United States. Last month, the British government opened official inquests into the deaths of Princess Diana and Dodi Fayed, fueling ongoing speculation that the couple was murdered in a secret plot. In France and Germany, books by the once-mainstream political analyst Thierry Meyssan ("L'Effroyable Imposture" -- The Big Lie) and the former Social-Democratic cabinet minister Andreas von Bulow ("Die CIA und der 11 September") have climbed bestseller lists with their shocking revelations that 9/11 was a plot by rogue elements within the US government. Uncle Sam, they claim, framed Osama. Meanwhile, major media outlets throughout the Islamic world charge that Israel, or an international Jewish cabal, were behind the World Trade Center attacks and countless other nefarious deeds.
It is tempting just to laugh at these views, dismissing them as the ranting of a lunatic fringe or the naive cynicism of the overeducated. But they are simply too prevalent to be ignored. The clearing house www.conspiracy-net.com , one of the many websites devoted to the subject, boasts over"one thousand searchable conspiracies," from child abductions in Nigeria to the invention of AIDS in CIA laboratories to the real motivations behind President Bush's proposed mission to Mars.
Are we living in a golden age of conspiracy theory? And if so, what stands behind this apparent upsurge in global anxiety? Fortunately, no shortage of observers has turned their attention to such questions. As Syracuse University political scientist Michael Barkun writes in"A Culture of Conspiracy: Apocalyptic Visions in Contemporary America" (California), the latest in a recent spate of academic studies on the subject,"obsessive concern with the magnitude of hidden evil powers" is just what one might expect in a turn-of-the-millennium culture"rife with apocalyptic anxiety."
Editorial in the WSJ (Feb. 2, 2004):
Was John Kerry brainwashed? That's what we've been wondering as the new Democratic frontrunner struggles to explain his off-again-on-again-off-again support for confronting Saddam Hussein.
Some of our readers may recall that"brainwashing" is the word that turned the late George W. Romney into a footnote in American political history. In the summer of 1967, the Michigan Governor was the leading contender for the 1968 GOP Presidential nod. Then he told a Detroit television station that during a trip to Vietnam he had had"the greatest brainwashing that anybody can get" regarding the increasingly unpopular war. Romney was quickly laughed out of the race.
Now Mr. Kerry seems to be concocting his own Romney-like rationale for changing his mind on Iraq, specifically on weapons of mass destruction. Back in 1991, the Massachusetts Senator opposed President George H.W. Bush's U.N.-backed effort to drive Saddam from Kuwait. But on October 11, 2002 he nonetheless voted to give the current President Bush the unilateral authority"to use the armed forces of the United States as he determines to be necessary and appropriate."
So why did the Senator later vote against the $87 billion appropriation to finish the job in Iraq (and Afghanistan), while accusing Mr. Bush of pursuing a" cut-and-run" strategy? Well, he now claims, he was"repeatedly misled" about Iraq's weapons by Bush officials including Vice President Dick Cheney and Secretary of State Colin Powell. And he is demanding an investigation.
But is it really likely that this savvy Washington insider was hoodwinked? As an 18-year member of the Senate Foreign Relations Committee, he has spent plenty of time thinking about how to handle Iraq. He also had privileged and direct access to U.S. intelligence, the same data that led President Clinton into a military confrontation with Saddam in 1998, which was the same year"regime change" became stated U.S. policy after Mr. Kerry allowed the Iraq Liberation Act to pass the Senate with unanimous consent.
Presumably, similar intelligence played a role in Senator Kerry's speech on October 9, 2002 that"I will be voting to give the President of the United States the authority to use force -- if necessary -- to disarm Saddam Hussein because I believe that a deadly arsenal of weapons of mass destruction [our emphasis] in his hands is a real and grave threat to our security." If Mr. Kerry was misled into believing in such a threat, so were the likes of Bill and Hillary Clinton, Al Gore, Madeleine Albright and Senator Carl Levin, all of whom made similarly unequivocal statements on the matter.
Nor does it appear to have been any contrary evidence that started Mr. Kerry's drift back into the antiwar camp. Rather it was the sudden traction Mr. Dean was getting with his antiwar message that led Mr. Kerry in January 2003 to start accusing Mr. Bush of a"rush to war." These days Mr. Kerry has more or less adopted the entire Dean line, decrying as"fraudulent" a coalition that includes most of our key allies from World Wars I and II.
Mr. Kerry has an explanation for all this, sort of. He says Saddam should have been evicted from Kuwait but voted"no" on the first Gulf War to give the former President Bush more time to amass domestic support. He says his"yes" vote in 2002 was premised on this President Bush attracting more international help. He didn't, he told Rolling Stone, expect Mr. Bush to"f -- it up as badly as he did."
Hell, we might curse too if we felt obliged to offer up such a tortured rationale. We're not the only ones who've noticed. Washington Post columnist David Broder, no Republican shill, recently suggested to Mr. Kerry that it would be difficult for him to explain to voters that"your 'no' [in 1991] did not mean no, and your 'yes' [in 2002] did not mean yes."
The Occam's Razor explanation, it seems to us, is that the former Naval Lieutenant tacks with the political winds -- and not just over the course of years and months but of days. The liberal New Republic magazine recently republished two Kerry letters to the same constituent in 1991, one appearing to support the Gulf War, the other to oppose it.
We think Mr. Kerry knows full well that there was no Administration conspiracy to mislead anybody this time around. Intelligence on Iraq was indeed faulty, as weapons inspector David Kay told Senators last week. But Mr. Kay was emphatic that any mistakes were not because of Administration pressure. Meanwhile, the prior occupant of the White House continues to believe the WMD existed. The Portuguese Prime Minister says Mr. Clinton told him recently"he was absolutely convinced, given his years in the White House and the access to privileged information which he had, that Iraq possessed weapons of mass destruction until the end of the Saddam regime."
All of which raises the vital question of Mr. Kerry's constancy and character. In the Romney era, at least, some sort of consistency on matters of war and peace -- or at least a plausible explanation for a change of heart -- was considered a prerequisite for would-be commanders-in-chief. Shouldn't it still be today?
Susan Baer, writing in the Balt Sun (Jan. 23, 2004):
For all the changes that have revolutionized women's lives in the last half-century and recognized women as equal partners in marriages and in the workplace, any variation in the role of a presidential candidate's spouse, generally a wife, seems to set off its own sort of culture war.
Lewis L. Gould, a historian of first ladies and professor emeritus at the University of Texas, says the public's interest in seeing political wives is not so much tied up with views about feminism or women's roles. "It's more a statement about how involved with show business running for president has become," he says. "It's like a situation comedy. Everyone steps into their role. If you start to depart from the formula, there's all sorts of unease."
He says he has been struck in this campaign by how early in the primary process most of the wives have become active, noting that in 1960, Jacqueline Kennedy became involved - reluctantly - only after her husband had been chosen as the Democratic nominee.
"In the run-up to the first voting, people are now asking questions about the spouse," says Gould. "It's an indication of how integral a part of this process their presence has become.
"The general attitude is that running for president is so important, if a candidate has a truly intimate marriage, the wife would want to put aside her career and be a part of the campaign. There's an underlying traditionalism here that's very strong."
He wonders if the same would be true for the husband of a female presidential candidate.
Myra Gutin, another historian of first ladies and a communications professor at Rider University in New Jersey, says the spouses provide the windows into a candidate's personal life and character that voters demand.
"Having your spouse on the campaign trail is the most visible endorsement of your candidacy," she says. "It says to audiences, 'My wife or my husband is supportive of what I'm doing.' We like to know the character of a candidate, and we put his or her family in that particular basket."
The wives of the current crop of candidates have distinctive styles, from the invisible Steinberg, for whom corduroys and a sweater are said to be dressy, to the colorful Teresa Heinz Kerry, one of the nation's top philanthropists, her fortune estimated at $500 million.
With the exception of Steinberg, the wives of the top-tier Democrats have been active in their husband's campaigns, several of them outspoken women who are a far cry from the dutiful, smiling mannequin-variety of political spouses.
George Bisharat, a professor at the University of California's Hastings College of Law, writing in the LAT (Jan. 25, 2004):
Although Israel has claimed that Palestinians willingly abandoned Palestine after being urged to leave in radio broadcasts by Arab leaders, a review of broadcast transcripts by Irish diplomat Erskine Childers in 1961 revealed that Palestinians were exhorted by Arab leaders to stay, not leave their homes. In fact, Yigal Allon, commander of Palmach, the elite Zionist troops, and later Israeli foreign minister, launched a whispering campaign to terrorize Palestinians into flight.
Nor were we simply unintended victims of a war launched by the Arab states against Israel. As far back as the late 19th century, leaders of Political Zionism (the movement to create a Jewish state in Palestine) advocated "transfer" of the Palestinians, by force if necessary. In 1948, Jews owned only 11% of the land allocated by the United Nations to the Jewish state -- not enough for a viable economy. As David Ben-Gurion said in February 1948 before he became prime minister of Israel: "The war will give us the land. The concepts of 'ours' and 'not ours' are peace concepts only, and in war they lose their whole meaning."
Zionist leaders knew that an Arab minority of 40% would challenge the Jewish demographic dominance they sought. Hence, nearly half of the Palestinian refugees ultimately expelled were forced out before the Arab states attacked Israel in May 1948. Israeli historian Benny Morris documented 24 massacres of Palestinian civilians, some claiming hundreds of unarmed men, women and children, during subsequent fighting. Thousands more Palestinians were, like the residents of Majdal (now Ashkelon) -- a southern coastal city 15 miles north of the Gaza Strip -- chased across the border into Gaza after the armistice of 1949.
Palestine had to be "cleansed" of its native population to establish Israel as a Jewish state. Ironically, those who today protest that the return of the refugees would destroy Israel unwittingly confirm this viewpoint, for the refugees are simply the Palestinians and their offspring who would have become Israeli citizens had they not been exiled.
Israel's denial of responsibility for the refugees and rejection of their repatriation (intransigence that was condemned early on by a U.S. official as "morally reprehensible") is nearly as offensive as the original expulsion itself. Israel welcomed immigrant Jews from all over the world but shot Palestinians who tried to return to recover movable property, harvest the fruit of their orchards or reclaim their homes. Oxford professor Avi Shlaim concluded in his book "The Iron Wall" that "between 2,700 and 5,000 [Palestinian] infiltrators were killed in the period 1949-56, the great majority of them unarmed."
Nothing the Palestinians had done merited this treatment, something the international community has consistently recognized. A 1948 U.N. resolution recognizing the Palestinian right of return has been annually -- and almost unanimously -- reaffirmed ever since. The Palestinian right of return is also supported by Human Rights Watch and Amnesty International.
The two-state solution envisioned today would probably ameliorate the conditions of the one-third of the Palestinians living under Israeli military occupation in the West Bank and Gaza Strip. There, Palestinians face incessant military attacks that have demolished homes and orchards and killed an average of nearly 70 Palestinians per month over the last three years. A smothering matrix of closures, curfews and checkpoints restricts movement and has caused unemployment to soar to more than 70% and threaten Palestinian children with malnutrition. Meanwhile, Israeli settlers, shock troops in the grinding 36-year campaign to seize and colonize yet more Palestinian land, speed through the West Bank and Gaza Strip on "Jewish only" roads. The oppressive features of Israeli military occupation were entrenched long before Palestinians resorted in the mid-1990s to the desperate -- yet still indefensible -- tactic of suicide bombings to slow the colonizing juggernaut.
But this two-state solution would not address the concerns of 1.2 million Palestinians living in Israel as second-class citizens. Palestinian citizens there possess formal political rights -- that much Israel can afford after expelling most Palestinians in 1948. But these Palestinians have restricted access to land (most real property in Israel is owned by the state or the Jewish National Fund and is leased to Jews only). They are also forced to carry identity cards that brand them as non-Jews, and they cannot serve in the armed forces (the key to many benefits in Israeli society). Palestinian towns and villages are starved of resources, with many lacking connections to the country's electrical or water systems. Government policies, from immigration to family planning, are designed to counter the "demographic threat" Israelis fear in the higher birthrate of Palestinian citizens. Israeli law enshrines the principle that Israel is the "state of the Jewish people," and it lacks firm guarantees of the legal equality of all citizens.
Nor would the two-state solution fairly redress the rights of diaspora Palestinians -- permitting us only return to a new, already overcrowded and underfunded "statelet" in the West Bank and Gaza Strip.
There is no bar to implementing the Palestinians' right of return. If there is room in Israel for a million Russian immigrants (including many non-Jews), there is room for those Palestinians who would elect return over other legal options. The sole obstacle is Israel's desire to maintain a "demographic balance" favorable to Jews.
James Burkee, writing in USA Today (Jan. 28, 2004):
This isn't the first time intelligence information -- or the public's lack of it -- has played a significant role in a presidential election. During the 1960 campaign, Republican Richard Nixon's Democratic challengers -- Sens. Stuart Symington of Missouri, Lyndon Johnson of Texas and John Kennedy of Massachusetts -- all campaigned on the so-called missile gap, the Soviet Union's perceived superiority in nuclear weaponry.
Bombastic Soviet Premier Nikita Khrushchev had capitalized on the shock value of Sputnik in 1957 by boasting that the U.S.S.R. was building missiles "like sausages." By harping on that "missile gap," Kennedy convinced many Americans that it was true: In 1960, 47% of Americans believed the Russians were ahead of the U.S. in missile and rocket production. The problem, historian Martin Walker says, was that "there was no missile gap, and Kennedy knew it."
Photographs taken by the CIA's super spy plane, the U-2, suggested that if there was such a gap, it favored the West. But President Eisenhower and Vice President Nixon could not challenge Kennedy's claims without acknowledging the existence of the U-2 and missions over Soviet territory. The Democratic challengers' strategy against Bush mirrors Kennedy's in 1960: attack from the right by saying the president has not been strong enough on homeland security and in postwar Iraq.
By manufacturing a "missile gap" and implying support for an invasion of Cuba by anti-Castro exiles, Kennedy seemed even more hawkish on defense than Nixon. The issue was decisive in a close race, which Kennedy won by fewer than 60,000 votes.
The danger of playing politics with intelligence was revealed in 1962, shortly after Kennedy took office, when the superpowers nearly came to blows over the placement of Soviet missiles in Cuba. Historian John Lewis Gaddis suggests that crisis may be traced to Kennedy's posturing during and after the election. "Khrushchev placed missiles in Cuba," Gaddis writes, "because he saw Kennedy as aggressive, not passive."
Today's Democratic challengers would do well to learn from the consequences of Kennedy's political ambition. By suggesting that President Bush lied about WMD and 9/11, the Democrats are, as was Kennedy, relatively safe: The Bush administration cannot prove the assertions incorrect without exposing U.S. intelligence.
But they play a dangerous game in running at Bush from the right, as Wesley Clark did on Jan. 10, when he asserted that "the two greatest lies" of the Bush presidency were that 9/11 could not have been prevented and that future attacks are inevitable. "If I'm president of the United States," Clark boasted, " . . . we are not going to have one of these incidents."
Someday the truth will come out, as it has about Kennedy. It may take decades, but one day historians will discover what the Bush and Clinton administrations knew and when they knew it.
In the interim, we can only hope the Democrats remember the lesson of 1960: Playing politics with intelligence is very risky business.
Elizabeth Bumiller, writing in the NYT (Jan. 25, 2004):
Historically, Americans have not voted out the commander in chief in the middle of war, which helps explain, Democrats say, why Mr. Bush used the grand stage of the State of the Union speech to underline the threat. ("And it is tempting to believe that the danger is behind us. That hope is understandable, comforting and false.") It is also why the president traced the two-year narrative of a war on terror and then rebutted those who questioned, as he put it, "if America is really in a war."
Historians say that Franklin D. Roosevelt would probably not have won a third term in 1940 had there not been the crisis in Europe and Hitler's invasion of France that June. "There were forces on the right who didn't like anything about the New Deal, he had not brought about economic recovery and a lot of people thought he had too much power," Mr. Kennedy said. "There's very little question he owes his third term, and his fourth as well, to the international crisis."
Similarly, in the Civil War election of 1864, Abraham Lincoln survived a challenge by George B. McClellan, the Democratic nominee and the general Lincoln had fired the year before. But it might have been otherwise had not General Sherman captured Atlanta two months before the election, turning Lincoln's fortunes around after a summer of devastating casualties. "Lincoln was elected on a tide of military success," said James M. McPherson, the Civil War historian. "But Lincoln and everybody else acknowledged that if the election had been held in August, it would have gone the other way."
Of course, unpopular wars have driven some presidents from office, like Lyndon B. Johnson, who chose not to run for re-election in 1968 because of his vulnerabilities over Vietnam. Harry S. Truman was so unpopular in 1952 because of the stalemate in Korea that he might not have won his party's nomination.
It is no surprise that the biggest fear of the current White House, short of another terrorist attack, is that Iraq will implode before the election. Barring that, political analysts say Mr. Bush is wise to wield his most powerful advantage against the opposition. In a New York Times/CBS News poll conducted just before the State of the Union, 68 percent, including majorities of both Democrats and independents, gave Mr. Bush high marks for the campaign against terrorism.
Frank Rich, writing in the NYT (Jan. 25, 2004):
Since its release, "The Fog of War" has generated plenty of debate on two fronts. Should Mr. McNamara, who freely admits to making errors about Vietnam but stops well short of outright contrition, rot in hell? The verdicts on his confessions in Mr. Morris's film range from mild praise (he's conceding fallibility, however belatedly) to utter rage (Roger Rosenblatt, on "The NewsHour," likened him to the self-justifying bureaucrats of Treblinka).
The greater debate has been over the degree to which the follies of Vietnam are now being re-enacted in Iraq. Though Mr. Morris started interviewing Mr. McNamara before 9/11 and his film never mentions current events, the implicit parallels between then and now are there for the taking. In the Johnson administration's deceptive hyping of the Gulf of Tonkin incident as a provocation to war, we see the Bush administration's deceptive hyping of the supposedly imminent threat of Saddam Hussein's weapons of mass destruction for the same purpose. In Mr. McNamara's stern warnings against waging war unilaterally and against trying to win the hearts and minds of a foreign land without understanding its culture first, we find historical lessons we didn't heed as we blundered into the escalating chaos of our "postwar" occupation of Iraq.
Such analogies can be pushed only so far, however, and Mr. McNamara refuses to draw them publicly, despite repeated badgering by interviewers like me to do so. But if it is inexact, not to mention wildly premature, to declare that Iraq is Vietnam, it is not too soon to mine a related and pressing resonance of the McNamara story. When President-elect John F. Kennedy appointed Mr. McNamara to his cabinet, he was lionized as the very model, indeed the very shiny new model, of the modern star business executive: famously, the first non-Ford to be president of the Ford Motor Company, the most brilliant of the 10 so-called Whiz Kids whom Ford had recruited en masse from the Air Force brain trust of World War II, and the first M.B.A. from Harvard Business School to ascend so high in government.
As a national role model at the dawn of Camelot, Robert McNamara was Dick Cheney, Donald Rumsfeld and, yes, Paul O'Neill before it was cool. He entered the cabinet as an exemplar of "American certitude and conviction" who could use "his rationality with facts" to intimidate bureaucratic dissenters, David Halberstam wrote in "The Best and the Brightest" in 1972, after Mr. McNamara had come to his bad end. Among Mr. McNamara's virtues, Mr. Halberstam wrote, was loyalty but "perhaps too much loyalty, the corporate-mentality loyalty to the office instead of to himself."
"The Price of Loyalty," Ron Suskind's new best-selling exposé of the inner workings of the Bush White House, reads like an as-told-to book by its principal source, Mr. O'Neill, a C.E.O./cabinet officer fired by another Texan wartime president. It casts the former treasury secretary in the same role of protagonist that Mr. McNamara plays in "The Fog of War." When Mr. O'Neill was first appointed, he was hailed for his successful tenure at Alcoa, where, like Mr. McNamara at Ford, he was prized for his humanistic concern with safety as well as his can-do resuscitation of a sinking bottom line. The parallels end there. Whatever one thinks of Mr. O'Neill's White House tenure, he is of footnote stature in American history, if that. And unlike Mr. McNamara, a loyal courtier to presidents to the bitter end and beyond, Mr. O'Neill hardly waited a moment before trashing George W. Bush.
Patrick Tyler, writing in the NYT (Jan. 25, 2004):
Since becoming prime minister in 1997, Tony Blair has proved the most successful and popular Labor leader of the last century, yet there are whispers in his own party that he could be out of office by Easter.
"If Blair is out next week, there probably won't be any tears outside his family because it is a ruthless system and it's the way the system works," said Iain McLean, professor of politics at Oxford University.
Whatever it is about British politics and the parliamentary system that deliver abrupt and surprising reversals of political fortune seems to be at work again. Mr. Blair has proved nimble at brilliant reinventions in the past, but the question remains: Can he outrun British democracy's penchant for fatigue, which has sometimes claimed leaders at the most surprising times?
In May 1945, Winston Churchill was at the apex of power after Germany's surrender. And with Franklin Roosevelt's death the month before, Mr. Churchill was the West's icon of allied victory.
Yet before the month was out, he was forced to resign as prime minister, and the Labor Party soon swept into office under Clement Attlee.
In 1990, Margaret Thatcher suffered a similar indignity after winning three terms for her party, presiding with Ronald Reagan at the burial of Communism in Europe and privatizing much of the British economy.
In August she was telling the first President Bush, with Churchillian verve, not to go "wobbly" in the face of Saddam Hussein invasion of Kuwait. But by November she was political toast, overthrown by Tory rebels and replaced by John Major.
"I was visiting at Stanford University in 1990 and my colleagues had no idea" that Mrs. Thatcher was teetering, said Professor McLean.
"It was Thanksgiving, so there was no one around to explain,'' he added. "So I was briefly a pundit.''
Like Churchill and Lady Thatcher before him, Mr. Blair has set records for longevity and led his nation into war. He returned the Labor Party to a prominence it had not enjoyed in 101 years with two landslide victories. Large Labor majorities took control of the House of Commons, and almost every city, town and shire overturned the Tory supremacy in British life.
But his war leadership in Iraq has made him deeply unpopular with a large segment of the public, even though opinion polls still show that a plurality of Britons back his decision to go to war to remove Mr. Hussein.
Robert Kagan, senior associate at the Carnegie Endowment for International Peace, writing in the NYT (Jan. 24, 2004):
A great philosophical schism has opened within the West, and instead of mutual indifference, mutual antagonism threatens to debilitate both sides of the trans-Atlantic community. Coming at a time in history when new dangers and crises are proliferating, this schism could have serious consequences. For Europe and the United States to decouple strategically has been bad enough. But what if the schism over "world order" infects the rest of what we have known as the liberal West? Will the West still be the West?
It is the legitimacy of American power and American global leadership that has come to be doubted by a majority of Europeans. America, for the first time since World War II, is suffering a crisis of international legitimacy.
Americans will find that they cannot ignore this problem. The struggle to define and obtain international legitimacy in this new era may prove to be among the critical contests of our time, in some ways as significant in determining the future of the international system and America's place in it as any purely material measure of power and influence.
Americans for much of the past three centuries have considered themselves the vanguard of a worldwide liberal revolution. Their foreign policy from the beginning has not been only about defending and promoting their material national interests. "We fight not just for ourselves but for all mankind," Benjamin Franklin declared of the American Revolution, and whether or not that has always been true, most Americans have always wanted to believe that it is true. There can be no clear dividing line between the domestic and the foreign, therefore, and no clear distinction between what the democratic world thinks about America and what Americans think about themselves.
Every profound foreign policy debate in America's history, from the time when Jefferson squared off against Hamilton, has ultimately been a debate about the nation's identity and has posed for Americans the primal question "Who are we?" Because Americans do care, the steady denial of international legitimacy by fellow democracies will over time become debilitating and perhaps even paralyzing.
Americans therefore cannot ignore the unipolar predicament. ...
Right now many Europeans are betting that the risks from the "axis of evil," from terrorism and tyrants, will never be as great as the risk of an American Leviathan unbound. Perhaps it is in the nature of a postmodern Europe to make such a judgment. But now may be the time for the wisest heads in Europe, including those living in the birthplace of Pascal, to begin asking what will result if that wager proves wrong.
Guy Coq, author of a book about secularism in France, writing in the NYT (Jan. 30, 2004):
With France on the verge of passing a law that would prevent Muslim girls from wearing their head scarves in class, Americans are asking why the French are so attached to secularism....
The French word that is closest to secularism, laïcité, was invented in the late 19th century to express several ideas. Laïcité includes, foremost, tolerance. Tolerance had actually been around for a while. It was first instituted in 1598 under the Edict of Nantes which allowed Protestants to practice their faith and ended our Wars of Religion. But the state and the Roman Catholic Church were so intertwined that tolerance wasn't enough. We had to take away the church's power to oppress minorities and make law.
For that, France had to go farther than other countries in separating matters of state and matters of religion. The most emphatic expression of this desire came in our Revolution of 1789. The French people didn't just depose a monarch they also took aim at the Catholic Church's domination of society, stripping the church of its property and demanding that the clergy acknowledge the authority of the state.
In the century after the Revolution, however, the Catholic Church found ways to regain power. A concordat between the papacy and Napoleon in 1801 gave the church a privileged position as the majority religion of France. The church took control of education and provided priests as teachers. As monarchs, emperors and republics succeeded one another during the 1800's, the church inserted itself into politics by joining with forces that were enemies of the rights of man and the republican ideas of the Revolution.
The leaders of the Third Republic, in the 1880's, saw that for the republic to establish itself, it had to wrest control of the schools from the church. Prime Minister Jules Ferry founded the public school system, which barred priests as teachers and took over the job of transmitting common values and the sense of social unity in short, forming the citizens of the republic without reference to religion.
The next step, the ending of Napoleon's concordat, came in 1905. By separating church and state instituting a republic that was neutral toward all religions, and without a national religion France finally realized the aims of the Revolution. This is laïcité, and it has worked well.
But the laïcité of schools has been eroded by the intrusion of religious symbols, prompted by an excess of individualism, that philosophy so revered by Americans. The necessity of the law that Parliament will debate on Tuesday reveals the regrettable waning of this French tradition
Peter Robinson, a fellow at the Hoover Institution, served as chief speechwriter to Vice President Bush and special assistant and speechwriter to President Reagan. He is the author of How Ronald Reagan Changed My Life (Regan, 2003). In the WSJ (Jan. 22, 2004):
The White House communications director, Dan Bartlett, reported on Tuesday that by the time the president left to deliver his State of the Union address at the Capitol, the speechwriters were "on about draft 30 of the speech." From a speechwriter who went through the ordeal a few times himself, here's a report card:
* The Fatuity Factor: By the time a State of the Union address is in its 10th or 12th draft, it's easy for the speechwriters to start composing sentences that don't actually mean anything. Perhaps because they passed through so many hands -- his speechwriting staff was the largest in recent years, perhaps in history -- President Clinton's State of the Union addresses are especially rich in examples of empty rhetoric. Consider this beauty from Mr. Clinton's 1996 address: "Now is the time for us to look to the challenges of today and tomorrow, beyond the burdens of yesterday."
President Bush? I listened closely, but in all 54 minutes I never heard him utter a single sentence that didn't mean at least a little something. This may seem an odd category in which to award a grade. But within the speechwriting brotherhood, it's important. Even at the worst moments, everyone on the Bush staff kept his head. Grade: A
* Make 'Em Laugh: Humor is tricky in a State of the Union address. A few laughs would help set the audience in the House chamber at ease. But the occasion is supposed to be august. In 1992, President George H. W. Bush joked that Speaker Tom Foley and Vice President Dan Quayle, positioned on the rostrum behind him, "saw what I did in Japan [the President, ill with the flu, had vomited at a state dinner] and they're just happy they're sitting behind me." The elder Bush may have gotten a laugh, but he sounded undignified.
One of the finest moments this time took place during the president's discussion of the war on terror. Turning to the argument that the rebuilding of Iraq should be internationalized, the president deadpanned.
"This particular criticism," he said, "is hard to explain to our partners in Britain, Australia, Japan, South Korea, the Philippines . . . " As Mr.
Bush continued -- ". .. . Thailand, Italy, Spain, Poland, Denmark, Hungary, Bulgaria, Ukraine, Romania, the Netherlands . . ." -- his audience began to laugh. Then the audience interrupted him with applause. And when he finally completed the litany of nations that have committed troops to Iraq -- ". . . Norway, El Salvador, and . . . 17 other countries . . ." -- the audience gave him an ovation.
The best use of humor in a State of the Union address I've witnessed. Grade: A+
* The Speech He Got Stuck With: State of the Union addresses often amount to not one but two speeches: the speech the president got stuck with, which sounds like a hodgepodge, and, somewhere inside it, the speech the president wanted to deliver, which sounds unified, authentic and complete.
How do chief executives get stuck with hodgepodges? For weeks, Cabinet secretaries, agency heads, chairmen of congressional committees, and members of the White House senior staff draw up lists of initiatives they insist the address must contain. Some of this material can be tossed out.
But a lot cannot. Speechwriters do their best to keep this portion of State of the Union addresses thematically unified. They always fail.
How was this portion of President Bush's address? Just fine. The president's own interest in the speech came and went -- he appeared a lot more intent on making his tax cuts permanent than on modernizing the electricity grid. But his delivery remained well-paced, the text itself craftsmanlike. And it isn't really the rhetoric in this portion of any State of the Union address that matters in any event. It's the dollars. By contrast with the spree over which George W. Bush has so far presided -- as this newspaper has pointed out, Mr. Bush has increased discretionary domestic spending more than any chief executive since Lyndon Johnson -- the hodgepodge of proposals the president advanced on Tuesday appears restrained. Grade: A
* The Speech He Wanted to Deliver: In 1992, President George H. W. Bush delivered one of the best speeches-within-a-speech in any State of the Union address, speaking with feeling about the end of the Cold War.
"[C]ommunism died this year," the elder Bush proclaimed. "There are still threats. But the long, drawn-out dread is over."
On Tuesday, President George W. Bush delivered a speech-within-a-speech of his own, devoting it to the war on terror. These first 25 minutes of his address proved beautifully written and powerfully delivered. "The work of building a new Iraq is hard, and it is right," the president declared. "And America has always been willing to do what it takes for what is right." Yet something was missing. Although the president provided a compelling defense of his actions in the 28 months since 9/11, he told us almost nothing about what comes next.
"[N]early two-thirds of [al Qaeda's] known leaders have now been captured or killed," the president stated. Did he mean to suggest that the war on terror is two-thirds over? If not, why not? At times the president spoke as if the war would end as soon as we caught "the remaining killers." At other times he spoke as if the war would continue until we had transformed the entire Arab world, remaking a region that "remains a place of tyranny and despair and anger." Which does he intend?
As he proved in his defiant address on Sept. 20, 2001, nine days after the terrorist attacks, George W. Bush knows how to sound Churchillian. In the State of the Union address, he should have told us whether the war on terror has reached the beginning of the end or only the end of the beginning. Grade: Incomplete
* "Good Enough": The president's failure to lay out our next objectives in the war on terror strikes me as serious. On the other hand, you can submit President Reagan's 1984 State of the Union address to the most minute scrutiny but find only the broadest hints about what he intended to do in a second term. Yet later that year he carried 49 out of 50 states -- and by the time he left office he had won the Cold War.
A pretty good speech is often good enough. Overall Grade: B+
Tony Quinn, co-editor of the California Target Book, a nonpartisan analysis
of California legislative and congressional campaigns, writing in the LAT
(Jan. 25, 2004):
Not since the 1928 elections have the Republicans retained control of both Congress and the White House. Now that 76-year-old record may be about to fall: President Bush is looking stronger with the economy picking up, and Republicans seem likely not only to hold on to control of both houses of Congress but also to increase their numbers.
A GOP gerrymander of Democratic districts in Texas will probably add six to eight Republicans to the House. Every other big state is so heavily gerrymandered that no other major changes are likely. This means the GOP majority in the House is expected to grow by at least half a dozen seats.
The Rothenberg Political Report says that six Democratic Senate seats are in danger of falling to Republicans five in the South while only three GOP-held seats are similarly vulnerable. The Democratic minority will almost certainly have fewer seats in the next Congress than it has today.
How did we ever get to this: The nation's historic-majority Democratic Party reduced to a declining minority, and the second-banana Republicans suddenly running everything?
A good place to start is 1928, because U.S. politics seems to run in roughly 60-year cycles.
From the Civil War until 1928, Republicans were dominant, building a coalition from Civil War veterans, farm states and the emerging West. From the election of Franklin D. Roosevelt in 1932 until 1994, Democrats held both houses of Congress for all but four years and the White House for most of that time, especially in the New Deal years. They also held the most governorships and a majority of state legislatures. The New Deal coalition of the Southern poor, ethnic and minority voters and urban liberals held together, more or less, for six decades.
The big change on the national stage occurred in 1994, when the GOP achieved dominance in the South and its border states, and not on traditional economic issues but on cultural ones. While the Democrats have held their own in the industrial North and New England, they have declined in the South and much of the West over the last 40 years to a point of near-extinction. "Angry white males" in their pickups with gun racks have brought about fundamental political change by doing nothing more than shifting their loyalties from Democrats, based on economics, to Republicans, based on culture and values.
With Democrats getting down to the serious business of choosing an opponent to Bush, they face a basic issue: Can they reverse the cultural alienation that has cost them so many of their former core supporters? Watching their candidates pander to every socially liberal interest group suggests they cannot indeed do not even acknowledge their predicament.
The lesson of Al Gore's defeat in 2000 is not Florida, which he would have won had he just gotten as many votes as the Democratic candidate for the U.S. Senate there, but places like eastern Kentucky and West Virginia, which didn't stray from their New Deal roots until 2000.
A cultural conservatism grounded in religion and traditional values is imbedded in the South and its border states and is now more important to their voters than economic issues. Among the best measures of how people voted in 2000 was church attendance. People who attended church regularly overwhelmingly supported Bush; those who didn't went for Gore.
The Democrats' problem goes beyond simply being irreligious. There's an undercurrent of hostility toward religion in the highest ranks of the party. The Democrats' dismissal of Bush's "faith-based initiative" is just one example of their hostility. Even on an issue like abortion rights, on which Democrats are with the majority of the public, intolerance of any dissent has alienated a mainstay of the New Deal coalition, Roman Catholics.
That's ironic, because the last time the Democrats were in as bad a shape as they are today, 1928, the issue that did them in was religion, specifically the nomination of the Catholic Al Smith for president. Then, the party seemed too religious; now its problem is no religion at all. The gamble Democrats made in 1928, nominating a Catholic for president, paid off handsomely in the half century after Smith, as legions of Irish, Italian and Eastern European Catholics joined Southern and Western Protestants to vote again and again for Democratic majorities. John F. Kennedy's Catholicism is now regarded as crucial to his election in 1960, and he remains the last non-Southern Democrat to win the White House.
Yet, Democrats seem to have forgotten how important religious identification was to their success. Many Catholics and conservative Protestants once regarded Republicans as country-club elitists with whom they had little or nothing in common. A Southern Baptist named Harry S. Truman won in 1948 with solid Catholic and fundamentalist Protestant support.
Today, churchgoing Catholic voters are leaving the Democratic fold with accelerating speed, as are many traditional Protestants. The crucial and growing Latino constituency is more religious and more Catholic than mainstream Democrats, and it could be next.
Brendan I. Koerner, writing in Slate (Jan. 28, 2004):
Despite winning both the Iowa caucus and the New Hampshire primary, John Kerry trails Howard Dean on the delegate scorecard . How can Kerry have fewer delegates than the man he's twice trounced at the polls?
The discrepancy is due to the early whims of some unpledged delegates, colloquially known as superdelegates. Of the 4,964 delegates who will attend the Democratic convention in Boston this July, the majority are obliged to support specific candidates in accordance with how their respective states voted during primary season. But there are 801 delegates who won't be bound by such customs. These superdelegates—typically congressmen, party leaders, and other political bigwigs—can support whomever they please at the convention. The delegate scorecard so far, then, takes into account that just more than a quarter of the superdelegates have already expressed a public preference for one candidate or another, and Dean has been the more popular choice than Kerry among this elite.
The Democratic National Committee created superdelegates as part of a 1982 overhaul of convention rules. In response to the furor at the 1968 convention, where street protestors railed against the rarefied nature of politics as usual, the party had opted to turn the nominating process entirely over to delegates picked in the primaries and caucuses, rather than giving party elders a backroom say. But after dark horses George McGovern and Jimmy Carter won their respective nominations, party leaders worried that the populist approach encouraged"insurgent" candidates who would tend to lose more often than not—Carter's 1976 triumph notwithstanding. The superdelegates, then, were intended to stabilize the process. As political insiders, they could generally be expected to cast their lot with mainstream candidates favored by the Democratic hierarchy.