Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: NYT (1-6-06)
Mr. Sharon will also be known as the chief architect of the Likud Party's settlement drive in the occupied territories. His defeat, as prime minister, of the second Palestinian intifada will doubtless be carefully studied, once the hysteria and hype die down, as a model of a relatively clean, successful counterinsurgency.
But that is for the future. Meanwhile, Mr. Sharon's stroke has plunged Israel and the region into deep confusion.
Just a few days ago, there were a handful of certainties. All the polls indicated that in the coming Israeli general elections, scheduled for late March, Mr. Sharon's new Kadima Party would win handily, reinstalling him in the premiership. It was not clear how large a mandate he would enjoy or who would be his coalition partners. But a Sharon-led Israel was a certainty.
Another certainty was that his next term in office would be shadowed by the corruption investigation and charges that have already forced the resignation of his son, Omri Sharon, from the Knesset. But again, this scandal was not expected to be a coalition- or career-breaker: Israeli society has become too jaded, or simply faces too many existential problems, to give much weight to personal miscreancy.
Most important, there was a vague certainty that there would be further steps toward a pacification of Israel-Palestine and a separation of its two warring tribes into two relatively homogeneous states. Mr. Sharon had shown the way, courageously, remorselessly, six months ago with the uprooting of the Jewish settlements and the withdrawal of the Israel Defense Forces from the Gaza Strip. And he had shown the way, in defiance of often absurd and mendacious criticism by the Palestinians and their supporters, by pushing forward with the construction of the barrier - overwhelmingly a fence, not a wall - between the Arab West Bank and (Jewish) Israel more or less along the 1967 Green Line....
Posted on: Friday, January 6, 2006 - 12:05
SOURCE: WSJ (1-6-06)
Mr. Sharon has been intimately identified with every major event in that history. An infantry officer in the desperate battle for the Jerusalem corridor in the 1948 War of Independence, leader of the paratroopers in the 1956 Sinai campaign, he rose to the rank of general and commanded divisions in the Six Day War of 1967 and the 1973 Yom Kippur War. As a government minister, he was the architect of the Israeli invasion of Lebanon in 1982, and the primary force behind the settlement movement. With the sole exception of Shimon Peres, he has been a member of the Knesset longer than any other Israeli, and he remains unsurpassed in his ability to forge and maintain coalitions. He began his political career on the left, swung keenly right, and concluded in the center. Mr. Sharon, more than any single Israeli, represented the finest ideals of the Jewish state--its heroism, resilience and versatility--as well as many of its most controversial policies.
And, like Israel, Mr. Sharon was a ganglion of contradictions. The party he formed in 1977, Shlomzion, advocated negotiations with the Palestine Liberation Organization and the creation of a Palestinian state in territories captured by Israel in 1967.
Joining the Likud, however, then under the leadership of Menachem Begin, Mr. Sharon became an unremitting foe of the Palestinian organization and its leader, Yasser Arafat. A Palestinian state already existed, Mr. Sharon claimed, situated in a large part of what was formerly British Mandated Palestine and comprised of a large Palestinian majority--Jordan--and there was no need to establish another.
In the mid-1970s, he staunchly opposed the peace overtures of Egyptian President Anwar Sadat, and promoted the construction of Israeli settlements in the occupied Sinai Peninsula. But the same Sharon also uprooted settlements and withdrew Israeli troops from Sinai in 1982 to fulfill the terms of Israel's peace agreement with Egypt. Several Israeli leaders, Begin included, feared that Mr. Sharon posed a threat to the country's democracy. Nevertheless, when a state investigation found him morally culpable for the massacre of Palestinian civilians by Christian militiamen in Beirut's Sabra and Shatilla refugee camp, then-Defense Minister Ariel Sharon promptly complied with the court's finding and resigned....
Posted on: Friday, January 6, 2006 - 11:52
SOURCE: Financial Times (UK) (1-5-06)
As they struggle desperately to nail down their legacies, Tony Blair appears to have one big advantage over George W. Bush: he can pick the timing of his departure. Mr Bush, like all lame duck US presidents, has no option but to limp through to the finish line. The British prime minister can choose his moment to get out on top. But the lesson of history suggests something quite different. Mr Bush is the fortunate one to have no choice in this matter. It is Mr Blair and the Labour party who are now trapped.
When Enoch Powell, the former Conservative minister, said that all political careers end in tragedy, he was clearly not thinking of two-term US presidents. In the postwar era, Truman, Eisenhower, Reagan and Clinton all benefited from having to see out their second terms, no matter how bad things got. At least two of these (Truman and Clinton) would probably have been forced from office under a parliamentary system as a liability to their parties. The case of Truman should be of particular comfort to the present US administration. An inexperienced president who owed his office more to chance than any obvious ability, he seemed out of his depth early in his second term, despite having pulled off a plucky and unexpected election victory. His commitment to an unpopular and unwinnable war in Korea sent his approval ratings to record lows. Still, he had no choice but to plough on. His doggedness eventually won him the respect of the public and ultimately that of historians too. Mr Bush will do his utmost to follow this example.
Mr Blair has tried to fashion an es-cape route from Powell's iron law of parliamentary life by constructing a pseudo-presidential term limit for himself. But in attempting to retain control over the timing of his departure, all he has done is create the worst of both worlds. Labour is now stuck in a version of the executioner's paradox, which says that if you tell a prisoner you will execute them by surprise one morning over the following week (to spare them a sleepless night), you will find you cannot execute them at all. By Thursday evening it will no longer be a surprise, which rules Friday out; so by Wednesday evening it will no longer be a surprise, which rules Thursday out, and so on. The result is no one sleeps a wink. Mr Blair's announcement that he would serve a full term before standing down was intended to quell the speculation that would otherwise surround his every move but it has had the opposite effect. Everything Mr Blair does is enveloped in a miasma of second-guessing because no one believes he can see it through to the end. Were he to make it near to the end of this parliament, both his enemies and his supporters would cease to believe he had any intention of standing down. What the executioner's paradox shows is that you cannot set an end point to your ambitions and preserve your freedom of manoeuvre beforehand....
Posted on: Friday, January 6, 2006 - 04:14
SOURCE: NYT (1-5-05)
Is there some human failing that affects second-term presidents, like arrogance or sheer fatigue? To some degree, perhaps. But the main problem is not personal but institutional - or rather constitutional, as embodied by the 22nd Amendment limiting presidential tenure.
A second-term president will, in effect, automatically be fired within four years. Inevitably his influence over Congress, and even his authority over the sprawling executive branch, weaken. His party leadership frays as presidential hopefuls carve out their own constituencies for the next election. Whether the president is trying to tamp down scandal or push legislation, he loses his ability to set the agenda.
But whether or not a president has a diminished second term, the amendment barring a third term presents the broader and more serious question of his accountability to the people.
While political commentators analyze every twist in White House politics, while citizens follow dramatic stories of leaks, investigations and indictments, the one person who does not have to care is George W. Bush. In a sense, he has transcended the risks and rewards of American politics. He will not run again for office. The voters will not be able to thank him - or dump him.
And yet accountability to the people is at the heart of a democratic system.
There was nothing in the original Constitution of 1787 that barred a third or fourth term for presidents. That was why Franklin Delano Roosevelt could run again in 1940 and 1944, becoming the only president to serve more than two terms. And that was why, three years later, in 1947, after sporadic public debate, Republicans demanded presidential term limits and changed the Constitution.
With majorities in both chambers of Congress, Republicans, joined by Southern Democrats opposed to the New Deal, were able to push the 22nd Amendment through the House (after only two hours of debate!) and the Senate (after five days of debate). At the time, an amendment limiting presidents to two terms in office seemed an effective way to invalidate Roosevelt's legacy, to discredit this most progressive of presidents. In the House, one of the few Northern Democrats to vote with the majority was freshman representative John F. Kennedy, whose father had fallen out with Roosevelt. In the spring of 1947, as the historian David Kyvig noted, 18 state legislatures rushed to ratify the amendment, with virtually no public participation in the debate. By 1951, the required three-fourths of the state legislatures had ratified it.
While George Washington limited himself to two terms, it had never been his intention to create a precedent. Washington didn't want to die in office and have the succession appear "monarchical." But his primary reason for retiring was simply that after a lifetime of public service, he was bone-tired, desperate to return to the tranquillity of Mount Vernon. ...
Posted on: Friday, January 6, 2006 - 00:07
SOURCE: National Post (1-5-06)
Israel's prime minister, Ariel Sharon, has suffered a massive brain hemorrhage; at the very least, his long political career appears to be over. What does that mean for Israeli politics and for Arab-Israeli relations?
Basically, it signals a return to business as usual.
Since the State of Israel came into existence in 1948, two points of view on relations with the Arabs have dominated its political life, represented by (as they are presently called) Labour on the left and Likud on the right.
Labour argued for greater flexibility and accommodation with the Arabs, Likud called for a tougher stance. Every one of Israel's 11 prime ministers came from the two of them, not a single one came from the plethora of others. The two parties together suffered a long-term decline in popularity but they jointly remained the pivots and kingmakers of Israel electoral life.
Or so they did until six weeks ago. On Nov. 21, Sharon left Likud and formed his own party, called Kadima. He took this radical step in part because his views vis-à-vis the Palestinians had evolved so far from Likud's nationalist policies, as shown by his withdrawal of Israeli forces and civilians from Gaza during mid-2005, that he no longer fit there. Also, he had attained such personal popularity that he attained the stature to found a party in his own image.
His move was exquisitely timed and enormously successful. Instantly, the polls showed Kadima effectively replacing Labour and Likud. The latest survey, conducted by"Dialogue" on Monday and published yesterday, showed Kadima winning 42 seats of the 120 seats in the Knesset, Israel's parliament. Labour followed with 19 seats and Likud trailing behind with a dismal 14.
Kadima's stunning success turned Israeli politics upside-down. The historic warhorses had been so sidelined, one could speculate about Sharon forming a government without even bothering to ally with one or other of them.
Even more astonishing was Sharon's personal authority in Kadima; never had Israel witnessed the emergence of such a strongman. (And rarely do other mature democracies; Pim Fortuyn in the Netherlands comes to mind as another exception.) Sharon quickly lured to Kadima prominent Labour, Likud and other politicians who shared little in common other than a willingness to follow his lead.
It was a daredevil, high-flying, net-less, bravura, acrobatic feat, one that would last only so long as Sharon retained his magic touch. Or his health.
I was skeptical of Kadima from the very start, dismissing it just one week after it came into existence as an escapist venture that"will (1) fall about as abruptly as it has arisen and (2) leave behind a meager legacy." If Sharon's career is now over, so is Kadima's. He created it, he ran it, he decided its policies, and none else can now control its fissiparous elements. Without Sharon, Kadima's constituent elements will drift back to their old homes in Labour, Likud, and elsewhere. With a thud, Israeli politics return to normal.
Likud, expected to slip into a dismal third place in the March voting, stands the most to gain from Sharon's exit. Kadima's members came disproportionately from its ranks and now Likud conceivably could, under the forceful leadership of Benjamin Netanyahu, do well enough to remain in power. Likud's prospects look all the brighter given that Labour has just elected a radical and untried new leader, Amir Peretz.
More broadly, the sudden leftward turn of Israeli politics in the wake of Sharon's personal turn to the left will stop and perhaps even be reversed.
Turning to Israeli relations with the Palestinians, Sharon made monumental mistakes in recent months. In particular, the withdrawal of all Israelis from Gaza confirmed for Palestinians that violence works, prompting a barrage of rockets on Israeli territory and an inflammation of the political temperature.
As Israel settles back to a more normal state, with no politician enjoying Sharon's outsized popularity, governmental actions will again come under closer scrutiny. The result is likely to be a less escapist and more realist set of policies toward the Palestinians and perhaps even some forward movement toward a resolution of the Israeli-Palestinian war.
Posted on: Thursday, January 5, 2006 - 10:50
SOURCE: NY Daily News (1-4-06)
respectively, at New York University's Steinhardt School of Education.
Arum is the author of Judging School Discipline: The Crisis of Moral
Authority and Zimmerman is the author of Whose America? Culture Wars in
the Public Schools, both from Harvard University Press.]
Sam Alito is a conservative. Eschewing judicial activism, Alito
defers to the good-faith efforts of prosecutors, police officers,
prison wardens, trial judges, and juries. Like newly confirmed Chief
Justice John R. Roberts, Alito sees himself as an umpire rather than a
player. Unless there's a clear and obvious violation of the rules,
then, Alito will let the game go on.
That's the received wisdom about Supreme Court nominee Samuel A.
Alito, whose confirmation hearings will start next week. And the
received wisdom is correct, with one glaring exception: the governance
of schools. In his 15 years as an appellate judge, Alito deferred to
everyone but school boards, principals, and teachers. When it comes to
education, in fact, Sam Alito is no conservative. Instead, he's a
raging judicial activist.
Take his dissent in C.H. v. Oliva, in which a New Jersey school
removed a child's poster of Jesus from a Thanksgiving display. Although
the school returned the poster to the display the very next day, Alito
said courts should entertain a lawsuit by the child's parents. "Public
school authorities may not discriminate against student speech based on
its religious content," Alito wrote. "Recognition of this important
principle would not interfere with the operation of the public
We disagree. Over the past four decades, courts have shown a
disturbing penchant for meddling with the day-to-day operations of
schools. As a result, principals and teachers have lost a good deal of
the moral authority that they once possessed. In too many instances
around the country, activist judges--not schools--determine what's best
for American children.
And Sam Alito has been more activist than most. Examining all of
Alito's rulings and dissents in cases related to educational practice,
we found that Alito was more than twice as likely as the average
appellate judge to side with challengers to American schools. And in
cases involving K-12 public school discipline and student expression,
Alito favored challengers five times out of eight.
In Saxe v. State College Area School District, for example, Alito
struck down a school anti-harassment policy that had been designed to
protect gay students, among others. Why? Because conservative parents
said the policy would limit students' ability "to speak out about the
sinful nature and harmful effects of homosexuality." In Pope v. East
Brunswick Board of Education, meanwhile, Alito censured school
officials for refusing to award official recognition to a student Bible
The common theme here is an extreme skepticism--bordering on
cynicism--about school officials' motives, competence, and judgment. Of
course principals and teachers must remain sensitive to the free-speech
rights of all students, including right-wing Christians. But we also
need to give these adults enough respect and authority to uphold these
freedoms and to balance them against other, equally important
Posted on: Wednesday, January 4, 2006 - 23:55
SOURCE: NY Sun (1-3-06)
In Baden-Wurtenberg, Heribert Rech of the ruling Christian Democratic Union party has overseen the administering of a 30-topic loyalty test for applicants to become naturalized citizens. Following an intensive and sophisticated study by the Baden-Wurtenberg government of Muslim life, it developed a manual for the naturalization authorities explaining that applicants for citizenship must concur with the"free, democratic, constitutional structure" of Germany.
Because survey research finds that 21% of Muslims living in Germany believe the German constitution irreconcilable with the Koran, the written yes-no questions of yesteryear are history for Muslim applicants for citizenship. As of January 1, 2006, immigration officers who suspect Islamist leanings are instructed to probe further. Personal interviews will now last an hour or two and will be given to an estimated half of naturalization applicants.
The questions amount to a summary of Western values. What do you think of democracy, political parties, and religious freedom? What would you do if you learned about a terrorist operation underway? Views of the attacks of September 11, 2001, are a"key issue," the director of the alien registration office in Stuttgart, Dieter Biller, said: Were Jews responsible for it? Were the 19 hijackers terrorists or freedom fighters? Finally, nearly two thirds of the questions concern gender issues, such as women's rights, husbands beating wives,"honor killings," female attire, arranged marriages, polygyny, and homosexuality.
Responding to critics, the Interior Ministry denies discrimination against Muslims, insisting on the need to find out whether the applicants' expressed views on the German constitution correspond to their real views. Applicants who pass the test and are granted citizenship could later lose that citizenship if they act inconsistently with their" correct" answers.
Adding extra requirements of Muslim applicants for citizenship is not unique to Germany; in Ireland, for example, male candidates are made to swear they will not marry more than one wife.
The second initiative originates in Lower Saxony, where the interior minister, Uwe Schünemann, also a CDU member, has stated he would consider making radical Islamists wear electronic foot tags. Doing so, he says, would allow the authorities"to monitor the approximately 3,000 violence-prone Islamists in Germany, the hate preachers [i.e., Islamist imams], and the fighters trained in foreign terrorist camps." Electronic tags, he suggested, are practical"for violence-prone Islamists who can't be expelled to their home countries because of the threat of torture" there.
The electronic tagging of terror suspects is also not unprecedented. In Britain, the method has been used since March 2005 and, other than a glitch-plagued start, it has been applied to ten suspects with reasonable success. In Australia, counterterrorism measures implemented last month permit tagging for up to a year.
But Mr. Schünemann's proposal goes well beyond these applications, tagging not just potential terrorists, but also"hate preachers" who break the law not by personally engaging in violence but by articulating beliefs that encourage others to terrorism. Tagging them breaks new conceptual ground by aggressively going to the ideological source of violence.
It has potentially large implications. If hate preachers are tagged, why not the many other non-violent Islamists who also help create an environment promoting terrorism? Their ranks would include activists, artists, computer gamers, couriers, funders, intellectuals, journalists, lawyers, lobbyists, organizers, researchers, shopkeepers, and teachers. In short, Mr. Schünemann's initiative could lead ultimately to the electronic tagging of all Islamists.
But electronic tags reveal only a person's geographic location, not his words or actions, which matter more when dealing with imams and other non-violent cadres. With due allowances for personal privacy, their speech could be recorded, their actions videoed, their mail and electronic communications monitored. Such controls could be done discreetly or overtly. If overt, the tagging would serve as a modern scarlet letter, shaming the wearer and alerting potential dupes.
The Schünemann proposal points to the urgent need to develop a working definition of Islamism and Islamists, plus the imperative for the authorities to explain how even non-violent Islamists are the enemy.
Messrs. Rech and Schünemann have presented two bold tactics for the defense of the West, premised in each case on an understanding that culture and ideas are the real battleground. I salute their creativity and courage. Who will next adapt and adopt these initiatives?
Posted on: Tuesday, January 3, 2006 - 12:27
SOURCE: THE AUSTRALIAN (1-2-06)
There is a dark history to the collection of Aboriginal human remains. The collectors in the 19th and early 20th centuries robbed recent graves and burial sites. Aborigines feared to die because of the prospect that their bones would not be properly interred but be taken off to museums and laboratories. Scientists measured Aboriginal skulls on live and dead bodies to demonstrate the alleged inferiority of the Aboriginal race.
The move to repatriate human remains is prompted by the very proper recognition that amends must be made for these barbarities. However, there is still a scientific interest in human remains, not to demonstrate racial difference but to understand the common story of the evolution and history of humankind.
With the advance in scientific inquiry, new discoveries can be made by the examination of human remains that have long been held in museums; and as techniques are further refined, yet more discoveries will be possible.
Robert Foley, the director of the Centre for Human Evolutionary studies at Cambridge University, writes: ''In the last decade alone it has become possible to extract DNA from ancient bone and thus, for the first time, be able to say something about prehistoric people's genetic make-up. Other new techniques allow us to reconstruct, through minute chemical traces, the way someone's diet may have changed as they grew, even to say how old they were when they were weaned.
''The history of human health and disease, as well as patterns of growth and diet, is etched in the skeletons that have been recovered from around the world.''
The movement of Aboriginal people to this continent and their survival here is a central part of the saga of the spread of humankind across the globe.
The museum is well placed to advance a new vision: that Aboriginal and settler societies constitute together the history of humankind on this continent; how both societies have used the land and have been shaped by it and how for better or worse it is the common inheritance of all who live here.
Viewed in this way, there is no prehistory of Australia; the Aboriginal human remains are fully part of Australian history, research into which is listed as one of the statutory functions of the museum.
Posted on: Tuesday, January 3, 2006 - 08:41
SOURCE: The Toronto Star (1-3-06)
Across the country, voters had had enough. The Liberals had displayed an unacceptable arrogance - and for far too long. They had rammed through a controversial bill to build a pipeline; shown wanton disregard for taxpayers' money; and introduced a less than satisfactory budget. Their leader looked old and tired.
When Prime Minister Louis St. Laurent called an election in the spring of 1957, the Liberals had been in power since William Lyon Mackenzie King had defeated R.B. Bennett's Conservatives in 1935. They believed that they were the "natural governing party" and entitled to rule for as long as they deemed fit.
True, the Liberals did not have a minority government in 1957. Yet that same Liberal arrogance at the forefront of Canadian politics for the past 12 years under Jean Chretien and Paul Martin - not to mention the sponsorship scandal and the Liberal belief that they know how to spend taxpayers' money better than the citizens who earn it - is at the heart of the current election campaign.
Today's Liberals, however, have one thing going for them that their counterparts in 1957 did not: a Conservative leader who has yet to capture the imagination and trust of the voters.
As they embarked on the 1957 campaign, St. Laurent and the Liberals ignored two key factors: the public's demand for change and the persona of the new Conservative leader, John Diefenbaker.
It was not the Liberal decision to support an American-controlled company - it planned to build a pipeline for Alberta gas to be sent to Ontario consumers - that eventually cost the party victory in the June 10, 1957 election. Rather, it was because Liberals forgot the sacred public trust that is at the root of governing in Canada.
As prominent Ottawa journalist Blair Fraser wrote a few months before the vote, "Political historians may well conclude that the Liberals fell, not because of any one policy, and certainly not a pipeline policy of which the average voter knew little and cared less, but because they failed to observe the proper limits of power."
As for Diefenbaker, he was a charismatic force on the campaign trail.
Whereas, St. Laurent appeared weary and was portrayed as "yesterday's man," Diefenbaker, although he was 62 years old (St. Laurent was 75), seemed fresh and honest.
He railed against the Liberal "dictatorship" and the party's "mockery" of Parliament during the pipeline debate. More important, he had a vision about a "new national policy" and "one unhyphenated Canada" that would restore the country's sagging spirit.
Current Conservative Leader Stephen Harper is no Diefenbaker. He lacks Dief's passion and oratorical skills. But Harper can also learn a great deal from Diefenbaker's performance during the 1957 election.
Posted on: Tuesday, January 3, 2006 - 08:29
SOURCE: PBS NewsHour Interview (1-2-05)
For a historical look at how executive powers have been wielded and its impact on governance, we're joined by Richard Norton Smith, director of the Abraham Lincoln Presidential Library and Museum in Springfield, Ill.; Ellen Fitzpatrick, a professor of history at the University of New Hampshire; and Andrew Rudalevige, professor of political science at Dickinson College and author of "The New Imperial Presidency: Renewing Presidential Power after Watergate".
Professor Rudalevige, either in method or degree are we in new territory with the Bush assertions of executive power?
ANDREW RUDALEVIGE: Well, I think we've pushed beyond old boundaries, but that is not to say these boundaries were very clearly drawn at any time throughout history. After Sept. 11, President Bush certainly was very aggressive in pushing forward an agenda of unilateral executive power arguing that this was justified, of course, by the crisis and also authorized by Congress, which after all after Sept. 11 passed a very broad resolution. Still, even before Sept. 11, the Bush administration had argued very forcefully that executive power had been reined in too far. I think you could argue that argument went too far, that presidential power throughout the post-war era anyway and even earlier than that had grown beyond what the framers certainly had in mind.
RAY SUAREZ: Richard Norton Smith, an aggressive agenda of unilateral power?
RICHARD NORTON SMITH: An aggressive agenda that no doubt the White House would say responding to events beyond anyone's imagination and in some ways the justification being put forth has its roots in Lincoln's Doctrine of Necessity 140 years ago.
Remember it was Lincoln who suspended habeas corpus, one of the most cherished rights of a free people, during the Civil War. He famously said, although it was sometimes necessary to amputate a limb to save a life, it was never wise to end a life to save a limb. In other words, the Doctrine of Necessity gave the president inherent power to temporarily suspend a clause of the Constitution in order to put down a rebellion by those who would trash the entire document.
Now that, obviously, can be a slippery slope and that is part of the debate that has been going on as long as there has been a Constitution and of which this is the latest chapter.
RAY SUAREZ: Professor Fitzpatrick, the president, in a recent news conference, cited Article 2 of the Constitution as the platform he was standing on, more or less, for these authorities, for his privileges in doing the kinds of things he was doing in wartime. What does Article 2 say and does he have a good case?
ELLEN FITZPATRICK: Well, the Constitution gives the president power as executive and as Commander in Chief and in the Federalist Papers, Madison made the point that one of the reasons for transferring power from the state government to the president, to the central government, was to protect against foreign enemies, to really protect the security of the country. But originally the notion of being Commander in Chief was not-- was explicitly divided from the notion that the president would decide when it was appropriate to wage war, that power very explicitly went to the Congress. So, the Constitution is very clear on this point.
What has happened historically, however, in times of war is that the Congress has ceded authority -- enormous authority in many cases -- to the president to take on powers during times of war because of this unusual situation and the importance of protecting the security of the nation. I think the historical context for what is going on now is really the expansion in the post-World War Two period of the national security state. It is the idea of national security, the doctrine of national security, that is being invoked now to justify a range of actions on the part of the Bush administration in the context of a proclaimed war on terror that has no endpoint.
RAY SUAREZ: Well you talked about how the Constitution expressly gives Congress the power to make war, but the United State's flag and its forces have gone a lot of places in the last 50 years without a declaration of war.
ELLEN FITZPATRICK: Yes.
RAY SUAREZ: Is this part of a general erosion of Congressional power?
ELLEN FITZPATRICK: It is certainly the case that very few times in American history, in fact, when military intervention has been undertaken has it happened as a result of a congressional declaration of war and increasingly in the post-World War Two period, what the idea of national security does is to put the United States in a state of permanent military readiness. It redefines foreign policy problems as threats to the security of the nation and it turns foreign policy goals or aspirations into necessities for the nation's survival and once that happens, what we've seen in the last 50-plus years, is that repeatedly the president invokes that idea, that doctrine, to take unilateral action and to consult the Congress belatedly, if at all.
RAY SUAREZ: Professor Rudalevige, is this a zero-sum game, that if the executive is gaining power it is almost, by definition, Congress that is losing it?
ANDREW RUDALEVIGE: Well, I don't buy that quite. Over time you have seen certainly the growth of a large American state, something that didn't exist in the 18th century, not only the military establishment that Professor Fitzpatrick talked about, but also a large executive establishment in domestic affair, a regulatory state, and as that grows, the government's power grows and the practical necessity of centralized leadership grows. The president has, in fact, been that person. Really under our Constitutional system, he is the only focal point of national leadership. But, that said, Congress needs to be defining the goals of that leadership, right? Where should it be going? In what direction should the country be going?
ELLEN FITZPATRICK: Yes.
ANDREW RUDALEVIGE: Congress has been, I think, asleep at the switch in recent years. The imperial presidency can really only be empowered by an invisible Congress and so, right now, I think we do have a zero-sum game. That's not a necessity.
RAY SUAREZ: Richard Norton Smith, is this a game of ebbs and flows as well? When the crisis passes, when the war is over, other forces in the country reassert their power and chip away at the expanded presidency?
RICHARD NORTON SMITH: Actually that is what's been the case for most of our history, Ray. If you look at-- If you look at the 19th century after the Mexican-American War and Polk presidency, the presidency was in many ways downsized, perhaps tragically in the years before the Civil War. There was a reaction after the Civil War to the enormous concentration of personal power that had flowed to the White House and you had a series of relatively weak presidents. After World War One in the 1920s, likewise, there was this reversal of power -- a lot of it back to Congress, a lot of it back to the states and the localities.
Picking up off what my colleagues have said, this is a game of Constitutional see-saw, what changed that dynamic really was the permanent Cold War. Just a few years after we witnessed the surrender of Japan and Nazis Germany, we undertook a whole different kind of foreign policy. It was called Communist containment. Harry Truman fought a war in Korea that he called a police action, not a war. Truman and Eisenhower and other presidents used the CIA and other instruments to overthrow hostile governments around the world -- all in the name of protecting the United States from this constant menace. It was, in effect, a permanent state of siege. And I do agree with Ellen, there are some parallels with the current war on terror.
But if I can say something heretical, we historians are complicit, to some degree. You know Teddy Roosevelt a hundred years ago had what he called his Stewardship Theory of the presidency, which said the president could do anything the Constitution did not explicitly prevent him from doing, and for most of the last century, most historians have almost reflexively celebrated the powerful, strong, energetic, agenda-setting, Congress-dominating president and, to some degree, the chickens have come home to roost.
Posted on: Tuesday, January 3, 2006 - 02:27