Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Foreign Policy Research Institute (FPRI) (1-1-09)
Fifteen years ago, Samuel P. Huntington published, first as an article (“The Real Clash of Civilizations?” Foreign Affairs, Summer 1993) and then as a book (The Clash of Civilizations and the Remaking of World Order, Simon and Schuster, 1996), his famous argument about the clash of civilizations. The clash that he was referring to was the clash between the West—Western civilization—and the rest. Of the rest, he considered the greatest challenges to the West would come from the Islamic civilization and the Sinic, or Confucian, civilization. These challenges would be very different because these civilizations were very different. But together they could become a dynamic duo that might raise very serious challenges to the West.
Since Huntington published his thesis, these challenges have indeed occurred. We all know about the conflict that we have with at least the Islamic extremists, the Islamists within the Islamic civilization—not only with Al Qaeda since 9/11, but then in Iraq and now once again in Afghanistan. There is also the clash with the Chinese civilization. This takes a very different form, being much more a competition over “smart power” and “soft power” (more on which below) than over hard power. But this is also a conflict that may become more intense in the next generation. More recently, Huntington wrote a book about the clash within the West itself, and especially within the United States: Who Are We? The Challenges to America’s National Identity (Simon and Schuster, 2004). That is the struggle that I will address here—the internal conflicts over America’s identity.
These internal conflicts have been maturing over the course of some forty years. One might date them to 1968, the most pronounced year of the famous decade of the 1960s. One could thus call them a forty-year war; a decade or so ago, some people called them the “culture wars”. However, although this war has been going on for a long time, with truces or modus operandis and modus vivendis between the contending parties from time to time, we are now in a new stage because of the current world economic crisis. This is becoming very important in terms of the clash of civilizations within America itself.
For many years we have been governed by a free-market ideology. This is a distinctive product of the West in general, and more specifically of the Anglo-Saxon or Anglo-American tradition in economics and government and in the relationship between them; it has been especially pronounced in the United States and its Republican Party in the past few decades. This particular product of the West—the free-market ideology—is now under serious assault, not only from a counter ideology—i.e., the belief in extensive government regulation—but indeed from economic reality. A good argument can be made that what has recently happened to the world economy was a result of the free market being carried to an extreme.
This ongoing economic debate was refracted through the political elections of 2008. As a result, we now have the most dominant one-party government in America since 1964, when Lyndon Johnson won a handsome landslide and was supported by a heavily Democratic Congress. That Democratic dominance in government shortly issued in substantial social and political changes, most obviously the Great Society programs and a variety of other initiatives that greatly altered America and produced what is seen as the radical 1960s. There is a very good chance that the new Democratic administration and Congress will also make very significant changes. One expression of the West, i.e., the free market, will likely be confronted and put down by another expression of the West, i.e. extensive government regulation, something that has developed in Western societies over the last several generations. But there may be a better way to go than either of those two extremes, something that combines the best of the West and the best of these two extremes, while eliminating their dangerous consequences. I will call this Liberty under Law—the free market as refracted into something deeper—liberty—and government regulation as redefined into something much better—law....
Posted on: Friday, March 20, 2009 - 16:25
SOURCE: Austin American-Statesman (3-19-09)
During the economic crisis of the 1930s, two prominent institutions of higher learning introduced programs of study designed to prepare students for lives in a grim world. Neither institution used pre-professional courses, accountability measures, standardized testing or statistics about student earning outcomes. The University of Chicago introduced its great books program and, later, a sequence of courses in the humanities and the social, physical and natural sciences. The University of Texas started its Plan II program of rigorous courses in literature, philosophy, society, arts, mathematics and natural sciences.
Since 2004, I have traveled yearly as a national lecturer for the Phi Beta Kappa Society, our nation's oldest and most prestigious college honor society. Since 1776, membership in PBK has been one of the highest honors that can be conferred on undergraduate liberal arts and science students.
In the past 25 years, however, as American undergraduate education has emphasized more and more pre-professional "career-credentialing," students at about 100 institutions with PBK chapters fail to appreciate what becoming a member signifies for them and for our country. A survey identified as one reason for their attitudes: "low prestige for academic achievement at the institution." What else do we expect when we promote career training, starting salaries and self-interest over cultivation of the mind, creativity and civic virtues?
The society's aim has stayed the same since 1779: to bring together the best minds across academic disciplines so they can communicate without reserve on matters of speculation with that freedom of enquiry which ever dispels the clouds of falsehood by the radiant sunshine of truth. Nowadays, such aims might seem old-fashioned or dangerously liberal.
In our nation, states, cities and towns, education, from pre-school through Ph.D., is influenced by special interest groups. These include zealots who put creationism on the same level that biological studies have reached in 130 years of research since Charles Darwin; visionaries who pay students for grades; and elected officials who hold schools, teachers and students accountable by evaluating majors according to the salaries earned by their graduates and who distort education into a quantifiable process of dispensing and acquiring rote knowledge.
In 20 years of guest-teaching in elementary and secondary schools, I have not met one teacher or administrator who thinks devoting precious classroom time to preparation for the Texas Assessment of Knowledge and Skills is helpful in educating their students. When the results are used to assign ratings to schools that will make or break their funding, the outcome is even worse. More time is lost in prepping students and, as The Dallas Morning News reported in June 2007, in Texas alone, "tens of thousands of students cheat on the TAKS test every year, including thousands on the high-stakes graduation test." Then there are the documented cases of schools and school districts themselves cheating to survive in the funding contest.
My own travels and talks for PBK have persuaded me all is not lost. First, PBK is not about politics.* PBK chapters are places where scholars and students with different political outlooks come together and express ideas openly. I have been there and done that.
Second, professors who are devoted to PBK are active all across the country. They teach their students with contagious enthusiasm, as I have seen firsthand at the University of Arkansas, Eastern Illinois University and Jacksonville State University in Alabama, and at smaller jewels like Hendrix College in Arkansas, Gustavus Adolphus and St. Olaf in Minnesota, and Roanoke College in Virginia.
We don't need more evaluation standards, standardized tests, unreadable reports, cash bribes for students or politically calculated ideas from highly partisan government committees. Let's get out of the way of teachers at all levels who believe what Phi Beta Kappa stands for: "Love of learning will captain the ships of our lives." Their students will do the rest. All of our boats will rise.
Posted on: Thursday, March 19, 2009 - 19:12
SOURCE: Politico.com (3-19-09)
This week, President Barack Obama found himself in a difficult position. At the same time that the president has been trying to build support for his financial bailout program, the media reported that American International Group had paid out huge executive bonuses.
Americans on Main Street, who are struggling to survive, don’t want to see stories about the executives whose companies are receiving taxpayer dollars enjoying generous financial rewards. There is no way that such a story can play well politically.
Obama’s response was swift. The president asked, “How do they justify this outrage to the taxpayers who are keeping the company afloat? This isn’t just a matter of dollars and cents. It’s about our fundamental values.” He has promised to do everything in his power to stop these kinds of payments from continuing. Implicit in his statement is a threat to executives that if they want to continue receiving government assistance, they will have to crack down on this kind of behavior.
Is this kind of rhetoric risky for the president? Some opponents will certainly point to his words as a form of class warfare that Americans don’t like, a line of attack on Obama that began with Joe the Plumber.
But recent American history is filled with examples of presidents who lashed out against bad corporate behavior at the same time that they were trying to save capitalism from severe downturns or self-destruction. President Theodore Roosevelt was a famous “trust-buster” who called for federal regulation of the corporate world. He distinguished between “bad trusts” and “good trusts,” arguing that economic concentration was inevitable but that certain kinds of behavior had to be abolished so the modern corporate economy would remain strong. Roosevelt’s goal was not to end the concentration of economic power but to eliminate abuses.
Franklin D. Roosevelt made similar arguments during the 1930s. The goal of the New Deal was to save capitalism from the Great Depression and avoid the kind of economic and political systems that were taking shape in Europe. FDR often lashed out against capitalists who refused to play by the rules and who continued to resist any sacrifice. In 1935 and 1936, feeling pressure from the populist campaigns of Huey Long, who was assassinated in 1935, and Father Charles Coughlin, Roosevelt denounced corporate and financial titans. During his nomination acceptance speech in Philadelphia in the summer of 1936, FDR railed against a “small group” who had “concentrated into their own hands an almost complete control over other people’s property, other people’s money, other people’s labor — other people’s lives.” FDR won the 1936 election by a landslide.
In 1961, John F. Kennedy called on executives in the steel industry, as well as steel unions, to voluntarily hold down prices and wages during negotiations of their new contract. Under pressure from the president, the unions agreed. Steel executives agreed as well, but then reneged in April 1962 when the companies revealed that they would raise their prices by 3.5 percent. Roger Blough, chairman of U.S. Steel, refused to change his mind after a meeting with the president. Kennedy blasted Blough and his colleagues, telling advisers: “My father told me businessmen were all pr---s, but I didn’t really believe he was right until now.” The administration mounted a campaign against the steel executives that included investigations into possible collusion, leaking Kennedy’s statement about businessmen (cleaning it up to read “sons of b----es”) and issuing public statements in which Kennedy said Americans were facing a dangerous time in the Cold War. The administration also noted that unions had agreed to sacrifice f
or the national good. One by one, the steel companies relented and agreed to hold down prices. Public opinion overwhelmingly approved of Kennedy’s actions. The president followed up with positive statements about the corporate world.
History suggests that the public would support Obama in his effort to condemn bad capitalist practices while trying to demonstrate his strong support for private enterprise. Indeed, it has become very clear that this is exactly what he is trying to do with his economic recovery plan: inject sufficient government funds and impose regulations so private markets can work again.
Posted on: Thursday, March 19, 2009 - 19:09
SOURCE: Philadelphia Inquirer (3-19-09)
Imagine a drug that made American teenagers think
and talk even more about the timeless concerns of adolescence: who's cool, who's cute, and who's going out with whom. Then imagine that millions of teens were taking this drug every day.
Actually, you don't have to. The drug already exists, and it's called MySpace. There's a competitor drug, too, known as Facebook.
Between one-half and three-quarters of American teens already have a profile on an Internet social network, where they spend hours per week - nobody really knows how many - sharing pictures, gossip, and jokes. And we should all be worried about that, although not for the reasons you might suspect.
Newspapers keep reminding us about "online predators" and other malfeasance on the Net, which makes us miss the digital forest for the trees. In this medium, the chief danger doesn't come from depraved adults. It's much subtler than that, and it comes from teenagers themselves - specifically, from their insatiable desire to hang out with each other.
And the key word here is insatiable. After all, teens have always wanted to hang out with each other. But the Internet lets them do it 24/7, transforming the social world of adolescence into an omnipresence.
Consider last year's MacArthur Foundation report on "digital youth," which confirmed that most teens communicate online with kids they already know, and that they're doing so more than ever. "Young people use new media to build friendships and romantic relationships, as well as to hang out with each other as much and as often as possible," the report found.
As the teenagers would say, "Duh!" Then they would ask, "What's the problem with that?"
Nothing, really, except for what it replaces: solitude. Once you're "always on," as the kids describe it, you're never alone.
That means you're less likely to read a book for pleasure, to draw a picture, or simply to stare out the window and imagine worlds other than your own. And as any parent with a teenager could testify, you're also less likely to communicate with the real people in your immediate surroundings. Who wants to talk to family members when your friends are just a click away?
True, many teens do communicate with strangers on the Net. But adolescents are also very adept at sniffing out "creepy" adults, a threat that has been vastly overblown by media reports.
Consider all of the ink spilled over Lori Drew, the Missouri woman who used a phony MySpace account to trick a teenager into believing that Drew was a male suitor. When the fake suitor dumped the teen and she committed suicide, you would have thought every kid in America was somehow in danger.
They're not - at least not from strangers. Although 32 percent of American teens say they have been contacted online by someone they don't know, just 7 percent report feeling "scared or uncomfortable" as a result, according to the Pew Research Center.
And when teens do feel hurt by something on the Internet, it usually comes from - surprise! - other adolescents at their schools. About one-third of teenagers say they have been the target of "online bullying," such as threatening messages or embarrassing pictures. But two-thirds of teens say bullying is more likely to happen offline. The Internet just makes it easier to do - and harder to escape.
If social networking sites had existed when I was a kid, I would have used them every bit as much as my teenage daughter does. With my own Facebook or MySpace page, I would have focused even more on all of the natural worries that permeated my adolescence: Am I cool? Am I cute? Will my peers like me? And it would have taken me a lot longer to become an adult.
So what should today's adults do in the face of this new challenge? We can try to limit our teenagers' computer time, of course, but that's probably a lost cause by now.
The better solution, as always, comes from the kids themselves. Teens around the country have started a small online movement against social networking sites, trying to make them seem uncool. My best friends' daughter just took down her Facebook page, for example, insisting that the site is "for losers."
So pass the word to every teen you know: Social networking is for losers. Just don't tell them I said so.
Posted on: Thursday, March 19, 2009 - 18:32
SOURCE: CNN (3-18-09)
As the budget debate heats up, Republicans are warning of socialism in the White House and claiming that Democrats are rushing back to their dangerous tonic of big government.
Speaking to the Conservative Political Action Conference, Rush Limbaugh warned that "the future is not Big Government. Self-serving politicians. Powerful bureaucrats. This has been tried, tested throughout history. The result has always been disaster."
On CNN, former Vice President Dick Cheney said he is worried that the administration is using the current economic conditions to "justify" a "massive expansion" in the government.
After the past eight years in American politics, it is impossible to reconcile current promises by conservatives for small government with the historical record of President Bush's administration. Most experts on the left and right can find one issue upon which to agree: The federal government expanded significantly after 2001 when George W. Bush was in the White House.
The growth did not just take place with national security spending but with domestic programs as well. Even as the administration fought to reduce the cost of certain programs by preventing cost-of-living increases in benefits, in many other areas of policy -- such as Medicare prescription drug benefits, federal education standards and agricultural subsidies -- the federal government expanded by leaps and bounds. And then there are the costs of Afghanistan and Iraq.
Federal spending stood at about $1.9 trillion in 2000, when Democrat Bill Clinton ended his presidency. In his final year in office, Bush proposed to spend $3.1 trillion for fiscal year 2009. President Obama's budget proposal for fiscal 2010 is $3.6 trillion.
Nor can Republicans blame a Democratic Congress for being responsible for these trends. Much of the expansion took place between 2002 and 2006, when Republicans controlled both Congress and the White House. The Weekly Standard's Fred Barnes was writing about "big government conservatism" back in 2003.
Two years later, the right-wing CATO Institute published a report noting that total government spending had grown by 33 percent in President Bush's first term, lamenting that "President Bush has presided over the largest overall increase in inflation-adjusted federal spending since Lyndon B. Johnson."
There were some areas where Bush backed off government cuts because programs were too popular, like Social Security. In other areas, like federal education policy and prescription drug benefits, the president seemed enthusiastic about bigger government.
Bush and Cheney also embraced a vision of presidential power that revolved around a largely unregulated and centralized executive branch with massive authority over the citizenry. This was a far cry from the days of Ohio Sen. Robert Taft, a Republican who constantly warned about the dangers of presidential power to America's liberties.
After the 2008 election, Cheney was not apologetic. He explained that "the president believes, I believe very deeply, in a strong executive, and I think that's essential in this day and age. And I think the Obama administration is not likely to cede that authority back to the Congress. I think they'll find that given a challenge they face, they'll need all the authority they can muster."
Importantly, the marriage between conservatism and a robust federal government was not unique to the Bush presidency. The roots of Bush's comfort with government can be traced to the Republican Right in the 1950s, members of Congress who called for an aggressive response to domestic and international communism.
Presidents Dwight Eisenhower and Richard Nixon were two Republicans who pragmatically accepted that Americans had come to expect that the federal government would protect against certain risks and that trying to reverse politics to the pre-New Deal period would be politically suicidal.
"Should any political party," Eisenhower said, "attempt to abolish Social Security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history."
When Nixon and congressional Republicans battled with Democrats over Social Security between 1970 and 1972, the debate revolved over how much to expand the program. Congressional Democrats wanted to increase benefits through the legislative process, while Nixon wanted to index benefits so they automatically increased with inflation.
Nixon and Congress did both.
President Reagan backed off his most ambitious efforts to cut government, most dramatically when he abandoned his proposal to curtail Social Security after facing a fierce backlash, while the military budget boomed. President George H.W. Bush signed the Americans with Disabilities Act in 1990, which was one of the boldest regulatory expansions of government since the civil rights laws of the 1960s.
All of these presidents, particularly Nixon and Reagan, likewise promoted a muscular vision of presidential power that strengthened the authority of government and introduced concepts, such as the unitary executive, which would become the intellectual underpinning of the Bush administration.
"When the president does it, that means that it is not illegal," Nixon told David Frost in 1977. Like it or not, strengthening the presidency is one of the most important ways in which the role of government has grown since the nation's founding.
Fifty years of American history have shown that even the party that traditionally advocates small government on the campaign trail opts for big government when it gets into power. The rhetoric of small government has helped Republicans attract some support in the past, but it is hard to take such rhetoric seriously given the historical record -- and it is a now a question whether this rhetoric is even appealing since many Americans want government to help them cope with the current crisis.
Posted on: Wednesday, March 18, 2009 - 20:54
SOURCE: Slate (3-17-09)
In the 20th century, there were two main traditions of clean torture—the kind that doesn't leave marks, as modern torturers prefer. The first is French modern, a combination of water- and electro-torture. The second is Anglo-Saxon modern, a classic list of sleep deprivation, positional and restraint tortures, extremes of temperature, noise, and beatings.
All the techniques in the accounts of torture by the International Committee of the Red Cross, as reported Monday, collected from 14 detainees held in CIA custody, fit a long historical pattern of Anglo-Saxon modern. The ICRC report apparently includes details of CIA practices unknown until now, details that point to practices with names, histories, and political influences. In torture, hell is always in the details.
The ice-water cure. "On a daily basis during the first two weeks I was made to lie on a plastic sheet placed on the floor which would then be lifted at the edges. Cold water was then poured onto my body with buckets. ... I would be kept wrapped inside the sheet with the cold water for several minutes. I would then be taken for interrogation," detainee Walid bin Attash told the Red Cross.
In the 1920s, the Chicago police used to extract confessions from prisoners by chilling them in freezing water baths. This was called the "ice-water cure." That's not its first use. During World War I, American military prisons subjected conscientious objectors to ice-water showers and baths until they fainted. The technique appeared in some British penal colonies as well; occasionally in Soviet interrogation in the 1930s; and more commonly in fascist Spain, Vichy France, and Gestapo-occupied Belgium. The Allies also used it against people they regarded as war criminals and terrorists. Between 1940 and 1948, British interrogators used "cold-water showers" as part of a brutal interrogation regimen in a clandestine London prison for German POWs accused of war crimes. French Paras also used cold showers occasionally in Algeria in the 1950s. In the 1970s, Greek, Chilean, Israeli, and Syrian interrogators made prisoners stand under cold showers or in cold pools for long periods. And American soldiers in Vietnam called it the "old cold-water-hot-water treatment" in the 1960s....
Posted on: Tuesday, March 17, 2009 - 18:19
SOURCE: Huffington Post (Blog) (3-16-09)
Democrats have been more willing to boast about their huge data base of small donors upon whom Obama relied. Using computer and telephone technology, as well as a motivated army of volunteers, Democrats did extremely well at raising small amounts of cash from donors who saw the possibility in his presidency.
In contrast, Democrats have not figured out how they feel about the wealthy individuals who have poured their money into the Obama campaign, as well as the efforts in recent years to create a vibrant infrastructure of progressive media and Internet outlets. The Campaign Finance Institute reported that Obama's base of small donors was roughly equivalent to George W. Bush's base in 2004. But large donors were extremely important. A recent analysis by the Washington Post found that there were approximately one hundred families and couples who donated $100,000 or more in 2007 and 2008 to Obama's campaigns. George Soros and Sheldon Adelson, the most talked about donors, have often been the focus of conservative attacks. Liberals have also come under fire from the right with questions about who provided the funds for the Center for American Progress, whose president John Podesta is in charge of Obama's transition and which is providing much of the intellectual firepower and personnel for the new administration.
Kim Phillips-Fein's new book, Invisible Hands provides a fascinating account of how important wealthy donors were to the rise and success of conservatism. She traces a network of wealthy business leaders who, starting in the 1940s and continuing through the 1980s, poured their funds and energy into making conservatism a viable and powerful political force. While many historians have focused on the role of ideas, activists, and politicians, this work takes us to the people who put their money where their mouths were.
"It is a book," she writes, "about businessmen like Lemuel Boulware [of General Electric] who supported and helped to build the conservative movement that brought Reagan to power in 1980 . . . This book is about those determined few, those ordinary businessmen . . . from companies of different sizes and from various industries, who worked for more than forty years to undo the system of labor unions, federal social welfare programs, and government regulation of the economy that came into existence during and after the Great Depression of the 1930s."
There is a big cast of characters in this book, including many figures who are not well known, but who were influential. For example, President of Sun Oil J. Howard Pew was determined to combat the influence of liberal ministers in the Protestant church. He helped to create an organization called the Spiritual Mobilization which paid to send conservative books to ministers and solicit money for this cause.
In another chapter of the book, Phillips-Fein recounts the history of The National Review, a journal famously founded by William Buckley Jr., that helped spread and popularize the ideas of conservatism since the 1950s. But what is less discussed are the people with the money that allowed this magazine to happen. Fellow Yale graduate Roger Milliken, a wealthy industrialist in the textile industry who had been tough with unions in his South Carolina plant, purchased subscriptions for over a thousand of his friends and bought advertising space. According to Phillips-Fein, he increased the number of ads significantly when the magazine faced a revenue shortfall that threatened the future of the magazine early in its history. Milliken also leaned on his friends to purchase advertisement. He was one among many other executives, like Lemuel Boulware from General Electric.
During the 1970s, businessmen funded the creation of numerous think tanks and advocacy organizations to fight against unions, government regulation, and for lower taxes, leaving behind a significant institutional infrastructure of organizations such as the Heritage Foundation, which continue to influence public debates.
Progressive Democrats now have their own wealthy donors, though the list is not nearly as extensive as the one compiled by conservatives. In the coming years, they will have to decide what role wealthy donors should play in this burgeoning movement. There will be two ways to read this book. One is as a guide of what not to do, a model of how private money became extremely influential in the political world. The other way is different, as a model of how money can be a useful and constructive force in coming years.
More soon from the academy....
Posted on: Tuesday, March 17, 2009 - 14:48
SOURCE: Special to HNN (3-17-09)
They never said
" The schwartzes at the local high school are making it a much better school. They are wonderful students!"
" I love having "schwartzes" as our neighbors. They are so well mannered, and so polite."
" I am so excited, we're having the Jones family for dinner on Sunday afternoon. Whenever the shwartzes come over, I make my best pot roast.""
"The schwartzes loves Jewish deli almost as much as I love grits!"
" The Concord is my favorite hotel. At least half of the guests there are schwartzes, so you know everyone is going to have a good time."
But I heard plenty of the following
"If the schwartzes keep coming into the neighborhood,, I am moving to Queens"
"Even when the schwartzes are educated, they don't have the same moral standards we do"
" I am not letting my daughter go to Wingate. It's full of schwartzes!"
" He married a schwartze and his family disowned him. They are sitting shiva right now!"
Lest I be accused of fomenting anti-Semitism, let me make one thing perfectly clear- not all Jews of that generation were closet or open racists At a left wing summer camp I attended, Camp Taconic and at Erasmus Hall High School in Flatbush, where I transferred after getting in a fight at my local high school, I met many Jewish young people whose parents were militant anti-racists, and participated in civil rights protests well before they became fashionable. Some of those people, whose houses I occasionally went to, spoke Yiddish as fluently as my parents, and sent their children to left wing Yiddish "shules," but none of them EVER used the word "schwartze" in conversation. It was not a part of their family's vocabulary
The refusal of left wing or anti-racist Jews to use the term casts doubt on Mason's claims that the word "schwartze" lacks pejorative connotations..While the word"schwartze" doesn't have the same awful history as the "N" word, or the same rage filled connotations, it conveys a level of discomfort about Jewish encounters with Blacks that cannot be dismissed as "neutral." Given the history of Jews as an oppressed people, it is a discomfort tinged with ambivalence,but is discomfort nonetheless. "Schwartze" was a term rarely used in anger, but often used in fear. It reflected a perception of Blacks as a dangerous "other.," an alien people who might subject Jews to the same danger they had
been in throughout most of their history.
Although I understand the experiences, and the emotions, that might lead some Jews to express their racial fears and animosities through a term like "schwartze," I would never use the word "schwartze "in conversation, and would not accept it's usage from a casual acquaintance, much less from a friend
Jackie Mason is on shaky ground in arguing that the word lacks negative connotations.. "Schwartze" is a term loaded with racial meanings, and none of them are positive.
Posted on: Tuesday, March 17, 2009 - 13:33
SOURCE: Informed Comment (Blog run by Juan Cole) (3-17-09)
Dick Cheney:"I guess my general sense of where we are with respect to Iraq and at the end of now, what, nearly six years, is that we've accomplished nearly everything we set out to do...."
What has Dick Cheney really accomplished in Iraq?
Cheney avoids mentioning all the human suffering he has caused, on a cosmic scale, and focuses on procedural matters like elections (which he confuses with democracy-- given 2000 in this country, you can understand why). Or he lies, as when he says that Iran's influence in Iraq has been blocked. Another lie is that there was that the US was fighting"al-Qaeda" in Iraq as opposed to just Iraqis. He and Bush even claim that they made Iraqi womens' lives better.
The real question is whether anyone will have the gumption to put Cheney on trial for treason and crimes against humanity.
Posted on: Tuesday, March 17, 2009 - 13:02
When religious commentators or commentators on religion look at the theme, their observations tend to follow one of three lines. Those who are instinctive and ideological supporters of "unfettered" free markets report on and predict trends that will deprive churches of part of their mission, as government is increasingly involved in health, education, and welfare programs. Churches (and synagogues, et cetera) will be worse off than before the crunch and the crash. A second set advocates to religious organizations that they do more and better than before with works of mercy (and justice), taking initiatives to help make the best use of enhanced governmental initiatives. In a sense, they say, thank God for government, secular organizations, and others who help feed and heal. The third set takes a long view, asking churches to reexplore their traditions (e.g., Catholic social teachings which support the common good.)
It is too soon to know how this recession-depression will work out for religious institutions and ideas. While watching and waiting, I decided to do what so many do: compare today to the Great Depression of the 1930s. I had explored religious roles and responses in my The Noise of Conflict: 1919-1941, the second volume in my Modern American Religion (University of Chicago, 1991). If we repeat in any way -- but who says we will, or must? -- what happened then, there is not a lot of cheer to be spread. I particularly relied on Samuel C. Kincheloe’s 1937 Research Memorandum on Religion in the Depression, an extensive, judicious, realistic survey done for the Social Science Research Council by a Chicago Theological Seminary (then Congregational) professor. Like so many other secular and mainstream Protestant analysts, he did not pay much attention to what was prospering when nothing else was: fundamentalism.
Some leaders hoped and prayed for religious revivals, none of which erupted. "There has been much emphasis on the belief that what society needs is religion," Kincheloe reported, and I observed, "but society evidently did not think so." Money problems limited church efforts to serve the poor, whose numbers grew exponentially. At the same time, deep believers within all congregations and denominatins "did not fall away from faith merely because of economic trauma." The Christian Century editorialized, with a view on the past: "Did people not address this Depression religiously because for once they did not think it occurred under the providence of God?" The editorial conclusion: this may have been "the first time men have not blamed God for hard times." If that was true in 1935, it seems to be true today, too. There are accusers, accused, and commentators on all hands today, but one seldom hears that all the dealings, many of them now seen as greedy at best and criminal at worst, were anything but the results of individual and corporate folly and corruption. This time again, citizens can’t blame God for getting them into this, and are trying to find God-ly ways to get out of it...together.
Posted on: Monday, March 16, 2009 - 19:30
SOURCE: New Republic (4-1-09)
The world "liberal" was first used in its modern political sense in 1812, when Spaniards wrote a new constitution liberating themselves from monarchical rule. As it happens, the word "socialism" originated in roughly the same period; it came into existence to describe the utopian ideas of the British reformer Robert Owen. Such timing suggests two possibilities: Either the fates of liberalism and socialism are so interlinked that one is all but synonymous with the other--or the two are actually competitors developed to meet similar conditions, in which case victory for one marks the defeat of the other.
These days, one could be forgiven for believing that the former conclusion is correct. It was not so long ago that conservatives were equating liberalism with fascism; today, they have executed a 180-degree swing in order to argue that liberalism is actually synonymous with socialism. "Americans," proclaimed Republican Senator Jim DeMint at the recent meeting of the Conservative Political Action Conference, "have gotten a glimpse of the big-government plans of Obama and the Democrats and are ready to stand up, speak out, and, yes, even to take to the streets to stop America's slide into socialism." But it isn't just the right that has worked itself into a frenzy; on the question of whether we are approaching a new age of socialism, there seems to be remarkable political consensus. In recent weeks, the covers of National Review ("OUR SOCIALIST FUTURE"), The Nation ("REINVENTING CAPITALISM, REIMAGINING SOCIALISM"), andNewsweek ("WE ARE ALL SOCIALISTS NOW") have--respectively--lamented, heralded, and observed the coming rise of socialism.
But all these commentators--right, left, and middle--may want to take a deep breath. We aren't headed for an era of socialism at all, since socialism is not a natural outgrowth of liberalism. Liberalism is a political philosophy that seeks to extend personal autonomy to as many people as possible, if necessary through positive government action; socialism, by contrast, seeks as much equality as possible, even if doing so curtails individual liberty. These are differences of kind, not degree-- differences that have historically placed the two philosophies in direct competition. Today, socialism is on the decline, in large part because liberalism has lately been on the rise. And, if Barack Obama's version of liberalism succeeds, socialism will be even less popular than it already is.
Posted on: Monday, March 16, 2009 - 15:14
SOURCE: Informed Comment (Blog run by Juan Cole) (3-16-09)
Bob Dreyfuss has a postmortem on the scuttling by the Israel lobbies of the Chas Freeman appointment to the chairmanship of the National Intelligence Council. Dreyfuss shows that well-known figures in the Israel lobbies openly led the charge against Freeman, crowed at how they had succeeded, and then simultaneously denied their own existence. He argues that the lobbies were forced out into the open, which is bad for them, since lobbies thrive in obscurity. And he foresees their problems in shaping the Obama administration's policies toward Israel, as well as great difficulties in putting lipstick on the pig of the forthcoming Netanyahu-Lieberman government, which consists of soft fascists and outright racists.
Dreyfuss is a long-time observer of the Neoconservatives and of the less savory activities of the Israel lobbies, but he is likely wrong.
The Israel lobbies and the Neoconservative element among them in particular want the US to do to Iran what it did to Iraq, i.e. attack it, put it to the flames, and break its legs for decades to come.
The National Intelligence Estimates of the 16 US intelligence agencies on Iran of 2005 and 2007 were greeted with howls of outrage by the Israel lobbies. The first concluded that Iran was at least a decade from having the technical capacity to create a nuclear weapon, even if the international community let it freely import all the equipment it would need for that endeavor. The second concluded that there is no good evidence Iran even has a nuclear weapons program at all, or has done any weapons-related experiments since early 2003. (That is, the likelihood is that Tehran was afraid of Saddam's (non-existent) program, which the US had hyped, and gave up the experiments when they saw that he was about to be overthrown).
Freeman would have been in charge of editing future National Intelligence Estimates. As Andrew Sullivan rightly hinted, the Israel lobbies did not want someone there so unsympathetic to their conviction that Iran is an imminent and existential threat to Israel, and so unlikely to report out conclusions that would underpin a US war on Iran, or US permission to Israel to strike Iran. The NIC chairman's tenure can last for a decade, and the Israel lobbies' best hope for a war on Iran would come if the Republicans regained the presidency and at least the Senate in 2012. They would want cooked-up NIEs ready to go, as the deeply flawed 2002 Iraq NIE supported that war.
As for the idea that the Israel lobbies have been weakened by the affair, that does not follow. They have been operating in the open for decades. They torpedoed George Ball for secretary of state way back in the Carter administration, and everyone knew it then and knows it now. AIPAC set up the Washington Institute for Near East Policy explicitly to offset the then perceived even-handedness toward Israel of Brookings, and WINEP directors went on to hold highly influential and even decisive positions in the Clinton and W. Bush administrations. WINEP head Dennis Ross has a position in the Obama administration related to Iran policy, even though the WINEP web page hosted articles urging the bombing of Iran. These appointments are not made despite the WINEP connection, but because of it.
The strength of these lobbies comes from their passionate commitment to their cause, from excellent organizing skills, and from their ability to unify around it across religious and ideological boundaries, and above all from ability to leverage support serially on issues from likely allies. Thus, the leadership can arrange for millions of protest emails to be sent by evangelical Christians as well as by Jewish congregations. The New Republic takes the same side as Commentary. They succeed even though their most passionate projects are not supported, and are even opposed, by probably a majority of the American Jewish community. I doubt much hangs on money per se; real lobbying is often a relatively inexpensive affair. But behind-the-scenes concurrence of big players like the military-industrial complex on the desirability for a war is crucial if you are a lobby trying to get up a war for other reasons. They are very good at getting their way, and can think and plan carefully a good decade out, and have virtually no effective opposition, which is the real secret of their strength. Did Obama get even 5,000 emails complaining that he did not stand by Freeman? And that would be a tiny number compared to what Freeman's opponents can muster.
Since it is obvious that the Israel lobbies are working toward a war on Iran, and since I believe that a US or Israeli war on Iran would have extremely damaging effects on the US and on Israel, I am very worried about these lobbies' efficacy and proven track record in accomplishing their goals.
Gen. Jones should seek someone as level-headed on these issues as Freeman who has not expressed himself so publicly.
Posted on: Monday, March 16, 2009 - 14:38
SOURCE: Foreign Affairs (subscribers only) (3-1-09)
As the twentieth century drew to an end, it became clear that a major change was taking place in the countries of the Arab world. For almost 200 years, those lands had been ruled and dominated by European powers and before that by non-Arab Muslim regimes -- chiefly the Ottoman Empire. After the departure of the last imperial rulers, the Arab world became a political battleground between the United States and the Soviet Union during the Cold War. That, too, ended with the collapse of the Soviet Union in 1991. Arab governments and Arab dynasties (royal or presidential) began taking over. Arab governments and, to a limited but growing extent, the Arab peoples were at last able to confront their own problems and compelled to accept responsibility for dealing with them.
Europe, long the primary source of interference and domination, no longer plays any significant role in the affairs of the Arab world. Given the enormous oil wealth enjoyed by some Arab rulers and the large and growing Arab and Muslim population in Europe, the key question today is, what role will Arabs play in European affairs? With the breakup of the Soviet Union, Russia ceased to be a major factor in the Arab world. But because of its proximity, its resources, and its large Muslim population, Russia cannot afford to disregard the Middle East. Nor can the Middle East afford to disregard Russia.
The United States, unlike Europe, has continued to play a central role in the Arab world. During the Cold War, the United States' interest in the region lay chiefly in countering the growing Soviet influence, such as in Egypt and Syria. Since the end of the Cold War, U.S. troops have appeared occasionally in the region, either as part of joint peace missions (as in Lebanon in 1982-83) or to rescue or protect Arab governments from their neighboring enemies (as in Kuwait and Saudi Arabia in 1990-91). But many in the Arab world -- and in the broader Islamic world -- have seen these activities as blatant U.S. imperialism. According to this perception, the United States is simply the successor to the now-defunct French, British, and Soviet empires and their various Christian predecessors, carrying out yet another infidel effort to dominate the Islamic world....
Posted on: Monday, March 16, 2009 - 12:48
SOURCE: San Jose Mercury News (3-14-09)
There was a time when politicians thought they could learn from history. Today, history is used as a form of profanity.
The favorite historical curse word in Washington now is "socialism." Rather than buy into the rhetoric that "we are all socialists now," as a recent Newsweek article put it, the media need to call out politicians who rewrite the past. Distorting history is a dangerous game that poisons political debate.
Even before the current threat of bank and insurance company nationalizations, Republican leaders and right-wing commentators were hurling the epithet "socialist" at their Democratic opponents. John McCain repeatedly accused Barack Obama of harboring a socialist agenda on the campaign trail. Things have not changed much since.
House Minority Leader John Boehner, R-Ohio, called Obama's policies an "American socialist experiment," the same day that Sen. Jim DeMint, R-S.C., described the president as "the world's best salesman of socialism." Republican presidential contender Mike Huckabee claimed that Obama was creating "the Union of American Socialist Republics," adding for good measure that "Lenin and Stalin would love this stuff."
Based on the latest Republican comments, one would think the painful history of the Soviet Union was caused by nothing more than the nationalization of a few banks. Of course, this was not the case. Socialism, as it was theorized in the 19th century and put into practice by the Soviets, was the nationalization of everything — all the "means of production": banks, yes, but also industries, farms, even private property. It was an economic philosophy that explicitly rejected the market economy and capitalism, seeking instead to have the state direct economic growth through "five-year plans" and other centralized measures.
Socialism breathed its final gasp in 1989, though it had been dying long before. Some commentators still call present European states (especially France) "socialist," but this claim is equally misleading. European socialist parties have long abandoned their old objective of nationalizing the economy.
Since François Mitterrand's aborted nationalization program in the 1980s, the French state sector has been steadily shrinking through privatization. The Scandinavian countries, sometimes celebrated as models of new socialism because of their cradle to the grave welfare net, have very few nationalized industries. We are all capitalists now, at least in the West.
We can, and should, have a debate about the role of the government in health care, education, welfare and social security. But this is not a debate about socialism; it is a debate about what kind of capitalist society we want to live in....
Posted on: Sunday, March 15, 2009 - 19:57
SOURCE: New Republic (3-18-09)
How does one learn to construct and to lead a republic? Monarchies do not provoke this question, or at least not with the same urgency. When King George III took the throne in Britain in 1760, he had some thirty-three predecessors in England alone, if one goes back only to William the Conqueror, and fifty-odd predecessors if one goes back to Egbert, the first "King of All England," in the ninth century. George had all this history to consider, and also the histories of other monarchies and hierarchical structures the world over. When Alexander Hamilton, John Jay, James Madison, and their colonial friends framed and defended a Constitution for a newly united federation of American states, they, too, turned to history--but they had fewer examples on which to draw. They looked to the ancient Greek city-states, to the republics of northern Italy, to the Germanic League, and, of course, to the Roman Republic, from which they would take their greatest inspiration. They would have leapt for joy to have read Josiah Ober's new book.
As Hamilton, Jay, and Madison saw it, Greece presented little other than a cautionary tale to the eighteenthcentury architects of a new and vast republic. Hamilton was characteristically high-flown:
It is impossible to read the history of the petty
Republics of Greece and Italy, without feeling
sensations of horror and disgust at the distractions
with which they were continually agitated, and at the
rapid succession of revolutions by which they were
kept in a state of perpetual vibration between the
extremes of tyranny and anarchy. If they exhibit
occasional calms, these only serve as short-lived
contrasts to the furious storms that are to succeed.
If now and then intervals of felicity open themselves to
view, we behold them with a mixture of regret, arising
from the reflection that the pleasing scenes before us
are soon to be overwhelmed by the tempestuous waves
of sedition and party rage. If momentary rays of glory
break forth from the gloom, while they dazzle us with a
transient and fleeting brilliancy, they at the same time
admonish us to lament that the vices of government
should pervert the direction and tarnish the luster of
those bright talents and exalted endowments, for which
the favoured soils that produced them have been so
The late eighteenth century took its view of ancient Athenian democracy primarily from its critics: Thucydides, Xenophon, Plato, Aristotle, Plutarch. Hamilton, Jay, and Madison were espousing the standard line, which was this: although there were differences between cities--as between democratic Athens and oligarchic Sparta--all the Greek city-states were given to instability and suffered ugly demographic crises induced by revolution and war. Moreover, the so-called democracies were not really "governments by the people" at all. As Madison put it, "In the most pure democracies of Greece, many of the executive functions were performed, not by the people themselves, but by officers elected by the people, and representing the people in their executive capacity."
Even considered as a republic governed primarily by elite representatives, Athens was to be judged a failure. Its leading statesman, "the celebrated Pericles," in Hamilton's words, routinely "sacrifice[d] the national tranquility to personal advantage, or personal gratification." Most egregiously, according to Hamilton, he started the Peloponnesian War to distract the Athenian populace from efforts to prosecute him for the misuse of state funds. So pure democracy necessarily devolves into either anarchy or rule by a corrupt managerial elite--this is the theoretical claim undergirding the commentary on Athens in The Federalist Papers and sustaining the authors' conclusion that Athens could not be a useful model for the citizens designing a new constitution in 1787 for a set of freshly united states.
In the early twentieth century, the sociologist Robert Michels formalized this theoretical claim as the "iron law of oligarchy." Any collectivity larger than a face-to-face society of a few hundred souls must develop formal organization if it is to succeed at pursuing its collective flourishing; and the imperatives of organization, Michels argued, will inevitably drive the evolution of political forms toward oligarchies ruled by a small elite corps of expert managers. On Michels's argument, a long-lived and competitively successful participatory democracy with direct rule by the people themselves--"pure democracy," in the words of Madison--is best understood as something like a unicorn: beloved for its purity, seen only in dreams.
Josiah Ober has made it his life's work to refute Michels's law, and to prove that "participatory democracy" can meet the demands of organization by developing institutional and cultural forms that effectively provide for a group's success over the long-term. By "participatory democracy," Ober means forms of political organization in which ordinary citizens, amateurs, really do make and implement critical policy decisions as well as sustain the systems of reward and sanction that keep the whole democratic machine functioning. A genuinely participatory democracy is not, contrary to our own regime, built around the principle of representation; but this does not mean that representative and participatory institutions are mutually exclusive forms of democratic organization, as I will suggest....
Posted on: Friday, March 13, 2009 - 14:39
SOURCE: NYT blog (3-12-09)
On Nov. 24, 1963, two days after John F. Kennedy’s assassination, President Lyndon B. Johnson met with his principal national security advisers to consider the most volatile issue he had inherited: Vietnam. A coup at the beginning of November — approved by the Kennedy administration — had toppled Ngo Dinh Diem’s government and taken his life. Concerns about the ability of his untested successors to withstand Vietcong insurgents backed by Ho Chi Minh’s North Vietnamese Communist regime gave Johnson a sense of urgency about an issue that could threaten United States interests abroad and undermine his standing at home.
Johnson’s first concern was to assure that he was acting in concert with Kennedy’s plans. But no one could provide authoritative advice on J.F.K.’s intentions. By increasing the number of military advisers in Vietnam from 685 to 16,700, Kennedy had indicated his determination to preserve Saigon’s autonomy. His agreement to a change of government in hopes of finding a leader who could command greater popular support than Ngo Dinh Diem seemed to confirm Kennedy’s commitment to preventing a Communist victory.
Lyndon Johnson tried to give his nation guns and butter. In the end, he provided neither.
At the same time, however, Kennedy had signaled his intentions to reduce America’s military role in Vietnam by directing that 1,000 of the advisers be brought home by the end of 1963. He had also rejected requests from his military chiefs for the use of American ground forces in the fighting. In addition, he had told several advisers that he intended to withdraw American military personnel from Vietnam after the 1964 election.
Since Kennedy had left no clear indication of what he would do in response to worsening conditions in Vietnam, Johnson was free to put his own stamp on American policy. And he did not hesitate to say what he planned. He chose to interpret Kennedy’s past actions as a commitment not to allow a Communist conquest. When his ambassador to Saigon, Henry Cabot Lodge, told Johnson that Vietnam “would go under any day if we don’t do something,” Johnson answered: “I am not going to lose Vietnam. I am not going to be the president who saw Southeast Asia go the way China went.” At the Nov. 24 meeting, he urged everyone “to devote every effort” to the war. “Don’t go to bed at night until you have asked yourself, ‘Have I done everything I could to further the American effort to assist South Vietnam,’ ” he urged.
Johnson also said that South Vietnam’s generals should be told that he would stand behind them, but he told his staff that he wanted something for his support. “I want ‘em to get . . . out in those jungles and whip hell out of some Communists,” he said. “And then I want them to leave me alone, because I got some bigger things to do right here at home.”
Johnson’s eagerness for quick success in Vietnam rested on a number of fears. First, he worried that if Vietnam went down it would provoke another round of McCarthyism: critics would attack him for weakness in fighting the Communists and blame him for losing Vietnam the way they had blamed Truman for losing China. That could cripple his ability to be an effective foreign policy leader. He also feared that a Communist victory in Vietnam would embolden the Russians and the Chinese to new acts of aggression in Europe and Asia and increase the risks of a nuclear war. Last, but certainly not least, he worried that his hope of becoming a great reform president, who changed the domestic life of the nation, would fall victim to a foreign policy debate over Vietnam.
As Johnson soon learned, despite his protests to the contrary, he could not have guns and butter. And though, as Lady Bird Johnson said, Vietnam “wasn’t the war he wanted. The one he wanted was on poverty and ignorance and disease … ”, once he committed himself to winning the war with a broad bombing campaign and 545,000 combat troops, he lost the freedom to build a Great Society. Protests against the loss of American and Vietnamese lives and the commitment of billions of dollars to fight the war drained away the country’s energy for large-scale domestic improvements.
Now that President Obama has inherited not one war but two, does he face a similar hurdle? With the country’s economy in such poor shape and his eagerness to enact bold health insurance, education and environmental reforms, he will need to recall that wars are the enemy of far reaching change. World War I stopped Progressivism; in the 1940’s “Dr. Win the War replaced Dr. New Deal,” as Franklin D. Roosevelt said; the Korean War sidetracked Harry Truman’s Fair Deal; and Vietnam frustrated Johnson’s hopes of additional Great Society measures.
Mr. Obama’s commitment to maintain perhaps 50,000 troops in Iraq after the drawdown of combat forces over the next 19 months, combined with his decision to send an additional 17,000 troops (for starters) to Afghanistan, could be the beginning of an unwanted debate about commitments abroad. If the country begins to see mounting costs in lives and money from the administration’s war policies, it risks distractions from the more urgent designs the president described in his campaign and recent messages to the Congress and the country.
History is never a precise guide for current political actions. But the consistent negative impact of earlier foreign conflicts on grand projects at home is a cautionary tale that should command President Obama’s close attention. Guns and butter rarely mix.
Posted on: Friday, March 13, 2009 - 13:18
A recent article in The Atlantic and recently released Lutheran documents give good reasons to revisit the status of gays and lesbians across American society. Unfortunately, few commentators to date have addressed the most troubling development of the past few years: the growth of DOMA Laws, or "Defense of Marriage Acts." These laws are forms of religious violence.
The Federal Defense of Marriage Act, passed in 1996, stipulates that for the purpose of federal laws and operations, "the word 'marriage' means only a legal union between one man and one woman as husband and wife." According to domawatch.org – a website sponsored by supporters of these laws – thirty-seven states now have some form of DOMA Laws on the books. The rationales for such defensive laws are often couched in neutral, "secular", or "naturalist" language. But the move to establish such laws came from religious groups, notably conservative Protestants, Catholics, and Mormons. And the logic and appeal of these laws also originates in religion, and functions as a form of violence. Six theses can clarify the contours of the religious violence embedded in these laws.
1) DOMA Laws violate sacred texts. Many of the arguments against gay and lesbian civil unions or marriage appeal to biblical texts from Genesis, Leviticus, Romans, or I Corinthians. But such arguments impose upon the texts a twentieth century understanding of sexual identity alien to the Jewish or Hellenistic cultures in which these texts arose.
2) DOMA Laws elevate heterosexual marriage to idolatrous status. In some communities of faith, defending "marriage" has become all but an item of confessional status (it is absent from any historic Christian Confessions). This arrogates to a majority – heterosexuals – special privileges (economic, social, and spiritual) not available to sexual minorities.
3) DOMA Laws scapegoat gays and lesbians. As Rene Girard argues, scapegoating is a chief manifestation of religious violence. It is difficult to see what real threat is posed to heterosexual intimacy, much less to civil society, by the desire of homosexuals for similar rights. It is easy to see how DOMA laws organize consent over and against a relatively voiceless and powerless group.
4) DOMA Laws sacrifice homosexual rights, and damage civil society, in the interest of religious purity. One measure of the justice in any society is how well it cares for vulnerable members. Sexual difference marks individuals as both vulnerable and "dangerous." And as Mary Douglass showed, any "danger" against which a law must defend is invariably constructed around some purity interest. DOMA Laws require gays and lesbians to sacrifice rights others take for granted, and render them subject to legalized forms of exclusion and discrimination. They damage the deep trust that is the most important social practice in civil society.
5) DOMA Laws confuse legislation with religion, and violate the First Amendment, as Ann Pellegrini and Janet Jakobsen have argued. It is entirely permissible (although ethically subject to scrutiny) for private communities to shape the boundaries of association in whatever ways members agree upon. It is a violation of the First Amendment's protection of free association to inhibit by law some forms of association that pose no harm to the common good, and a violation of the freedom from an established religion when religiously-inspired exclusions are written into law.
6) DOMA Laws perpetuate an association of sex with power, and thereby do damage to any sacramental sensibility that might remain in association with even heterosexual marriage. As Hendrik Hartog and other historians have shown, marriages have shifted in the modern era from patriarchal patterns of coverture to social contracts in which couples seek mutual fulfillment. Such contracts might be compatible with a sacramental sensibility, since they entail pledges of sexual fidelity and commitments to share social resources and responsibilities, along with (one might argue) other gifts of God. DOMA Laws associate sexual fidelity with legislated forms of coercive power, and inhibit the deep trust and mutuality intrinsic to modern (and sacramental) marriage. They establish hierarchies of relationships, and associate heterosexual unions (and sexual practices) with dominance.
DOMA Laws have been passed with the support and lobbying of religious groups. Such laws point, unfortunately, to a deep tendency of religions to consolidate power through exclusion, as Miroslav Volf has so cogently shown; these laws have no rationale for their existence apart from that exclusion. People who wish to "defend" corrosive influences on marriage – and I count myself as one – might actually find allies among gays and lesbians who desire public recognition for their pledges of fidelity and their commitments to share resources and responsibilities with one another. A true defense of marriage would not involve mean-spirited exclusions, but would embrace practical policies that strengthen deep trust and support families facing economic challenges.
Paul Elie’s article in The Atlantic,"God, Grace, and Sex," is online as "The Velvet Reformation" at http://www.theatlantic.com/doc/200903/archbishop-canterbury/2.
The Social Statement "Human Sexuality: Gift and Trust" and the ECLA’s recommendations on ministry practices are online at http://www.elca.org/What-We-Believe/Social-Issues/Social-Statements-in-Process/JTF-Human-Sexuality.aspx.
Posted on: Thursday, March 12, 2009 - 20:37
SOURCE: Histori-blogography (10-17-05)
I searched news stories, and it looks like no one is reporting on the implications of this clause. But it looks to me like Iraq is about to ratify a constitution that forbids Iraqi authorities to hand suspected insurgents to the US military -- which seems to me to be a pretty significant development, yeah?
How do other people read this one? What do you think it adds up to?
See also Article 30, in which we learn that the state"guarantee[s] to the individual and family -- especially women and children -- social and health security and the basic requirements for leading a free and dignified life. The state also ensures the above a suitable income and appropriate housing." Read the rest of Article 30, too.
Then read Article 31:"Every citizen has the right to health care. The state takes care of public health and provide[s] the means of prevention and treatment by building different types of hospitals and medical institutions."
Looking forward to the RNC press release trumpeting the enshrinement, by American armed force, of a state-run economy with income guarantees and single-payer health care.
Straight through the looking glass, here.
ADDED STILL LATER:
Don't miss Articles 112, 113, and 117, which establish"the region of Kurdistan," and a general framework of Iraqi regionalism, under rules that (at least on my first reading) would make John Calhoun smile:"In case of a contradiction between regional and national legislation in respect to a matter outside the exclusive powers of of the federal government, the regional authority shall have the right to amend the application of the national legislation within that region."
Farther down in Article 117, we learn that each region gets its own distinct"police, security forces, and guards of the region."
Did we just help to create a Kurdish army on Turkey's southern border?
Posted on: Thursday, March 12, 2009 - 15:41
SOURCE: In These Times (10-10-05)
We who were born at the end of the Weimar Republic and who witnessed the rise of National Socialism are left with that all-consuming, complex question: How could this horror have seized a nation and corrupted so much of Europe? We should remember that even in the darkest period there were individuals who showed active decency, who, defying intimidation and repression, opposed evil and tried to ease suffering. I wish these people would be given a proper European memorial—not to appease our conscience but to summon the courage of future generations.
Let’s consider not the banality of evil but its triumph in a deeply civilized country. After the Great War and Germany’s defeat, conditions were harsh and Germans were deeply divided between moderates and democrats on the one hand and fanatic extremists of the right and the left on the other. National Socialists portrayed Germany as a nation that had been betrayed or stabbed in the back by socialists and Jews; they portrayed Weimar Germany as a moral-political swamp; they seized on the Bolshevik-Marxist danger, painted it in lurid colors and stoked people’s fear in order to pose as saviors of the nation. In the late ’20s a group of intellectuals known as conservative revolutionaries demanded a new volkish authoritarianism, a Third Reich. Richly financed by corporate interests, they denounced liberalism as the greatest, most invidious threat, and attacked it for its tolerance, rationality and cosmopolitan culture. These conservative revolutionaries were proud of being prophets of the Third Reich—at least until some of them were exiled or murdered by the Nazis when the latter came to power. Throughout, the Nazis vilified liberalism as a semi-Marxist-Jewish conspiracy and, with Germany in the midst of unprecedented depression and immiseration, they promised a national rebirth.
Twenty years ago, I wrote about “National Socialism as Temptation,” about what it was that induced so many Germans to embrace the terrifying specter. There were many reasons, but at the top ranks Hitler himself, a brilliant populist manipulator who insisted and probably believed that Providence had chosen him as Germany’s savior, that he was the instrument of Providence, a leader who was charged with executing a divine mission.
God had been drafted into national politics before, but Hitler’s success in fusing racial dogma with a Germanic Christianity was an immensely powerful element in his electoral campaigns. Some people recognized the moral perils of mixing religion and politics, but many more were seduced by it. It was the pseudo-religious transfiguration of politics that largely ensured his success, notably in Protestant areas, where clergy shared Hitler’s hostility to the liberal-secular state and its defenders, and were filled with anti-Semitic doctrine.
German moderates and German elites underestimated Hitler, assuming that most people would not succumb to his Manichean unreason; they didn’t think that his hatred and mendacity could be taken seriously. They were proven wrong. People were enthralled by the Nazis’ cunning transposition of politics into carefully staged pageantry, into flag-waving martial mass. At solemn moments the National Socialists would shift from the pseudo-religious invocation of Providence to traditional Christian forms: In his first radio address to the German people, 24 hours after coming to power, Hitler declared, “The National Government will preserve and defend those basic principles on which our nation has been built up. They regard Christianity as the foundation of our national morality and the family as the basis of national life.”
To cite one example of the acknowledged appeal of unreason, Carl Friedrich von Weizsaecker, Nobel-laureate in physics and a philosopher, wrote to me in the mid-’80s saying that he had never believed in Nazi ideology but that he had been tempted by the movement, which seemed to him then like “the outpouring of the Holy Spirit.” On reflection, he thought that National Socialism had been part of a process that the National Socialists themselves hadn’t understood. He may well have been right: The Nazis didn’t realize that they were part of a historic process in which resentment against a disenchanted secular world found deliverance in the ecstatic escape of unreason. German elites proved susceptible to this mystical brew of pseudo-religion and disguised interest. The Christian churches most readily fell into line as well, though with some heroic exceptions....
Modern German history offers lessons in both disaster and recovery. The principal lesson speaks of the fragility of democracy, the fatality of civic passivity or indifference; German history teaches us that malice and simplicity have their own appeal, that force impresses and that nothing in the public realm is inevitable.
Another lesson is the possibility of reconstruction, for the history of the Federal Republic since World War II, a republic that is now 55 years old, exemplifies success despite its serious flaws and shortcomings. Postwar Germany made a democracy grow on what was initially uncongenial ground, when its people were still steeped in resentment and denial. American friendship supported that reconstruction, especially in its first decade.
I fear that an estrangement is now taking place. German acceptance of Western traditions has been the precondition for its gradual reconciliation with neighbors and former enemies. The German achievement is remarkable—but it too needs constant protection.
My hope is for a renewal on still firmer grounds of a trans-Atlantic community of liberal democracies. Every democracy needs a liberal fundament, a Bill of Rights enshrined in law and spirit, for this alone gives democracy the chance for self-correction and reform. Without it, the survival of democracy is at risk. Every genuine conservative knows this.
Posted on: Thursday, March 12, 2009 - 15:37
SOURCE: This American Life (NPR) (2-27-09)
... David Beim has a much more profound reason why banks shouldn't lend. He shows me something on his computer.
David Beim: Ok, so here is a picture, a graphic, and a chart that goes back to 1916 and up to...
Alex Blumberg: We’re in his office, and we’re looking at a graph, and it's, basically, a measure of how much debt we the citizens of America, are in. How much we all owe--on our mortgages and credit cards and auto loans--compared to the economy as a whole, the GDP. And for most of history, the amount we owed was a lot smaller than the economy as a whole.
This ratio, household debt to GDP bounces along around between 30 and 50 percent, for most of the '30s and '40s 50s, 60s, and 70s, right into the 80s. Then it breaks through 50 % in the 80s, starts heading up in the 1990s. And then ...
David Beim: From 2000 to 2008, it just goes, almost a hockey stick, it goes dramatically upward.
Alex Blumberg: Like a rocket.
David Beim: It hits 100% of GDP. That is to say, currently, consumers owe 13 trillion dollars when the GDP is $13 trillion. That’s a $100 trillion owed by individuals. That is a ton.
Alex Blumberg: I'm going to ask a leading question, because I’m looking at a graph right here. Tell me professor, has there ever been a time where we owed that much before?
David Beim: I’m glad you asked me that. And guess what? The earlier peak, which is way over on the left part of the chart, where debt is 100% of GDP, was in 1929. This is a map of twin peaks. One in 1929 and one in 2007.
Alex Blumberg: Does that chart scare you?
David Beim: Yes. That chart is the most striking piece of evidence that I have that what is happening to us is something that goes way beyond toxic assets in banks, it’s something that had little to do with mortgage securitization, or ethics on Wall Street, or anything else. It says the problem is us. The problem is not the banks, greedy though they may be, overpaid though they may be. The problem is us. We have over-borrowed. We have been living very high on the hog. We are, our standard of living has been rising dramatically over the last 25 years, and we have been borrowing to make much of that prosperity happen....
Posted on: Thursday, March 12, 2009 - 14:59