Roundup: Historian's Take
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: New Republic (8-14-12)
Jennifer Burns is Assistant Professor of History at Stanford University and the author of Goddess of Market: Ayn Rand and the American Right.
In the heyday of her celebrity, it often seemed that the only appropriate public response to Ayn Rand was dismissal. In 1961, Newsweek magazine sent a reporter to investigate the growing circle of devotees clustered around the right wing novelist. Visiting the New York City headquarters of Rand’s Objectivist movement, the reporter declared the Russian-born Rand an "apparition" with a "glare that would wilt a cactus." After a similar pilgrimage, a writer for Life magazine forthrightly concluded that Rand was the leader of a cult. A review of Rand’s essay collection Capitalism: The Unknown Ideal in The New Republic simply referred to Rand as "Top Bee in the communal bonnet, buzzing the loudest and zaniest throughout this all but incredible book."
And yet, some fifty years later, Rand is the avowed intellectual inspiration of presumptive GOP Vice Presidential nominee Paul Ryan. Ryan offers no apologies for interest in Rand’s philosophy and makes little effort to hide his allegiances. Just how did Rand travel from the fringes of a 1960s subculture to the heart of American politics?
It clearly wasn’t via the traditional institutions of mainstream conservatism. The original mandarins of the conservative movement, from William F. Buckley, Jr. to Whittaker Chambers, all roundly rejected her atheistic philosophy of selfishness and her assertion that capitalism was a moral system.
Rather, Rand made her fortunes among the young foot soldiers of the right in the 1960s, who thrilled to her iconoclastic rejection of mainstream values. Rand’s Objectivism, as she called her comprehensive philosophical system, attacked all American pieties but one: The national creed of getting rich. This put her in perfect step with the anti-authoritarian mood of the times—while offering the additional benefit, unlike your average hippy guru, of not threatening her followers’ material fortunes.
But Rand’s philosophy would have gone nowhere if it were confined to its original adherents...
Posted on: Wednesday, August 15, 2012 - 14:23
SOURCE: Chronicle of Higher Ed (8-14-12)
Robert Zaretsky is a professor of history in the Honors College at the University of Houston.
The distant but growing sigh rising across the nation is the sound of humanities professors writing—or tweaking—their syllabi for fall classes. Like Labor Day, the writing of the syllabus has become an empty ritual of late summer—a thoughtless activity that has overtaken (or shunted aside) the practice it is meant to sustain.
Strictly speaking, a syllabus is a course outline that tells the student what books to read and when to read them, what papers to write and when to hand them in, and what subjects will be discussed and when students need to be ready to discuss them. In a word, it is a checklist. It is also the dark side of teaching—or, more accurately, of the telling of the past.
For most humanities disciplines, the syllabus is harmless, and often even helpful. It is both a checklist and a contract offered by the professor. It resembles the bullet-point pamphlet the plumber walks you through, patiently explaining all the things he will do in return for sticking you with a bill rivaling your child's college tuition. With the syllabus, a professor informs the student: Here is what I will do, here is what I expect you to do. Just as a homeowner may decide to turn the malfunctioning outdoor Jacuzzi into a compost bin, the student will decide if she truly needs a course on, say, medieval scholasticism....
Posted on: Wednesday, August 15, 2012 - 08:24
SOURCE: NYT (8-14-12)
Posted on: Wednesday, August 15, 2012 - 08:20
SOURCE: American Interest (8-12-12)
Walter Russell Mead is professor of foreign affairs and the humanities at Bard College and editor-at-large of The American Interest.
With Governor Romney’s selection of Wisconsin Congressman Paul Ryan as his running mate, the vague contours of the presidential race have suddenly become sharper. Up until now, partly because Romney’s image has been so fuzzy, we were looking at a referendum on President Obama rather than a clear-cut contest between political philosophies. Now, given Ryan’s prominence as a budget hawk and entitlement reformer, the public has a choice to make.
On the one hand, President Obama and Vice President Biden stand foursquare for the growth of what I’ve been calling the blue social model. In terms of government policy, they want to continue to grow the mix of interventions, guarantees, entitlements and programs that FDR launched in the New Deal, that Lyndon Johnson extended in the Great Society, and that various presidents (of both parties — think of Nixon and the EPA and W and the prescription drug benefit) have extended since.
This is a bolder stance than the Clinton approach. Bill "the era of big government is over" Clinton was a small ‘c’ conservative: he aimed to conserve the bulk of the entitlement state by trimming a few of its less popular features like welfare payments not linked to work. President Obama, who succeeded at passing health care where Clinton failed, has bigger ambitions, and intends to press ahead with the characteristic direction of American politics in the last two thirds of the twentieth century — towards a more powerful, more purposeful and more intrusive federal state.
Beyond that, Obama and Biden will be running on the blue social model as a way of life. The mass production, mass consumption society of Fordist America saw stable employment at good wages for most people in the US. For Obama and Biden, that kind of America is what Frank Fukuyama called the end of history: a relatively egalitarian income distribution, a stable employment picture, defined benefit pension programs for more and more workers, a gradually rising standard of living, more kids spending more years in school from generation to generation and a government of Keynesian macro-economists who keep the economy on an even keel.
For the Obamians, this is the ideal form of society. The apparent creaks and strains of the last thirty years — rising income inequality, stagnating real wages, economic volatility — are the result of policy errors rather than historical forces. Bad, selfish people have dismantled the regulations and controls that kept a healthy middle class economy in place and like Toad of Toad Hall in The Wind in the Willows, reckless rich nincompoops have driven the national economy — and the blue social model — into the ditch. President Obama’s goal is to bring back the good old days, and make them better yet. His methods are classic tools of the progressive movement of the twentieth century and he believes that there is much, much more than government can do to make our country richer and our society more just.
The Republican challengers will be attacking this vision head on...
Posted on: Tuesday, August 14, 2012 - 14:10
SOURCE: Joplin Globe (8-12-12)
Steve Harmon is a professor at Pittsburg State University in Pittsburg, Kan.
The fire at the mosque of the Joplin Islamic Society, located in Carl Junction, in the wee hours of Aug. 6 has been called “suspicious” by local investigating authorities. Destroying a mosque is not only a crime, it’s a hate crime.
We, as Americans or as Christians, have no quarrel with Muslims. There is no war between the West and Islam or between the United States and Muslims. The war is within Islam. It is a war between violent, extremist Muslims, who comprise a tiny minority of Islam, and peaceful, moderate Muslims, who comprise the vast majority. It is a battle for the soul of Islam and for the future of the Middle East.
The United States and its Western allies have been caught in the crossfire. What we, as Americans, need to do is to support moderate Muslims in their struggles against the extremists....
Posted on: Monday, August 13, 2012 - 13:34
SOURCE: NYT (8-11-12)
Posted on: Sunday, August 12, 2012 - 21:29
SOURCE: CS Monitor (8-10-12)
Jonathan Zimmerman is a professor of history and education at New York University. He is the author of “Small Wonder: The Little Red Schoolhouse in History and Memory” (Yale University Press).
Last Sunday, Wade Michael Page killed six people at a Sikh temple in Wisconsin before being wounded by a police officer and taking his own life. Two days later, Jared Lee Loughner pleaded guilty to killing exactly the same number of people at an Arizona shopping mall last year.
Who committed a worse crime? At first glance, it seems like a ridiculous question: One murderer’s victims are as dead as the other’s. But under US state and federal “hate crime” laws, the answer is probably Page. And that might be the most ridiculous thing of all.
An avowed white supremacist, Page most likely targeted his victims because they were of a different color or perhaps mistook them as Muslims. All but three states now have laws providing for enhanced penalties when a crime is motivated by racial, ethnic, or religious prejudice. So does the federal government, which broadened its hate-crime law in 2009 to include attacks based on sexual orientation.
Page, then, would be judged more harshly than Mr. Loughner – whose goal was to murder former Congresswoman Gabrielle Giffords – and possibly than James Holmes, who allegedly killed 12 moviegoers at a Colorado theater last month....
Posted on: Friday, August 10, 2012 - 13:51
SOURCE: Huffington Post (8-9-12)
Bill O'Reilly invited me on his Fox News show Wednesday night because he was upset that I called him a "right wing buffoon" in my Huffington Post article about Pete Seeger's appearance on the Colbert Report. If I'd called him a "conservative pundit," I doubt he would have asked me on the O'Reilly Factor.
The reference to O'Reilly in my Huffington Post article was an after-thought. I was praising Stephen Colbert for inviting Seeger on his show and I suggested that Colbert lead a campaign to get Pete nominated for the Nobel Peace Prize. I wrote that: "Colbert's show -- including his faux campaign for president, his Super PAC, and his nightly send-up of Bill O'Reilly's right-wing buffoonery -- brilliantly satirizes the absurdities of America's corporate-dominated political culture. By heading a campaign to get Pete Seeger the Nobel Peace Prize, Colbert would actually demonstrate that the forces of social conscience can triumph, against the odds."
Soon after my Huffington Post article appeared online, O'Reilly's producer contacted me to ask if I'd like to appear on the show to discuss the article. Once I got the invitation, I called a number of my friends, all of them media savvy, two of whom had been on O'Reilly's show, to ask for their advice. Some of the advised me to reject Bill's offer. Bill controls the microphones, the camera, and the agenda, interrupts his guests (particularly his occasional liberal guests), and heaps abuse and scorn on them. Others encouraged me to go on the show but advised me not to get distracted by his bullying and to stay "on message," no matter what he said or how often he interrupted me.
I decided to accept Bill's invitation. I thought he might want to discuss my idea about a Nobel Peace Prize for Pete Seeger. I figured Bill would attack Pete for having been a Communist and for his history of left-wing activism, so I came prepared with a list of Pete's many political and cultural accomplishments, as well as the fact that Pete was a World War Two veteran (compared with O'Reilly's lack of military service), had been married to the same woman for 70 years (a true examplar of "family values"), and was single-handedly responsible for cleaning up the Hudson River.
But it was clear from the get-go last night that Bill didn't want to talk about Pete Seeger. He wanted to talk about why I called him a "right-wing buffoon" and paint me as an example of the alleged left-wing bias rampart on America's college campuses today.
I figured that Bill would ask me to explain why I called him a "right-wing buffoon." I had three possible ways to respond.
I have to give Bill credit. Although he smugly patronized me by calling me "Doc," insulted my intellectual integrity, and disparaged the Huffington Post, he mentioned my book -- and showed a copy of its cover -- several times, and even called me an "honest guy." Several times, when I kept talking even though he was trying to interrupt me, he backed off and let me finish. He clearly had the upper hand in controlling the "debate," but he let me have my say.
More interesting than my back-and-forth with Bill was the immediate reaction to my appearance on his show.
Within minutes of going off the air on O'Reilly I started getting emails from Bill's fans, most of them saying exactly the same thing in exactly the same words. This suggests that this immediate flurry of emails -- most of whom from people who didn't include their names -- was somehow orchestrated. It is hard to believe that it could have been random. Most of them accused me of being a typical left-wing college professor poisoning the minds of the next generation, and most of them spewed with vitriol and hatred. Here's an example:
You FUCKING CLUELESS ASSHOLE.......being a typical liberal you couldn't give O'Reilly an example of your propaganda bullshit. I get so sick of pieces of shit like you and your ilk. It is shame you weren't at the midnight showing of Batman in Aurora. It's stunning how your kind loves to suck Obamao's dick (does his semen taste good? The earthquake that's gonna cause california to fall into the ocean can't come soon enough. But in the meantime more and more of cali cities are gonna file bankruptcy. How are those socialist programs working out? I'm no republican(libertarian) but I get sick of them not using your boy's, Saul Alinsky, tactics. How's it feel? See you in hell asshole!
This is the kind of person who watches O'Reilly, listens to Limbaugh, and supports the Tea Party. I didn't go on the O'Reilly Factor thinking that I'd persuade most of Bill's loyal viewers, like the nut-case who wrote that email, about the importance of Jane Addams, Martin Luther King, Saul Alinsky, Ella Baker, Betty Friedan, and Pete Seeger.
But I was pleased to also get a handful of emails from O'Reilly watchers who said they'd buy the book. In fact, my publisher informed me that within minutes of my appearance on the show, sales of the book jumped dramatically, based on its ranking on Amazon.com. So, thanks, Bill!!
Also, some other more mainstream media outlets saw the show and asked to interview me about the book.
Was it worth spending five minutes on the O'Reilly Factor, taking Bill's abuse, and having to read the outpouring of hate email from the kind of deranged people who represent the small right-wing fringe of American politics?
I'm not sure. Except that my 15-year old daughter told me that she was proud of me. That made it worth it.
Posted on: Friday, August 10, 2012 - 12:09
SOURCE: Dissent (7-30-12)
Kristen Loveland is a PhD candidate at Harvard University, studying the ethical debates surrounding new reproductive technologies in Germany.
It was easy to miss, but “the worst attack on Jewish life since the Holocaust” took place in Germany at the end of June, according to Pinchas Goldschmidt, one of Europe’s leading rabbis. Goldschmidt was referring to the judgment of a regional German court that the circumcision of boys is a criminal act. The practical effect of the ruling is still to be determined, and German lawmakers are currently debating the need for new legislation. It has nonetheless struck many as bizarre that Germany, after decades spent memorializing the Holocaust, would consider banning a critical Jewish rite.
We should hesitate, however, to read the decision through the lens of anti-Semitism. For one, the case concerned the circumcision of a young Muslim boy. If religious prejudice was involved, it was more likely anti-Islamic than anti-Semitic. More profoundly, the ruling actually emerged from Germany’s sincere efforts to learn the lessons of the Holocaust and vow “never again.” Strange as it may seem, the court based its decision on principles of individual freedom and physical integrity that make sense only as products of Germany’s particular reckoning with its past, from the Nuremberg trials to the present.
Just after the Second World War, Germany established a legacy important for future bioethical determinations. The 1947 Nuremberg Code, which was drafted while twenty-three Nazi doctors were being prosecuted for medical atrocities, mandated the full consent of individuals to be “absolutely essential” in medical practices. But this applied only to human research subjects in medical experiments; that bioethical limitations might extend to widely practiced communal or religious rites was not yet considered. Germany also enshrined “human dignity” in its Basic Law of 1949 as an inviolable first principle, reinforced by the right to the free development of one’s personality and physical integrity.
Thus when new biomedical technologies arrived in the 1980s and 1990s, German attitudes and policies toward them differed widely from those in the rest of the west. When the Council of Europe drew up a Convention on Human Rights and Biomedicine in 1997, which promised to protect the dignity and integrity of all individuals, Germany refused to sign on, finding its protections weak. Unlike many other western states at the millennium, Germany prohibited research on embryonic stem cells, making an exception in 2002 for lines imported from abroad. (That Israel was quick to send its embryos raised fewer questions than it might have.)
But the circumcision ruling became possible only because, in the past two decades, the importance of physical integrity to individual development became especially invested in the child and therefore limited what parents could do to their children’s bodies. Until last year, German parents were completely forbidden from diagnosing the genetic makeup of in vitro embryos. The ban wasn’t just about the fear of designer babies; selecting embryos so as to avoid diseases like Huntington’s was considered a violation of the future child’s right to free development. Today genetic diagnosis is permitted for the purpose of avoiding a “severe hereditary disease,” but the imperative to preserve children’s bodies remains. In the circumcision case, the court ruled that boys face similar threats to their free development when their parents want them circumcised.
In the court’s view, the child should decide whether to get circumcised, and whether to affiliate with Islam, after he reaches the age of consent. Germany’s postwar history by no means made such a ruling inevitable but does explain how it came to be. By calling the boy “unable to consent,” the court mechanically identified him with vulnerable populations in Germany’s history, including severely mentally disabled persons whom the Nazis would have euthanized. And it made circumcision irreconcilable with the principles of the Basic Law. According to the court, circumcision irreparably altered the boy’s body for medically unnecessary reasons, thus violating his right to control over his physical integrity. And it permanently marked him as a Muslim, violating his right to self-determination. At the heart of the ruling, then, is an idealized individualism, which imagines that children’s bodies should be preserved from any community intrusion—and implicitly assumes the adult will not feel alienated and hollow as a result.
Americans more readily accept the power of religion and community to shape the individual. But when it comes to economic issues, many of us imagine that the individual stands alone. Just as it is impossible to enter adulthood and choose one’s religion without cultural influence, so it is absurd to have a fully free choice about when one needs, say, modern health care. In the Obamacare ruling, a majority of the justices imagined that individuals could somehow separate their bodies from the health-care market until they freely chose to enter into it, even though emergency-room care is unforeseeable and society foots the bill. Both the German and American legal arguments suspend the individual body above its social world: the former from a long-practiced religious rite, the latter from the structure of medical care in America.
Germany’s circumcision ruling has now moved into politics. And politics is where this debate belongs. The German parliament recently passed a symbolic measure in favor of legalized circumcision and promised a binding resolution in the fall. Biomedical issues like this one, which are only increasing in importance and number, inspire a real clash of values about the relationship between body and society. It should be we as political communities—more than we as subjects of court rulings—who decide which values prevail....
Posted on: Friday, August 10, 2012 - 11:31
SOURCE: Special to HNN (8-9-12)
Michael Seeley is a senior at Augustana College in South Dakota and was co-recipient of a Peace Prize Forum scholarship in 2011.
Nearly two centuries ago, on October 23, 1812, France momentarily fell into the hands of a lunatic, General Claude François de Malet. Ironically, madman Malet’s plans for France were more attuned to the emerging economic realities of the day than were those of the emperor that he hoped to depose, Napoleon Bonaparte.
Having slipped from the unguarded window of his Parisian insane asylum in the dead of night, Malet hurried home, where his wife equipped him with his uniform and weapons. Once more wearing the gold braid of command and reunited with co-conspirators, he rushed to the nearest barracks of the National Guard and shook the local militia colonel awake. Thrusting forged papers into the colonel’s hand, Malet cried that Napoleon had been killed in Moscow and that the Senate had appointed him head of the new government. The colonel complied with his new leader’s request for soldiers and roused more than a thousand troops to assemble in the street. Malet immediately used his new army to arrest the men in charge of government during Napoleon’s absence in Russia; the madman planned to round up the remaining leaders, oust the Emperor, reinstitute republicanism, and save France from the costly and disastrous war against Russia.
Had the coup succeeded, France might have been thrust back onto a path of peace and perhaps even long-term political stability. Malet’s good intentions, however, fell victim to Malet’s madness. When the authenticity of his documents was questioned, Malet shot the Prefect of Paris in the face. The confidence of his soldiers in his leadership and sanity thus shaken, the coup quickly unraveled, and within days Malet had been arrested, tried, and shot by firing squad.
The coup against him thwarted before he even knew of the attempt, Napoleon pressed his ill-fated attack against Russia. Later analysts questioned Napoleon’s decision to expand his empire into the vast Asiatic steppes but it is clear that the Emperor had little choice. Given Russia’s blatant disregard for France’s economic blockade of Britain, Napoleon needed to seal off Continental markets if he was to succeed in his attempt to strangle the British economy and war machine. Moreover, empires throughout history have grown by seizing resources from conquered peoples. Napoleon forged the French Empire from the same mold. French navies could never oust Britain from the high seas, but on the Continent, France’s skilled Corsican general seized numerous countries with relative ease and extracted sizeable reparation promises from them.
Excluding the seizure of moveable assets of indiscernible value, such as the vast number of art works taken from Italy to found the Louvre, French acquisitions from foreign nations between 1799 and 1814 has been estimated at 785 million francs. Between 1806 and 1809 alone, France forced Prussia and Austria to promise it 350 and 515 million francs, respectively — massive sums by any account. Napoleon, however, was only able to collect less than half of the payments.
In debt from previous campaigns, cut off from overseas trade by the British blockade, and unable to raise additional taxes at home, Napoleonic France suffered from a sort of military “bubble”: it had to invade new territories in order to survive. In the long run, however, attempting to fund a government with military conquests did not pay; the conquer-and-collect strategy proved unsustainable. Modern economic growth was changing the old formula. By the early nineteenth century, wealth increasingly lay in human capital, in knowing how to add value in manufacturing and services, rather than in land, minerals, or precious metals and gems. The latter can be seized and exploited but the former dry up if abused. Unsurprisingly, economically liberal Britain and its allies prevailed.
Whether he fully understood the implications of his plan or not, Malet was on to something. A military bubble like that suffered by Napoleonic France is unlikely today because now conquerors pay the conquered rather than the reverse. The paradigm shifted decisively in the twentieth century. After World War I, the victorious Allies sought reparations from Germany but managed only to cripple the German economy and pave the way for fascism. Unwilling to repeat that mistake after World War II, the United States infused billions to restore the economies of Germany, Italy, and Japan lest they fall to communism.
The United States continues to pay its defeated enemies, including Iraq and Afghanistan. Instead of filling its war chest with the indemnities of the conquered foe, the United States has flipped the moneybox open to watch the contents spill out into oil rich Iraq. While figures about the total cost of the war vary, every account places it in the trillions of dollars. Despite huge government fiscal deficits and dire domestic economic problems, U.S. taxpayers have been improving Iraqi infrastructure, governance, health, and security systems. The new paradigm of paying conquered nations now appears as defunct as the old system of stealing from them, so the U.S. government’s strategies in Iraq seem as outdated as Napoleon’s in Russia. When and how an even newer paradigm will arise remains unclear.
Hopefully, it will not require a coup attempt by a madman.
Posted on: Thursday, August 9, 2012 - 14:27
SOURCE: Other Words (8-6-12)
Roger Peace, an adjunct professor of history at Tallahassee Community College, is the author of A Call to Conscience: The Anti-Contra War Campaign (University of Massachusetts Press, 2012). Distributed via OtherWords (OtherWords.org)
In mid-July, relatives of three U.S. citizens killed by drone strikes in Yemen filed a wrongful death lawsuit against top security officials. Their complaint charges that the lethal strikes "violated fundamental rights afforded to all U.S. citizens, including the right not to be deprived of life without due process of law."
This raises an important question: When one country polices the world, who polices the police?
In the aftermath of World War II, America's leaders forged an internationalist foreign policy in defiance of popular pressure to return to "isolationism," or conventional national defense. This internationalism had two tracks. One was rooted in U.S. expansionist and interventionist history and took the form of a "world policeman" role against communism. The other was based on the principle of collective security associated with the recently established United Nations.
The U.S. "world policeman" role — a euphemism for extended U.S. power and influence — became the policy of choice, hailed as "protecting the free world." The idea of working together with other nations to prevent war and solve common problems was relegated to areas where U.S. vital interests were not at stake.
U.S. interventions in Vietnam, Latin America, and elsewhere sparked public protests at home but didn't fundamentally shift U.S. policies abroad. Nor did the end of the Cold War. Indeed, only one month after the Berlin Wall fell, U.S. forces invaded Panama, killing thousands of civilians in an effort to apprehend leader Manuel Noriega, whom the United States accused of drug running. The U.N. General Assembly condemned the Panama invasion as "a flagrant violation of international law," but to no avail.
The 9/11 terrorist attacks engendered a new wave of regime-changing interventions. As part of an ill-defined "war on terrorism," the United States began incarcerating people without trial in Afghanistan and Iraq. Now the United States is employing armed drones to assassinate suspected "militants" in the tribal areas of Pakistan as well as in Yemen and Somalia.
Our "world policeman" role has entered a new and more dangerous phase. The Obama administration has asserted its right to secretly assassinate individuals anywhere, eviscerating national boundaries, individual rights, and democratic policymaking.
It's time to revive the idea of collective security. No country, including our own, must be allowed to operate as a law unto itself. Nor is it in the interest of U.S. citizens to have other nations follow our example when it comes to international drone strikes. We need to establish international prohibitions on armed drones as well as cyber warfare and extraordinary renditions.
We also need a permanent U.N. security force to protect civilians from unruly forces in various parts of the world. Gaining international agreement for this will undoubtedly be difficult, but the process can be pushed forward on a country-by-country basis.
Garnering domestic support for this collective security agenda may be even more difficult. Yet economic pressures are pushing the United States to downsize its global role, and collective security offers the best alternative.
Posted on: Thursday, August 9, 2012 - 13:38
SOURCE: TomDispatch (8-9-12)
Nick Turse is the associate editor of TomDispatch.com. An award-winning journalist, his work has appeared in the Los Angeles Times, the Nation, and regularly at TomDispatch. He is the author/editor of several books, including the recently published Terminator Planet: The First History of Drone Warfare, 2001-2050 (with Tom Engelhardt).
In the 1980s, the U.S. government began funneling aid to mujahedeen rebels in Afghanistan as part of an American proxy war against the Soviet Union. It was, in the minds of America’s Cold War leaders, a rare chance to bloody the Soviets, to give them a taste of the sort of defeat the Vietnamese, with Soviet help, had inflicted on Washington the decade before. In 1989, after years of bloody combat, the Red Army did indeed limp out of Afghanistan in defeat. Since late 2001, the United States has been fighting its former Afghan proxies and their progeny. Now, after years of bloody combat, it’s the U.S. that’s looking to withdraw the bulk of its forces and once again employ proxies to secure its interests there.
From Asia and Africa to the Middle East and the Americas, the Obama administration is increasingly embracing a multifaceted, light-footprint brand of warfare. Gone, for the moment at least, are the days of full-scale invasions of the Eurasian mainland. Instead, Washington is now planning to rely ever more heavily on drones and special operations forces to fight scattered global enemies on the cheap. A centerpiece of this new American way of war is the outsourcing of fighting duties to local proxies around the world.
While the United States is currently engaged in just one outright proxy war, backing a multi-nation African force to battle Islamist militants in Somalia, it’s laying the groundwork for the extensive use of surrogate forces in the future, training “native” troops to carry out missions -- up to and including outright warfare. With this in mind and under the auspices of the Pentagon and the State Department, U.S. military personnel now take part in near-constant joint exercises and training missions around the world aimed at fostering alliances, building coalitions, and whipping surrogate forces into shape to support U.S. national security objectives.
While using slightly different methods in different regions, the basic strategy is a global one in which the U.S. will train, equip, and advise indigenous forces -- generally from poor, underdeveloped nations -- to do the fighting (and dying) it doesn’t want to do. In the process, as small an American force as possible, including special forces operatives and air support, will be brought to bear to aid those surrogates. Like drones, proxy warfare appears to offer an easy solution to complex problems. But as Washington’s 30-year debacle in Afghanistan indicates, the ultimate costs may prove both unimaginable and unimaginably high.
Start with Afghanistan itself. For more than a decade, the U.S. and its coalition partners have been training Afghan security forces in the hopes that they would take over the war there, defending U.S. and allied interests as the American-led international force draws down. Yet despite an expenditure of almost $50 billion on bringing it up to speed, the Afghan National Army and other security forces have drastically underperformed any and all expectations, year after year.
One track of the U.S. plan has been a little-talked-about proxy army run by the CIA. For years, the Agency has trained and employed six clandestine militias that operate near the cities of Kandahar, Kabul, and Jalalabad as well as in Khost, Kunar, and Paktika provinces. Working with U.S. Special Forces and controlled by Americans, these “Counterterror Pursuit Teams” evidently operate free of any Afghan governmental supervision and have reportedly carried out cross-border raids into Pakistan, offering their American patrons a classic benefit of proxy warfare: plausible deniability.
This clandestine effort has also been supplemented by the creation of a massive, conventional indigenous security force. While officially under Afghan government control, these military and police forces are almost entirely dependent on the financial support of the U.S. and allied governments for their continued existence.
Today, the Afghan National Security Forces officially number more than 343,000, but only 7% of its army units and 9% of its police units are rated at the highest level of effectiveness. By contrast, even after more than a decade of large-scale Western aid, 95% of its recruits are still functionally illiterate.
Not surprisingly, this massive force, trained by high-priced private contractors, Western European militaries, and the United States, and backed by U.S. and coalition forces and their advanced weapons systems, has been unable to stamp out a lightly-armed, modest-sized, less-than-popular, rag-tag insurgency. One of the few tasks this proxy force seems skilled at is shooting American and allied forces, quite often their own trainers, in increasingly common "green-on-blue" attacks.
Adding insult to injury, this poor-performing, coalition-killing force is expensive. Bought and paid for by the United States and its coalition partners, it costs between $10 billion and $12 billion each year to sustain in a country whose gross domestic product is just $18 billion. Over the long term, such a situation is untenable.
Back to the Future
Utilizing foreign surrogates is nothing new. Since ancient times, empires and nation-states have employed foreign troops and indigenous forces to wage war or have backed them when it suited their policy aims. By the nineteenth and twentieth centuries, the tactic had become de rigueur for colonial powers like the French who employed Senegalese, Moroccans, and other African forces in Indochina and elsewhere, and the British who regularly used Nepalese Gurkhas to wage counterinsurgencies in places ranging from Iraq and Malaya to Borneo.
By the time the United States began backing the mujahedeen in Afghanistan, it already had significant experience with proxy warfare and its perils. After World War II, the U.S. eagerly embraced foreign surrogates, generally in poor and underdeveloped countries, in the name of the Cold War. These efforts included the attempt to overthrow Fidel Castro via a proxy Cuban force that crashed and burned at the Bay of Pigs; the building of a Hmong army in Laos which ultimately lost to Communist forces there; and the bankrolling of a French war in Vietnam that failed in 1954 and then the creation of a massive army in South Vietnam that crumbled in 1975, to name just a few unsuccessful efforts.
A more recent proxy failure occurred in Iraq. For years after the 2003 invasion, American policy-makers uttered a standard mantra: “As Iraqis stand up, we will stand down.” Last year, those Iraqis basically walked off.
Between 2003 and 2011, the United States pumped tens of billions of dollars into “reconstructing” the country with around $20 billion of it going to build the Iraqi security forces. This mega-force of hundreds of thousands of soldiers and police was created from scratch to prop up the successors to the government that the United States overthrew. It was trained by and fought with the Americans and their coalition partners, but that all came to an end in December 2011.
Despite Obama administration efforts to base thousands or tens of thousands of troops in Iraq for years to come, the Iraqi government spurned Washington’s overtures and sent the U.S. military packing. Today, the Iraqi government supports the Assad regime in Syria, and has a warm and increasingly close relationship with long-time U.S. enemy Iran. According to Iran's semiofficial Fars News Agency, the two countries have even discussed expanding their military ties.
African Shadow Wars
Despite a history of sinking billions into proxy armies that collapsed, walked away, or morphed into enemies, Washington is currently pursuing plans for proxy warfare across the globe, perhaps nowhere more aggressively than in Africa.
Under President Obama, operations in Africa have accelerated far beyond the more limited interventions of the Bush years. These include last year’s war in Libya; the expansion of a growing network of supply depots, small camps, and airfields; a regional drone campaign with missions run out of Djibouti, Ethiopia, and the Indian Ocean archipelago nation of Seychelles; a flotilla of 30 ships in that ocean supporting regional operations; a massive influx of cash for counterterrorism operations across East Africa; a possible old-fashioned air war, carried out on the sly in the region using manned aircraft; and a special ops expeditionary force (bolstered by State Department experts) dispatched to help capture or kill Lord’s Resistance Army (LRA) leader Joseph Kony and his senior commanders. (This mission against Kony is seen by some experts as a cover for a developing proxy war between the U.S. and the Islamist government of Sudan -- which is accused of helping to support the LRA -- and Islamists more generally.) And this only begins to scratch the surface of Washington’s fast-expanding plans and activities in the region.
In Somalia, Washington has already involved itself in a multi-pronged military and CIA campaign against Islamist al-Shabaab militants that includes intelligence operations, training for Somali agents, a secret prison, helicopter attacks, and commando raids. Now, it is also backing a classic proxy war using African surrogates. The United States has become, as the Los Angeles Times put it recently, “the driving force behind the fighting in Somalia,” as it trains and equips African foot soldiers to battle Shabaab militants, so U.S. forces won’t have to. In a country where more than 90 Americans were killed and wounded in a 1993 debacle now known by the shorthand “Black Hawk Down,” today’s fighting and dying has been outsourced to African soldiers.
Earlier this year, for example, elite Force Recon Marines from the Special Purpose Marine Air Ground Task Force 12 (or, as a mouthful of an acronym, SPMAGTF-12) trained soldiers from the Uganda People's Defense Force. It, in turn, supplies the majority of the troops to the African Union Mission in Somalia (AMISOM) currently protecting the U.S.-supported government in that country’s capital, Mogadishu.
This spring, Marines from SPMAGTF-12 also trained soldiers from the Burundi National Defense Force (BNDF), the second-largest contingent in Somalia. In April and May, members of Task Force Raptor, 3rd Squadron, 124th Cavalry Regiment of the Texas National Guard, took part in a separate training mission with the BNDF in Mudubugu, Burundi. SPMAGTF-12 has also sent its trainers to Djibouti, another nation involved in the Somali mission, to work with an elite army unit there.
At the same time, U.S. Army troops have taken part in training members of Sierra Leone’s military in preparation for their deployment to Somalia later this year. In June, U.S. Army Africa commander Major General David Hogg spoke encouragingly of the future of Sierra Leone’s forces in conjunction with another U.S. ally, Kenya, which invaded Somalia last fall (and just recently joined the African Union mission there). “You will join the Kenyan forces in southern Somalia to continue to push al Shabaab and other miscreants from Somalia so it can be free of tyranny and terrorism and all the evil that comes with it,” he said. “We know that you are ready and trained. You will be equipped and you will accomplish this mission with honor and dignity.”
Readying allied militaries for deployment to Somalia is, however, just a fraction of the story when it comes to training indigenous forces in Africa. This year, for example, Marines traveled to Liberia to focus on teaching riot-control techniques to that country’s military as part of what is otherwise a State Department-directed effort to rebuild its security forces.
In fact, Colonel Tom Davis of U.S. Africa Command (AFRICOM) recently told TomDispatch that his command has held or has planned 14 major joint training exercises for 2012 and a similar number are scheduled for 2013. This year’s efforts include operations in Morocco, Cameroon, Gabon, Botswana, South Africa, Lesotho, Senegal, and Nigeria, including, for example, Western Accord 2012, a multilateral exercise involving the armed forces of Senegal, Burkina Faso, Guinea, Gambia, and France.
Even this, however, doesn’t encompass the full breadth of U.S. training and advising missions in Africa. “We… conduct some type of military training or military-to-military engagement or activity with nearly every country on the African continent,” wrote Davis.
Our American Proxies
Africa may, at present, be the prime location for the development of proxy warfare, American-style, but it’s hardly the only locale where the United States is training indigenous forces to aid U.S. foreign policy aims. This year, the Pentagon has also ramped up operations in Central and South America as well as the Caribbean.
In Honduras, for example, small teams of U.S. troops are working with local forces to escalate the drug war there. Working out of Forward Operating Base Mocoron and other remote camps, the U.S. military is supporting Honduran operations by way of the methods it honed in Iraq and Afghanistan. U.S. forces have also taken part in joint operations with Honduran troops as part of a training mission dubbed Beyond the Horizon 2012, while Green Berets have been assisting Honduran Special Operations forces in anti-smuggling operations. Additionally, an increasingly militarized Drug Enforcement Administration sent a Foreign-deployed Advisory Support Team, originally created to disrupt the poppy trade in Afghanistan, to aid Honduras’s Tactical Response Team, that country’s elite counternarcotics unit.
The militarization and foreign deployment of U.S. law enforcement operatives was also evident in Tradewinds 2012, a training exercise held in Barbados in June. There, members of the U.S. military and civilian law enforcement agencies joined with counterparts from Antigua and Barbuda, Bahamas, Barbados, Belize, Canada, Dominica, the Dominican Republic, Grenada, Guyana, Haiti, Jamaica, St. Kitts and Nevis, St. Lucia, St. Vincent and the Grenadines, and Suriname, as well as Trinidad and Tobago, to improve cooperation for “complex multinational security operations.”
Far less visible have been training efforts by U.S. Special Operations Forces in Guyana, Uruguay, and Paraguay. In June, special ops troops also took part in Fuerzas Comando, an eight-day “competition” in which the elite forces from 21 countries, including the Bahamas, Belize, Brazil, Canada, Chile, Colombia, Costa Rica, the Dominican Republic, Ecuador, El Salvador, Guatemala, Guyana, Honduras, Jamaica, Mexico, Panama, Paraguay, Peru, Trinidad and Tobago, and Uruguay, faced-off in tests of physical fitness, marksmanship, and tactical capabilities.
This year, the U.S. military has also conducted training exercises in Guatemala, sponsored “partnership-building” missions in the Dominican Republic, El Salvador, Peru, and Panama, and reached an agreement to carry out 19 “activities” with the Colombian army over the next year, including joint military exercises.
The Proxy Pivot
Coverage of the Obama administration’s much-publicized strategic “pivot” to Asia has focused on the creation of yet more bases and new naval deployments to the region. The military (which has dropped the word pivot for “rebalancing”) is, however, also planning and carrying out numerous exercises and training missions with regional allies. In fact, the Navy and Marines alone already reportedly engage in more than 170 bilateral and multilateral exercises with Asia-Pacific nations each year.
One of the largest of these efforts took place in and around the Hawaiian Islands from late June through early August. Dubbed RIMPAC 2012, the exercise brought together more than 40 ships and submarines, more than 200 aircraft, and 25,000 personnel from 22 nations, including Australia, India, Indonesia, Japan, Malaysia, New Zealand, Philippines, Singapore, South Korea, Thailand, and Tonga.
Almost 7,000 American troops also joined around 3,400 Thai forces, as well as military personnel from Indonesia, Japan, Malaysia, Singapore, and South Korea as part of Cobra Gold 2012. In addition, U.S. Marines took part in Hamel 2012, a multinational training exercise involving members of the Australian and New Zealand militaries, while other American troops joined the Armed Forces of the Philippines for Exercise Balikatan.
The effects of the “pivot” are also evident in the fact that once neutralist India now holds more than 50 military exercises with the United States each year -- more than any other country in the world. “Our partnership with India is a key part of our rebalance to the Asia-Pacific and, we believe, to the broader security and prosperity of the 21st century,” said Deputy Secretary of Defense Ashton Carter on a recent trip to the subcontinent. Just how broad is evident in the fact that India is taking part in America’s proxy effort in Somalia. In recent years, the Indian Navy has emerged as an “important contributor” to the international counter-piracy effort off that African country’s coast, according to Andrew Shapiro of the State Department’s Bureau of Political-Military Affairs.
Peace by Proxy
India’s neighbor Bangladesh offers a further window into U.S. efforts to build proxy forces to serve American interests.
Earlier this year, U.S. and Bangladeshi forces took part in an exercise focused on logistics, planning, and tactical training, codenamed Shanti Doot-3. The mission was notable in that it was part of a State Department program, supported and executed by the Pentagon, known as the Global Peace Operations Initiative (GPOI).
First implemented under George W. Bush, GPOI provides cash-strapped nations funds, equipment, logistical assistance and training to enable their militaries to become “peacekeepers” around the world. Under Bush, from the time the program was established in 2004 through 2008, more than $374 million was spent to train and equip foreign troops. Under President Obama, Congress has funded the program to the tune of $393 million, according to figures provided to TomDispatch by the State Department.
In a speech earlier this year, the State Department’s Andrew Shapiro told a Washington, D.C., audience that “GPOI is particularly focusing a great deal of its efforts to support the training and equipping of peacekeepers deploying to... Somalia” and had provided “tens of millions of dollars worth of equipment for countries deploying [there].” In a blog post he went into more detail, lauding U.S. efforts to train Djiboutian troops to serve as peacekeepers in Somalia and noting that the U.S. had also provided impoverished Djibouti with radar equipment and patrol boats for offshore activities. “Djibouti is also central to our efforts to combat piracy,” he wrote, “as it is on the front line of maritime threats including piracy in the Gulf of Aden and surrounding waters.”
Djibouti and Bangladesh are hardly unique. Under the auspices of the Global Peace Operations Initiative, the U.S. has partnered with 62 nations around the globe, according to statistics provided by the State Department. These proxies-in-training are, not surprisingly, some of the poorest nations in their respective regions, if not the entire planet. They include Benin, Ethiopia, Malawi, and Togo in Africa, Nepal and Pakistan in Asia, and Guatemala and Nicaragua in the Americas.
The Changing Face of Empire
With ongoing military operations in Asia, Africa, the Middle East and Latin America, the Obama administration has embraced a six-point program for light-footprint warfare relying heavily on special operations forces, drones, spies, civilian partners, cyber warfare, and proxy fighters. Of all the facets of this new way of war, the training and employment of proxies has generally been the least noticed, even though reliance on foreign forces is considered one of its prime selling points. As the State Department’s Andrew Shapiro put it in a speech earlier this year: “[T]he importance of these missions to the security of the United States is often little appreciated… To put it clearly: When these peacekeepers deploy it means that U.S. forces are less likely to be called on to intervene.” In other words, to put it even more clearly, more dead locals, fewer dead Americans.
The evidence for this conventional wisdom, however, is lacking. And failures to learn from history in this regard have been ruinous. The training, advising, and outfitting of a proxy force in Vietnam drew the United States deeper and deeper into that doomed conflict, leading to tens of thousands of dead Americans and millions of dead Vietnamese. Support for Afghan proxies during their decade-long battle against the Soviet Union led directly to the current disastrous decade-plus American War in Afghanistan.
Right now, the U.S. is once again training, advising, and conducting joint exercises all over the world with proxy war on its mind and the concept of “unintended consequences” nowhere in sight in Washington. Whether today’s proxies end up working for or against Washington’s interests or even become tomorrow’s enemies remains to be seen. But with so much training going on in so many destabilized regions, and so many proxy forces being armed in so many places, the chances of blowback grow greater by the day.
To stay on top of important articles like these, sign up to receive the latest updates from TomDispatch.com here.
Posted on: Thursday, August 9, 2012 - 13:13
SOURCE: Bloomberg News (8-8-12)
Colin Read is chairman of the finance department at the State University of New York, Plattsburgh. He is the author of the “Great Minds in Finance” series and other finance titles published by Palgrave MacMillan.
When machines replace seasoned traders and market makers, mistakes can occur at dizzying speed.
It happened with the notorious “flash crash” on May 6, 2010, and again on Aug. 1 this year, when software at Knight Capital Group Inc. (KCG) malfunctioned, triggering unintended trades and leading to a $440 million loss for the company.
Ironically, Knight Capital Group was originally known as a market maker, with trading specialists who oversaw trades on each side of a security to ensure the market functioned in an orderly and efficient manner. The company’s troubles once again show the extent to which Wall Street now relies on algorithmic programs to execute its trades, for better or worse.
Although many in the financial world have expressed concerns that a new era of automated finance is destabilizing the markets, algorithmic trading isn’t new -- it’s almost as old as computers themselves....
Posted on: Thursday, August 9, 2012 - 12:52
SOURCE: Huffington Post (8-8-12)
Harvey J. Kaye is the Ben and Joyce Rosenberg Professor of Democracy and Justice Studies and Director of the Center for History and Social Change at the University of Wisconsin-Green Bay.
Republicans are masters in the art of denying or suppressing the truth. And yet every so often one of them blurts it out -- often in a perverse fashion, but dangerously so, nonetheless. Last week, GOP Congressman Darrell Issa of California, apparently incapable of holding it in, did just that.
According to the Federal Times Issa told a conference of the Association of Government Accountants in San Diego that:
"The Greatest Generation created many of what the private sector would call Ponzi schemes. They created Social Security, they created Medicare on their watch, [they] created Medicaid ... All of that without resources or funding." Indeed, he continued, "A generation that was doing many things right -- coming out of World War II -- also planted the seeds for all the problems we have today."
You can just imagine the reaction of his Republican colleagues. They surely can't defend him on it. Ever since Reagan spoke at Normandy, France on the occasion of the 40th anniversary of the D-Day landings in June 1984, they have been trying their best to take advantage of Americans' phenomenal admiration and affection for the World War II veterans -- apparently hoping to wrap themselves in the "flags of our fathers," not to mention praying that Americans do not remember that their party opposed America's entry into the Second World War.
And yet now look at what their right wing comrade Issa has done!
"Didn't you get the memo, Darrell?" the GOPers must be asking, "You know, the one that said 'Go after President Franklin Roosevelt as much you can: Accuse him of hijacking the Constitution, planting the seeds of socialism, and setting up the country for eventual failure. But whatever you do, don't say anything bad the Greatest Generation -- and for that matter, definitely do not say anything that might serve to remind Americans that FDR and his fellow citizens not only succeeded in defending the United States against Germany and Japan in the 1940s, but also in rescuing it from the Great Depression in the 1930s.'"
In his defense, the congressman did refer to Social Security, and the generation's later Medicare and Medicaid initiatives, as "Ponzi schemes." Still, one can practically hear the Tea Party's admonitions: "Darrell, what are you doing? You are putting our whole historical project at risk. Indict FDR, not our parents and grandparents! Hey, don't forget, loose lips sink ships! Talk like that and the next thing you know Americans might start remembering what made the Greatest Generation and its greatest leader great..."
They might start remembering how a generation subjected big business to public account and regulation, empowered government to address the needs of working people, organized labor unions, fought for their rights, enlarged the 'We' in 'We the People,' established a social security system, expanded the nation's public infrastructure, improved the environment, and imbued themselves with fresh democratic convictions, hopes, and aspirations -- all of which gave them the courage, confidence, and wherewithal to fight and win a global war against fascism and imperialism...
They might start remembering how a president and people saved the nation from economic ruin and political oblivion and turned it into the strongest and most prosperous country on earth by making it freer, more equal, and more democratic than ever before in the process.
Hell, Darrell, before you know it Americans might start remembering how progressive their parents and grandparents were and want to start emulating them!
Posted on: Thursday, August 9, 2012 - 12:46
SOURCE: WSJ (8-7-12)
Mr. Roberts, a historian, is author most recently of The Storm of War: A New History of the Second World War (Harper, 2011).
What is it about the month of August? Why should we still persist in regarding it as a quiet time—with Congress in recess, business slowed down, and people on holiday—when so many world-historical events take place in this month? You can ignore the Ides of March, but history shows that it's in the dog days of August that great events take place.
Ever since the Roman Senate proclaimed in A.D. 8 that the eighth month of every year be named after the Emperor Augustus (63 B.C.-A.D. 14), nephew and adopted heir of Julius Caesar, the month has seen a disproportionate share of cataclysmic events.
In the last century alone, World War I broke out on Aug. 4, 1914, and Adolf Hitler ordered the invasion of Poland on the night of Aug. 31, 1939. World War II was only won with the dropping of two atomic bombs in August 1945. Other 20th-century conflicts sparked in August: the Vietnam War, with the Gulf of Tonkin incident of August 1964; the Russian invasion of Czechoslovakia in 1968; and the Gulf War, when Saddam Hussein invaded Kuwait in August 1990...
Posted on: Wednesday, August 8, 2012 - 13:25
SOURCE: Newsweek (8-6-12)
Niall Ferguson is a professor of history at Harvard University. He is also a senior research fellow at Jesus College, Oxford University, and a senior fellow at the Hoover Institution, Stanford University. His Latest book, Civilization: The West and the Rest, has just been published by Penguin Press.
The British—slightly less than a thousand of them—used to govern India. Without air-conditioning.
Conan O’Brien was not the only one who watched the London Olympic opening ceremonies with amazement. "Hard to believe my ancestors were conquered by theirs," he tweeted. Every Indian watching must have been thinking the very same.
Until their TVs went dark.
The recent power outage in India interested me more than the Olympics. (I had a very British reaction to the opening ceremonies: I found them excruciatingly embarrassing.) The Indian blackout was surely the biggest electricity failure in history, affecting a staggering 640 million people. If you have ever visited Delhi in the summer, you will have some idea what it must have felt like.
"Every door and window was shut," Rudyard Kipling recalled of summer in the scorched Indian plains, "for the outside air was that of an oven. The atmosphere within was only 104 degrees, as the thermometer bore witness, and heavy with the foul smell of badly-trimmed kerosene lamps; and this stench, combined with that of native tobacco, baked brick, and dried earth, sends the heart of many a strong man down to his boots, for it is the smell of the Great Indian Empire when she turns herself for six months into a house of torment."
There was a reason the British moved their capital to the cool Himalayan hill station of Simla every summer. Maybe today’s Indian government should consider following their example...
Posted on: Tuesday, August 7, 2012 - 13:51
SOURCE: Philadelphia Inquirer (8-7-12)
Jonathan Zimmerman teaches history at New York University and lives in Narberth. He is the author of "Small Wonder: The Little Red Schoolhouse in History and Memory" (Yale University Press).
Once upon a time, American elections were rife with corruption. Party bosses bought votes with strong drink and cold cash, or stuffed ballot boxes with bogus names. Then along came the good-government reformers who cleaned up our democracy with new election laws and regulations.
That's the story we all learned in high school. And it's true, up to a point. But it leaves out a crucial fact: Those measures also sought to bar certain people from the polls. The goal of election reform wasn't simply a clean vote; it was also to keep out the "wrong" kind of voter.
Pennsylvania's voter-ID law, which is being challenged in state court, follows this pattern. On its face, it seems neutral and unimpeachable: Who could object to safeguards against fraud? But in practice, as opponents told the court last week, it would make it harder for poor people and minorities to vote....
Posted on: Tuesday, August 7, 2012 - 11:10
SOURCE: The Root (7-31-12)
Linda Heywood, Ph.D., and John Thornton, Ph.D., are professors of history and African-American studies at Boston University.
(The Root) -- Monday's New York Times article on President Obama's roots in Southern slavery through his mother has reopened the contention that the first Africans brought to Virginia were indentured servants and not slaves. While some observers, such as writer Alondra Nelson, may contend that genealogy studies prove little beyond how closely all members of the human family are related, they are invaluable for understanding the greater past.
Yesterday's news was also about real historical events and the ability to bring the past alive. There was a real John Punch, real laws that defined his status in a racializing America and real descendants who made certain decisions in the evolving marketplace of American race relations. These past decisions have major implications about the way that contemporary Americans view themselves and fellow Americans.
That being said, the issue is a complicated one. As professors of history and African-American studies at Boston University, we have been unraveling the story of the first African arrivals in Virginia over the past decade, and despite suggestions to the contrary in the New York Times article, we can assert that Africans were not indentured servants as Europeans were....
HNN Special: Barack Obama's Slave Ancestor
On Other Websites
Posted on: Monday, August 6, 2012 - 12:49
SOURCE: NYT (8-1-12)
Jared Diamond, a professor of geography at the University of California, Los Angeles, is the author of the forthcoming book “The World Until Yesterday: What Can We Learn From Traditional Societies?”
MITT ROMNEY’S latest controversial remark, about the role of culture in explaining why some countries are rich and powerful while others are poor and weak, has attracted much comment. I was especially interested in his remark because he misrepresented my views and, in contrasting them with another scholar’s arguments, oversimplified the issue.
It is not true that my book “Guns, Germs and Steel,” as Mr. Romney described it in a speech in Jerusalem, “basically says the physical characteristics of the land account for the differences in the success of the people that live there. There is iron ore on the land and so forth.”
That is so different from what my book actually says that I have to doubt whether Mr. Romney read it. My focus was mostly on biological features, like plant and animal species, and among physical characteristics, the ones I mentioned were continents’ sizes and shapes and relative isolation. I said nothing about iron ore, which is so widespread that its distribution has had little effect on the different successes of different peoples. (As I learned this week, Mr. Romney also mischaracterized my book in his memoir, “No Apology: Believe in America.”)
That’s not the worst part. Even scholars who emphasize social rather than geographic explanations — like the Harvard economist David S. Landes, whose book “The Wealth and Poverty of Nations” was mentioned favorably by Mr. Romney — would find Mr. Romney’s statement that “culture makes all the difference” dangerously out of date. In fact, Mr. Landes analyzed multiple factors (including climate) in explaining why the industrial revolution first occurred in Europe and not elsewhere....
Posted on: Friday, August 3, 2012 - 23:20
SOURCE: Open Democracy (8-2-12)
For over thirty years, the American media have repeatedly pronounced the death of the women’s movement and blamed feminism for women’s failure to “have it all.“ But none of this is true. The movement has spread around the globe and early radical feminists wanted to change the world, not just seek individual self-fulfillment.
The latest media-generated debate exploded when Anne-Marie Slaughter revealed in the July 2012 edition of the Atlantic Magazine why she had left her fast-track, high-pressured job for Hilary Clinton at the State Department. Families, she admitted, could not withstand the strain. Even a superwoman like herself-- blessed with a helpful husband, enough wealth to buy domestic help and child care, could not do it all. Although she described the insane work policies that made her neglect her family, she implicitly blamed feminism for promising a false dream. It was too hard, the hours too long, the persistent sense of guilt too pervasive.
What was missing in her article was the history of “having it all.” Too many editors care more about how an article about the death of feminism will, without fail, create a sensation and increase readership than about an inaccurate media trope.
And her article went viral, as they say, setting off a round of attacks and rebuttals about the possibility of women enjoying - not just enduring - family and work. She returned to her former life as a high-powered professor at Princeton University, which in my experience, hardly counts as opting out of trying to have it all.
To Slaughter, I want to say, you may know a great deal about foreign policy, but you certainly don’t know much about our history. By 1965, young American women activists in Students for a Democratic Society asked themselves what would happen to America’s children if women worked outside the home. The fact is, activists in the women’s movement knew women could never have it all, unless they were able to change the society in which they lived.
At the August 1970 march for Women’s Strike for Equality, the three preconditions for emancipation included child care, legal abortion and equal pay. “There are no individual solutions,” feminists chanted in the late sixties. If feminism were to succeed as a radical vision, the movement had to advance the interests of all women.
The belief that you could become a superwoman became a journalistic trope in the 1970s and has never vanished. By 1980, most women’s (self-help) magazines turned a feminist into a Superwoman, hair flying as she rushed around, attaché case in one arm, a baby in the other. The Superwomen could have it all, but only if she did it all. And that was exactly what feminists had not wanted.
American social movements tend to move from a collectivistic vision to one that emphasizes the success of the individual. That is precisely what happened between 1970 and 1980. Alongside the original women’s movement grew another kind of feminism, one that was shaped by the media, consumerism and the therapeutic self-help movements that sprang up in that decade. Among the many books that began promising such fulfillment for women, was the best seller “Having It All” by Elizabeth Gurley Brown (1982) who tried to teach every woman how to achieve everything she wanted in life.
Self -help magazines and lifestyle sections of newspapers also began to teach women how to have it all. Both turned a collectivistic vision of feminism into what I have elsewhere called Consumer Feminism and Therapeutic Feminism. Millions of women first heard of the movement when they read about the different clothes they needed to buy in order to look like a superwoman and the therapy they needed to become a confident and competent superwoman. Self-help books and magazines ignored the economic and social conditions women faced and instead emphasized the way in which each individual woman, if only she thought positively about herself, could achieve self-realization and emancipation.
By 1980, the idea of improving all women’s lives—sisterhood—had been transformed into creating individual superwomen. Early activists—like myself-- bristled at the idea that feminism was about individual transformation. But no matter how many articles feminists wrote, they couldn’t compete with all the books and magazines that taught women how to become an assertive, well-dressed independent woman—as long as she had the wealth to hire domestic and child care to assist her ascent into men’s world.
" The all-around Supermom rises, dresses in her vivid pants suit, oversees breakfast and then searches for the sneakers and then goes off to her glamorous high-paying job at an advertisement agency where she seeks Personal Fulfillment and kids’ college tuition. She has, of course, previously found a Mary Poppins figure to take care of the kids after school. Mary Poppins takes care of them as if they were her own, works for a mere pittance and is utterly reliable.
Supermom II comes home from work at 5:30, just as fresh as a daisy, and then spends a truly creative hour with her children. After all, it’s not the quantity of the time, but the quality. She catches up on their day, soothes their disputes and helps with their homework, while creating something imaginative in her Cuisinart (with her left hand tied behind her back). After dinner—during which she teaches them about the checks and balances of our system of government--she bathes and reads to them, and puts the clothes in the dryer. She then turns to her husband and eagerly suggests that they explore some vaguely kind of kinky sexual fantasy.”
The feminist-- as remade by the media and popular culture-- emerged as a superwoman, who then turned into a scapegoat for a nation’s consumerism, the decline of families, and the country’s therapeutic culture. For this, the women’s movement’s was blamed, even though this selfish superwoman who neglected her family seemed bizarre, not to say repellent, to most of the early activists.
The backlash again feminism, directed as it was against the women’s movement, reflected a moral revulsion against the shallow self-absorption of America’s consumer and therapeutic culture. And when Americans took a good hard look at this narcissistic superwoman who embraced the values of the dominant culture, they grew anxious and frightened. For they no longer saw loyal mothers and wives who would care for their communities, but a dangerous individual, unplugged from home and hearth, in other words, the female version of America’ ambitious but lonely organization man. Thus was born the cultural wars between stay-at-home moms and career women.
Anne-Marie Slaughter’s article, like most women who have complained about how hard it is to “have it all,” focused on an elite group of female professionals who have the means to outsource parts of their job as mother, cook, cleaner and caretaker of the home. What she and others have failed to understand is that the original women’s movement sought an economic and social revolution that would create equality at home and at the workplace. Nor have most critics of feminism understood that the so-called “Mommy Wars” --battles fought between those who worked outside the home and those who were “stay-at-home” moms-- have also been fueled by the media.
Missing from the media’s coverage of these Mommy Wars are the millions of working mothers who will never have it all, but still must do it all. Millions of women cannot afford to care for the children they have, work dead-end jobs, and cannot begin to imagine living the life of a superwoman. These are the women that the radical women’s liberation movement addressed and for whom they sought decent jobs, sustainable wages, and government training, social services and child care. These are the women who are stuck on the sticky floor, not held back by a glass ceiling.
Posted on: Friday, August 3, 2012 - 13:42