Roundup: Historian's TakeFollow Roundup: Historian's Take on RSS and Twitter
This is where we place excerpts by historians writing about the news. On occasion this page also includes political scientists, economists, and law professors who write about history. We may from time to time even include English profs.
SOURCE: Informed Comment (Blog) (2-21-13)
Juan Cole is the Richard P. Mitchell Professor of History and the director of the Center for South Asian Studies at the University of Michigan. His latest book, "Engaging the Muslim World," is just out in a revised paperback edition from Palgrave Macmillan.
The resignation of Prime Minister Hamadi Jebali in Tunisia has created a political crisis that the elected government will have to deal with. Jebali is a politician of the Muslim religious right, from the Ennahda Party, and had led an Ennahda-dominated cabinet in coalition with two smaller secular parties, Moncef Marzouki’s social democratic Congress for the Republic and another small partner.
The Jebali government was shaken by the assassination of secular opposition figure Chokry Belaid, a severe critic of the religious right. Many secular Tunisians openly accused Ennahda of the act (though without proof), withdrawing the public confidence necessary for Jebali to rule from that party. He therefore sought a shake-up of his cabinet, installing non-party technocrats to produce a government of national unity. The Ennahda Party parliamentarians, however, rejected that step. They have the largest bloc of members of parliament, around 40%, but are not a majority.
When Jebali found his proposal blocked, he stepped down Tuesday night. In essence, he treated his party’s rejection of his plan as a vote of no confidence. In parliamentary systems, prime ministers have to step down all the time when they lose a vote of no confidence. I see Jebali’s move as positive. He or someone else will have to try to form a government, being nominated by elected president Marzouki for the task.
Actually Jebali is not the first post-revolution prime minister to step down, and while the political crisis is regrettable (and especially the assassination that caused it), the political process is not. Tunisia was ruled by strongmen for most of its post-independence history, but now has leaders who need the support of parliament and of the people. As we see in Belgium or Italy, getting such support is not always straightforward. But that’s politics, and politics of a parliamentary sort are good, and much better than corrupt, oppressive, inflexible strong men.
SOURCE: National Review (2-20-13)
Conrad Black is the author of Franklin Delano Roosevelt: Champion of Freedom, Richard M. Nixon: A Life in Full, and the recently published A Matter of Principle. He can be reached at email@example.com.
What a pleasure it is, in the wake of Presidents’ Day, to write something upbeat about President Obama. As disappointing as I found his second inaugural address, I am pleased to disagree with commentators from right to left who found fault with his State of the Union message last week. I thought it was pretty good. For the first time in my observations, he actually seemed to take the deficit seriously, though he could not avoid recourse to refuge in his old mouse-hole of the serried ranks of unnamed economists “who say we need $4 trillion in deficit reduction to stabilize our finances.” I don’t believe there are such economists; they are the flip side of the “economic royalists, monopolists, and war profiteers” that FDR used to rail against in the Thirties, to the delight of his followers. (There were no war profiteers or “malefactors of great wealth” in the U.S. in the Thirties. But it was a convenient cul-de-sac into which he could sweep public anger, where it could harmlessly dissipate itself — though it still rankles with the contemporary nincompoop Right, which holds that FDR didn’t alleviate the Depression but won four straight terms through electoral flimflam.)
To President Obama’s phantom gallery of economists is imputed the view that if the country cuts just $400 billion a year from what the deficit will be if no changes are made to taxing and spending, for ten years, all will be well. No, it won’t: The accumulated deficit, which was $10 trillion four years ago, and is $17 trillion today, will be $27 trillion in ten years on that scenario, and, in the words of Douglas MacArthur (referring to nuclear war), “Armageddon will be at our door.”...
SOURCE: National Review (2-21-13)
In his first term, President Obama was criticized for trash-talking the 1-percenters while enjoying the aristocracy of Martha’s Vineyard and the nation’s most exclusive golf courses.
Obama never quite squared his accusations that “millionaires and billionaires” had not paid their fair share with his own obvious enjoyment of the perks of “corporate jet owners,” “fat cat bankers,” and Las Vegas junketeers.
Now, that paradox has continued right off the bat in the second term. In the State of the Union, Obama once more went after “the few” and “the wealthiest and the most powerful,” whom he blasted as the “well-off and the well-connected” and the “billionaires with high-powered accountants.”...
SOURCE: The New Republic (2-6-13)
Michael Kazin’s most recent book is American Dreamers: How the Left Changed a Nation. He is co-editor of Dissent and teaches history at Georgetown University.
Contrary to what everyone who loved—or hated—his inaugural address seems to think, President Obama has yet to demonstrate that he is determined to launch a new liberal era.
The big speech did gesture in that direction. Obama declared, in the style of FDR, that “our country cannot succeed when a shrinking few do very well and a growing many barely make it.” The line about equality being “the star that guides us still; just as it guided our forebears through Seneca Falls, and Selma, and Stonewall” was a welcome salute to three of the most prominent civil rights movements in American history. And not since Lyndon Johnson has a president spoken about poverty with such apparent conviction and specificity: “We are true to our creed when a little girl born into the bleakest poverty knows that she has the same chance to succeed as anybody else, because she is an American; she is free, and she is equal, not just in the eyes of God but also in our own.”
But to believe that Obama has truly revived the great tradition of egalitarian reform is to neglect the distinction between two species of modern liberalism: that which promotes the equality of rights and that which works toward a greater equality of opportunity and wealth. The latter, the social variety, emerged from the class tumult of the Gilded Age and inspired such key New Deal measures as Social Security, the WPA, and the National Labor Relations Act. The former harks back to the abolitionists and early feminists; it demands that the promise of individual liberty be extended to every American, regardless of their skin color, national origin, gender, or whom they happen to love....
SOURCE: Jerusalem Post (2-19-13)
The presence of a reported 50,000 Iranian Revolutionary Guards in Syria to support President Bashar Assad is an indication of what Iran and Hezbollah have in mind – the preservation of Syria as a strategic asset in the regional power struggle. Losing Syria would leave Hezbollah isolated in Lebanon; a Syrian-Lebanese alliance, on the other hand, would allow Hezbollah to fortify Iranian interests in both countries.
A future Syrian government would then comprise a Sunni-Hezbollah alliance, similar to the political structure in Lebanon, without Assad and the Alawites, who would become a minority protected by Hezbollah. Following the Lebanese model, Hezbollah in Syria would be a state-within-a-state, with its own army and political structure, allied with a weak, fragmented Islamist Sunni-dominated state. For Hezbollah this is the perfect solution; it allows them to function covertly without the inconveniences of diplomatic restrictions.
Hezbollah can and will attack Israel, as it did in the Second Lebanese War (2006), protected by its parent state and entrenched within Syria, with vastly expanded capabilities....
SOURCE: Omaha World-Herald (2-18-13)
I taught a course last month in the United Arab Emirates, which isn’t a democracy. But according to an Emirati guy I met there, the United States isn’t much of a democracy, either.
“Anyone you don’t like, you just assassinate him with a drone,” he told me. “Shoot first, ask questions later.”
But now lots of people are asking questions about U.S. drone strikes, especially after the recent confirmation hearings for John O. Brennan. Nominated by President Barack Obama to direct the Central Intelligence Agency, Brennan defended the CIA’s targeted killings in Pakistan, Yemen and elsewhere as essential to national security....
SOURCE: Sightings from the Martin Marty Center (2-15-13)
R. Scott Hanson teaches history and religion at Temple University and the University of Pennsylvania and is the author of City of Gods: Religious Freedom, Immigration, and Pluralism in Flushing, Queens -- New York City, 1945-2001 (New York: NYU Press, under review).
It did not receive much attention on Monday, January 21 during President Obama’s second inauguration, but some were alarmed when the reporter at the private pre-inaugural worship service at St. John’s Episcopal Church noted that Rev. Andy Stanley, who gave the sermon, referred to the President as “Pastor in Chief.” In an interview with Christianity Today several days after the inauguration, Stanley said his remark had been taken out of context by some reporters, clarifying that it had come from being impressed by the President’s visit with families after the Sandy Hook Elementary School tragedy. Stanley had said, "Mr. President, I don't know the first thing about being President, but I know a bit about being a pastor. And during the Newtown vigil on December 16th after we heard what you did—I just want to say on behalf of all of us as clergy, thank you." And added, "I turned to Sandra [Stanley’s wife] that night and said, 'Tonight he's the Pastor in Chief.'" Other commentators also referred to President Obama as pastor in chief after being moved by his separate visits with each family who lost a child and by his speech to those gathered in mourning.
Later on Monday, President Obama’s inaugural address did even more to cast him as Pastor in Chief with his use of religious language and themes. He used the word “God” five times (and twice more with “His” and “He”), which is just short of Reagan’s record of eight in 1985. Obama also mentioned “our creed” five times, giving the second sentence of the Declaration of Independence significance as a kind of civil religion. Finally, he ended by saying, “Thank you, God Bless you, and may He forever bless these United States of America.”
It may strike some as alarming that the president would be referred to as a Pastor in Chief or that he would make frequent use of religious language in such a public ceremony, but this association is something President Obama shares with many of his recent predecessors. Religious language in presidential inaugural addresses has become increasingly more explicit in the twentieth century, particularly since World War II.
As I reported in an earlier Sightings column in 2001, The earliest American presidents came from Protestant backgrounds but were heavily influenced by Enlightenment philosophy as young men and were Deists by the time they entered politics. Since their generation had made the separation of church and state a fundamental American principle they were quite hesitant, and very creative, when naming God. Washington referred to "that Almighty Being who rules over the universe," Adams to "the Protector in all ages," and Jefferson to "that Infinite Power." But with Presidents Monroe and Pierce, we see the beginning of a trend with the actual use of "God." Lincoln mentioned God six times in his second inaugural address (and eight more times with He, His, the Almighty, and the Lord), and he was daring in his use of Scripture to judge the Confederacy near the end of the Civil War. Such references to God then appear in subsequent addresses, steadily increase in the twentieth century, and reach a record high with Reagan in 1985.
It has also become almost obligatory since Reagan (1981) to end every inaugural address (and State of the Union address now, too) with some combination of "God bless you" and "God bless America"—a move from asking for, appealing to, or seeking divine guidance to asking God to bless the people and country. Eisenhower (1953) and George H. W. Bush (1989) even led the people in prayer. George W. Bush, who was well known for his “born-again” evangelical Christian background, also caused a stir when he alluded to plans in his 2001 inaugural address to begin an Office of Faith-Based and Community Initiatives (renamed the Office of Faith-Based and Neighborhood Partnerships by President Obama). At the same time, Bush and Obama were careful to be inclusive of other world religions in their addresses, and Obama also made room for “non-believers” in his first inaugural address. Obama's second address took place on MLK Day, and some were reminded of an oratorical style that reflects the prophetic black preaching tradition of someone like King.
The result of such changes has lent recent addresses an ever more sermon-like quality, with the president as a kind of pastor to the people. But why? Perhaps such language gradually became less taboo, as presidents have felt more and more free to employ it. Or it may also stem from the increasing intimacy of the event. Thanks to the media, inaugurations have moved from the confines of Congress (last with J.Q. Adams in 1825) to radio (Coolidge, 1925) and then finally to television (Truman, 1949). Founding fathers like Jefferson and Madison would no doubt be very pleased by President Obama’s powerful reinterpretation of the Declaration of Independence and Constitution for our times, but it is likely they would be uncomfortable with the label of pastor in chief and the common trend among recent presidents to employ more explicit religious language in their addresses.
SOURCE: Philadelphia Inquirer (2-11-13)
Jonathan Zimmerman is Professor of Education and History and Director of the History of Education Program, Steinhardt School of Culture, Education, and Human Development.
Now that women can assume combat roles in the military, I've got a question for you: Whose daughters will do the combatting?
Not mine. And not yours, either, if they live in a leafy, upper-middle-class suburb like the one where my two girls have grown up.
That's because the military draws overwhelmingly from the middle and lower-middle classes of our society. And that's what most of our news coverage has ignored, in the rush to congratulate the Pentagon for removing the ban on women in combat.
Let's be clear: the Pentagon should be congratulated. Thousands of female medics, drivers, and other servicewomen have already seen battle overseas. But they have often been blocked from key promotions because they lacked official "combat" designations....
SOURCE: NYT (2-14-13)
SOURCE: East Asia Forum (2-13-13)
Dr John Blaxland is Senior Fellow at the Strategic and Defence Studies Centre, the Australian National University.
Rikki Kersten is Professor of modern Japanese political history in the School of International, Political and Strategic Studies at the College of Asia and the Pacific, the Australian National University.
The recent activation of Chinese weapons radars aimed at Japanese military platforms around the Senkaku/Diaoyu Islands is the latest in a series of incidents in which China has asserted its power and authority at the expense of its neighbours.
The radars cue supersonic missile systems and give those on the receiving end only a split second to respond. With Japanese law empowering local military commanders with increased discretion to respond (thanks to North Korea’s earlier provocations), such incidents could easily escalate. In an era of well-established UN-related adjudication bodies like the International Court of Justice (ICJ), how has it come to this? These incidents disconcertingly echo past events.
In the early years of the 20th century, most pundits considered a major war between the great powers a remote possibility. Several incidents prior to 1914 were handled locally or successfully defused by diplomats from countries with alliances that appeared to guarantee the peace. After all, never before had the world been so interconnected — thanks to advanced communications technology and burgeoning trade. But alliance ties and perceived national interests meant that once a major war was triggered there was little hope of avoiding the conflict. Germany’s dissatisfaction with the constraints under which it operated arguably was a principal cause of war in 1914. Similarly, Japan’s dissatisfaction helped trigger massive conflict a generation later....
SOURCE: NYT (2-9-13)
SOURCE: NYT (2-13-13)
SOURCE: NYT (2-13-13)
SOURCE: National Review (2-7-13)
Amity Shlaes, who directs the George W. Bush Institute’s economic-growth program, is the author of the book Coolidge, forthcoming from HarperCollins.
Action is something Americans of both parties demand of their presidents these days. This is natural for Democrats, whose heritage is all action, starting with Franklin Roosevelt and his Hundred Days. But Republicans like energy and a big executive as well. Over the course of the campaign this past year, any number of political stars, including Governor Mitch Daniels of Indiana, argued that only an energetic candidate would be up to the job of managing the U.S. fiscal crisis. Mitt Romney worked hard to let voters know his party could beat the Democrats in the legislative arena. He swore up and down that, à la Roosevelt, he would get off to a running start, sending five bills to Congress and signing five executive orders on his first day in the Oval Office.
The Grand Old Party’s abiding affection for a “bigger and better” presidency isn’t entirely logical. After all, the Obama presidency commenced with an effort to reenact the Hundred Days. Yet President Obama’s first-term economic performance itself was not “big” but mediocre, tiny even. Perhaps Republicans should consider whether inaction on the part of the White House can be desirable. Perhaps, led by Republicans, the United States could benefit from trying out an unfashionable idea: the small presidency.
Evidence from a near-forgotten period, the early 1920s, instructs us. In those days the country was suffering economic turmoil similar to our own. Because of a crisis — World War I — the government had intruded in business and financial markets in unprecedented fashion, nationalizing the railroads, shutting down the stock market, and entering the debt market with war bonds....
SOURCE: The Nation (2-12-13)
Jon Wiener teaches US history at UC Irvine.
Thanks to Steven Spielberg and his film Lincoln, we’ve been hit by a new wave of management wisdom supposedly gleaned from the film’s central character. Business Week ran a piece titled “Career Lessons from Spielberg’s Lincoln”; the New York Times called theirs “Lincoln’s School of Management.” Doris Kearns Goodwin, whose book on Lincoln and his cabinet, Team of Rivals, famously provided the basis for some of the movie, has been back on the “leadership advice” circuit....
...[But] there are some key moments in Lincoln’s life that the management advice people have neglected. One came in his Second Inaugural, when he declared that, if the Civil War continued “until all the wealth piled by the bondsman's two hundred and fifty years of unrequited toil shall be sunk, and until every drop of blood drawn with the lash shall be paid by another drawn with the sword”—if that happened, he said, he would conclude that "the judgments of the Lord are true and righteous altogether."
As Eric Foner observed in his book The Fiery Trial: Abraham Lincoln and American Slavery, Lincoln was “reminding the country that the ‘terrible’ violence of the Civil War had been preceded by two and a half centuries of the terrible violence of slavery.” Here, Foner continues, Lincoln was asking the entire nation, “what were the requirements of justice in the face of those 250 years of unpaid labor?” On that topic, our management advice experts are strangely silent.
SOURCE: CNN.com (2-11-13)
Julian Zelizer is a professor of history and public affairs at Princeton University. He is the author of "Jimmy Carter" and of "Governing America."
(CNN) -- President Obama is set to deliver the first State of the Union Address of his new term. On Tuesday evening, he will step before a joint session of Congress and a nation in difficult times.
Unemployment rose in January to 7.9%. There are signs of economic progress, but millions of Americans are struggling to find a job while others are desperate to keep the one they have.
Other kinds of economic challenges face many people. The Pew Research Center recently released a study showing the growing number of adults who are struggling to support grown children and their parents, the "Sandwich Generation" as they are called....
It may be tempting to list a series of measures Obama wants Congress to pass, but the president should use this speech to do something more than provide a laundry list, and the historical record offers some guidance about how.
The speech can offer a vision. In 1941, President Franklin Roosevelt gave one of the most historic State of the Union addresses when he outlined the Four Freedoms. He delivered his speech on the brink of America becoming involved in World War II. With Europe and Asia in the middle of a major military crisis, FDR defined the four freedoms that he believed should be the foundation of the international system: the freedom of speech and expression, the freedom to worship God, the freedom from want and, finally, the freedom from fear....
SOURCE: LaborOnline (2-10-13)
Philip Rubio is an assistant professor of history at North Carolina A&T State University, and author of There's Always Work at the Post Office: African American Postal Workers and the Fight for Jobs, Justice, and Equality (2010, University of North Carolina Press).
It’s Saturday February 9th as I write this. Every postal worker knows by heart their first official day of work as their “anniversary date.” Not for sentimental reasons, but for purposes of seniority, retirement, and all the benefits thereof. It’s an important date.
I retired early from the post office in Durham, North Carolina after 20 years to go to graduate school (30 years is standard retirement at the US Postal Service). But even though it’s been almost 13 years since I last punched off the clock, I still remember my anniversary date: February 9, 1980. Making $8.10 an hour to start: up from $2.95 an hour a decade, the result of over 200,000 postal workers staging an eight-day nationwide wildcat strike beginning March 18, 1970–the largest wildcat in U.S. history, leading to the 1971 transformation into the US Postal Service as a self-supporting independent government agency.
February 9, 1980 began a week of paid, on-the-job training for me in Denver, Colorado. Postal management impressed upon us the imperative of maintaining the “sanctity of the mail.” That same week union representatives urged us to join whatever union went with our craft. The vast majority of postal workers are unionized....
SOURCE: Boston Review (2-1-13)
Richard White, Margaret Byrne Professor of American History at Stanford University, is author, most recently, of Railroaded: The Transcontinentals and the Making of Modern America.
Speaking in New Haven in 1860, Abraham Lincoln told an audience, “I am not ashamed to confess that 25 years ago I was a hired laborer, mauling rails, at work on a flat-boat—just what might happen to any poor man’s son.” After his death, Lincoln’s personal trajectory from log cabin to White House emerged as the ideal American symbol. Anything was possible for those who strived.
But the goal of this striving was not great wealth. Perhaps the most revealing memorial to Lincoln and his world is found in one of the most mundane of American documents: the census. There he is in the Springfield, Illinois, listing of 1860: Abraham Lincoln, 51 years old, lawyer, owner of a home worth $5,000, with $12,000 in personal property. His neighbor Lotus Niles, a 40-year-old secretary—equivalent to a manager today—had accumulated $7,000 in real estate and $2,500 in personal property. Nearby was Edward Brigg, a 48-year-old teamster from England, with $4,000 in real estate and $300 in personal property. Down the block lived Richard Ives, a bricklayer with $4,000 in real estate and $4,500 in personal property. The highest net worth in the neighborhood belonged to a 50-year-old livery stable owner, Henry Corrigan, with $30,000 in real estate but only $300 in personal property. This was a town and a country where bricklayers, lawyers, stable owners, and managers lived in the same areas and were not much separated by wealth. Lincoln was one of the richer men in Springfield, but he was not very rich.
Not only was great wealth an aberration in Lincoln’s time, but even the idea that the accumulation of great riches was the point of a working life seemed foreign. Whereas today the most well-off frequently argue that riches are the reward of hard work, in the Civil War era, the reward was a “competency,” what the late historian Alan Dawley described as the ability to support a family and have enough in reserve to sustain it through hard times at an accustomed level of prosperity. When, through effort or luck, a person amassed not only a competency but enough to support himself and his family for his lifetime, he very often retired. Philip Scranton, an industrial historian, writes of one representative case: Charles Schofield, a successful textile manufacturer in Philadelphia who, in 1863, sold his interest in his firm for $40,000 and “retired with a competency.” Schofield, who was all of 29 years old, considered himself “opulent enough.” The idea of having enough frequently trumped the ambition for endless accumulation....
SOURCE: Balkinizatoin (2-5-13)
Mary L. Dudziak is a legal historian at Emory University whose research focuses on the impact of war on American democracy and on the relationship between international affairs and American legal history.
The leak of a White Paper on targeting killings is getting the expected attention from law bloggers and others, with much commentary focused on whether the legal analysis is correct – for example the definition of “imminence.” The precise legal analysis is a distraction from more compelling issues, which are taken up by Jack Goldsmith in a Washington Post op-ed. I often disagree with Goldsmith, but this time I find myself in agreement. In part.
Goldsmith begins: “A decade of war is now ending,” President Obama proclaimed in his second inaugural address. But war is not ending, it is changing — and has been for years. Obama has cut back on heavy-footprint, conventional-force war in two countries. At the same time, he has presided over the rise of a secret, nimbler war defined by covert action, Special Forces, drone surveillance and targeting, cyberattacks and other stealthy means deployed in many countries.”
The character of ongoing war, largely off the American political radar screen, has been the focus of scholarly attention across fields. Ongoing secret war is an extension of ongoing small wars – justified for many years by the Cold War-era national security policy that American safety at home could only be protected by the projection of American military force around the world (NSC 68).
Goldsmith is right, unfortunately, that the president is arguing that war is coming to an end, while at the same time he continues it. (A point also made here.) He continues:
This new form of warfare needs a firmer political and legal foundation....Because secret surveillance and targeted strikes, rather than U.S. military detention, are central to the new warfare, there are no viable plaintiffs to test the government’s authorities in court. In short, executive-branch decisions since 2001 have led the nation to a new type of war against new enemies on a new battlefield without focused national debate, deliberate congressional approval or real judicial review.
Although Goldsmith is right that the character of war has changed, his solution is disappointingly conventional: “What the government needs is a new framework statute — akin to the National Security Act of 1947, or the series of intelligence reforms made after Watergate, or even the 2001 authorization of force — to define the scope of the new war, the authorities and limitations on presidential power, and forms of review of the president’s actions.” This is where I disagree. Something more fundamental is in order.
The nation most needs robust political engagement with American military policy, something we have not had in a sustained way since the war in Vietnam. Americans debated the war in Vietnam, and ultimately countless numbers took to the streets. Congress fiercely debated war appropriations. Elections were affected, as candidates gained or lost in the polls based on their position on Vietnam.
Deep public engagement with Vietnam was tied in large part to the fact that the costs of war came home to American families because of the draft. In the years since, the all-volunteer armed forces are one factor among others to isolate many Americans from war. (The other most important issues are privatization/contracting, and changes in war technologies.) A 2011 Pew Research Center report found that "A smaller share of Americans currently serve in the U.S. Armed Forces than at any time since the peace-time era between World Wars I and II." The data reveals "a large generation gap,"with "more than three-quarters (77%) of adults ages 50 and older...[having] an immediate family member –a spouse, parent, sibling or child – who had served in the military." In contrast, for people under 50, "57% of those ages 30-49 say they have an immediate family member who served. And among those ages 18-29, the share is only one-third."
In recent years, I’ve noted elsewhere,
In Iraq and Afghanistan, war...spread across borders as American drones fired on targets in Pakistan and elsewhere. Death and destruction were the province of soldiers and of peoples in faraway lands. The experience of wartime for most Americans largely devolved to encounters between travelers and airport screeners, as the Transportation Security Administration adopted intrusive new practices. At home, wartime had become a policy rather than a state of existence.
The only enduring limit on the use of force comes from an informed and engaged citizenry. The most troubling aspect of an era of secret warfare is that its very “secret, nimbler” character makes it easier to ignore, and thereby harder for democratic limits to function.
More essential than a new framework statute, we need a form of war politics. An essential but inadequate step is transparency, so that Americans have the capacity to know what their nation is doing. More difficult but more essential, we must find a way to care about the nation’s most fearsome power, which is now exercised without our even noticing. Whether the American people can become engaged again without a draft or forces on the ground is something I can’t answer. But finding a path toward political engagement is more important now that, for Americans, the experience of war has become so easy, and so forgettable.
SOURCE: Sightings from the Martin Marty Center (2-7-13)
Ousmane Kane is Alwaleed Professor of Contemporary Islamic Religion and Society and Professor of Near Eastern Languages and Civilizations at Harvard University.
The destruction of the sixth-century monumental Buddha statues of Bamiyan in March 2001 by the Taliban shocked many persons concerned with the preservation of world cultural legacy. Such examples of iconoclasm were not new in Islamic history. In the name of the restoration of the purity of the faith, groups with similar persuasions have destroyed Sufi and Shiite shrines in various parts of the Arabian Peninsula during the nineteenth and twentieth century. But until very recently, few observers believed that such examples of iconoclasm will ever reach the Sahel. Although the Sahelian countries had overwhelming Muslim populations, Islam in Sub-Saharan Africa was believed to be peaceful compared to elsewhere in the Arab World. In most of the twentieth century, no armed Islamic group was to be found anywhere in the Sahel. Very few Sub-Saharans trained in Afghanistan during the Soviet occupation or joined Al-Qaida, and suicide bombing was unheard of until a few years ago. This is not so much because intolerant Islamic groups were not to be found in the Sahel, but they had neither the sophistication nor the logistical and financial resources to challenge state power.
In recent years, a variety of jihadi groups have appeared in the Sahel, the Harakat al-Shabab al-Mujahidin in Somalia, Boko Haram in Nigeria, the Movement for Unicity and Jihad in West Africa. Recently, these groups have linked up with AQIM which provided them with sophisticated military training and substantial financial and logistical resources. In the last few years, jihadi groups have stated a clear agenda of Islamizing the Sahel. Nowhere have these jihadi groups been a greater threat to state power than in Mali. Since it became independent from French colonial rule in 1960, this poor Sahelian nation about the size of California and Texas combined has been struggling to preserve its national integrity. Until 2012, the threat came essentially from secular Tuareg groups who resented marginalization in the context of postcolonial Mali.
In January 2012, an assortment of Salafi jihadi groups allied with secular Tuareg groups, defeated the garrisons of the Malian national army stationed in the north of the country, conquered two-thirds of the Malian territory, and proclaimed an Islamic State. Immediately after, they started to implement Islamic penal law, cutting hands and feet of thieves, lapidating adulterers, forcing all women to wear headscarves, and dismantling centuries-old Sufi shrines designated by UNESCO as world cultural heritage sites. Incapable of restoring its national sovereignty alone, the Malian government has been seeking outside help since February 2012. Since then, the Malian crisis has been placed in the heart of the agenda of leading regional African and international bodies: the Economic Community of West African States, the African Union, the European Union and the United Nations. For ten months, no serious initiative to restore Malian national integrity was undertaken. Boosted by the passivity of the international community, the insurgent groups decided in January 2013 to extend their territorial control to the remaining third of the country. They prompted the French intervention in the side of the Malian army on January, 11 2013.
On January 27, the French and Malian troops re-conquered Timbuktu. On the same day, a journalist of Sky News, embedded with the French troops reported that 25,000 manuscripts had been burnt or disappeared. Interviewed from Bamako, the capital city of Mali located hundreds of miles away, the major of Timbuktu Ousmane Hasse reported having heard that the largest library in Timbuktu (Ahmad Baba library) had been torched by fleeing insurgent groups. The news of the destruction of manuscripts spread like wildfire. In reality, the staff of the Ahmad Institute had moved to safety the manuscripts during the crisis. The rebels had burnt a very small number of manuscripts that were being restored at the new building of the Ahmad Baba Institute.
As of February 1, the French troops have re-conquered all the major Northern cities of Mali (Timbuktu, Gao, and Kidal) forcing the rebels to withdraw to the mountains. Nobody knows what the outcome of the present crisis will be, but this much we do know: gone are the days when Sub-Saharan Islam was stereotyped as different and more peaceful.