History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 20 Jan 2019 21:57:43 +0000 Sun, 20 Jan 2019 21:57:43 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://www.hnn.us/site/feed The Watergate at 50: What the Building Tells Us About the Nixon Administration's Initial Optimism and Eventual Demise

“Watergate is having a moment,” writes Washington Post columnist Philip Bump. Mentions on cable news and searches on the Internet are at the highest level in a decade. Slate’s Watergate podcast “Slow Burn” was a smash hit. Nixon’s former counsel John Dean has tripled his Twitter following. The Watergate boom dates from May 2017, Bump writes, when President Trump fired FBI director James B. Comey.

It’s hard to believe now that the Watergate was once generally known as a luxurious, private address in the nation’s capital. 

Designed by an Italian architect during the Kennedy administration, the Watergate is actually six buildings spread over ten acres, including three co-op apartment buildings, two office towers and a hotel. The Watergate shopping arcade included a number of businesses bearing the Watergate name, including the Watergate Bakery and the Watergate Beauty Salon. The Watergate had its own bank, a small post office, a Safeway supermarket, one dentist and three psychiatrists. “If only it had a tennis court and a movie theatre,” said one Watergate resident, “I don’t think I’d ever have occasion to leave the place.”

A sophisticated security system recorded the comings and goings of members of Congress, cabinet secretaries, White House aides, journalists, judges and diplomats. Owners of Watergate apartments, from massive penthouses with Potomac River views to modest one-bedrooms overlooking the Howard Johnson Motor Lodge, had something in common: a desire to be close to the center of power in the capital city of the most powerful nation on earth.

***

On January 20, 1969, Richard Nixon took the oath of office on the West front of the U.S. Capitol. A few blocks away, the Watergate West apartment building welcomed its first residents.

“To go forward at all,” Nixon said in his Inaugural Address, “is to go forward together.” So many members of the incoming Republican administration were making the Watergate their home, the Washington Post quipped it appeared as if they were taking Nixon’s instructions literally.

Rose Mary Woods, Nixon’s “confidential secretary,” rented a two-bedroom duplex on the seventh floor of Watergate East, with a “verbal” option to buy. She selected the apartment because it was an eight-minute drive from the White House. “I bring a lot of paperwork home on weekends,” she told the Post. “If friends stop by, I can leave all that work out upstairs in the den, and the downstairs won’t be disturbed.” Another Nixon secretary, Shelley Ann Scarney, ran into Elisabeth Hanford at a party in New York soon after the election. Hanford, who was working in the White House Office of Consumer Affairs and had been asked to stay on by the incoming administration, lived in Watergate East and recommended the building to Scarney, who called the leasing office and rented a one-bedroom apartment on the sixth floor. Her boyfriend, the president’s young speechwriter Pat Buchanan, rented an apartment nearby.

Maurice Stans, the incoming secretary of commerce, paid $130,000 (nearly $920,000 in today’s dollars) for a Watergate East apartment on the twelfth floor. His wife, Kathy, installed yellow-and-white wallpaper in the foyer and hung her husband’s collection of African ceremonial knives in the master bedroom – on his side of the bed. In the library, with its grass-cloth wall coverings, seven-foot-tall elephant tusks joined a coffee table with an elephant-leg base and a rug made from a Bengal tiger shot by Maurice on one of his many international hunting expeditions. A combination den/TV room/room was turned into a “patriotic suite,” with blue carpeting, three white walls and one wall covered in red felt. “Isn’t it mad?” Kathy Stans giggled.

Martha and John Mitchell, Nixon’s incoming Attorney General, purchased a duplex on the seventh floor of Watergate East, which came with four parking spaces. Martha said her new apartment lacked “the woman’s touch” and immediately began personalizing it.  She replaced the parquet floor in the foyer with marble and brought in a “more traditional” stairway to replace the “contemporary one” that came with the apartment. Transportation Secretary John Wolpe and his wife, Jennie, brought a three-bedroom penthouse in Watergate East with two working fireplaces. Martin Anderson, and MIT-trained economist and special assistant to the president, and his wife, Annelise, who was finishing her Ph.D. at Columbia University, rented a one-bedroom apartment in Watergate West, with a view of the Howard Johnson Motor Lodge. Other Nixon appointees who moved into the Watergate included Postmaster General Winton M. Blount; the incoming U.S. chief of protocol Emil “Bus” Mosbacher, Jr.; Frank Shakespeare, chief of the U.S. Information Agency; H. Dale Grubb, a Nixon liaison to Congress; James Keogh, managing editor of presidential messages; Mary T. Brooks, director of the U.S. Mint. 

LIFE magazine explained the Watergate’s appeal. “IT USED TO BE GEORGETOWN – NOW IT’S WATERGATE,” read the headline on a full-color, eight-page spread. “JUST EVERYBODY LIVES THERE.”

The Watergate served as a freshman dorm for the incoming class of political stars: part fishbowl, part pressure-cooker. Residents, LIFE continued, became members of an exclusive club, open only to people with social and financial power. “Any American who comes under the heading of ‘forgotten’ may as well not apply.”

A Watergate sales executive said all the residents in the complex were “delightful people,” but the Republicans were “the icing on the cake.”

***

 Sunday, June 18, 1972, began like any other Sunday at the Watergate, Maurice Stans later recalled. He slept in one additional hour, as he often did on weekends, and performed his morning exercise routine, which included about twenty minutes on his electric bicycle. He showered, put on a robe, and opened the front door of his apartment to pick up his copy of the Post. He went into the kitchen to fix breakfast. A headline on the front page caught his eye: FIVE HELD IN PLOT TO BUG DEMOCRATS' OFFICE HERE.

Who in the world could have been up to that? He asked himself. 

As the scandal unfolded that summer and fall, Watergate residents turned on the evening news to see their home as the backdrop to a criminal investigation. Tourists gaped at the Watergate through glass-roofed buses and posed for pictures in front of the Watergate sign. Airline pilots approaching National Airport pointed out the Watergate to passengers, along with the Pentagon and the Lincoln Memorial. The New York Times published a map of “The Watergate Tour,” with stops at 2600 Virginia Avenue, home of the Democratic National Committee, and at Watergate Wine & Spirits for souvenir bottles of Watergate Scotch ($5.99 a bottle, or nearly $36 dollars today).

At the Watergate Hotel, guests were taking anything that had the Watergate name on it and wasn’t nailed down. A maid entered a room just as its occupant, a senior executive with a major American corporation, was packing his bags. The room had been stripped bare. Even the bedspread was missing. Embroidered towels were disappearing at a rate of $4,000 a month, according to hotel staff – equivalent to $24,000 today.  A hotel manager ordered a switch: “We had to go with anonymous towels.”

***

Every four (or eight years), a new crop of presidential appointees arrives in Washington with high hopes. Like the new residents of the Watergate 50 years ago, they come to the nation’s capital hoping to “go forward together.” Many Trump appointees reportedly cluster near the city’s southern waterfront, in the Navy Yard and the Wharf, “a bubble within the Washington bubble,” far away from “official Washington” and the “buzzing” neighborhoods that aren’t “Trump-friendly.” They hang out in their friends’ apartments, at nearby restaurants or around rooftop pools. “It’s not, like, as ritzy as Georgetown,” one young Trump aide remarked.

According to a 1971 report in the Washington Post, the Watergate was “a glittering Potomac Titanic,” as glamorous as the fabled ship, but “with no icebergs or steerage class.”

There was, of course, an iceberg ahead – the botched break-in at the Democratic National Committee in June 1972, and the cover-up that took down the Nixon presidency and forever redefined the Watergate in popular culture.

Perhaps the Watergate and the Titanic are about to merge -- in the popular culture -- once more. Presidential historian John Meacham recently compared the Trump White House to the doomed ocean liner: “[W]e’re like the Titanic steaming through the North Atlantic,” he wrote, “ and we know the iceberg is there…and [the closest ship] won’t answer our distress calls.”

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170954 https://historynewsnetwork.org/article/170954 0
Alexandria Ocasio-Cortez and the History of Dance, Gender, and American Identity

When an anonymous right-wing twitter account released a video of incoming congresswoman Alexandria Ocasio-Cortez dancing in a joyful homage to The Breakfast Club, commentators on both sides of the aisle were quick to react. Pundits debated the optics of combining dance and politics, questioning whether Ocasio-Cortez's gender has shaped our expectations of how she ought to behave. But while many in the media celebrated the sight of a youthful Ocasio-Cortez and her Boston University pals dancing together, missing from these conversations was an appreciation of the role that dancing bodies have long played in American political life. 

 

Physical appearance and bodily movements have traditionally been tied to expressions of personal identity. Dressing in a certain way and following certain behavioral norms have historically marked people as belonging to specific social categories, categories determined by social constructions of race and gender. Because the act of dancing is fundamentally tied to behavior and appearance, debates about dance have inspired lengthy discussions regarding representations of race, gender, and even American national identity. From Victorian-era ideas about dance and demeanor, to folk dances meant to help assimilate newly-arrived U.S. immigrants, fears that public dance halls would lead to white slavery, and the uses of concert dance as cultural weapon in the Cold War, the act of dancing has helped to define American social and political issues for centuries. 

 

In the latter half of the nineteenth century, dancing was an important part of a young person’s introduction to polite society. As one guidebook recounted, in the words of John Locke, children “should be taught to dance as soon as they are capable of learning it; for … it gives children manly thoughts and carriage more than anything.” Victorian sensibilities held that the body and its movements demonstrated cultural refinement and could even reveal someone’s moral fiber. However, although physical grace was important for both men and women, there were also key differences in the gendered implications of this attribute. 

According to the wisdom of the day, young men who studied fencing or boxing in order to gain strength were advised to learn dancing so that they might also acquire physical grace and personal control. “It is a matter of the first importance to the young aspirant,” one manual observed, “that he attend to the training and deportment of his body, as well as that of his mind.” A stately bearing, the booklet went on, could offer young men the “command of address” that was the marker of a well-bred gentleman. Such was the reasoning behind the decision of the Annapolis Naval Academy to hire a dancing master in 1898, a move meant to instill America’s future naval officers with the refinement they needed to conduct themselves appropriately both onboard ships and inside drawing-rooms. 

 

But while dancing and deportment showed off masculine accomplishments, a woman’s demeanor could purportedly reveal the truth about her inner self. As one Rev. Charles Kingsley explained, “if manners make the man, manners are the woman herself … they flow instinctively, whether good or bad, from the instincts of her inner nature.” Dancing, according to the writer Florence Hartley, was an essential womanly skill. “No woman,” she advised, “is fitted for society until she dances well; for home, unless she is perfect mistress of needlework; for her own enjoyment, unless she has at least one accomplishment to occupy thoughts and fingers in her hours of leisure.” A young lady, Hartley continued, ought to inhabit a graceful demeanor whenever she found herself in the public eye. One should, for example, strive avoid such unladylike performances as “sucking the head of your parasol” in the street. 

 

Beginning in the early twentieth century, dance in the United States began to serve another purpose as well, one that was key in shaping ideas about the American national character. As a wave of immigrants arrived in the United States, social reformers worried that newcomers would not be properly assimilated into the fabric of American life. Dance, which psychologist G. Stanley Hall referred to as “the most educative of all because it places the control of the muscles under the will,” worked to impart middle-class American values to the children of these new residents.

 

Taking up the mantle of what was called the “Americanization” movement, orchestra conductor Meyer Davis invented an “Americanization dance” in which various styles combined to form “an American dance to be danced to American music in America by Americans.” But social workers also emphasized the values of the American “melting pot.” They argued that immigrants brought their own “gifts” which, if preserved, might contribute to the advancement of the nation as a whole. As reporter Helen Bullitt Lowry explained, “the Pole who remembers his mazurka is a better American citizen than the Pole who has traded his mazurka for the American ‘toddle.’” Schools, settlement houses, and women’s clubs across the nation held festivals in which young people performed their national folk dances and took part in patriotic pantomimes, celebrating American unity through cultural and ethnic diversity. 

 

But if proper forms of dancing could teach good citizenship, then improper forms could pose a serious threat to the racial hierarchies embedded in American life. Following the “White Slavery” Mann Act of 1910, a law which forbid the interstate transport of women “for any immoral purpose,” critics warned young women to beware of the dangers posed by mixed-race dance halls. In Chicago, investigative journalist Genevieve Forbes observed the activities in one such “black and tan.” Recalling a “little white girl” who was dancing in the arms of “a large colored man,” Forbes wrote that the young woman “nestles her blonde curls against his coat. Arms interlocked, bodies pressed close together, she gets some of the ‘loving’ she desired.” In the eyes of contemporary observers, such scenes jeopardized the forward progress of American civilization. Endangering white feminine purity and masculine authority, the crude “barbarisms” of jazz music threatened to ensnare American youth deep inside its web.  

Later, after World War II, dance was once again called upon to help secure and grow the American way of life. As I explain in my dissertation “Hot Bodies, Cold War: Dancing America in Person and Performance,” the State Department enlisted American dance companies to help establish U.S. supremacy against the rising power of the Soviet Union. Performing in a Russian-dominated art form that was older than the independent United States itself, ballet companies worked to prove that the U.S. system could offer cultural accomplishments as well as capitalist dollars. The innovative movements of modern dance companies showed that the United States could produce new art forms as well score in their performances of the old. And folk dance troupes worked to represent cultural authenticity, showing the world the varied ethnicities that contributed to a thriving American way of life.   

 

However, although U.S. commentators called dance an “international language” whose value and meaning transcended politics, these tours were part of a concentrated effort to counter perceptions of U.S. racism and portray the country as a land of progressively equal opportunities. In 1954, the same year that the program “Operation Wetback” relied on racial stereotypes to justify the mass deportation of Mexican workers, Mexican-American modern dancer José Limón emphasized the cultural bonds uniting the United States with South America. In 1960, while the U.S. government sought to eradicate Native American tribal authority at home, the famous “American Indian” ballerina Maria Tallchief performed a balletic version of integrative U.S. expansionism matched with an appealing, all-American exoticism. Two years later, while touring in the Soviet Union at the height of the Cuban Missile Crisis, African American dancer Arthur Mitchell performed an intimate pas de deux, or dance for two, with the white ballerina Allegra Kent. Although U.S. television producers refused to broadcast Mitchell dancing with a white partner at home, the State Department framed this performance as an authentic representation of American democracy abroad. 

Maria Tallchief

In more recent days, President Trump’s awkward participation in a traditional all-male Saudi sword dance drew attention on social media, while CNN’s Jake Tapper criticized former President Barrack Obama for his carefree dancing at a Beyoncé concert. In both cases, commentators drew on standard racial stereotypes of dancing men. The powerful white man’s uncompromising self-control translated to hip immobility, while the unrestrained black man’s free movements revealed his lack of personal restraint. Critics of Ocasio-Cortez have also relied on stereotypical gender norms, evoking beliefs that a woman’s bodily movements reveal her true character. Conservative fixation with exposing Ocasio-Cortez as a fraud have also extended to mocking her clothes, calling her a “little girl,” and implying that a high school nickname proves that she is not the person she claims to be, and most recently, a nude phtoto hoax. Because Ocasio-Cortez’s ideas represent a threat to the traditional standards of wealth, race, gender, and social power, her critics argue that her unrestrained body reveals the truth about her self—that she is not the capable young politician she appears to be but is instead an unintelligent and frivolous woman woefully ill-equipped for power.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170956 https://historynewsnetwork.org/article/170956 0
Ibram X. Kendi: Remember Dr. King's Nightmare This MLK Day

On May 8, 1967, Dr. Martin Luther King. Jr. was interviewed by NBC correspondent Sander Vancor. King told Vanocur that he had been “soul searching” since his famous 1963 “I Have a Dream” speech delivered at the March on Washington for Jobs and Freedom. King felt his “old optimism” for the civil rights movement was “a little superficial” and he had replaced it with a “solid realism.” King foresaw difficult days ahead and said the dream he had once imagined had “turned into a nightmare.” 

 

This past Wednesday, historian Ibram X. Kendi spoke about Dr. King’s nightmare and legacy at the George Washington University. Dr. Kendi is the founding director of the Anti-Racist Research and Policy Center at American University. His book Stamped from the Beginning: the Definitive History of Racist Ideas in America won the 2016 National Book Award for Nonfiction. How To Be An Antiracist, his next book, will be released this August. 

 

Most frequently, Kendi argues, Americans use this holiday to remember and celebrate Dr. King’s famous speech. We remember his dream that his four kids “would not be judged by color of their skin but by the content of their character.” By focusing on this particular speech, Americans are celebrating what Kendi calls “the march of America’s progress.” We discuss how far we have come and assess how far we still have to go to reach King’s dream. The United States, many conclude, has been on a steady march towards racial progress. 

 

In writing Stamped from the Beginning, Kendi pondered if the analogy of the “march towards justice” was historically accurate. Eventually, he concluded the narrative is widely ahistorical. In actuality, there has been a dual march—one of antiracist progress and one of racist progress.  If King’s dream symbolizes the glories of racial progress, then King’s nightmare symbolizes the inglorious march of racist progress as people build and rebuild more sophisticated barriers to exclude and exploit people. While Americans hail King’s dream and our march towards justice, we have largely ignored and denied the simultaneous nightmare of racist advancement. 

 

We as citizens and scholars must reclaim this nightmare, Kendi urges. We must recognize both King’s dream and his nightmare as part of his legacy. We must consider King’s nightmare as a symbol of racist progress that conservatives too often dismiss and liberals too often downplay. With this “solid realist” notion of American history, a clearer path towards antiracist activism and policies can emerge. 

 

Recent efforts to “Reclaim King” have encouraged historians and the public to recognize King’s radicalism in the last years of his life. Scholars and activists have countered popular narratives that use King’s legacy to encourage racial colorblindness and “American moderation.” Dr. Kendi encourages Americans to study King after the March on Washington and after Selma. King’s activism, he remindes us, included his criticism of the Vietnam War, his housing desegregation work in Chicago, the Poor People’s Campaign, and calls for a human rights revolution. To Kendi, the fact that King was very different in 1963 and 1968 is what is most inspiring about King. King was able to self-reflect, self-critique, and grow as an activist and thinker.

 

By 1968, Dr. King had transcended the category of liberal or conservative. Today, Kendi argues that our choices as a people are much more eternal and fundamental than liberal or conservative. There are lies and truth—and we must be truth. There is war and peace—and we must be peace. There is hierarchy and equality—and we must be equality. There is exploitation and love—and we must be love. There is racist and antiracist—and we must be antiracist. 

 

Kendi believes that over last year, we’ve witnessed King’s nightmare in its totality. Nevertheless, we must learn from history and not necessarily despair. Even in the nightmare, King did not lose hope. Cynicism is the kryptonite of change, Kendi urges, and the nation is not lost as long as we continue to believe change is possible and keep the “long, bitter but beautiful” anti-racist struggle alive. 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170988 https://historynewsnetwork.org/article/170988 0
Dr. King's Legacy and Nuclear Disarmament

The year was 1958. The Soviet Union and the United States were developing and testing nukes at an alarming rate. In March, Dr. Martin Luther King, Jr. received a letter from Norman Cousins and Clarence Pickett of the National Committee for a Sane Nuclear Policy (SANE). King was asked to support a statement urging an end to nuclear testing.   

King joined the SANE movement right away. That April, Dr. King also signed an appeal by Protestant clergyman on halting nuke tests.   

Few people know Dr. King was also an activist for nuclear disarmament and urged peace during the Cold War nuclear arms race. He can inspire us today to finish the journey of eliminating all nuclear weapons.  The public outcry generated by groups like SANE against nuclear tests helped encourage President Dwight Eisenhower to start negotiations with the Soviets on a test ban treaty in 1958. In October, King joined a statement to the U.S. and Soviet negotiators in Geneva.  It read “an important beginning has to be made on one vital part of the problem of world peace, the permanent internationally inspected ending of nuclear weapons tests.” Eisenhower proposed a suspension of nuclear tests during the talks. There were no nuclear tests by the U.S. or the Soviets from late 1958 into 1961. His successor President John F. Kennedy was able to produce a limited treaty with the Soviets in 1963 banning nuclear tests in the atmosphere, underwater and outer space. Underground tests were permitted to continue.  Like many others, King believed ending nuclear testing was a critical step toward stopping the arms race. In 1957, in Ebony Magazine, King wrote "I definitely feel that the development and use of nuclear weapons of war should be banned. It cannot be disputed that a full scale nuclear war would be utterly catastrophic. Hundreds and millions of people would be killed outright by the blast and heat, and by the ionizing radiation produced at the instant of the explosion." Dr. King recognized that spending on nuclear armaments robbed from society. King said “A nation that continues year after year to spend more money on military defense than on programs of social uplift is approaching spiritual doom.”

 

The goal of eliminating nuclear weapons has been shared by successive leaders including presidents Ronald Reagan and Barack Obama. Today, however, we have lost any momentum in reducing the nuclear danger. There are still close to 15,000 nukes worldwide according to the Arms Control Association. President Trump has yet to take action on eliminating nuclear weapons. Instead Trump has sought to scrap the Iran nuclear deal and the INF Treaty with Russia. 

 

Dr. King’s words can inspire us to jumpstart nuclear disarmament. 

 

In his sermon “Loving Your Enemies” Dr. King said “It is an eternal reminder to a generation depending on nuclear and atomic energy, a generation depending on physical violence, that love is the only creative, redemptive, transforming power in the universe.”  King wanted all people, all nations, to come together to work out their differences. Through what Dr. King called “a great fellowship of love” the world can achieve peace and nuclear disarmament. 

 

The 1996 Comprehensive Nuclear Test Ban Treaty, banning all tests including underground, has yet to be ratified by the U.S. Senate. President Trump could ask the Senate to ratify this treaty and fulfill one of Dr. King’s goals of ending nuke tests forever.  

 

Instead of nation’s wasting dollars on nukes we could feed the hungry, end disease, and save the environment. As we celebrate Martin Luther King Day listen to his words and be inspired to take action for world peace. 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170987 https://historynewsnetwork.org/article/170987 0
What I’m Reading: An Interview With Historian Stacy D. Fahrenthold  

 

Stacy D. Fahrenthold is a historian of Middle Eastern migrations, and she is currently Assistant Professor in the Department of History at University of California, Davis. Her book, Between the Ottomans and the Entente: the First World War in the Syrian and Lebanese Diaspora, 1908-1925 (OUP, March 2019), examines the politics and activism of a half million Ottoman subjects in the Americas when their home empire fell. She studied at Georgia State University and Northeastern University, where she received her PhD in 2014. Dr. Fahrenthold tweets from @SDFahrenthold.

 

What books are you reading now?

 

I’ve been reading mostly about borderlands as I prepare a new comparative history course on the theme of “bans and border walls.” Laura Robson’s States of Separation: Transfer, Partition, and the Making of the Modern Middle East offers insight into how the partitioning of the Ottoman Empire after World War I abetted massive programs to remove, displace, and resettle refugees, with consequences to the present day. Camila Pastor’s The Mexican Mahjar: Transnational Maronites, Jews, and Arabs under the French Mandate examines the politics of Middle Eastern migration to Mexico, casting new light on these bordering practices beyond the region itself. And I recently revisited Ruben Andersson’s Illegality, Inc.: Clandestine Migration and the Business of Bordering Europe, which reveals the economies enabled by border policing. It touches on smuggling, military armament, and incarceration in the contemporary Mediterranean.

 

What is your favorite history book?

 

My answer to this question probably changes every season, but at the moment Melanie Tanielian’s The Charity of War: Famine, Humanitarian Aid, and World War I in the Middle East is certainly a contender! Tanielian tells an important story about the First World War from the position of the Ottoman home front. It’s a must-read for anyone interested in the impact of war on civilians, in the Middle East or beyond.

 

What qualities do you need to be a historian?

 

A willingness to fail frequently, and a commitment to failing a little bit better each time. 

 

Who was your favorite history teacher?

 

I was fortunate to have many fantastic teachers and mentors as I studied. Isa Blumi taught me to question the received wisdom of traditional historiography. Christine Skwiot challenged me to read widely and pay attention to footnotes. Ilham Khuri-Makdisi taught me to never give up the dogged pursuit of new sources, and to honor the stories I find as best I can. 

 

What is your most memorable or rewarding teaching experience?

 

A couple of years ago I taught a course on World War I in the Middle East, and students worked together to produce a digital newspaper capturing Ottoman perspectives of the war using primary sources they found in their research. The paper, called World War Alla Turca, followed the historical timeline as we progressed through the syllabus, and my students found an impressive array of digital resources in Middle East history.

 

What are your hopes for history as a discipline?

 

I sincerely hope that historians stop taking the bait in our national conversation about the humanities-as-failing. I worry about the impact of “meme-ification” of crisis rhetoric on our prospects as a discipline. Of course, there are many amazing scholars already combatting austerity, meeting the challenges of reduced enrollment, and confronting popular distrust in our institutions. They inspire me and I think that they represent where we’re headed.

 

How has the study of history changed in the course of your career?

 

Economic austerity in the U.S. and political instability in the Middle East have combined to make accessing archives much more difficult over time. But on the other hand, a new culture of collaboration among researchers and digitization of primary sources has made my work more enjoyable, too.

 

What are you doing next?

 

Since finishing the book, I’ve started a new project on Middle Eastern migrants who encountered border police during their journey. It builds from my ongoing data mining project that has tracked hundreds of state attempts to halt, detain, or surveil Syrian, Turkish, and Assyrian individuals across the Americas. It’s really exciting, in part because digitization projects undertaken by regional archives have made these police records more accessible. The stories are also intrinsically microhistorical and engaging to read.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170721 https://historynewsnetwork.org/article/170721 0
The Cost of Corruption in the Balkans

The six Western Balkan countries —Serbia, Bosnia and Herzegovina, Kosovo, Macedonia, Montenegro, and Albania—are “state captured” by corrupt politicians linked with organized crime. These facts have been documented by various international reports, which are raising major concerns among EU officials on the enlargement process. 

 

These countries are expected to meet social, political, and human rights standards as fundamental qualifications for joining the EU. The EU, however, is becoming increasingly doubtful that the Balkan countries can meet these standards, as their social and political life is deeply beset by corruption. Nevertheless, the efforts to eradicate corruption must not stop. 

 

The six constituent republics that once made up the Socialist Federal Republic of Yugoslavia (SFRY) were Bosnia and Herzegovina, Croatia, Macedonia, Montenegro, Serbia, and Slovenia, while Serbia contained two Socialist Autonomous Provinces, Vojvodina and Kosovo. The socialist state created after the German occupation in World War II brought together Serbs, Croats, Bosnian Muslims, Albanians and Slovenes. 

 

In the early 1990s, many of the republics began to declare independence from the Socialist Federal Republic of Yugoslavia. Despite European blessing for the new states in a 1992 referendum, war quickly erupted. Yugoslav army units, withdrawn from Croatia and renamed the Bosnian Serb Army, carved out a huge swath of Serb-dominated territory. Over a million Bosnian Muslims and Croats were driven from their homes through ethnic cleansing.

 

In August 1995, the Croatian army stormed areas in Croatia under Serb control, prompting thousands to flee. Soon Croatia and Bosnia were fully independent. Slovenia and Macedonia had already declared independence. Serbia and Montenegro also had their own governments under separate constitutions. In 1999, Kosovo's ethnic Albanians fought Serbs in another brutal war to gain independence. Serbia ended the conflict beaten, battered, and alone.

 

Today, these nations are troubled by corruption. Lavdim Hamidi, the Editor-in-Chief of Kosovo’s newspaper Zeri, who investigated corruption in the Balkans, says that “The Balkan states undoubtedly are at the top of the list of the most corrupt countries in the world.” 

 

The 2017 Corruption Perceptions Index highlights that the majority of the Balkan countries are making little or no progress in ending corruption. Journalists and activists in these countries are risking their lives every day in their effort to expose corrupt leaders. The index ranks 180 countries and territories by their perceived levels of political corruption, with 1 being the least and 180 the most corrupt. 

 

Of all the Balkan counties, Macedonia is most corrupt, ranked 107th. Two months ago, Macedonian ex-Prime Minister Nikola Gruevski sought asylum in Hungary over a wire-tapping scandal for which the court found him guilty. Xhemal Ahmeti, an expert on Balkan political affairs, says that Macedonia and other Balkan countries are the same as in Nigeria or anywhere else corruption takes place behind the mask of tribal, family, clan, and ethnic ties. 

 

“The ‘elites’ in these countries” says Ahmeti, “have always been at work to convince their publics that they are mistakenly accused of corruption by Westerners.”  Accordingly, the EU and international observers in Macedonia will not succeed in fighting corruption without direct and active monitoring on the ground. 

 

Kosovo is the second most corrupt country in the Balkans, ranked 85th. Since it declared its independence in 2008, Kosovo has provided many opportunities for its political leaders to become extremely rich. “No matter where they served, all seemed to be profiting considerably more than their wages shows. High level party officials became so rich they could afford to hire personal drivers and bodyguards without declaring the source of financing,” says Jeton Zulfaj, who spent the last two decades in Kosovo focusing on anti-corruption strategies. 

 

In Kosovo, where unemployment reached an alarming figure of 30%, politicians are the richest class in the country. Many big businesses have greatly expanded thanks to politicians’ support, who receive millions in return for “their efforts.”  

 

Decades ago the violent dissolution of the Former Yugoslavia has left a legacy of deep mistrust and animosity between majority and minority ethnicities in the newly-created states that emerged out of it. In Kosovo, the roots of the interethnic conflict between Albanians and Serbs go back deep into history. For the most of the 20th century, Albanians in Kosovo have been subjected to discrimination, intimidation, and even mass expulsion by Yugoslav/Serb authorities.

 

In this environment, corruption has flourished. According to the corruption index, Albania fell from 83rd to 91st place. Progress was made in tackling petty corruption in the public sector, but much work must still be done, especially on corruption in the judiciary. Gjergj Erebaja, a journalist from Albania, says that “The justice system, including the prosecutors and the courts, are under extreme influence of the political elite. Politicians… use unlimited power of the state to blackmail voters… Large private businesses are, to some extent, an extension of the political system.”

 

Bosnia and Herzegovina made no progress in fighting corruption in the past decade, ranking the same as Albania. In this country, political corruption at all levels of government remains a serious concern. British Ambassador to Bosnia and Herzegovina, Matt Field, has recently written on corruption, stating:

The final cost of corruption is harder to total, but it includes millions in corrupt government spending, in stolen funds, and in missed foreign investments. And this price is always [falling on] the taxpayer, the citizen, who does not receive the quality public services for which they pay.

 

Transparency International official Cornelia Abel named Serbia as an example of a “captured political system,” citing the excessive influence of its President, Aleksandar Vucic. “Serbia … is becoming a prime example of one person in the position of power influencing everyone else,” she said. Serbia fell by five places on the Corruption Perception Index, from 72 in 2016 to 77 in 2017. 

 

The Business Anti-Corruption Portal, supported by the European Union, states that “Corruption is a problem in Serbia, and the prevalence of bribery exceeds the regional average. Foreign companies should be aware of conflicts of interest within Serbia’s state institutions. Government procurement, natural resource extraction, and the judiciary are especially vulnerable to fraud and embezzlement.” 

 

Montenegro also has made little to no progress in its fight against corruption, and it remains at 64th place. Transparency International experts said that the 2016 alleged coup attempt only “stopped anti-corruption efforts to some extent.” Montenegro is often criticized for not doing enough to tackle organized crime and corruption, with Brussels demanding concrete results in fighting corruption at the high political level as one of the main conditions for the country to join the EU.

 

The endemic political corruption of the Balkan states is certainly one of the main obstacles which is dramatically slowing the process of integration into the EU. Given, however, that the Balkan states are eager to join the EU, and since the EU is interested in luring them to its orbit and distancing them from Turkey and Russia, both sides need to take specific measures to address the problem of corruption. 

 

The EU is in a strong position to use its leverage by offering investments, loans, and access to the European market, against which neither Russia nor Turkey can compete effectively—nonetheless, they are stopping short of nothing to incorporate them into their sphere of influence. In return, the Balkans should be required to institute political, economic, and social reforms. 

 

The EU should also insist on greater transparency and accountability, which would curtail pervasive corruption by elected officials. To that end, the EU should resume a law enforcement and justice presence not only in Kosovo (which recently ended after ten years), but in all the Balkan states who wish to become EU members.

 

Civil societies throughout the Balkans have a major role to play by protesting and holding massive rallies and demanding an end to the corruption that has infected all government strata, including the judiciary and law enforcement. Should their respective governments fail to take clear and decisive steps to deal with corruption, the public may have to resort to civil disobedience, which could include labor strikes, student walkouts, and a slowdown by government employees. 

 

Addressing the problem of corruption in the Balkans is central to the EU’s geostrategic interests as well as the Balkans’ future wellbeing within the EU community. The Balkans’ accession to the EU must be seen as a marriage of necessity that will dramatically enhance their collective security while substantially improving the quality of life and respect for human rights throughout the Balkans.     

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170961 https://historynewsnetwork.org/article/170961 0
Will Beto O'Rourke Follow the Path of Barack Obama, Abraham Lincoln, Jimmy Carter, and George H.W. Bush to the Presidency?

As multiple politicians have announced their 2020 presidential campaigns, many pundits continue to speculate if former Texas Congressman Robert Francis “Beto” O’Rourke of El Paso, Texas will enter the race. The paths to the presidency of four presidents—Jimmy Carter, Barack Obama, Abraham Lincoln, and George H.W. Bush—suggest Beto’s campaign could be successful. 

 

Many forget how much of a long-shot former Georgia Governor Jimmy Carter seemed when he announced his Presidential candidacy in December 1974 and was met with total skepticism.   While Carter had had a significant term as Georgia’s Chief Executive, no one could foresee how he would march to the nomination in 1976 over many better known and more experienced contenders.  He was the ultimate “Dark Horse" who shocked everyone by his well organized and planned strategy to win the nomination and overcome President Gerald Ford in the election.

 

In 2007, freshman Illinois Senator Barack Obama began his 2008 Presidential campaign, building upon his much saluted convention speech in the Democratic National Convention in 2004. Many found Obama inspirational, but wondered if he could overcome the challenge of former First Lady and New York Senator Hillary Clinton.  Many also were skeptical that a candidate of mixed race heritage could go all the way to the Presidency. Nevertheless, Obama did what seemed impossible, and took the oath of office in 2009.

 

Then we have the case of Abraham Lincoln. Lincoln served one term from 1847-1849 in the Whig party as a Congressman from Illinois during the Mexican-American War. He was known for his antiwar stance after he introduced the Spot Resolution which demanded that President James K. Polk point out the exact location of the hostilities at the border with Mexico that led to the war declaration. His controversial opposition to the war led him to choose not to seek reelection.  But ten years later in 1858, Lincoln ran for the US Senate. His seven debates across Illinois with opponent Stephan A. Douglas earned him national recognition. Although he lost the election, Lincoln went on to win the Republican nomination and the presidency in 1860.

 

Finally, George H. W. Bush lost two Texas Senate races in 1964 and 1970. The second loss came after he served two terms in the House. Despite these two losses, Bush went on to serve as the U.S. Ambassador to the United Nations, Envoy to the People’s Republic of China, Director of the Central Intelligence Agency, and Vice President under Ronald Reagan for two terms. In 1988, he was elected president.

 

So two Presidents seemed highly unlikely to win their party’s nomination (Carter and Obama) and two Presidents (Lincoln and H. W. Bush) lost Senate races after serving in the House of Representatives but went on to become Presidents. 

 

So, what does this tell us about Beto O’Rourke’s chances? O’Rourke, a three term Congressman from El Paso, Texas, ran a very aggressive Senate campaign against Ted Cruz in 2018. He lost in the heavily Republican state by only 2.6 percentage points— about 215,000 votes statewide. O’Rourke raised more money than any candidate for a Senate seat in American history—three times what Cruz was able to raise. O’Rourke gained national media attention; used social media brilliantly; inspired many young people; and was applauded for his charisma, optimism, and energy.  

 

Despite his loss to Cruz, O’Rourke is behind only Joe Biden and Bernie Sanders in early 2020 polls. The fact that his first and middle name are the same as the late Senator Robert F. Kennedy, and that he resembles Kennedy in his appearance, only adds to the lure of a possible, serious Presidential candidacy by someone who would match, if successful, the path of four former Presidents of the United States.  Time will tell if O’Rourke is destined to follow these four Presidents in the White House.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170955 https://historynewsnetwork.org/article/170955 0
William Barr Needs a History Lesson As the Senate Judiciary Committee holds its confirmation hearings for William Barr, the current nominee for Attorney General of the United States, it is clear Barr needs to brush up on his constitutional law, as well as U.S. history.  

 

During yesterday’s hearing, Senator Mazie Hirono (D-HI) asked Barr whether or not he believed birthright citizenship was guaranteed by the 14th Amendment. The question is important as the idea of birthright citizenship has come under increasing attack from the right in recent years. From the Republican primaries onward, Donald Trump has repeatedly asserted that birthright citizenship is unconstitutional, should be eliminated, and can be ended by executive order. While some on the right have balked at the last claim, Trump has tapped into an ever-present disdain among conservatives for birthright citizenship. 

 

For his part, Barr seemingly tried to side step the politically divisive issue. However, his answer to Senator Hirono’s question was not only vague, it also suggested that the soon-to-be Attorney General doesn’t know basic constitutional law or history. 

 

“I haven’t looked at that issue legally. That’s the kind of issue I would ask OLC [Office of Legal Counsel] to advise me on, as to whether it’s something that appropriate for legislation. I don’t even know the answer to that,” Barr answered.

 

There are a couple of worrying signs in this response. First, birthright citizenship is a part of the 14th Amendment, meaning any action to change that would have to be a constitutional amendment, not legislation. This is a basic tenant of constitutional law. The fact that Barr, who previously served as Attorney General under George H.W. Bush, thinks any action can be taken against birthright citizenship through simple legislation shows one of two things: (1) he isn’t competent enough to understand basic constitutional processes in the United States or (2) he was rather insidiously actually answering Senator Hirono’s question. 

 

The latter point warrants a bit of explanation. Barr quite visibly looked like he was attempting to simply move past the question and not answer Senator Hirono. However, if Barr does in fact think that birthright citizenship can be dealt with through congressional legislation, then the only logical explanation for this, barring the above first option, is that he doesn’t believe the 14th Amendment guarantees this status. Whereas the first possibility of incompetence warrants a refresher in constitutional law, this second one demands a lesson in history. 

 

History is quite clear on the intent of 14th Amendment: it was meant to create the birthright citizenship in the wake of emancipation. The 14th Amendment was created to guarantee that freed slaves, free blacks, and their posterity would forever be considered American citizens. Before its adoption, citizenship was a murky, ill-defined, status. The Constitution only mentions citizenship a few times, and does not provide a concrete definition of what a citizen is or who can be a citizen. To this day there is actually no legal definition of what citizenship actually is.

 

From the Constitution’s ratification to the adoption of the 14th Amendment, black Americans had repeatedly claimed they were citizens because of their birth on American soil. Scholars such as Elizabeth Stordeur Pryor and Martha S. Jones have shown the myriad of ways in which black Americans made claims on this status, only to be rebuffed in many cases. Citizenship could provide black Americans with a recognized spot in the nation’s political community. It represented hope for a formal claim to certain rights, such as suing in federal court. 

 

This leads to the infamous 1857 Supreme Court decision Dred Scott v. Sandford, when Chief Justice Roger Taney crafted an opinion that quite consciously attacked the very possibility of black citizenship. Taney concluded that Dred Scott, an enslaved man, could not sue in federal court because he was not a citizen. He was not a citizen, in Taney’s words, because black people “are not included, and were not intended to be included, under the word ‘citizens’ in the Constitution… On the contrary, they were at that time considered as a subordinate and inferior class of beings who had been subjugated by the dominant race, and, whether emancipated or not, yet remained subject to their authority, and had no rights or privileges but such as those who held the power and the Government might choose to grant them.”

 

Taney went out of his way to create a Supreme Court decision that attempted to put the legal nail in the coffin of black citizenship. The 14th Amendment was, quite consciously, crafted to upend Dred Scott, which was still the law of the land after the Civil War.  Thus when conservatives rail against birthright citizenship and claim that it is not, in fact, a part of the Constitution, they are ignoring America’s long history of slavery, discrimination, and segregation. 

 

When the soon-to-be Attorney General William Barr states that he thinks legislation can be used to make changes to birthright citizenship, it is because he does not believe the 14th Amendment guarantees it. And when he and other conservatives espouse such an opinion, it is because they are once again willfully ignoring American slavery's legacy of racism. This is, admittedly, not surprising. Barr also expressed the opinion during his confirmation hearing that the justice system “overall” treats black and white Americans equally, despite mountains of information proving otherwise. 

 

While the attack on birthright citizenship from the right deserves attention and should be fought at every turn, the underlying historical erasure of slavery and discrimination is also requires our attention. This willful amnesia is why the potential next Attorney General of the United States can, in one day, ignore so many aspects of America’s fraught history with race. And it is why we all must be on guard. 

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170959 https://historynewsnetwork.org/article/170959 0
The Defect that Brexit and MAGA Share

 

In June 1945, Winston Churchill, who had just overseen the British contribution to victory in the Second World War, was voted out of office in one of the most unexpected election outcomes of the twentieth century. Remarkably, the very soldiers who had enabled victory on the battlefield were central to the routing of the Conservative Party at the ballot box; they, and their social networks, voted overwhelmingly for Labour.

Churchill was a man of grand vision, of big ideas, and there was no greater mission in his life than the defeat of Nazi Germany and Imperial Japan and the retention of Britain’s special place in the world. The necessity to defeat the Axis and maintain the Empire, however, blinded him to a key dynamic: he fundamentally undervalued and misunderstood the central aspiration of his citizen soldiers in a second world war; they desired immediate and profound social change. For the ordinary citizen soldier, his participation in the war was, at heart, about building a better post-war world at home – a world with better housing, health care provision and jobs.

Churchill’s inability to fully empathize with his citizen soldiers was to have profound implications for his great mission. Churchill, and others in charge of strategy, were convinced that ordinary English, Welsh, Scots, Irish, Africans, Australians, Canadians, Indians, New Zealanders and South Africans would fight with the required determination and intensity to guarantee victory and save the Empire in its hour of need. This was the key assumption, or understanding, that drove British strategy during much of the first half of the war. It was accepted that in a newly raised citizen army men would be inadequately trained and might not be provisioned with the theoretically ideal scale or quality of materiel. But, it was expected, in this great crisis, the “great crisis of Empire,” that in spite of these drawbacks they would rise to the challenge. 

As we know, they did not always meet these lofty ambitions. The defeats in France in 1940 and at Singapore and in the desert in 1942 put a nail in the Imperial coffin. In the end, Britain did not even achieve her initial reason for going to war: the restoration of a free, independent Poland. As the war dragged on, the British and Commonwealth Armies played an increasingly smaller role proportionally in fighting the Axis. On 1 September 1944, with the Normandy campaign completed, General Bernard Montgomery reverted to commanding an army group of roughly fifteen divisions. His erstwhile American subordinate, General Omar Bradley, on the other hand, rose to command close to fifty in what clearly signaled a changing of the guard. By the end of March 1945, of the 4 million uniformed men under the command of General Dwight D. Eisenhower in North-West Europe, over 2.5 million were American, with less than 900,000 British and about 180,000 Canadian. The British and Commonwealth Armies advanced across Europe into an imperial retreat. 

The result was that the the post-war empire was a “pale shadow” of its former self. The cohesion of its constituent parts had been irretrievably damaged. Much of its wealth had been lost or redistributed. Nowhere was this more apparent than in the Far East. The loss of Singapore was not only a serious military defeat, but it was also a blow to British prestige. Barely two years after the war, and only five years after Churchill had uttered his famous words, that he had “not become the King’s first minister in order to preside over the liquidation of the British Empire,” Britain’s Imperial presence in India had ended. The loss of the subcontinent removed three-quarters of King George VI’s subjects overnight, reducing Britain to a second-rate power.

The history of Britain and the Commonwealth cannot, therefore, be understood outside of the context of the performance of British and Commonwealth soldiers in the Second World War. The Empire failed not only because of economic decline, or a greater desire for self-determination among its constituent peoples, but also because it failed to fully mobilize its subjects and citizens for a second great world war. Africans, Asians and even the citizens of Britain and the Dominions demonstrated, at times, an unwillingness to commit themselves fully to a cause or a polity that they believed did not adequately represent their ideals or best interests. This manifested in morale problems on the battlefield, which, in turn, influenced extremely high rates of sickness, battle exhaustion, desertion, absence without leave and surrender in key campaigns. When the human element failed, the Empire failed.

Today, the Anglo-Saxon world is enmeshed in another era of what some might term big ideas, although thankfully not in a world war. In the United Kingdom, instead of imperial unity we talk about Brexit and in the United States, President Trump’s vision for America, to make it Great Again, has captured the imagination of a significant cohort of the population. Whether one agrees with these movements or not, they are certainly radical; however, in a similar vein to Churchill’s grand vision of the 1940s, they are vulnerable to ignoring the needs of the many in preference to the visions of the few. In Britain, there is hardly a week that goes by without the announcement of a new set of figures outlining the collapse of basic public services and amenities. Violent crime is uphospital waiting times have risen and child poverty is increasing, just to mention a few social metrics. It seems evident to many, that leaving Europe will not address the fundamental issues faced by people who voted Brexit. Trump’s presidency, the evidence suggests, will harm the welfare of those who were most likely to vote for him; tax cuts for the wealthy and building a wall along the Mexican border will not bring back a lost prosperity to middle America. 

Trump and the Brexiteers might heed lessons from the Second World War. The American President, Franklin D. Roosevelt believed that the efforts demanded by the state to meet the global cataclysm that was the Second World War, required legitimacy, accorded by citizens who were invested materially and ideologically in that same state. In this sense, he linked intimately the questions of social change and social justice with the performance of American armies on the battlefield. By comparison, in his obsession with defeating the Axis, the British Prime Minister lost sight of the goals and ambitions of the ordinary man, the smallest cog in the “machinery of strategy,” but a vital one all the same. For the citizen soldier, the war was not an end in itself; it was a step on the road toward a greater aspiration, political and social reform. To succeed, big ideas had to take account of the little stories of ordinary people. If they do not, they are very likely bound to fail. 

 

 

 

 

 

 

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170733 https://historynewsnetwork.org/article/170733 0
Martin Luther King Day: What Historians Are Saying

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170984 https://historynewsnetwork.org/article/170984 0
A Tyrant's Temper Tantrum

King Charles I of England, frustrated at the limitations of his otherwise powerful position, decided to dissolve Parliament in March of 1629 and to clap several of the opposition’s leaders in irons. The monarch had come to an impasse over issues as lofty as religious conformity and as mundane as the regulations concerning tonnage, eventually finding it easier to simply dissolve the gathering than to negotiate with them. Historian Michael Braddick explains that the “King was not willing to assent to necessary measures” in governance, and that Charles was intent on “upholding his right to withhold consent” as he saw it, believing that “without that power he was no longer a king in a meaningful sense.” Charles was a committed partisan of the divine right of kings, truly believing himself to be ennobled to rule by fiat, and regarding legislators as at best an annoyance, and at worst as actively contravening the rule of God. 

Though it was legally required to levy taxes, at this point in history Parliament was always an occasional institution; indeed, this was the fourth that Charles had dissolved. Yet separation of powers still made it impossible for the king to directly collect taxes of his own accord, and so he adopted byzantine means of shuffling numbers around to draw income into the treasury. Such was the “Period of Personal Rule,” and to critics the “Eleven Years Tyranny,” in which Charles’ power became ever more absolute. Royalists may have seen the dissolution as a political victory, yet the ultimate loss would be Charles’, to spectacular effect. Historian Dianne Purkiss explains that the “events that were ultimately to lead to the Civil War were set in motion by a royal tantrum.” 

Royal tantrums are very much on all of our minds this new year, as we approach the fourth week of the longest government shutdown in U.S. history. As the state coffers of England were depleted after Parliament was dissolved and continued solvency required creative means of reorganization, redefinition, and shifting of funds, so too do we find government agencies forced by extremity to demand work of essential employees without pay. Garbage piles up in federal parks and at the National Mall, TSA agents and air-traffic controls work for free, yet the president, under the influence of right-wing pundits, refuses to end this current shutdown. With shades of Charles’ tantrum, Speaker Nancy Pelosi explains Donald Trump’s current obstinacy as a “manhood thing for him.” 

Meanwhile, Trump claims that his proposed border wall with Mexico is a national security issue, and after two years of inaction on his unpopular signature campaign promise has decided, not uncoincidentally following the election of a Democratic House, that he’ll invoke sweeping emergency powers to construct said wall, which last month Jennifer De Pinto and Anthony Salvanto of CBS News reported 59% of Americans oppose. At the time of this writing it’s unclear as to if Trump will declare those broad executive powers, in an audacious power-grab not dissimilar to Charles’ petulant dissolution of Parliament. 

Trump’s proposal calls to mind the pamphleteer and poet John Milton’s appraisal of Charles in his 1649 Eikonoklastes that the monarch did “rule us forcibly by Laws to which we ourselves did not consent.” Milton denounced the royalists whom he saw as an “inconstant, irrational, and Image-doting rabble,” this crowd who wished to make the kingdom great again and who are “like a credulous and haples herd, begott’n to servility, and inchanted with these popular institutes of Tyranny.”

Yet as much fun as it is to draw parallels between the events of the 17th century and our current predicament, we must avoid the overly extended metaphor. Trump is not Charles I; Pelosi is not the anti-Royalist John Pym; the Republicans are not Cavaliers and the Democrats are not Parliamentarians. Treating history as a mirror can obscure as much as illuminate, and yet I’d argue that the past does have something to say to the present, especially as we understand the ways in which American governance is indebted to understandings of those earlier disputes.

Political pollster and amateur historian Kevin Phillips argued that the English civil wars set a template for perennial political conflict in his 1998 book The Cousins’ Wars: Religion, Politics, & the Triumph of Anglo-America. With much controversy, Phillips argued that a series of conflicts between the 17th and 19th  centuries should best be understood as connected to one another, analyzing how “three great internal wars seeded each other,” with the “English Civil War… [laying] the groundwork for the American Revolution” which “in turn, laid the groundwork for a new independent republic split by slavery” that would be torn asunder during the American Civil War. For Phillips, modern Anglo-American history should be interpreted as a Manichean battle between two broad ideologies, which manifested themselves differently in each conflict while preserving intellectual continuities with their forebearers. Basing his analysis on geography and demography, Phillips sees in Charles’ claims of Stuart absolutism and religious conformity the arguments of King George III in the American Revolution, or the aristocratic defenses of inequity offered by the Southern planter class in the American Civil War. As a corollary, in the Parliamentarian he sees the language of “ancient liberties” as embraced by the American revolutionaries, or the rhetoric of New England abolitionists in the antebellum era. The first position historically emphasizes order, hierarchy, and tradition, while the second individualism, justice, equality, and liberty. 

There’s much that is convincing in Phillip’s claims. The American revolutionaries certainly looked back to thinkers like Milton; the Puritanism of the Parliamentarians was crucial in both revolutionary and antebellum New England in terms of crafting a language of rebellion. The Southern aristocrats and apologists of slavery during the American Civil War consciously compared themselves to Charles’ Cavaliers, and rejected the creed as spelled out in the United States’ founding documents as evidence of heretical non-conformism. Thus applying Phillips’ to the current divisions in the United States has a logic to it. If the American Revolution continued the same debates from the 17th century English civil wars (and it in part did), and the American Civil War was born from the contradictions of the Revolution (which is undeniably true), then it might follow that the current divisions in our country are a continuation of the American Civil War by other means. In this perspective, Trump is a kind of Copperhead President, a northern Confederate sympathizer as argued convincingly by Rebecca Solnit in The Guardian.

While acknowledging that there is much that’s valuable in Phillips’ interpretation, I prefer rather to draw a different lesson entirely. Without comment as to the causal relationships between those conflicts, I rather note a particular structure by which each one of them unfolds, an ur-narrative which for progressives is incredibly important to be aware of as we may soon be facing a period of unrivaled opportunity for enacting profound change. 

Returning to the 17th century, parallels to today can be seen in the Parliamentarian view that Charles was both an incompetent monarch and an aspiring tyrant, an illegitimate ruler enraptured by foreign influence. Had it not been for his own petulant intransigence Charles may have been able to weather those political storms, but it was precisely his own sense of inviolate authority which made his downfall inevitable. Charles’ fall from power, in turn, heralded a period of incredible potential for radical change in English history. Historian David Horspool writes that this discourse was “of a kind never before witnessed in England: an open debate” on how the new Republic should be governed. Occasions like the Putney Debates, held by the New Model Army, put front and center issues of republican liberties that had been marginal before, such as the participant Thomas Rainsborough who declared that “Every person in England hath as clear a right to elect his Representative as the greatest person in England.” Meanwhile, religious radicals like the Levellers and the Diggers, the former of whom had sizable support in both the army and Parliament, suggested communitarian economic arrangements, whereby the commons would be restituted as the property of all Englishmen, views that would still be radical today. 

Such is the primordial narrative: an ineffectual and reactionary leader makes attempts at increasingly more power which triggers a crisis that leads to his downfall while presenting the opportunity for unprecedented, radical political change from the opposition. Had Charles been less vainglorious, perhaps the civil wars could have been avoided, but he was and as a result what ultimately presented itself was the possibility of something far more daring than mere incremental change. The same template is in evidence during the American Revolution. Had moderate voices like Prime Minister William Pitt been heeded, had George III been less intemperate regarding the imposition of the Intolerable Acts, than perhaps America would still simply be part of the British Empire. As it was, the hardening of George’s position allowed for the introduction into the world of the radical democratic philosophy which defined the American Revolution, and which flowered during the Articles of Confederation when many states adopted shockingly egalitarian constitutions. Similarly, on the eve of the American Civil War, most northerners were not abolitionists, yet increasing belligerence from the Southern slave-owning class, in the form of the Missouri Compromise and especially the Fugitive Slave Act, rapidly radicalized the northern population. In the years following the Civil War there was radical possibility in Reconstruction, when true democratic principles were installed in southern states for the first time. 

We’ve already seen the arrival of new radical possibilities in opposition to the reactionary leader. Does anyone credibly think that we’d have elected several Democratic Socialists were it not for Trump? Does anyone believe that we’d finally be able to consider policy proposals like Representative Alexandria Ocasio-Cortez’s Green New Deal, and the restitution of a proper marginal tax rate, had it not been for the rightful frustration and anger at the reactionary Republican agenda? Suddenly the Democrats are suggesting actual ideas and not just the furtherance of the collapsing neo-liberal consensus; suddenly it seems as if actual change might be possible. In this sense, Trump has ironically accomplished something that the Democrats themselves haven’t been able to do – he’s pushed them to the left. 

But I must present a warning as well, for there is another part to those narratives. Writing of the English civil wars, historian Frank McLynn explains that those years “undoubtedly constituted a revolutionary moment, a time when, in principle, momentous changes were possible.” Yet the English civil wars’ radical promise was never realized, betrayed by the reactionary Lord Protector Oliver Cromwell, and among the radical participants in that revolution they were done in by “the besetting sin of the Left through the ages – internal factionalism and squabbling instead of concentrating on the common enemy.” The result would be the demagoguery of Interregnum and finally the Restoration of the monarchy. Similar preclusion of democratic possibility occurred in the 18th Century United States, when the radical politics of the Revolution would be tempered at the Constitutional Convention of 1787, with the drafting of a document that abolitionist William Lloyd Garrison famously described as “an agreement with Hell.” Post-Civil War Reconstruction, often cynically and incorrectly presented as a manifestation of draconian Yankee opportunism, was a hopeful interlude drawn to a close by Congress’ 1877 betrayal, the ramifications of which define our politics today. 

Consequently, there is a central question which the left must ask itself. It’s no longer if Trump will fall, it’s the question of what opportunities will be taken by progressives once he does. Trump’s gross incompetence and unpopularity has done more to discredit right-wing ideas than decades of liberal punditry. Clearly, we cannot afford to retreat to bland centrist moderation when the tide of history seems to call for more radical proposals. But the historical template provides warning, especially about how quickly hopeful moments can be squandered and reversed. A king’s greatest weakness is that he too often actually believes in his divine right. To be effective we can never be as stupidly arrogant. Now, what will we do with this moment? 

                   

 

 

 

 

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170916 https://historynewsnetwork.org/article/170916 0
The Myth of the Liberal Professoriate

 

For years conservative broadcasters and the right-leaning print media have denounced liberal control of American higher education. This assertion is based on the large number of academic instructors who belong to the Democratic Party and on actions taken at some colleges to promote a sense of inclusiveness and toleration to the point which – some say – discourages free speech. Both of these latter observations about academe are to a limited extent true, but they indicate that college faculty and administrators are conservative, not liberal, at least in the philosophical sense. And for intellectuals, liberal and conservative social and political values have traditionally rested on philosophical views of human nature. 

Liberal programs initially emerged from the belief that human beings are inherently good or are attracted to the good. Seventeenth century religious liberals like the Quakers used the term Inner Light, or the presence of the Creator in all humans, to explain this, while later liberal theorists like Henry David Thoreau used the term conscience. On the other hand, early classical conservatives rooted their policies in the idea that people are by nature either evil or selfish. Religious conservatives like the Puritans believed that Original Sin left all with a powerful inclination to evil, whereas secularly oriented conservatives like Adam Smith, the father of capitalism, asserted that innate “self-love” drives human action.

Although we often lose sight of the philosophical origins of liberal and conservative policy, today’s public agendas reflect those roots. Conservatives have traditionally supported powerful militaries believing that strong nations selfishly prey on weak ones, while liberals downplayed the need for military spending and substituted investment in education and social programs in order to help individuals maximize their latent moral and intellectual capabilities. Similarly, conservatives advocated criminal justice systems characterized by strict laws and harsh punishment to control people’s evil or selfish impulses, while liberals favored systems that focus on rehabilitation to revitalize latent moral sensibilities. Conservatives traditionally opposed welfare spending believing its beneficiaries will live off the labor of society’s productive members, while liberals believed such investments help those who, often through no fault of their own, find themselves lacking the skills and knowledge needed to succeed. Though the philosophical roots of these policies are frequently forgotten today, these agendas continue to be embraced by liberals and conservatives.

College professors are philosophical conservatives. This is a product of their daily experiences, and it shapes their professional behaviors. First, the realm of experience: senior members of the profession are intimately familiar with the old excuse for failure to complete an assignment, “my dog ate my essay,” and its modern replacement, “my computer ate my essay.” Years ago, missed exams were blamed on faulty alarm clocks; today that responsibility has been shifted to dying iPhone batteries. Term papers cut and pasted from Wikipedia are an innovation; plagiarism is not. A philosophically liberal view of humanity is difficult to sustain amidst such behaviors.

The clearest manifestation of philosophical conservatism in the teaching profession is seen in tests and grades. Testing is based in part on the assumption that individuals will not work unless confronted by the negative consequences of failure, an outlook that is steeped in philosophical conservatism. (Among historians, who spend inordinate amounts of time examining wars and economic depressions, which often resulted from greed and avarice, their academic discipline itself encourages a philosophically conservative outlook.)

How then can academicians be accused of being liberal? As noted, this is partly because the majority of faculty are registered Democrats. Counterintuitively, this reflects a philosophically conservative and not a liberal outlook, especially relating to all important economic policy. During the debate over the tax bill last year, Republicans continued their traditional support for supply-side or trickledown economics by proposing to lower taxes on high earners and corporations, whereas Democrats continued to advocate demand-side economics by proposing to shift the tax burden from the large middle to the small upper class and to provide tax-credits for workers. Supply-side economics is based on the assumption that reducing the tax burden on the rich will lead them to invest in plant and equipment which will create jobs, the advantages of which will trickle down to workers in the form of wages and benefits.

This is a philosophically liberal notion. It assumes that people and corporations will invest in plant and equipment even when wages are stagnant leaving many people without the income needed to purchase the goods new factories will produce. Demand-side economics, on the other hand, is partially based on the philosophically conservative notion that no rich person or corporation will build a plant, if the masses lack the income needed to buy the product. In support of this position today, demand-siders note that many corporations are using most of the surplus capital from last year’s tax cuts to buy back stock instead of investing in capital assets because wealth and income are more concentrated in the hands of the few than in recent history, which minimizes the purchasing power of the many. In supporting tax schemes and other economic policies that put money in the pockets of the many, demand-siders embrace the conservative idea that such programs will stimulate selfishly based investment spending by corporations in an attempt to tap the rising wealth of the majority of consumers. Moreover, demand-side policies will unleash the selfishly oriented entrepreneurial inclinations of working people by giving them the wherewithal to open small businesses that spur economic growth.

College professors and administrators are also attracted to Democratic economic policy because they are aware of the successes of demand-side economics. There has not been a major depression since the New Deal, though there have certainly been recessions. This is because that movement largely achieved its goal of shifting the weight of the government from supporting a supply-side to a demand-side approach to economics by institutionalizing minimum wages, overtime pay for work beyond forty hours a week, unemployment insurance, Social Security, and strong labor unions. This was not part of some left-wing socialist agenda. The goal was to put money into the hands of the many and thus incentivize the capital class to invest in new productive capacity, and more importantly to maintain demand and spending when the economy slows.

Academicians generally, and historians especially, realize that prior to the New Deal depressions (aka panics), occurred every ten to twenty years and were exacerbated by wage cuts which reduced demand and led to further layoffs and wage reductions. Minimum wages and union contracts which guarantee a wage for the life of the contract have slowed the downward economic spiral that turned recessions into depressions by limiting wage cuts, while Social Security and unemployment insurance also slowed economic downturns by helping sustain demand as the economy slackened. Though the contemporary right often argues that New Deal programs sought to create a liberal safety net for the poor, academics realize those programs were less attempts to help individuals directly and more attempts to jump-start a stalled economy and to keep it humming in part by incentivizing the capital class to continue to invest in productive capacity.

Conservatives also label academics as liberals because of their attempts to encourage inclusiveness and discourage what some term hateful speech on campuses. To the extent that this is true, and it often seems exaggerated, it is rooted in philosophical conservatism. Academics realize that language has great symbolic power, and symbols have a tendency to generate emotional as opposed to rational responses which colleges and universities rightly scorn. Academics also recognize that negative symbolism, including language, has served to dehumanize groups, and dehumanization has often led to discrimination and persecution. Only philosophical conservatives can have so little faith in human reason and goodness as to believe that emotionally laden language has the power to perpetuate injustice.

Ironically, the right, in supporting both supply-side economics and in tacitly accepting ethnically insensitive and sexist language, is embracing policies rooted in liberal not conservative thought, while the university – in favoring the opposite – adheres to a philosophically conservative outlook. Indeed, a traditional conservative would argue that the appeal of supply side-economics and insensitive speech lie in their ability to protect the wealth of the rich and to sustain the increasingly fragile sense of dignity of the middle class.

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170828 https://historynewsnetwork.org/article/170828 0
Roundup Top 10!

Teacher strikes can’t fix the core problems with our schools

by Diana D'Amico

The forces that once led to the growth of suburban schools have led to the decay of their urban counterparts.

 

Kruse and Zelizer: It's 'Network' nation: How our media became overrun by polarization, outrage and attitude

by Kevin Kruse and Julian Zelizer

How the news has become sensationalized.

 

 

What We Can Still Learn From American History's First Special Prosecutor

by Andrew Coan

The Mueller investigation grew out of a rich, complicated and not always edifying history that even most legal scholars and historians have largely forgotten.

 

 

Here’s How Democratic Presidential Contenders Should (Not) Talk About Russia

by David S. Fogleson

Candidates gearing up for 2020 may be blazing new trails on domestic issues, but when it comes to engagement with Russia, they haven’t moved beyond the counterproductive status quo.

 

 

Math And Science Can't Take Priority Over History And Civics

by Natalie Wexler

In our rush to prioritize STEM subjects, we’re overlooking other fields that are even more important.

 

 

Trump’s Trade Policy Threatens US Consumer as Much as China

by Paul Ropp

Trump’s China policy ignores the complete interdependence of the US and Chinese economies.

 

 

Angela Davis and the Jewish Civil War

by Marc H. Ellis

The Black-Jewish alliance, at least what’s left of it, faces a common challenge of how memorialization works and for whom.

 

 

The Radical Tradition of Student Protest

by Mike Jirik

The student protests against anti-black racism at UNC Chapel Hill are part of a long history of student protest against racism that includes individuals like John Brown Russwurm.

 

 

Why Study History?

by Elizabeth A. Lehfeldt

To answer that question, Elizabeth A. Lehfeldt tells a pedagogical story in two parts.

 

 

Why Americans trust technology but not science

by Joyce Chaplin

Benjamin Franklin understood that the two go hand-in-hand.

 

 

State of the Union: What would Jefferson do?

by Karen Tumulty

Pelosi's proposal was not as radical as it might sound.

 

 

 

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170985 https://historynewsnetwork.org/article/170985 0
Lots of People Won New Rights in the 1960s, but Not College Women Athletes

Chris von Saltza, Olympic champion - By Harry Pot - [1] Dutch National Archives, The Hague, Fotocollectie Algemeen Nederlands Persbureau (ANeFo), 1945-1989, Nummer toegang 2.24.01.05 Bestanddeelnummer 912-8410, CC BY-SA 3.0 nl

 

Today the #MeToo movement puts the spotlight on young women in college who have been abused without much recourse. Most media attention exposes flagrant violations by men in date rape on campus. It extends to gender harassment by executives in the work place. Looking back to the 1960s, however, another pervasive abuse included benign neglect by colleges and universities. Women as students were treated inequitably in campus activities, especially in intercollegiate sports. Graphic examples can help us remember and learn from past practices.

Between August 26th and September 11th in 1960 Chris von Saltza stood on the victory podium at the Olympic Games in Rome four times to receive swimming medals, a total of three gold and one silver. She then entered Stanford University and graduated in 1965 with a bachelor’s degree in Asian history, gaining prominence in her long career as a computer scientist. After the 1960 Olympics Chris never had an opportunity to swim competitively for a team again. Stanford, after all, did not offer varsity athletics teams for women. What was a young woman to do? There was no appeal. For better or worse, this was the way things were in American colleges back then. 

In contrast to Chris von Saltza’s experience, over a half-century later another high school senior, American swimmer Katie Ledecky, won five medals at the 2016 Olympics held in Rio de Janeiro. She, too, was about to graduate from high school and would enroll at Stanford as a freshman in the fall of 2016. The historic difference was that she had a full athletic grant-in-aid plus a year-round national and international schedule of training and competition along with prospects for substantial income from endorsements and a professional athletics career. 

In 2018 there are pinnacles of success that indicate changes since the 1960s. Katie Ledecky has excelled as a student and works as a research assistant for a Stanford psychology professor. Ms. Ledecky also led her Stanford women’s swimming team to two National Collegiate Athletic Championships and recently signed a $7 million professional swimming contract.

Connecting the dots to explain the comparisons and contrasts of these two Olympic champion women swimmers who were students at Stanford requires reconstruction of the condition of college sports for students in the decade 1960 to 1969 The profile of Stanford‘s Chris von Saltza’s lack of collegiate opportunities was not an isolated incident. Following World War II American women had triumphed in the Olympic Games every four years - but with little base provided by high school or college sports. 

At the 1964 Olympic Games in Tokyo, the women’s swimming star was Donna DeVarona, who won two gold medals. In 1964, she was featured on the covers of both Time and Life magazines and named the outstanding women athlete of the year. Despite her achievements, her competitive swimming career was over, as she and other women athletes had few if any options for formal training and participation in intercollegiate sports or elsewhere.

Young women from the U. S. won gold medals in numerous Olympic sports. A good example was Wilma Rudolph, who won three gold medals in track and field at the 1960 Olympics in Rome. Rudolph benefitted from one of the few college track and field programs for women in the U. S., coached by Ed Temple. Most of their competition was at Amateur Athletic Union (AAU) meets, with no conference or national college championship meets available. Furthermore, at historically black Tennessee State University, funding and facilities were lean. 

The limits on women’s sports are revealed in college yearbooks of the era. A coeducational university campus yearbook devoted about fifty pages to men’s sports, especially football and basketball. In contrast, women’s athletics typically received three pages of coverage. In team pictures, the uniforms often were those of gym class gear. The playing format was for one college to sponsor a “play day” in which five to ten colleges within driving distance gathered to sponsor tournaments in several sports at once. Softball, field hockey, basketball, and lacrosse were foremost.

Coaches, usually women, usually received minimal pay. Most held staff appointments, in physical education where they taught activity classes. The women’s gym had few, if any, bleachers for spectators. Coaches of the women’s teams usually lined the playing fields with chalk, mopped and swept up the gymnasium floors, and gathered soiled towels to send to the laundry. One indispensable piece of equipment for a woman coach was a station wagon, as players and coaches piled in with equipment to drive to nearby colleges for games and tournaments. The women’s athletic activities often had their own director – yet another example of “separate but unequal” in intercollegiate athletics and all student activities. There was a perverse equality of sorts. All women students were required to pay the same mandatory athletics fee as male students, even though the bulk of it went to subsidize varsity teams that excluded women.

Despite the lack of intercollegiate sports for women in the 1960s there were some signs of life. One was creation of alliances that eventually led to chartering a national organization, the Association for Intercollegiate Athletics for Women (AIAW) in 1971, with over 280 colleges as members. The first action the Division for Girls and Women Sports (DGWS) took was to establish the Commission on Intercollegiate Athletics for Women (CIAW)to assume responsibility for women’s intercollegiate sports and championships. 

One heroic figure associated with women’s sports to emerge in the decade was Donna Lopiano, who graduated with a degree in physical education from Southern Connecticut State University in 1968. She excelled in sports as a girl and was the top player picked in the Stamford, Connecticut Little League local draft. However, she was forbidden to play baseball with the boys due to by-law language’s gender restrictions. Lopiano started playing women’s softball at the age of sixteen. After college, she was an assistant athletics director at Brooklyn College, coached basketball, volleyball, and softball, and then took on leadership roles in national women’s sports associations. Eventually she was Director of Women’s Athletics at the University of Texas along with appointments in sports policies and programs. She also was one of the most honored athletes of her era. Her experiences, including exclusion from teams, shaped her dynamic leadership over several decades. 

The bittersweet experiences of women athletes such as Donna Lopiano, Chris von Saltza, Wilma Rudolph, and Donna DeVarona show that although the 1960s has been celebrated as a period of concern for equity and social justice, colleges showed scant concern for women as student-athletes. One conventional analysis is that in 1972 the passage of Title IX would usher in a new era for women and scholastic and college sports. That was an unexpected development. In congressional deliberations around 1970, neither advocates nor opponents of Title IX mentioned college sports. All sides were surprised when the issue surfaced in 1972. The National Collegiate Athletic Association opposed inclusion of women’s sports -- until it made an unexpected reversal in 1978. Many colleges were slow to comply with the letter or spirit of Title IX. As late as 1997 the burden was on women as student-athletes to file lawsuits against their own colleges, pitting them against athletics directors, presidents, boards, and university legal counsel.

Title IX eventually demonstrated how federal legislation could prompt universities to provide programs and services accessible to women that they would not have provided if left to their own volition. It has required contentious oversight of resources for student athlete financial aid, training facilities, coaching salaries and other parts of a competitive athletics team. It includes television coverage of women’s championships in numerous sports. Equity and opportunity across all college activities, ranging from sports to fields of study along with hiring and promotion, remain uneven. The caution is that the experience of a Katie Ledecky at Stanford, including her professional swimming contract is exceptional. Sixty years after Chris von Saltza won her four Olympic medals and entered Stanford, inclusion of women as full citizens in the American campus is still an unfinished work in progress.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170711 https://historynewsnetwork.org/article/170711 0
Separating Children from Their Parents Is an Anglo-American Tradition

 

The separation of children from their illegally migrant parents in the USA is seen as an aberrant and inhumane deviation from American tenderness for the family. This orphaning, as a matter of policy, is not “who we are,” as many liberals and some conservatives despairingly say.

But in many ways it is, and indeed, has long been. For the state, in the United States and earlier in Britain, has been a formidable creator of orphans. Perhaps this helps to explain the ambiguity in the attitude to the orphan: great display is made of theoretical pity and piety, but the way such children have been actually treated has frequently been punitive and repressive. Whether in orphanages, asylums, schools or other receptacles for those guilty of losing their parents, the extent of abuse by those in whose power such unfortunates have fallen is only now becoming clear.

Whatever charitable sentiments are kindled by the plight of orphans, such compassion has rarely prevented countries from the making of yet more of them by failing to wage war, or even to prevent it in those places – Syria and Yemen – where the indiscriminate harvesting of human life yields its sorry crop of abandoned children.

But it has not required war for governments, charities and even private individuals to rob children of their parents. From the first time a ship sailed from London to Virginia in the early 17th century taking “a hundred children out of the multitude that swarm about the place” until the last forced child migrants from Britain to Australia in 1967, thousands of young people were orphaned, not only of parents but of all ties of kinship, country and culture. The orphans sent from Britain to Australia alone numbered some 180,000.

A long association of derelict and orphan boys with the sea was formalized in a statute of 1703, which ordered that “all lewd and disorderly Man Servants and every such Person and Persons that are deemed and adjudged Rogues, Vagabonds and Sturdy beggars…shall be and are hereby directed to be taken up, sent, conducted and conveyed to Her Majesty’s Service at Sea.” Magistrates, overseers of the poor were empowered to apprentice to marine service “any boy or boys who is, are or shall be, of the age of ten and upwards or whose parents are or shall be chargeable to the parish or who beg for alms.”

Transportation removed 50,000 felons – among them many juveniles – by transportation to the American colonies; and in the process robbed many more children of at least one parent. In the 1740s, recruiting agents in Aberdeen sowed fear by luring children to service in the plantations. Peter Williamson and his companions, shipped to Virginia in 1743, were sold for sixteen pounds each. In 1789 the first convict ship, the Lady Juliana, consisting entirely of transported women and girls, set sail for Australia.

The historical fate of a majority of orphans is unknown. Many were taken in by kinsfolk or neighbors, and while many must have been fostered out of duty or affection, others were certainly used as cheap labor, for whom their foster-parents were accountable to no one.

It was not until the industrial era that the policy of removing children from their parents in the interests of society became widespread. The Poor Law Amendment Act permitted parishes to raise money to send adults abroad. One of the Assistant Commissioners claimed that “workhouse children had few ties to their land, and such as there were could be broken only to their profit.” In 1848, Lord Salisbury also advocated emigration for slum children.

Annie McPherson, Quaker and reformer, was the first private individual to organize mercy migrations, the rescue of children from their “gin-soaked mothers and violent fathers.” She set up a program of emigration in 1869. Dr Barnardo used Annie McPherson’s scheme, before implementing his own in 1882. He referred to “philanthropic abduction” as the rationale behind this disposal of the offspring of misery. 

At the same time, “orphan trains” carried children from New York and Boston to the open plains of the West, under the auspices of the Children’s Aid Society, established in 1853 by Charles Loring Brace. Sometimes children were “ordered” in advance, others were chosen as they left the train, or paraded in the playhouses of the small towns where farmers could assess their strength and willingness to work. These “little laborers” responded to a shortage of workers on farms. Between 1854 and 1929 a quarter of a million children were dispatched in this way.

In Britain, what were referred to as “John Bull’s surplus children,” were promised a future of open air, freedom and healthy work. Some were undoubtedly well cared for; but others exposed to exploitation, life in outhouses and barns, freezing in winter, stifling in summer, isolation and deprivation of all affection. The proponents of such schemes argued that this would provide them with a fresh start in life; but the cost of a one-way journey to Canada was far less than their maintenance by payers of the poor-rate. 

Joanna Penglase has called babies and infants taken from their mothers’ care for “moral” reasons, or simply because it was regarded as socially impossible for a woman to raise a child on her own as “orphans of the living.”

In 2010, the then British Prime Minister Gordon Brown apologized for the removal of children from their parents under the Fairbridge scheme, which took them to Australia, a practice which continued into the late 1960s. In 2008 Kevin Rudd, then Premier of Australia, apologized to indigenous families whose children had for generations been removed. In 2013 the Irish Taoiseach apologized for the abuse of orphans and illegitimate children by the Magdalene laundries from 1910 until 1970. 

It is in this context that former Attorney General Jeff Sessions declared zero tolerance of illegal immigration in April 2018. All such people would be prosecuted. Families were broken up, because detention centers were “unsuitable” for children. In June, after harrowing scenes of forcible separations, Trump signed an executive order that families should be kept together. All children under five were to be reunited with their families within 14 days, and those over five within 30 days.

It might have been thought that the creation of orphans by government had been consigned to history. Was it amnesia or dementia that made the administration, in its determination to be tough on illegal migration, separate parents from their children in its retreat to a tradition of punitive indifference to the most vulnerable? 

And then, what of the orphans of addiction, of mass incarceration, the abductions of the offspring of the marriage of technology with commerce, orphans of the gun-fetish and the multiple social estrangements created by social media and the engines of fantasy which lure children from their parents, protectors and guardians? The orphan-makers have never been busier in this era of wealth and progress.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170705 https://historynewsnetwork.org/article/170705 0
A Choir Sings Out Loud and Strong

The Charles R. Drew Prep School has been graduating gifted music majors for fifty years. It opened when the Vietnam was raging in Southeast Asia. Throughout all of those years, and all of its headmasters, it has maintained its prestige mainly because of its well-known choir, made up of some of the most skilled singers in America. Year after year, the boys would graduate after four years of superior singing and studies to enroll at America’s very best colleges.

Until now.

This year all hell breaks loose behind the ivy-covered walls of Drew. Gay boys in the choir fight with each other and the straight singers, too. Jealousies and hatreds rise to the surface. One talented singer, Pharus, insists that he is better than everybody else and struts across the stage all night. The problems are so great that a teacher is brought in to teach ‘creative thinking’ in an effort to restore calm to the choir and he fails at this job. What to do?

The play, that opened last week at The Samuel J. Friedman Theater in New York, is the story of eight singers - seven African Americans and a white boy, David, who has a fascination with Biblical heritage - an overly strong fascination. They sing together, they argue together and they make amends together in this very impressive play with soaring music and singing written by Tarell Alvin McCraney.

The play is a roller coaster of emotions and says a lot about youth history and racial history and in many new ways. As an example, there is a marvelous discussion between the boys over what slave era African American songs meant to the slaves in the 1850s and what they mean to African Americans today. Did the lyrics cry out for an escape from bondage then and now or were they just lyrics and nothing more?

The highlight of the play is a searing argument over the ‘n’ word. The old, white creative thinking professor flies into a rage when one African American boy uses the word in yelling at another African American. The white professor tells them that they don’t know their history and the great struggle that has been going on for racial equality for hundreds of years.

The richness of the play, that also has some sharp humor, is not any one scene or one actor, though. It is the choreography by Camille A. Brown and the choir’s joyous singing of music by Jason Michael Webb.  It is one of the best choreographed plays I have ever seen and the choreography is really, really different. The conclusion of each song brings joyous roars from the audience.

Choir Boy could be a drama about any prep school, or any high school, in America. It is about teenage boys growing up between classes amid a myriad of racial and sexual tension. In the end, too, these teenagers who thought school and life was so easy, are confronted with the severe penalties they have to pay for their behavior.

Director Trip Cullman has done a wonderful job of telling a taut drama full of angst and hope and, at the same time, weaving in the song and dance numbers. The result is a very pleasing show. He gets fine work from his performers -- Chuck Cooper as the headmaster, John Clay III as Anthony, Nicholas Ashe aa Junior, Caleb Eberhardt as Davis, the only white singer, and J. Quinton Johnson as Bobby. The extraordinarily gifted Jeremy Pope plays Pharus. He is a wonder as both a singer and actor and a young man struggling with his homosexuality.

Veteran actor/director Austin Pendleton is sensational as the old white professor.  

The play has some minor problems. The plot is a bit choppy and you must pay careful attention to the story as it unfolds. There does not seem to be a strong reason to bring in the white teacher. There are pieces of the story that are left out. You never learn, as an example, whether this is a prep school that has a choir or a music school whose choir is an important part of the program. You are told that the white kid has to keep his grads up to stay in school but that the others, for some reason, do not. It is stressed that they are “legacies,” or students whose parents attended the school, and that are safe no matter what they do (that’s not really true. The sex in the play comes and goes and you are not sure of  peoples’ relationships and how they developed until late in the play.

Even so, Choir Boy is a powerful drama about the coming of age of a group of superbly talented and at the same time supremely distraught young men.

It is a song to remember.

PRODUCTION: The play is produced by the Manhattan Theatre Club. Scenic and Costume Design: David Zinn, Lighting” Peter Kaczorowski, Fight Direction: Thomas Schall, Music Director: Jason Michael Webb, Sound: Fitz Patton. The play is choreographed by Camille A. Brown and directed by Trip Cullman. It runs through February 24.  

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170945 https://historynewsnetwork.org/article/170945 0
Is There a Statesman in the House?  Either House?  

 

It seems likely that 2019 will be one of the most challenging and consequential years in recent American history, perhaps on par with 1974. When Robert Mueller’s investigation of President Donald Trump concludes and his report is likely sent to Congress, a time of reckoning will be upon us. 

The United States will need leaders in both parties to display a quality that has been in short supply in our country in recent years: statesmanship.

Statesmanship is a pattern of leadership, and an approach to public service, that is characterized by vision, courage, compassion, civility, fairness, and wisdom. Statesmanship can involve bipartisanship, but it is not the same as bipartisanship. History proves that there can be a strong bipartisan consensus to enact harmful policies or to evade difficult alternatives. 

When statesmen consider public policy issues, their first question is, “what is in the public interest?” Personal and partisan considerations can follow later, but hopefully much later. If the national good is not identical to, or even clashes with, personal and partisan considerations, the former must prevail. 

Genuine statesmanship requires leaders to dispassionately consider issues, carefully weigh evidence, and fairly render verdicts, even if they go against personal preferences or are contrary to the desires of their political base. Given our current political climate it is easy to forget that statesmanship, while unusual, has been a critical feature of American politics and history. 

Republican senator Arthur Vandenberg played a pivotal role in the late 1940s in securing congressional approval of key elements of President Harry Truman’s foreign policy including the Marshall Plan and NATO. Margaret Chase Smith, a first term GOP senator from Maine, broke from party ranks in 1950 and challenged Joseph McCarthy and his demagogic tactics. Senate Republican Leader Howard Baker damaged his chances for the 1980 GOP presidential nomination by supporting the Panama Canal treaties that were negotiated by President Jimmy Carter. Republican Richard Lugar, then the chairman of the Senate Foreign Relations Committee, defied President Ronald Reagan in the mid-1980s and pushed for economic sanctions on the apartheid regime in South Africa. 

Decisions of similar gravity are likely to face political leaders in Washington in 2019. Democrats should fairly evaluate Mueller’s report as it pertains to alleged Russian collusion and obstruction of justice by the president. They should not overreach because of their antipathy to Trump or avoid their responsibilities if Mueller’s findings suggest impeachable crimes if they fear that such an action would complicate their 2020 prospects. Republicans must end their reflexive and unworthy tendency to overlook the president’s frequently egregious and possibly criminal behavior because Trump remains hugely popular with the GOP base. 

Senator Paul Simon, a consequential and successful public official in Illinois for more than four decades, worried during his final years that statesmanship appeared to be at low ebb. “We have spawned ‘leadership’ that does not lead, that panders to our whims rather than telling us the truth, that follows the crowd rather than challenges us, that weakens us rather than strengthening us,” he wrote. “It is easy to go downhill, and we are now following that easy path. Pandering is not illegal, but it is immoral. It is doing the convenient when the right course demands inconvenience and courage.”

Decades earlier, Senator John F. Kennedy wrote eloquently on the subject. In Profiles in Courage, he argued that politicians sometimes face “a difficult and soul-searching decision” in which “we must on occasion lead, inform, correct and sometimes even ignore constituent opinion.” Kennedy added that being courageous “requires no exceptional qualifications, no magic formula, no special combination of time, place and circumstance. It is an opportunity that sooner or later is presented to all of us. Politics merely furnishes one arena which imposes special tests of courage.” 

Those tests are coming in 2019.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170777 https://historynewsnetwork.org/article/170777 0
Maestro Is Out of Tune

Arturo Toscanini was one of the greatest symphonic conductors in the history of the world. He first raised his baton at the age of 19 and pretty much kept conducting until his death at the age of 82. The Italian genius led orchestras in numerous nations and even the famous NBC Symphony orchestra in New York. His orchestras played some of greatest classical music ever written and had some of the planet’s great musicians in them. Among the singers he worked with were superstars Ezio Pinza and Enrico Caruso. A little-known chapter to most Americans was his running battle with Benito Mussolini, the fascist Italian dictator, Adolf Hitler and their goons from the early 1930s to the end of World War II.

Now that intriguing story is finally being told in a new play, Maestro, that opened last night at the Duke Theater, 229 W. 42d Street, in New York. It stars John Noble, includes a small orchestra that plays composition by some of the great composers and has a vivid historical video news clip show that tracks the Mussolini takeover in Italy.

Maestro does not work, not at all. The legendary Maestro drops his baton in this drama, and that is sad because the story is so fascinating and inspiring. The play, though, is really out of tune.

In history, Toscanini was publicly critical of both German Chancellor Adolf Hitler and Italian dictator Benito Mussolini, sided with the Jews all over Europe against the fascists and was condemned by the Italian government. The conductor eventually fled the country. Generally speaking, that story is told in the play, but most of the interesting aspects of his war with the fascists are left out by playwright Eve Wolf.

Toscanini refused to conduct in Italy because of his hatred for Mussolini and so he led orchestras in other countries nearby and his audiences there were enormous. That’s not covered sufficiently in the play.

The program notes tell the vivid story of how Toscanini helped Italian Jews escape transport to camps and how he even got them out of Italy and helped find them jobs in the U.S. That is not in the play.  

In 1931, six years into Mussolini’s reign, Toscanini was physically assaulted by fascists, but that story is not in the play. Musicians are quoted in the program notes recalling how Toscanini not only made them better musicians, but better people. That is not in the play.

He was one of the star conductors of Germany’s Bayreuth Music Festival, but he quit in 1931 as the Nazis marched towards power. In Italy, he refused to play the new fascist national anthem that Mussolini insisted upon, infuriating the Italian strongman. Again, not in the play.

Toscanini’s defiance of Mussolini never ended. As an example, he refused to ever show Mussolini’s photograph at concerts, as just about everyone else did. An angry Mussolini had Toscanini’s phone tapped and had him followed. Not in the play.

There are several problems with Maestro besides its thin history. First, it is a one man show and Noble has to carry the whole drama on his shoulders. The tale cries out for other characters, such as his wife, Carla, the NBC brass who created their symphony just for Toscanini in 1937, his many musicians, music lovers and, of course, Il Duce himself. This is great play waiting to happen. It does not happen at the Duke. 

For some bizarre reason, playwright Wolf slips in the five-piece orchestra to play numerous classical music pieces by Verdi, Wagner, Gershwin, Tedesco, and others. The music goes on and on and on. Actor Noble could have lunch between each piece. The musical interludes seem longer than Mussolini’s nearly 20-year reign in Italy. The orchestra has absolutely nothing to do with the story (the musicians in the orchestra, Mari Lee, Henry Wang, Matthew Cohen, Ari Evan, Maximilian Morel and Zhenni Li, are quite good, though).

The book is choppy. It starts with Toscanini yelling at “musicians” who are audience members and then he is off discussing the last years of his career, skipping over all the years, the decades, that made him so famous. 

Questions are never answered. Toscanini tells the audience that he and his wife had not slept with each other for over twenty years, that he did not see her from 1938 to 1945 and did not get a single letter from her for seven years. When he returns to Italy at the end of the war, he tells the audience he is stunned that she was not there to greet him. Huh? In the play, Toscanini sort of left out the fact that during his marriage he had affairs with half the women in Italy and one third of the women in America. And the wife did not want to see him?

We are told the Mussolini has taken his passport, which means he can’t leave the country, but then he pops up in New York City with the NBC Symphony. How did he do that? At one point in the story he shows up in Palestine to lead an orchestra made up of Jewish refugees. How did he get there? Did someone invite him? Did he get off the train at the wrong stop? What?

He returns to conduct at La Scala, in Milan, but says he won’t walk the streets of Milan. Why?

Noble does a decent job portraying Toscanini in this play directed by Donald T. Sanders, but, unfortunately, he has a very weak book to work with and is unable to tell much about the fabled conductor whom Mussolini hated so much. You might have learned much about history but did not.

This play is like a concert in which much of the music is missing. 

PRODUCTION: The play was produced by the Ensemble for the Romantic Century. Scenic and Costume Design: Vanessa James, Lighting Design: Beverly Emmons & Sabastian Adamo, Sound: Bill Toles, Projection Design: David Bengali. The play is directed by Donald T. Sanders. It runs through February 9.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170953 https://historynewsnetwork.org/article/170953 0
What I’m Reading: An Interview With Historian Carla Pestana

 

Carla Gardina Pestana is Professor of History at the University of California, Los Angeles and the Joyce Appleby Endowed Chair of America in the World. 

What books are you reading now?

I have a number of different books going at the moment. Disaffection and Everyday Life in Interregnum England by Caroline Boswell, a book I agreed to review, came to me because I listed myself as a military historian (among other categories) on the website, Women Also Know History. This site makes available information about women’s expertise as historians in order to promote the expertise of women historians. Since it was the first request I had received that mentioned having found me there, I felt compelled to agree. Otherwise I seldom review books these days. 

For recreational reading, I just finished Tigerbelle, The Wyomia Tyus Story, an autobiography of an Olympian. I don’t generally read either auto/biographies or modern history, but Ty is a family friend so I made an exception. It’s an interesting account, especially for what it shows about her world, which became dramatically wider (she was raised in the rural south but traveled extensively as a result of her athletic expertise) as well as for the gender dynamics prevailing in the era when she was coming up as an Olympian. 

More work-related but equally enjoyable has been reading Elena Schneider’s The British Occupation of Havana. I’ve been awaiting this study of the 1762 occupation of the supposedly impregnable Cuba port city. Through Schneider’s treatment we can see clearly that imperial boundaries were frequently crossed (in wartime as well as during periods of peace), and regional residents routinely failed to cooperate in efforts to close off one empire from another. That and her treatment of the role of slaves and free people of color in the defense and subsequent occupation of the city are profoundly illuminating. Her’s is one of a spate of excellent new books on the early Caribbean. 

All this is not to mention the doctoral dissertation on Haitian independence and land that I just finished or the many books and articles I am rereading in order to decide if I should assign them for my winter quarter class. There’s always too much reading to do. 

What is your favorite history book? 

This seems like an impossible question, because there are so many amazing books. I have recommended certain books to many people, so I guess that is one measure. Mr. Bligh’s Bad Language is a great book, and Greg Dening was a scholar I always admired as well as a lovely person. I used to teach Natalie Zemon Davis’s The Return of Martin Guerre, along with the related debates. I liked that book for how it shows the way the historian does her work. When I was a graduate student I read (in the same week in my first term) Christopher Hill’s The World Turned Upside and Perry Miller’s The New England Mind: From Colony to Province. This conjunction set me to thinking: how could both these realities have coexisted. My M.A. thesis (which was published in the New England Quarterly in 1983) represented my first attempt to answer that question; and my dissertation—on religious radicalism in early New England—followed and extended my effort to understand how England produced Quakerism and other forms of radicalism even as newly-founded New England embraced orthodoxy and policed its borders with violent results. 

Recently, I have read a number of wonderful books about Caribbean history: Elena Schneider’s book on Havana, but also David Wheat’s Atlantic Africa and the Spanish Caribbean, 1570-1640and Molly Warsh’s American Baroque: Pearls and the Nature of Empire, 1492-1700. I’m preparing to teach a new course on Atlantic history, so I have an enormous stack of such books to go through.   

Why did you choose history as your career?

The simple answer: my undergraduate teachers suggested I try graduate school. Growing up I knew no professors, indeed nobody with a Ph.D. But I loved reading and history, so I was amenable to the suggestion. I have never looked back: I went directly into graduate school from undergrad, and carried right on through M.A., Ph.D., and—somewhat miraculously—to a first position that was a tenure-track job at a good school. It all worked out amazingly well, although it seemed a crazy path, an unimaginable future, at the time when I took it up. If I had known I would become so passionate about being a historian that I would go live in another state for decades, away from family and my beloved Los Angeles, I wonder what I would have done. But, after a long haul, I managed to come home, and take a position at my graduate institution. I now work 15 miles from where I was born, and not too many academics can say that. 

The more complex answer: I find entering an alternate world an interesting way to use my intellectual abilities. Immersing oneself in a particular time and place in order to come to know it well is a fascinating process. At times in my career I have become so thoroughly immersed in my research that I have gotten mixed up about the number properly assigned to the current month; during the era I study, the first of the year was in March, which made December (quite sensibly) the tenth month. On the rare occasion that I can stay in the seventeenth century for an extended time, I have to remind myself that September is not in fact (any longer) the seventh month. 

I once read an essay by Edmund Morgan who suggested that we focus our research questions on what doesn’t make sense. That insight and instruction strikes me as apt in that it’s the disjunctions, the perplexities, which draw the eye and demand to be explored. When we complete that exploration, we so often find something unexpected and revealing. I would take his observation one step further, to say that as we come to know a time and place well, we become more sensitive to the unexpected. Some projects, of course, dig into unknown topics and archives but most re-consider (through deeper research or new questions) already studied topics. I find my work always shifts back and forth between verities (often contained in the historiography) that need to be challenged, and archival sources that open up the possibility of answers. That tension keeps the intellectual life of the historian interesting. 

What qualities do you need to be a historian?

Well, I don’t know about all historians, but I am tenacious, organized, and detail-oriented. I have trouble taking no for an answer, so I just keep digging and trying to figure out what I want to know. While I have never minded (indeed I cherish) time spent alone at my desk or in an archive, struggling with writing and with research, I also thoroughly enjoy the opportunities that my work gives me for thinking with others. I enjoy talking to people—students, colleagues, or the public—about ideas and about the past. I am equally pleased by the solitary and the communal aspects of this work. Ideally you can do both, work alone and with others. I feel that being able to support oneself through work as a historian is a great privilege. 

Who was your favorite history teacher? 

Another difficult question since I have benefited from the teaching and guidance of so many great history teachers. Besides my father who was not a history teacher but was prodigiously intelligent and would answer all my childhood questions about history, a high school teacher leaps to mind. Milton (Mickey) Sirkus was a great teacher. When I think back now about his pedagogical approach, I have to laugh. In an honors history class I took with him in high school, he used teasing each of his students about her or his heritage as a way to engage us in U.S. history. It is hard to imagine a teacher today who would use such a hook, and even at the time it struck me as rather edgy. It was the case that we defended our immigrant ancestors or relatives, and their contributions to U.S. history more energetically than we might have otherwise, since he engaged us on a personal level. I never thought much about what my ancestors and older relatives had faced because they were the children of immigrants until that class. He used sarcasm and teasing in a way I would not feel comfortable doing. As one example of that, when I wrote to him many years later to explain that I had gone on from his history class through college and graduate school to become a historian, he wrote back to welcome me to the ranks of the unemployed. It was in fact a terrible time for finding a history position in a university, but luckily he was eventually proved wrong on that score.  

Since my high school history class, I had excellent teachers at my undergraduate alma mater (people who pointed me toward graduate school) and in my graduate program too. I was fortunate to go to UCLA to pursue a graduate degree in early American history in the 1980s. I started working with Gary Nash, who was an amazing lecturer, galvanizing the undergraduates in his big classes, and an excellent editor, giving the best readings of my written work that I have encountered anywhere. My second year at UCLA, Joyce Appleby joined the faculty, and she was stunningly accomplished in all aspects of the work we do. She served as a role model for so many of us—so smart and no nonsense. I am so pleased to have a chair at UCLA now named in her honor. 

What is your most memorable or rewarding teaching experience?

I used to employ a first-day exercise in smaller classes that both the students and I enjoyed. I’d ask them to write down and pass up their earliest historical memory, and then I would write all their answers of the board. Then we’d discuss the list from numerous angles, starting with what criteria they had used to decide an event was historical. We discussed what guides us in making that sort of a call, and what examples of formal historical writing might align with their choices. The exercise always resulted in a great first-day discussion, one that often ranged widely. I have to say, doing it also brought home to me the ages of my students, as I watched their earliest events move forward in time. I haven’t done that in a while, but I do remember it fondly. 

These days I am enjoying the work I do at UCLA with transfer students. Some of the best undergrads I have taught here have been from the local community colleges, transferring in as juniors. They undergo a bit of culture shock, but at the same time they are eager, smart and enthusiastic. I have thoroughly enjoyed overseeing undergraduate honors theses with a handful of them.  

What are your hopes for history as a discipline? 

I work with so many smart, engaged young people that I am able to remain hopeful. It is easy to bemoan these anti-intellectual times and to worry about what will happen to the American university system and to our ability as a society and a culture to engage intellectually. Yet many people care deeply about learning, including learning about the past, and they work at the thinking and writing that we—whether as producers or consumers—need to keep history going as a discipline and as a form of knowledge. So in spite of the gloomy prognostications, I remain hopeful. History is a foundational component of a humanist education, and it is something that many people beyond the academy know to be valuable. I’m toying with writing a book for a popular audience in part to try to make some of the work we do in the academy more accessible and interesting to those outside it.  

Do you own any rare history or collectible books? Do you collect artifacts related to history?  

I don’t own any particularly rare or collectible books, although I do still have my beloved print copy of the OED—The Oxford English Dictionary—in two volumes, with its magnifying glass in the little drawer that allows me to read the many pages printed on each sheet.  

As for artifacts, I have received some fun items as gifts from former students. One gave me an old nautical sextant—appropriate to my work on maritime history and privateers—while another gave me a framed sheet out of an early edition of John Foxe’s The Actes and Monuments (better known as Foxe’s Book of Martyrs)—a gift relevant to my work on religion. I also have a counted cross stitch sampler that replicates one from the late seventeenth century. The original maker was a New England girl who grew up to join the Quaker meeting in Lynn, Massachusetts, a meeting and a community that I wrote about in my first book. My mother stitched it for me as a gift while I was writing a dissertation that included this girl, Hannah Breed, and other people from her community. 

What have you found most rewarding and most frustrating about your career?

When you ask about my career, I assume you mean my own personal triumphs and trials. If that’s the intention, I have to admit that I have been extremely privileged and lucky, so both the high points and the low occurred in that context. 

One of the most rewarding aspects of my privileged position has been being able to take all the time I wanted and needed to write a second book. I got tenure based on the first book, so despite the pressure to publish again quickly (and the harsh strictures from one department chair in particular about “frozen associate professors” who didn’t finish a second book promptly), I produced a second book (The English Atlantic in an Age of Revolution, 1640-1661) that differed drastically from my first. It took me forever to learn all that I needed in order to be confident about that book and to send it out into the world, but it was a better book for it. I am glad I did not bend to the pressure (whether self-inflicted, institutional, or otherwise) to be fast, and the tenure system allowed me that opportunity. That book might still be my personal favorite of those I have so far written, because of how far I had to stretch to write it. It didn’t help matters that I had two children over the course of researching and writing it, either. 

That is the perfect lead in to the frustrations. Like many women in my cohort, I did experience the challenges of having babies at an institution with no pregnancy leave policy. My female colleagues thought I should go ask what arrangements would be possible, but the chair of the department looked at me blankly. It was aggravating, but because I didn’t have tenure the first time around, I just thanked him and left. I didn’t become better at advocating for myself the second time, either, even though by then I did have tenure. My children are in their early to mid-20s so this was not all that long ago. Most women academics then of my acquaintance who were older than me did not have children, and if they did they often had them before they joined a department. If you found yourself in my situation, you were supposed to hope your baby arrived in the summer, best of all in early summer, so you could spend a little time at home; if the baby was born at a different time of the year, you might be allowed to teach an overload, bank some courses, and get a little time off that way. Some colleagues seemed to think that one should not try to be an academic and a mother. I managed, as did others, but the lack of support or even awareness was a source of frustration. 

How has the study of history changed in the course of your career? 

I have been at this a while, so it has changed in various ways. In my own original field of early American history, when I was in graduate school my fellow students were doing the “New Social History,” studying various groups in society often using quantitative methods. The cultural turn had already overtaken literature departments but was just coming into historians’ awareness. Soon that became the dominant approach, but at the same time areas such as Native American history were blossoming too. Today it seems that some of those early seeds of the social history scholarship—especially its engagement with race, class and (eventually) gender—has paid big dividends, reshaping the ways we think about so many topics.  

In my own historical scholarship, I have been most conscious of the shift in geographical frames. Today Atlantic history seems a bit passé, but the shift out from British North America felt startlingly true and profound at the time. When I was a graduate student, colonial America meant the thirteen colonies that became the United States and the only external links that matter were back to Britain. Most projects were framed within a single colony, and the bent toward social history meant detailed archival work within a relatively narrow geographical framework. Looking up from that narrow landscape to perceive the connectedness of various places not in North America and indeed not within the English imperial boundaries felt like a revelation.  

What is your favorite history-related saying? Have you come up with your own?

I do not like the usual history sayings, because they often assume some simple connection between the past and the present that I perceive to be wrong. For instance, I don’t agree that history repeats itself. Or rather, as George Santayana said, “those who cannot remember the past are condemned to repeat it.” Even Karl Marx’s version, “History repeats itself, first as tragedy and then as farce” doesn’t strike me as entirely accurate. The factors that shape our present are so complex and multifaceted that attempts to achieve or avoid a particular outcome usually set into motion numerous unintended consequences—that (more than the repeat nature of history or our ability to remember it and thereby keep it from repeating) is what strikes me most often as I study the intentions of historical actors. 

I am rather more enamored of the L.P. Hartley observation, which points in the opposite direction: that “the past is a foreign country; they do things differently there.” The opening line of his novel could actually be read as a caution to those who look for repetition or simply lessons, since that often involves ignoring the differences. 

For the sheer pleasure of following its twisted history in our popular culture, I do rather like Laurel Thatcher Ulrich’s “Well-behaved women rarely make history.” I read it in its original context, before it developed a life of its own, in an article about what was considered proper behavior for women. Laurel meant it as a straightforward description of the cultural ideal: women were not to draw attention to themselves but to remain quietly in their proscribed roles. She was not issuing a call to revolution or advocating that women should misbehave and make history. But the quotation got picked up and flipped from its original meaning to its opposite. That reversal is fascinating, and I particularly love how people attribute it to various women (such as Eleanor Roosevelt) who purportedly said it to advocate that women make trouble and call attention to the need for change. Laurel has written a book on the whole phenomenon, in part to get all those who know her to stop sending her pictures of it misattributed on t-shirts, coffee mugs, and protest signs. 

The strange history of that history quotation makes it fun. It remains ubiquitous, and I bet people still email Laurel about its odder appearances. I’ve long since quit doing so, although I continue to see it around. 

What are you doing next?

Well, I am chair of my department, so I am doing a great deal of university service. I care deeply about my department and my university, so I don’t mind giving some of my time over to this work. But that obligation does mean that I will produce less scholarship in the short term. I do have a book manuscript on Plymouth Plantation that I am trying to finish for the 400th anniversary of the Mayflower landing. It differs from anything I have done before, in that it is aimed at a popular audience. I was inspired to write it by an extended visit to the living history museum that reenacts Plymouth, having been brought in along with others to help the staff there to update their historical coverage. That experience got me thinking about Plymouth and how we Americans envision it. My impulse to create this work owes something to the fact that I wrote for a few years for the Huffington Post. Writing for a popular audience about the intersections between the past that I study and current events proved a challenging discipline; 800 words are very few (at least for the historian who writes 200 page books), and the need to respond quickly and in a focused fashion I found invigorating. I am trying to bring what I learned doing that to this new project. 

As usual—as has been the case since the start of my career—I also have a little Quaker piece I am mulling over. My very first research project as a graduate student was on the Quakers, and I keep coming back to them with various questions and ideas. And finally, I mean to get back into the Jamaican archives, to follow up some of what I was doing with my previous book. So, lots to do, but not enough time to do it all. Isn’t that always the case?

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170597 https://historynewsnetwork.org/article/170597 0
Trump’s Bullshit-Savant Moment on Afghanistan

Once again, the President put his factually-challenged relationship with the past on public display. In a January 3rd Cabinet meeting, Trump offered a tour de force with a fanciful alternative history of Afghanistan. According to him, the Soviets invaded in late 1979 because of cross-border terror attacks. The subsequent decade-long war, the President insisted, bankrupted the USSR and led to the collapse of the Soviet Union. Trump clearly had no idea what he was talking about. If the past is a foreign country, then in Trump’s parlance, he is an illegal immigrant trespassing upon it. 

 

To briefly correct the President’s (mis)understanding of Afghan history: The Soviet Union invaded the country on December 26, 1979, ostensibly to support a friendly communist government under threat from a domestic insurgency provoked by unpopular reforms and the violent suppression of political dissent. Fearing the collapse of an allied regime on its southern border, the Soviets replaced the Afghan communist leadership with a more moderate and pliable cadre. Though initially planning for a swift withdrawal, Soviet forces soon found themselves sucked into a quagmire which proved impossible to escape. Over the next decade, they deployed roughly 100,000 troops, losing 15,000 of them, in a bloody counter-insurgency against the so-called mujahideen– American supported ‘freedom fighters’. The war forced over 7 million to flee as refugees, created an unknown number of internally displaced persons, and killed, maimed and wounded an untold number of Afghans. The Soviet war ended with the Geneva Accords in 1988, allowing the USSR to feign ‘peace with honor’ which covered an ignominious retreat. 

 

The United States and its allies immediately denounced the Soviet invasion, which made Afghanistan a battleground in the increasingly hot Cold War. American policy-makers saw the potential of turning Afghanistan into the Soviet Vietnam. Beginning with the Carter administration, and significantly ramped up under Reagan, the US secretly funneled $3 billion to the Afghan mujahideen. By bleeding the soft underbelly of the beast, American Cold Warriors hoped to strike a mortal wound to the evil empire. Following the end of the Cold War, some conservative commentators characterized the Soviet defeat as a consequence of Reagan’s tough stance which forced them to spend an incessant, and unsustainable amount on defense. These analysts contend that the Afghan war, along with the cost of Soviet military aid to Central America and the US deployment of Pershing missiles in Europe, bankrupted the Soviet Union and led to its collapse. 

 

It was this interpretation of history which Trump’s stream of consciousness soliloquy rather clumsily tipped his hat to. Nevertheless, the President’s alternative history almost immediately earned him a scathing rebuke from a no-less august stalwart of the right than the editorial page of the Wall Street Journal. The Journal’s willingness to take him to task for a position loosely held by many on the American right over the years is notable. Doubly so for a publication which has repeatedly proven reticent to fact-check the man. 

 

Yet the Journal’s response is a non-sequitur. What has been lost in the consternation provoked by the President’s remarks is the fundamental question which remains unanswered – namely, what the hell is the US doing in Afghanistan? Though he got his facts wrong – Trump does not seem to care about them anyway, and is thus the bullshitter-in-chief in the Harry Frankfurt sense – the essence of his question is correct. The US has lacked a clear policy on and purpose in Afghanistan since the early 2000s, making the President’s rambling, historically uninformed remarks something of a bullshit-savant moment. 

 

Now entering its eighteenth year and one of the costliest wars in American history, the President has reportedly grown frustrated with a continuing conflict which he seemingly does not understand. While his ignorance provides fodder for detractors and evokes the concern of the national security establishment, it also allows him to ask basic questions regarding the purpose of that war which have long been considered settled within Washington circles of power. The President’s ignorance of Afghanistan, though extreme, is far from unique amongst the American policy establishment. Such ignorance is the consequence of a larger failing of American policy in the country – the lack of a clear publicly pronounced purpose and end-goal for the continued American presence in Afghanistan. 

 

Despite nearly two decades of war in the country, American policy is largely driven by a noxious combination of inertia and sunk costs. A large part of the problem is that America’s civilian political leadership long ago abdicated its war-fighting responsibilities regarding Afghanistan. It is the role of civilian elected officials to formulate, articulate and communicate the fundamental purpose of an armed conflict and to direct the military and security apparatus of the government to execute that vision. But this has not been the case with Afghanistan. Since the quick victory over the Taliban in 2001, America’s political attentions quickly wandered elsewhere, most importantly Iraq. This meant that the Afghan war has largely been farmed out to the generals to fight a war whose aims and purpose they have not been instructed in. The military has thus continued to do what the military knows best – fight a war. It is no wonder then this conflict goes on, with no end in sight. 

 

What the hell is the US doing in Afghanistan? The President clearly does not know. But this is the central question. The one the President himself, along with the other elected officials of the US Government, needs to answer. It is neither the responsibility nor the place of the US military leadership to do so. In his bullshit-savant moment, Trump has set himself a challenge. Sadly, it is one he has demonstrated little interest in or ability to rise to. 

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170913 https://historynewsnetwork.org/article/170913 0
From Birth of a Nation to Silent Sam: What History and Popular Culture Can Teach Us About the Southern "Lost Cause" and Confederate Monuments Today Steve Hochstadt is a writer and a professor of history at Illinois College. 

The thousands of monuments to the Confederacy and its leaders scattered across the South have become a national political controversy that shows no signs of abating. The decision of the City Council of Charlottesville, Virginia, to remove the statue of Robert E. Lee, mounted on his horse on a 20-foot high pedestal in the center of town, prompted three public rallies of white supremacists in 2017. At the Unite the Right rally in August last year, James Alex Fields Jr. drove his car into a crowd of counterprotesters, killing one woman and injuring dozens of people. He has just been convicted of first-degree murder. The statue still stands.

 

Of the approximately 1700 public memorials to the Confederacy, less than 100 have been removed in the past few years. These visible symbols represent the persistence of a cherished historical myth of American conservatives, the honor of the “Lost Cause” of the Civil War. Developed immediately after the defeat of the South in 1865, the Lost Cause relies on two claims: the War was caused by a conflict over states’ rights, not slavery, and slavery itself was an honorable institution, in which whites and blacks formed contented “families”.

Thus the political and military leaders of the Confederacy were engaged in a righteous struggle and deserve to be honored as American heroes.

 

This interpretation of the Civil War was a political tool used by Southern whites to fight against Reconstruction and to disenfranchise and discriminate against African Americans. Northern whites generally accepted this mythology as a means to reunite the nation, since that was more comfortable for them than confronting their own racial codes.

 

During most of the 160 years since the end of the Civil War, the Lost Cause reigned as the official American understanding of our history. The glorification of the Ku Klux Klan in the film “Birth of a Nation” (originally titled “the Clansman”) in 1915 was a landmark in the nationalization of this ideology. The newly formed NAACP protested that the film should be banned, but President Woodrow Wilson brought it into the White House, and the KKK sprang to life again that year in both North and South.

 

Not as overtly supportive of white supremacy as “Birth of a Nation”, “Gone With The Wind” in 1939 reinforced the Lost Cause stereotypes of honorable plantation owners, contented slaves unable to fend for themselves, and devious Northerners. It broke attendance records everywhere, set a record by winning 8 Academy Awards, and is still considered “one of the most beloved movies of all time”.

 

Generations of professional historians, overwhelmingly white, transformed the Lost Cause into official historical truth, especially in the South. Textbooks, like the 1908 History of Virginia by Mary Tucker Magill, white-washed slavery: “Generally speaking, the negroes proved a harmless and affectionate race, easily governed, and happy in their condition.” This idea prevailed half a century later in the textbook Virginia: History, Government, Geography, used in seventh-grade classrooms into the 1970s: “Life among the Negroes of Virginia in slavery times was generally happy. The Negroes went about in a cheerful manner making a living for themselves and for those for whom they worked.” A high school text went into more fanciful detail about the slave: “He enjoyed long holidays, especially at Christmas. He did not work as hard as the average free laborer, since he did not have to worry about losing his job. In fact, the slave enjoyed what we might call collective security. Generally speaking, his food was plentiful, his clothing adequate, his cabin warm, his health protected, his leisure carefree. He did not have to worry about hard times, unemployment, or old age.” The texts were produced in cooperation with the Virginia state government.

 

The Civil Rights struggles of the 1960s not only overturned legal segregation, but they also prompted revision of this discriminatory history. Historians have since thoroughly rejected the tenets of the Lost Cause. All the leaders in the South openly proclaimed that they were fighting to preserve slavery, based on their belief in the inherent inferiority of the black race. Both official and eyewitness sources clearly describe the physical, psychological and social horrors of slavery.

 

But the defenders of the Lost Cause have fought back against good history with tenacious persistence. In the international context of the Cold War, the local journalists and academic historians and forthright eyewitnesses, who investigated and reported on the real race relations in American society, became potential traitors. These “terrorists” of the 1950s cast doubt on the fiction of a morally superior America, as it battled immoral Communism. The dominance of white Americans in every possible field of American life was also threatened by a factual accounting of slavery before, during, and after the Civil War.

 

Bad history persists because those in power can enforce it by harassing its critics. It was easy for the FBI and conservative organizations to pinpoint those academics, journalists, and film directors who dissented from the Lost Cause ideology. They could then be attacked for their associations with organizations that could be linked to other organizations that could be linked to Communists. These crimes of identification were made easier to concoct because of the leading role played by American leftists in the fight against racism during the long 20th century of Jim Crow.

 

Thus did Norman Cazden, an assistant professor of music at the University of Illinois, lose his job in 1953. The FBI had typed an anonymous letter containing what Cazden called “unverified allegations as to my past associations,” and sent it to the University President. Cazden was among 400 high school and university teachers anonymously accused by the FBI between 1951 and 1953.

 

The defenders of the Lost Cause switched parties in my lifetime. Shocked by the white supremacist violence of the Civil Rights years, popular movements and popular sentiment forced both parties to end Jim Crow, using historical and political facts to attack all facets of white supremacist ideology, including the Lost Cause.

 

The shift of Dixiecrat Democrats to loyal Republicans is personified in the party shift of Strom Thurmond, Senator from South Carolina and most prominent voice in favor of segregation, from Democrat to Republican in 1964.

 

It still seemed appropriate in 2002 for the Senate’s Republican leader, Trent Lott, to toast Thurmond on his 100th birthday by saying he was proud to have voted for Thurmond for President in 1948, and “if the rest of the country had followed our lead, we wouldn’t have had all these problems over the years, either.” None of the major news outlets, the “liberal media” reported the remark, dwelling instead on the pathos of the old famous rich racist. Only a groundswell of criticism forced the mainstream media to recognize Lott’s words as a hymn to white supremacy.

 

By then, generations of Americans, both in the South and in the North, had absorbed the bad historical lessons that remain the basis for racist beliefs today. 

 

The Lost Cause lives on in the South, supported by federal and state tax dollars. An investigative report published in Smithsonian magazine revealed that the official sites and memorials of the history of the Confederacy still “pay homage to a slave-owning society and serve as blunt assertions of dominance over African Americans.” During the past decade, over $40 million in government funds have been spent to preserve these sites, originally created by Jim Crow governments to justify segregation. Schoolchildren continue to be taught Lost Cause legends.

 

Politics keeps bad history alive, because of the political expediency of the false narratives it tells. American white supremacists have been created and encouraged by this version of American history. 

 

So the struggle over history goes on. Most recently, several dozen graduate teaching assistants at the University of North Carolina announced a “grade strike” to protest the University’s plan to spend $5 million constructing a new building to house a Confederate monument that protesters had pulled down in August. They are refusing to turn in students’ grades.

 

The Lost Cause story itself deserves an “F”, but it will persist as long as political leaders find its fictions convenient.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/blog/154173 https://historynewsnetwork.org/blog/154173 0
What Historians Are Saying About the Wall, the Shutdown, and Trump's Primetime Speech

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170895 https://historynewsnetwork.org/article/170895 0
Round up Top 10!  

A wall can’t solve America’s addiction to undocumented immigration

by Julia G. Young

For more than 70 years, undocumented immigrants have shaped the U.S. economy.

 

Nixon in Fiction: A Bizarre, Complex History

by Alan Glynn

And What We Should Expect for Our Current President

 

 

Brazilian Politics and the Rise of the Far-Right

by Daniela Gomes

What history can teach us about the election of President Jair Bolsonaro in Brazil.

 

 

Alexandria Ocasio-Cortez isn’t the first self-described socialist elected to the House.

by David Greenberg

Here's some historical perspective on socialists in Congress.

 

 

The hole in Donald Trump’s wall

by Tore Olsson

As long as Americans continue to flood into Mexico, the wall will do little to deter crossings.

 

 

Kevin Kruse and Julian Zelizer: Trump's Demise Will Be "Worse Than Watergate"

by Kevin Kruse and Julian Zelizer

If the multiple charges against Trump prove out, he’ll easily displace Nixon at the top of the Crooked Modern Presidents list.

 

 

The history of science shows how to change the minds of science deniers

by Ephrat Livni

By understanding the history of science, we can “keep the world from falling apart.”

 

 

Lincoln's Legacy and the Government Shutdown: The Suicide of a Great Democracy

by George Packer

A shutdown looks like the beginning of the end that Lincoln always knew was possible.

 

 

The Crisis of Imperialism And Why It Will Only Get Worse

by Tom Engelhardt

A tale of imperial power gone awry that could hardly have been uglier. Yet, it’s hard to imagine how things won’t, in fact, get uglier still.

 

 

Why Is Trump Spouting Russian Propaganda?

by David Frum

The president’s endorsement of the U.S.S.R.’s invasion of Afghanistan echoes a narrative promoted by Vladimir Putin.

 

 

No, Trump Cannot Declare an ‘Emergency’ to Build His Wall

by Bruce Ackerman

If he did, and used soldiers to build it, they would all be committing a federal crime.

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170915 https://historynewsnetwork.org/article/170915 0
Congresswoman Virginia Foxx Thinks We Should Abandon The Term "Vocational Training." I Disagree. Last week, Congresswoman Virginia Foxx condemned some un-named folks as “classist” for placing the “stigma” of inferiority on those who opt for a vocational or technical education.  Indeed, to even place the adjective “vocational” in front of the term “education” implies such inferiority, Ms. Foxx asserts. But the Congresswoman misses the real import of education and misses, therefore, the distinction between education and training.  

 

Unfortunately, Ms. Foxx is not alone in missing this distinction. In its true meaning education must be about introducing young people to a knowledge of themselves and a knowledge of the relation that they have to society, the world, and the universe in which they live. We are all historical creations.  We inherit our attitudes, beliefs, and values from the world around us.  It may well be, for example, that the United States is the greatest country in the world, or that Christianity is the one true and only religion, or that the limits capitalism places on democratic control are divinely or naturally ordained.  But young people born and raised in this country will believe this not because they have chosen such beliefs, but simply because the whole of the world around them tells them this is so.  Only by deeply studying the real history of this country, only by understanding evolution and the history and the immensity of the universe, only by delving into literature and psychology, can young people begin to comprehend who they are as human beings in this country today.  And only by sampling the wealth of human knowledge in all its varied fields can young people freely decide to pursue one path or another in making their life and making their living.  

 

Of course, every state legislature in this country, in underfunding public education, and pushing the agenda Ms. Foxx pushes here – let’s call training for a career the same thing as an education – effectively seeks to deny young people knowledge of themselves and the world in which they live. In so doing, they dis-empower our youth and simply make “education” into a means of churning out compliant men and women who will work for fifty or sixty years, and then die. 

 

No, this country desperately needs educated young people. Those of us who insist that our students get a real education are not the “classists” condemned by Ms. Foxx. No, the real “classists” are those who would condemn our youth to the perpetual role of servants in a world ruled by wealth.

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170887 https://historynewsnetwork.org/article/170887 0
Yesterday was the 100th Anniversary of Theodore Roosevelt's Death. Here's How His Legacy Still Shapes the United States Today. The beginning of the year 2019 marks the centennial of the death of the 26thPresident of the United States, Theodore Roosevelt, who passed away at age 60 on January 6, 1919. The impact of Roosevelt was massive, and continues to be so on America a century later. Here are five ways that Teddy Roosevelt’s legacy still shapes the United States today. 

 

The first and most significant contribution of Theodore Roosevelt to his country was his commitment to and advocacy of conservation of the environment, including promotion of national parks and national monuments, protection of our natural resources for the long term, and emphasizing the need for government and the people to show respect and awe for the great natural wonders of the North American continent.  Roosevelt is regarded as the premier figure who inspired the environmental movement, which fortunately was encouraged and accelerated by many of his White House successors including Woodrow Wilson, Franklin D. Roosevelt, John F. Kennedy, Lyndon B. Johnson, Richard Nixon, Jimmy Carter, Bill Clinton and Barack Obama.

 

Second, Roosevelt emphasized the need for social justice and encouraged “progressivism” from the White House. He was committed to the cause of workers and consumers both in and out of office.  The need for responsible government regulation of corporations was a driving force in his life.  He sincerely believed that many problems in American society could not be resolved just on the state and local level, but needed a national voice for all of the American people—not just the wealthy and privileged.

 

Roosevelt also shaped the modern presidency as he revived the Presidential office after its decline in power and influence after Abraham Lincoln’s assassination. In doing so, he became the model for many future presidents including Wilson, FDR, Harry Truman,  Kennedy. Johnson, Nixon, Carter, Clinton and Obama. Presidential scholars in History and Political Science would regularly rate Theodore Roosevelt as a “Near Great” President, ranked only behind Lincoln, George Washington, and FDR. This is quite a feat to hold such scholarly admiration and public renown for an entire century.

 

Fourthly, Roosevelt saw the absolute need to build the defenses of the United States against any future foreign threat. In particular, he loved and was fascinated with the US Navy. He believed war was at times necessary to protect the great experiment in democracy and the constitutional framework set up by the Founding Fathers.  As part of his perception of world affairs, Roosevelt saw the need for the building of the Panama Canal, and for assertion of American authority over the Western Hemisphere, going past the wording of the Monroe Doctrine of 1823 with his Roosevelt Corollary in 1904, and his assertion of the Big Stick policy toward Latin America.  Unfortunately, this created a long-term image of the United States as a imperialist power, not well regarded or appreciated by the independent nations of the hemisphere.

 

Finally, Roosevelt, while promoting military and naval buildup for protection of the nation, was also a great diplomat. His expansion of American diplomacy and relations with foreign nations helped expand American power in the early 20thcentury. He became very close to nations that would later become our allies—particularly Great Britain and France—and set a new standard for presidential engagement by negotiating the Treaty of Portsmouth which ended the Russo Japanese War of 1904-1905, winning him the honor of the Nobel Peace Prize in 1906.  He also took a moral stand toward any sign of aggression in the world as he came to warn of the danger of German aggression at the time of the Morocco Crisis of 1905-1906. He spoke out against the pogroms going on in Czarist Russia during his Presidency, and worked to promote a peaceful co-existence between Japan and the United States in the Far East, due to his concerns over our territories of Hawaii, Guam, and the Philippine Islands. 

 

These five positive contributions of Theodore Roosevelt have lasted and will continue to have an impact on the American Presidency and the future of the American nation.

 

]]>
Sun, 20 Jan 2019 21:57:43 +0000 https://historynewsnetwork.org/article/170867 https://historynewsnetwork.org/article/170867 0