History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Fri, 24 Jan 2020 17:40:06 +0000 Fri, 24 Jan 2020 17:40:06 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed The History of Presidential Misconduct Beyond Watergate and Iran-Contra

 

In 1974, James Banner was 1 of 14 historians tasked with creating a report on presidential misconduct and how presidents and their families responded to the charges. The resulting report was a chronicle of presidential misconduct from George Washington to Lyndon B. Johnson. It indicated moments of corruption, episode by episode with no connective tissue between administrations. 

 

The historians delivered the report in eight weeks and it was prepared for distribution to the House Judiciary Committee. But then Richard Nixon resigned and the hearings it was designed for never happened. The report was published as a book but it got very little attention. Very few American historians even heard of it. “And thus it lay,” stated Banner. 

 

Then, in August 2018 historian Jill Lepore called Banner and asked him about the book. Afterwards, Lepore wrote about the 1974 report in the New Yorker and the press and surrounding political events ignited interest in it. To bring the chronicle of presidential misconduct up to date, Banner identified and recruited seven historians to write new chapters so that a new version of the book would end with Barack Obama. Presidential Misconduct: From George Washington to Today was published by the New Press in July 2019.

 

At the American Historical Association’s annual meeting in early January, Banner monitored a panel with three of the book’s new authors: Kathryn Olmstead, Kevin M. Kruse, and Jeremi Suri. Examining the presidencies of Richard Nixon, Jimmy Carter, and Ronald Reagan, the historians discussed how the recent past shapes the current discussion of presidential misconduct.

 

Kathryn Olmstead examined presidential misconduct beyond Watergate in the Richard Nixon administration and argued abuse of power and law-breaking were central to Nixon’s presidency. 

 

Dr. Olmstead urged historians to remember just how unusual Nixon was. Popular accounts of Watergate often minimize the crimes and focus on the subsequent cover-up, but Nixon’s crimes were substantial and began before he was elected. 

 

During the 1968 election, President Lyndon B. Johnson announced that if the North Vietnamese made certain concessions, LBJ would halt bombing campaigns and being negotiations with the North Vietnamese. Nixon publicly agreed with LBJ’s stance, but privately he took action to sabotage the plan. Nixon appointed Anna Chennault, a prominent Republican fund-raiser, as a go-between to encourage South Vietnam to not accept the negotiations. When North Vietnam signaled it would make the necessary concessions, many thought the war would end and Hubert Humphrey, the Democratic candidate for president, started to do better in the polls. Then, the South Vietnamese indicated that the concessions would not be sufficient for them to negotiate.

 

LBJ knew Nixon influenced the potential negotiations because he instructed the FBI to listen to Nixon’s phone calls. LBJ believed Nixon’s actions amounted to treason but he did not have definitive proof to show Nixon knew about the entire operation so LBJ did not reveal the information publicly.

 

It is likely the Chennault Affair contributed to the paranoia that eventually led to the Watergate break-in. F.B.I. Director J. Edgar Hoover informed Nixon that LBJ knew of Nixon’s role in sabotaging negotiations with North Vietnam and Nixon became obsessed with the idea that the Democrats had information that could hurt him. 

 

Once in office, Nixon’s illegal behavior snowballed. Nixon authorized secret bombings of Cambodia, warrantless wiretaps on news reporters, and created the infamous “Plumbers.” The Committee to Reelect the President raised 20 million dollars--much of it acquired through bribery and extortion--and then used the funds for massive harassment and surveillance of Democrats during the 1972 election. 

 

Thus, illegal behavior was central to Nixon’s conception of the presidency. Nixon himself explained this to David Frost in 1977. Nixon insisted he should have destroyed the tapes and maintained that “when a president does it that means it’s not illegal.”

 

Princeton historian Kevin Kruse discussed the presidency of Jimmy Carter. While Carter had a pronounced commitment to ethical government, a closer look at Carter’s presidency shows that even those who tried to meet anticorruption standards can be brought low by their efforts. 

 

Even before Watergate, Carter presented himself as a political outsider. After Watergate, Carter capitalized on American concerns about a lack of morals in politics. Carter wanted to seem as trustworthy as possible on the campaign and after he was elected he worked to maintain the public’s trust. Carter famously put his peanut farm into a blind trust and the White House implemented stricter rules regarding conflicts of interest and financial disclosure. 

 

Because of these promises, Carter’s family came under intense scrutiny and the constant hunt for dirt hurt the administration.

 

Carter asked Burt Lance, the incoming manager of the Office of Budget, to put his stocks in a blind trust. Lance agreed but then the stocks began to plummet to Lance delayed doing it. In response, a Senate committee and Comptroller General opened investigations into Lance. The investigations revealed sloppiness but concluded there was no wrongdoing. Nonetheless, Carter’s administration was consumed by the Lance investigation and Carter stuck by his friend despite his staff’s recommendation to fire Lance. Finally, in the fall of 1977 Carter pressed Lance to resign and in retrospect Carter realized he should have sooner. 

 

Next, a scandal emerged centered on Carter’s peanut warehouse that was put in a blind trust. As the business had fallen on hard times, it was revealed that Bruce Lance had once given a loan to the warehouse. Many were concerned that the funds were diverted to Carter’s campaign. A team of investigators reviewed 80,000 documents and Carter even gave a sworn deposition (this was the first time a president was interviewed under oath in a criminal investigation). The investigation concluded there was no evidence was diverted. 

 

The third scandal of Carter’s administration centered on Billy Carter. Billy ran a gas station and when the business started to struggle, Billy tried to capitalize on his brother’s fame. In September 1978, Billy took a trip to Libya seeking to make a deal with Muammar Gaddafi. While there, Billy made anti-Semitic comments and urinated on the airport tarmac.  This all caused a great deal of embarrassment for Jimmy Carter. Worse, it was soon revealed that Billy had received hundreds of thousands of dollars in loans from Libya. The scandal was dubbed Billygate. After a Senate investigation, a bipartisan report concluded Billy had not done anything illegal. 

 

These sloppy practices each invited close investigation but in each case, officials concluded the acts were not criminal. Nonetheless, these scandals overshadowed much of Carter’s presidency and demonstrate that Carter’s action never lived up to his high-minded intensions. 

 

Jeremi Suri, a historian at the University of Texas at Austin, discussed presidential scandal during the Ronald Reagan administration. Suri noted that he was surprised at how little attention historians have paid to presidential misconduct, likely because historians like to stay away from scandal and research more “serious” events. Suri, however, thinks that misconduct was central to policy for Reagan.

 

Scandal under Reagan is a paradox because Reagan personally was not corrupt—he did not personally get money from misconduct—and was adverse to discussions he thought were unseemly. Nonetheless, because of his personal qualities, Reagan’s policies were built on a pyramid of misconduct or a “cocktail of corruption” centered on the intersection of deregulation, ideological and at times religious zealotry, lavish resources, and personal isolation from the daily uses of those resources. 

 

In other words, the institutional structure of the executive branch created incentives for corrupt behavior. Strikingly, over 100 members of the Reagan administration were prosecuted and 130 billion dollars were diverted from taxpayer uses. 

 

Dr. Suri focused on a few particular scandals and started with the Environmental Protection Agency scandal. Reagan appointed Anne Gorsuch, the mother of Supreme Court Justice Neil Gorsuch, as the administrator of the EPA. Gorsuch directly negotiated contracts with land developers and when she was investigated, Gorsuch refused to turn over documents or testify and was held in contempt of court. Her deputy served two years in prison. 

 

Secretary of the Interior James Watt was forced to resign after he made explicitly racist comments. After resignation, Watt used his connections to work with the Department of Housing and Urban Development and lobby for his friends to get contracts to build affordable housing that wasn’t actually affordable. Watts was convicted on 25 felony accounts. 

 

As Reagan approached his second term, many advisors resigned and became lobbyists, flagrantly going past the legal limitations on lobbying. Those who continued to work in the White House continued the corruption that plagued the administration in its first term. 

 

Attorney General Edward Meese combined petty corruption with the gargantuan. Niece would try to get double reimbursed for expenses. He took out personal loans from people who were bidding for government contracts, would not disclose the loans, and then would lobby for the loaners to get the contracts. Niece was not convicted but resigned. 

 

Assistant Secretary of the Navy Melvin Paisley stole 622 million dollars from the government. The F.B.I. concluded this was a consequence of large-scale appropriations with insufficient oversight. 

 

Suri argued that the Savings and Loans Crisis and the Iran Contra scandal emerged from an administrative culture that was unregulated, permissive in the misuse of resources, and lavish in spending. He concluded that this structural corruption has not gone away. 

 

To conclude the panel, James Banner gave a thoughtful comment that connected the history discussed to the present impeachment of President Donald Trump. Nixon was a pioneer in orchestrating misconduct from the oval office. Reagan pioneered allowing a shadow administration to implement policies that were not approved by Congress. To Banner, it seems that the Trump administration is doing both of these things at the same time. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174106 https://historynewsnetwork.org/article/174106 0
The History Behind the Border and Immigrant Detention Centers How can history help us understand the detention of undocumented immigrants along the Southern border? At the American Historical Association’s annual meeting earlier this month, four historians examined this issue from different angles on a panel entitled Late Breaking: The Border Crisis in Historical Perspective. 

 

First, W. Fitzhugh Brundage, a professor at the University of North Carolina at Chapl Hill, examined how the history of human rights and torture illuminates the detention of migrants. 

 

All refugees are guaranteed broad protections under international law. These protections included a promise that an asylum seeker would not be returned to the country they were fleeing and protections for minor children and families. 

 

These protections were less consequential in reality than hoped. The international law had no methods for monitoring or formal enforcement mechanisms. Instead, it relied on the good will of signatory nations. The United States adopted the narrowest definitions of asylum and persecution so it only applied to a small portion of migrants. 

 

In 2005, under George W. Bush’s administration, the United States began deporting apprehended migrants using “expedited removal.” Operation Streamline established zero tolerance towards unauthorized border crossings and made it a misdemeanor. Deportations subsequently skyrocketed. Barack Obama expanded the policy and the number of people prosecuted for unauthorized border crossings quadrupled. 

 

Now, President Donald J. Trump has created an unprecedented campaign to delegitimize and dehumanize asylum seekers. Trump casts the southern border as a national security dilemma. Historically, national security has always been the largest justification for violating civil liberties and human rights. 

 

For much of American history, immigration was not considered a law enforcement issue. At the end of the 19th century, the government created formal oversight of immigration and placed it in the Department of the Treasury. In 1903, the duty moved to the Department of Labor. Then, in 1940 immigration oversight moved to the Department of Justice. In 2011, the oversight of immigration moved to the Department of Homeland Security solidifying that immigration was considered a matter of national security. This latest move is particularly troubling because the Department of Homeland Security has very little oversight so it’s hard for lawmakers to regulate its actions.

 

Recent news coverage has illuminated the systematic and pervasive human rights violations of agencies like the U.S. Immigration and Customs Enforcement (ICE).  Most egregiously, families are separated and people are indefinitely detained.  Inhumane conditions have amplified the problem, especially overcrowding. 

 

The intent of such policies is to make conditions in immigration detention so bad that asylum seekers will think twice about seeking asylum in the United States.

 

This indefinite detention is a glaring violation of the UN's ban of torture. Discussions of torture always involve debates over what torture actually is. Some defend it by dismissing any suggestion that it was actually torture. The controversy over the term “concentration camps” for the immigrant detention facilities shows the tension of the moment. 

 

The bottom line, however, is that the DHS and DOJ are committing huge violations of human rights. Trump will say these policies are necessary for national security, but the evidence shows that asylum and immigration enforcement are meant to intimidate, humiliate, and terrorize. 

 

Lauren Pearlman from the University of Florida discussed the economics of immigration detention with a specific focus on the role the private prison industry plays in immigration policy. 

 

Ellis Island, opened in 1882, was the first detention center created for immigrants. It wasn’t until the 1980s, however, that large numbers of people began to be detained. 

 

In 1988, Congress passed the Anti Drug Abuse Act and it demanded the detention of any noncitizens who committed a felony. The aftermath of September 11, 2001 cemented the securitization of immigration detention policy. The newly created Department of Homeland Security now took over the duties previously administered by the Department of Immigration and Naturalization Services (INS) under the Department of Justice.   

 

The Department of Immigration and Naturalization Services (INS) was one of the first to utilize private prison operators. By 1988, 800 of 2400 people in INS custody were housed in private facilities. 

 

Andrea Miller, an anthropologist at the University of California Davis, examined drones, air policing, and immigration. Dr. Miller asserted it was important to understand the longstanding connection between war power and police power. Atmospheric policing technologies (like helicopters, airplanes, and balloons) have long been used to control populations and are connected to tear gas, sonic weaponry, etc. 

 

Recently, the U.S. Customs and Border Patrol (CBP) announced it would test a small drone system weighing just 14 pounds. Incorporating this technology could reduce situation awareness gaps and “enhance situation awareness,” a military term that denotes the ability to extend police awareness beyond human ability. 

 

The use of drones also broadens the extensive system of surveillance of immigration. Today, undocumented immigrants are most likely to be apprehended in a traffic stop. ICE has access to data from license plate scans and readers. This data is stored and can be requested to create snapshots of where people live, travel, etc. This mirrors the same method of life analysis utilized in the War on Terror. Data points are placed in relation to each other to see patterns of mobility and figure out if someone is threatening. 

 

Dr. Miller is also interested in the connections between military tactics and tools and local police forces. For example, the Los Angeles Police Department first acquired drones in 2014. By 2017, the LAPD posted suggests guidelines for its unmanned aviation vehicle (UAV) pilot program that is now formally adopted. To Miller, it is telling that police and immigration agencies are adopting similar technologies adapted from the military. 

 

Finally, Stuart Schrader of Johns Hopkins University discussed El Salvador and the history of border patrol aid. 

 

Dr. Schrader examined the legacy of aid and immigration in the context of El Salvador. As Trump threatened to cut off aid to Central America, news circulated in September 2019 that El Salvador received enough American assistance to create a new border control agency. The agency would consist of 400 officers, 300 of whom would be dedicated to immigration and the other 100 working as part of a national police force. 

 

The United States has offered technical assistance to border patrol for a long time. In the 1950’s and ‘60s, the State Department worked with El Salvador to improve its structure of surveillance at its borders. 

 

U.S. assistance has always been designed to meet American policy goals. Previously, the U.S. Border Patrol was central to sate building and fighting the Cold War in the third world. Today, the U.S. is providing assistance to places like El Salvador to keep people inside their country and prevent them from immigrating to the U.S. The irony is that the U.S. is responsible for creating many of the problems that El Salvadorians are trying to escape and attempting to prevent people from leaving creates more pressure on El Salvador. 

 

Each of these perspectives offers a historical frame to help us understand the deep historical trends that shape daily headlines. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174105 https://historynewsnetwork.org/article/174105 0
Roundup Top 10!  

Why No GOP Senator Will Stand Up to Trump

by Garrett M. Graff

Barry Goldwater had the power to tell Nixon it was all over. But don’t expect a repeat this time.

 

Martin Luther King Jr. on Making America Great Again

by Justin Rose

“I’d like somebody to mention that day that Martin Luther King, Jr., tried to give his life serving others…I want you to say that I tried to love and serve humanity.”

 

 

The Injustice of This Moment Is Not an ‘Aberration’

by Michelle Alexander

From mass incarceration to mass deportation, our nation remains in deep denial.

 

 

The National Archives' dangerous corruption of history

by David Perry

While the National Archives issued an apology and vowed to undergo "a thorough review" of its policies after the Washington Post first reported on the alteration, having discovered it by chance, as a historian I worry about how many other altered documents the Trump administration has buried in our records. Will we ever know?

 

 

The Road to Auschwitz Wasn't Paved With Indifference

by Rivka Weinberg

We don’t have to be ‘upstanders’ to avoid genocides. We just have to make sure not to help them along.

 

 

What Antiabortion Advocates Get Wrong About The Women Who Secured The Right to Vote

by Reva Siegel and Stacie Taranto

The most famous suffragists largely weren’t antiabortion and wanted women to have more control over their bodies.

 

 

Universities must open their archives and share their oppressive pasts

by Evadne Kelly and Carla Rice

The archives of academic institutions can tell previously untold stories of eugenics. Universities can begin to undo oppressive legacies by opening them to artists and communities.

 

 

The Neighborhoods We Will Not Share

by Richard Rothstein

Persistent housing segregation lies at the root of many of our society’s problems. Trump wants to make it worse.

 

 

A Matter of Facts

by Sean Wilentz

The New York Times’ 1619 Project launched with the best of intentions, but has been undermined by some of its claims.

 

 

Pence's outrageous op-ed holds deeper meaning

by Jeremi Suri

Vice President Mike Pence published a powerful, but deceptive article in Friday's Wall Street Journal.

 

 

 

Charlotta Bass for Vice-President: America’s Two-Parties and the Black Vote

by Denise Lynn

Bass, an influential political activist, ran for Vice-President on the Progressive Party ticket in 1948. Her campaign demonstrated some of the shifting loyalties of Black voters and the failure of both political parties to address the needs of the Black community.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174134 https://historynewsnetwork.org/article/174134 0
Reflecting on Martin Luther King Jr.'s Dream and Legacy I went to the annual city-sponsored celebration of Martin Luther King, Jr., on Monday. Jacksonville’s Mayor, Andy Ezard, inaugurated these yearly breakfasts a decade ago. Every year a speaker helps us think about what MLK said, what he wanted to happen, and how he lived. There is often music, and we sing “Lift Every Voice and Sing”, a hopeful song: from “the dark past”, “the gloomy past” “that with tears has been watered”, to the present, “the place for which our fathers signed”, to the future, “Let us march on till victory is won.”

 

Some years, that song and the celebration around it do lift hope, because the present is evolving to a brighter future that is joyous to imagine. But not today.

 

Today hope means believing that we will soon stop going backwards, that this moment is just a hesitation on the journey toward the unity we seek. Sometimes hope makes way for despair about how things might get worse.

 

What American governments since the 1960s have created in order to undo centuries of prejudice, discrimination, and persecution, the Republican Party is dismantling. I don’t say that Trump is doing this, even though his name is on every new policy of his administration, because he is not alone. Republican politicians across the country are doing this work themselves, defending the work of their colleagues, and pledging allegiance to the man who is leading the charge backwards.

 

Two authors of distinguished books on the history of race in America just wrote articles for the New York Times for Martin Luther King Jr.’s birthday, which tell us where we are and what is being done in our names. Michelle Alexander published “The New Jim Crow: Mass Incarceration in the Age of Colorblindness” 10 years ago, when Trump was just a glittery real estate con man. She showed how the explosion of the number of Americans in jail in the wake of the “war on drugs” was at its heart “another caste system — a system of mass incarceration — that locked millions of poor people and people of color in literal and virtual cages.”

 

The numbers must be printed to make their proper impression. These are careful estimates only, because fuller data does not exist, but they are the best estimates we have. Between 1980 and 2010, the proportion of Americans in prison tripled to 1%, but the proportion of African Americans in prison also almost tripled from 1.3% to 3.1%. The racial disparity remained about the same, as American governments imprisoned so many more Americans. In 2010, one out of three black men had a felony conviction in their past, and one of every ten were in prison or on parole. That was true for only one in fifty of the rest of the population.

 

This happened in Boston, where African Americans were subject to police observations, interrogations, and searches at seven times the rate of the rest of the population. It happened in Charlottesville, Virginia, in 2017, when African Americans were nine times as likely to be subject to police investigative detentions. And so on.

 

Alexander shows that both Democratic and Republican political leaders gave the US the dubious distinction of having more than one fifth of the world’s prisoners and the highest incarceration rate in the world: 756 per 100,000, while most countries imprison fewer than 150 per 100,000. Both Boston and Charlottesville were dominated by Democrats.

 

Now she delivers a shorter message: our nation must move back to the path toward racial justice from the detour we are taking. Obama and the national Democratic Party did not do enough to reverse those trends. But that is a long way from what has happened in the past 3 years. She is clear that the transition from Obama to Trump moved us from a hopeful discussion of racial reform to an era of white supremacy, clothed as returning to greatness.

 

Richard Rothstein published “The Color of Law: A Forgotten History of How Our Government Segregated America” three years ago. He also covered the long history of discrimination that Alexander described, but this time from the point of view of housing segregation. In fine detail, Rothstein explained how the federal government throughout the 20th century, under Democrats and Republicans, used its vast financial powers to promote further residential segregation, notably in the new postwar suburbs. Where I grew up, in the giant Levitt developments on Long Island, the federal government insured his loans on the condition that African Americans would be excluded from buying his houses.

 

This has been the American way for centuries, putting an unfair economic burden on African Americans. 1968 appeared to put an end to federal complicity in the segregation of American housing. Through the Fair Housing Act, included in the Civil Rights Act of 1968, groups who are discriminated against in anything to do with housing can use the legal system to demand redress. That Act was passed in the wake of MLK’s assassination.

 

But discrimination continues, taking less obvious forms. An example of how this occurs out of our sight comes from Syracuse. Since 1996, property has not been reassessed in the city, which seems like merely local government incompetence. But since then, the values of homes in white neighborhoods have risen much faster than homes in black neighborhoods. Reassessment would shift some of the weight of property taxes toward those much more valuable white homes. Not doing anything means that black homeowners have been paying an increasingly disproportionate share of property taxes. The city government in Syracuse is dominated by Democrats.

 

Rothstein’s article in the NY Times shows how Secretary of Housing and Urban Development Ben Carson is directing his vast bureaucracy and billions of dollars away from the process of desegregation. Since he was a candidate for President in 2016, he has argued that efforts to fix racial segregation are bad “social engineering”. Now HUD is trying to make it impossible for residents in a community like Syracuse, where government or business policies discriminate against racial minorities, to prove that in court. One of the far-reaching policies of the Trump administration which makes fighting discrimination more difficult.

 

These are pieces in today’s national puzzle of race. Martin Luther King has missed more than 50 years of change in race relations, in party politics, in the American landscape. But his yet unrealized dreams can still inspire hope.

 

Hardly anything is more worth fighting for.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/blog/154306 https://historynewsnetwork.org/blog/154306 0
What Will the Museums of the Future Be Like?

Help the Smithsonian’s National Museum of American History by taking this brief survey to share what you’d like to see from *your* National Museum. 

To access the survey, click the image above or this link: https://s.si.edu/YourOpinionMatters.

To access the survey in Spanish, click this link: https://s.si.edu/TuOpiniónImporta.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174092 https://historynewsnetwork.org/article/174092 0
Four Speeches by Dr. King That Can Still Guide Us Today

 

(Author's note: On January 9, 2020 I delivered the Martin Luther King, Jr. tribute lecture at the Uniondale Public Library in Uniondale, New York.  The presentation focused on four speeches by Dr. King that illustrate his concerns and suggest what his views on the world today might have been.)

 

Thank you for inviting me to speak today at the Uniondale (NY) Public Library Martin Luther King, Jr. commemoration. It is a great honor. I speak as a Hofstra University educator, as a historian, as an activist, but also as a white man committed to social justice. I want to share my ideas about Dr. King’s legacy, but I also came tonight because I want to hear yours. In my talk I will use Negro rather than Black or African American because that is how Dr. King referred to himself.

 

In the tradition of Dr. Martin Luther King, Jr., I am here today to recruit you. At a time when democracy itself is under threat from a renegade President and his rightwing, often racist supporters, the United States desperately needs a renewed King-like movement for social justice. The world also desperately needs an expanded King-like movement for climate action. Where we are sitting now may be under water by the end of the 21st century, or even sooner. These are the movements I am recruiting you to join. 

 

I am not recruiting you to a particular ideology, political program, or point of view. The world is constantly changing – we are all constantly changing. But I am recruiting you to become compassionate human beings with respect for diversity in the tradition of Dr. King. We need to be concerned with the needs of others who share this planet with us, to recognize their humanity, and to understand that they want the same things that we do for themselves and their families -- adequate food and housing, decent education and medical care, safety, and hope for the future. We need to understand that just because someone does not live your way, practice your religion, or make the same choices that you make, does not mean they are wrong or less than you. These are some of the lessons I have learned from Dr. King.

 

The African American Civil Rights Movement in the United States was a major world historic event that motivated people to fight for social justice in this country and others. Its activism, ideology, and achievements contributed to the women’s rights movement, the gay and lesbian rights movement, the struggle for immigrant rights, and the anti-war movement in this country. It inspired anti-Apartheid activists in South Africa and national liberation movements in Third World countries. 

 

The traditional myth about the Civil Rights Movement, the one that is taught in schools and promoted by politicians and the national media, is that Rosa Parks sat down, Martin Luther King stood up, and somehow the whole world changed. But the real story is that the Civil Rights Movement was a mass democratic movement to expand human equality and guarantee citizenship rights for Black Americans. It was definitely not a smooth climb to progress. Between roughly 1955 and 1968 it had peaks that enervated people and valleys that were demoralizing. Part of the genius of Dr. King was his ability to help people “keep on keeping on” when hope for the future seemed its bleakest.

 

While some individual activists clearly stood out during the Civil Rights Movement, it involved hundreds of thousands of people, including many White people, who could not abide the U.S. history of racial oppression dating back to slavery days. It is worth noting that a disproportionate number of whites involved in the Civil Rights movement were Jews, many with ties to Long Island. In the 1960s, the Great Neck Committee for Human Rights sponsored an anti-discrimination pledge signed by over 1,000 people who promised not to discriminate against any racial or ethnic groups if they rented or sold their homes. They also picketed local landlords accused of racial bias. The Human Rights Committee and Great Neck synagogues hosted Dr. King as a speaker and raised funds for his campaigns on multiple occasions.

 

King and Parks played crucial and symbolic roles in the Civil Rights Movement, but so did Thurgood Marshall, Myles Horton, Fanny Lou Hammer, Ella Baker, A. Philip Randolph, Walther Reuther, Medger Evers, John Lewis, Bayard Rustin, Pete Seeger, Presidents Eisenhower and Johnson, as well as activists who were critics of racial integration and non-violent civil disobedience such as Stokely Carmichael, Malcolm X, and the Black Panthers.

 

The stories of Rosa Parks and Martin Luther King have been sanitized to rob them of their radicalism and power. Rosa Parks was not a little old lady who sat down in the White only section of a bus because she was tired. She was only 42 when she refused to change her seat and made history. In addition, Parks was a trained organizer, a graduate of the Highlander School where she studied civil disobedience and social movements, and a leader of the Montgomery, Alabama NAACP. Rosa Parks made a conscious choice to break an unjust law in order to provoke a response and promote a movement for social change. 

 

Martin Luther King challenged the war in Vietnam, U.S. imperialism, and laws that victimized working people and the poor, not just racial discrimination. When he was assassinated in Memphis, Tennessee, he was helping organize a sanitation workers union. If Dr. King had not be assassinated, but had lived to become an old radical activist who constantly questioned American policy, I suspect he would never have become so venerated. It is better for a country to have heroes who are dead, because they cannot make embarrassing statements opposing continuing injustice and unnecessary wars.

 

The African American Civil Rights Movement probably ended with the assassination of Dr. King in April 1968 and the abandonment of Great Society social programs by the Democratic Party, but social inequality continues. What kind of country is it when young Black men are more likely to be involved with the criminal justice system than in college, inner city youth unemployment at the best of times hovers in the high double-digits, and children who already have internet access at home are the ones most likely to have it in school? What kind of country is it when families seeking refuge from war, crime, and climate disruption are barred entry to the United States or put in holding pens at the border? These are among the reasons I am recruiting everyone to a movement for social justice. These are the things that would have infuriated Martin Luther King.

 

I promised I would share excerpts from four of Dr. King’s speeches. Everyone has the phrases and speeches that they remember best. Most Americans are familiar with the 1963 “I have a Dream” speech at the Lincoln Memorial in Washington DC and the 1968 “I’ve been to the Mountaintop” speech in Memphis just before he died. These are four other speeches that still resonate with me the most today.

 

The first speech I reference is one for local Uniondale, Long Island, and Hofstra pride. In 1965, Dr. King was honored and spoke at the Hofstra University graduation. It was less than one year after he received the Nobel Peace Prize and three years before his assassination. In the speech Dr. King argued “mankind’s survival is dependent on man’s ability to solve the problems of racial injustice, poverty and war” and that the “solution of these problems is . . . dependent upon man squaring his moral progress with his scientific progress, and learning the practical art of living in harmony.” I have no doubt that if Dr. King were alive today, he would be at the forefront of the Black Lives Matter movement, demands for gun control, climate activism, and calls for the impeachment of Donald Trump. 

 

In his Hofstra speech, Dr. King told graduates, families, and faculty, “we have built machines that think, and instruments that peer into the unfathomable ranges of interstellar space. We have built gigantic bridges to span the seas, and gargantuan buildings to kiss the skies . . . We have been able to dwarf distance and place time in chains . . . Yet in spite of these spectacular strides in science and technology, something basic is missing. That is a sort of poverty of the spirit, which stands in glaring contrast to our scientific and technological abundance. The richer we have become materially, the poorer we have become morally and spiritually. We have learned to fly the air like birds and swim the sea like fish. But we have not learned the simple art of living together as brothers.”

 

Always a man of hope, as well as of peace, Dr. King concluded that he had faith in the future and that that people would “go all out to solve these ancient evils of mankind” but he also acknowledged that the struggle would be difficult and that “there is still a great deal of suffering ahead.” He then challenged the graduates to become “an involved participant in getting rid of war, and getting rid of poverty, and getting rid of racial injustice. Let us not be detached spectators or silent onlookers, but involved participants.” Dr. King would have been here to recruit you too.

 

Letter from Birmingham Jail was not originally given as a speech, but Dr. King recorded it later so I am including an excerpt here where he explained the legitimacy and urgency of direct political action, the kind that brought down apartheid in South Africa and we are seeing from young people in Hong Kong today.

 

"Why direct action? Why sit ins, marches and so forth? . . . Nonviolent direct action seeks to create such a crisis and foster such a tension that a community which has constantly refused to negotiate is forced to confront the issue. It seeks so to dramatize the issue that it can no longer be ignored . . . The purpose of our direct action program is to create a situation so crisis packed that it will inevitably open the door to negotiation . . . We know through painful experience that freedom is never voluntarily given by the oppressor; it must be demanded by the oppressed. Frankly, I have yet to engage in a direct action campaign that was "well timed" in the view of those who have not suffered unduly from the disease of segregation. For years now I have heard the word "Wait!" It rings in the ear of every Negro with piercing familiarity. This "Wait" has almost always meant "Never." We must come to see, with one of our distinguished jurists, that "justice too long delayed is justice denied."

 

I was an anti-war activist in the 1960s and this speech by Dr. King had particular importance for me at the time. On April 4, 1967, at a meeting of Clergy and Laity Concerned about Vietnam at Riverside Church in New York City, he denounced U.S. involvement in the war on Vietnam and imperialism in general.

 

“As I have called for radical departures from the destruction of Vietnam, many persons have questioned me about the wisdom of my path . . . ‘Peace and civil rights don’t mix,’ they say . . . There is at the outset a very obvious and almost facile connection between the war in Vietnam and the struggle I and others have been waging in America. A few years ago there was a shining moment in that struggle. It seemed as if there was a real promise of hope for the poor, both black and white, through the poverty program. There were experiments, hopes, new beginnings. Then came the buildup in Vietnam, and I watched this program broken and eviscerated as if it were some idle political plaything of a society gone mad on war. And I knew that America would never invest the necessary funds or energies in rehabilitation of its poor so long as adventures like Vietnam continued to draw men and skills and money like some demonic, destructive suction tube . . . We were taking the black young men who had been crippled by our society and sending them eight thousand miles away to guarantee liberties in Southeast Asia which they had not found in southwest Georgia and East Harlem . . . I could not be silent in the face of such cruel manipulation of the poor . . .. For the sake of those boys, for the sake of this government, for the sake of the hundreds of thousands trembling under our violence, I cannot be silent.

 

In the same speech Dr. King argued: “I am convinced that if we are to get on the right side of the world revolution, we as a nation must undergo a radical revolution of values . . .We must rapidly begin the shift from a thing-oriented society to a person-oriented society. When machines and computers, profit motives and property rights, are considered more important than people, the giant triplets of racism, extreme materialism, and militarism are incapable of being conquered. A true revolution of values will soon cause us to question the fairness and justice of many of our past and present policies . . . True compassion is more than flinging a coin to a beggar. It comes to see that an edifice which produces beggars needs restructuring. A true revolution of values will soon look uneasily on the glaring contrast of poverty and wealth. With righteous indignation, it will look across the seas and see individual capitalists of the West investing huge sums of money in Asia, Africa, and South America, only to take the profits out with no concern for the social betterment of the countries, and say, ‘This is not just.’” 

 

On August 16 in Atlanta, Georgia, at the annual meeting of the Southern Christian Leadership Conference,Dr. King asked, “Where do we go from here?” It is a question people concerned with social justice are still asking today.

 “The movement must address itself to the question of restructuring the whole of American society. There are forty million poor people here. And one day we must ask the question, “Why are there forty million poor people in America?” And when you begin to ask that question, you are raising questions about the economic system, about a broader distribution of wealth. When you ask that question, you begin to question the capitalistic economy. And I’m simply saying that more and more, we’ve got to begin to ask questions about the whole society. We are called upon to help the discouraged beggars in life’s marketplace. But one day we must come to see that an edifice which produces beggars needs restructuring. It means that questions must be raised. You see, my friends, when you deal with this, you begin to ask the question, “Who owns the oil?” You begin to ask the question, “Who owns the iron ore?” You begin to ask the question, “Why is it that people have to pay water bills in a world that is two-thirds water?” These are questions that must be asked . . . [W]hen I say question the whole society, it means ultimately coming to see that the problem of racism, the problem of exploitation, and the problem of war are all tied together. These are the triple evils that are interrelated.”

 

So where do we go from here? 

 

Will our country be on the right side of history?

 

Will each of us be part of defining a direction and participate in direct action to achieve social change?

 

Will you join, not me, but in Dr. King’s memory, the struggle to achieve racial, social, and economic justice in the United States and the world?

 

I know I said I would only refer to four King speeches, but I need to reference one more, this time by his nine-year-old granddaughter Yolanda Renee King at the 2018 March for Our Lives Rally in Washington DC.

 

“My grandfather had a dream that his four little children will not be judged by the color of the skin, but the content of their character. I have a dream that enough is enough. And that this should be a gun-free world, period. Will you please repeat these words after me? Spread the word, have you heard? All across the nation we are going to be a great generation. Now, I’d like you to say it like you really, really mean it. Spread the word, have you heard? All across the nation we are going to be a great generation.”

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174067 https://historynewsnetwork.org/article/174067 0
Ross Douthat's Prescription for Academia Won't Solve the Real Problem

Fourteen years ago--four before I would enter a PhD program in English--the New York Book Review Classics rereleased a slim novel entitled Stoner that was first published in 1965 by an almost entirely forgotten writer named John Williams. Like many books in the (excellent) NYBRC series, it took awhile for readers to come back to Stoner, but by 2012 when I was studying for my comprehensive examinations in Renaissance poetry, this book by a man who died in 1994 became an unlikely international hit, lauded by some critics as one of the most perfect novels ever written. What defines Stoner, as Williams writes, is the “love of literature, of language, of the mystery of the mind and heart showing themselves in the minute, strange, and unexpected combinations of letters and words, in the blackest and coldest print.” 

 

It’s an unassuming narrative, the story of William Stoner, born to poor Missouri farmers at the end of the nineteenth-century, who through patience and simply working day in and day out ends up becoming a relatively unaccomplished professor of Renaissance literature Columbia University. He dies unheralded and mostly forgotten, still having lived a life dedicated to teaching and to poetry. Stoner’s career isn’t particularly happy, but he bears personal and professional hardship with a stolid midwestern dignity, and though Williams makes clear that the professor isn’t the most promising of his cohort, there is a rightful valorization of the work that he does, and the professionalism with which he conducts himself. Stoner is undeniably, in addition to being about the abstract love of literature, a celebration of work itself. 

 

With some foreshadowing, it was the New Critical close reading of Shakespeare’s Sonnet 73 that convinced the undergraduate Stoner that he belonged in a classroom and not on a farm. “That time of year thou mayst in me behold/When yellow leaves, or none, or few do hang/Upon those boughs which shake against the cold, /Bare ruined choirs, where late the sweet birds sang.” The poem evokes Stoner’s eventual solitary death, but in a manner far more prescient than Williams could have known. Shakespeare’s words are also a fair summation of what’s happened to the professor’s entire discipline over the last two generations as departments have shrunk, tenure track jobs have disappeared, and the academy has increasingly come to rely upon an exploited underclass of contingent, part-time faculty. 

 

I started graduate education a year before the financial collapse from which the vast majority of this country has yet to recover. Prospects for employment at a college or university were already bad enough in 2007; thirteen years later and they’re practically non-existent. That English departments – and by proxy the rest of the humanities including history, modern language, religious studies, philosophy, and so on – won’t exist in any recognizable form by let’s say 2035, should be obvious to anybody who surveys the figures and who has worked within the academy itself. 

 

The argument of who exactly is responsible for this state of affairs rages on, but New York Times columnist Ross Douthat insists he knows the answer. Part of the Times squad of Bret Stephens, David Brooks, and Bari Weiss--conservatives who are only read by liberals to prove how politically ecumenical said liberals are--Douthat doesn’t have a graduate degree himself. Nor has he (to my knowledge) produced peer-reviewed scholarship, or taught a university class (as anything other than a guest lecturer or as a function of his job as a columnist, I should say). But despite that, he has the hubris that can only be conferred by an undergraduate degree with the word “Harvard” printed on it, and so Douthat recently authored a column prescribing who is responsible for the “thousand different forces [that] are killing student interest in the humanities and cultural interest in high culture.” 

 

Like any column written by Douthat, Brooks, Stephens, or Weiss, what’s so insidious is that they’re often 25% correct – sometimes even 33% accurate! Douthat blames the disciplines themselves for their current predicament – and of course he’s correct. No doubt he’s critical of the lack of class solidarity between faculty and adjuncts, the ways in which professional organizations refuse to advocate for us, and the manner in which the ethic of the business school has infected the entire university. 

 

But of course that’s not what he finds responsible.  As predictably as if he were Alan Bloom writing The Closing of the American Mind in 1987, Douthat writes that the recovery of the humanities relies on a resuscitation of Victorian critic Matthew Arnold’s championing of “the best that has been thought and said.” A resurrection of the humanities must depend “at least on that belief, at least on the ideas that certain books and arts and forms are superior, transcendent, at least on the belief that students should learn to value these texts and forms before attempting their critical dissection.” Which we would all try and do, of course, if there were only any jobs in which to do it.

                  

I know that it’s been fashionable in conservative circles since the 1980s to bemoan the “post-modernist” professor who refuses to acknowledge Shakespeare’s brilliance while teaching comic books, but the schtick has gotten so old that I wish a young fogy like Douthat would learn a new tune. The critical studies professor railing against “dead, white males” is as much a bogeyman of the conservative conscience as the Cadillac driving welfare-queen, but the former remains a mainstay of their bromides, even while Douthat feigns an unearned reasonableness. 

 

So to his prescription, I must answer that of course those who study and teach literature cherish it as a source of value and transcendence, that of course they acknowledge that there are writings that endure because of individual, brilliant qualities, that of course we want our students to be enthusiastic about these things we’ve loved. Nobody spends the better part of a decade getting a PhD in English, or history, or philosophy because they hate English, history, or philosophy (though certainly some of them understandably come to). 

 

Nobody invests the better part of their youth in such research and teaching just so that Douthat can tell them that their professional failures are a result of just not loving their discipline enough. That critical, theoretical, and pedagogical consensus over the past few generations have rightly concluded that the job of the humanity’s isn’t mere enthusiasm, but also critical engagement with those texts – for both good and bad – speaks to the anemia of Douthat’s own education. Douthat, who has apparently never heard of the scientific method, writes that “no other discipline promises to teach only a style of thinking and not some essential substance,” as if learning how to rationally and critically engage with the world should be some afterthought to remembering that Shakespeare knew how to turn a phrase (I’ll cop to finding his phrase “essential substance” as being unclear--I suppose he means facts). Critical analysis of text has been a mainstay of humanistic inquiry since biblical exegetes first analyzed scripture; it runs through the humanists of the Renaissance, the philologists of the nineteenth-century, and the New Critics and formalists of the Modernist era. It’s hardly something made up at a Berkeley faculty meeting.  

 

Douthat’s pining for a purely aestheticized type of non-inquiry owes much to a certain segment of Victorian criticism, but it’s hardly defined the discipline for its whole history. Nor is the idea that being able to critically engage texts, historical events, or philosophical ideas as independent from whether or not you personally derive aesthetic pleasure from them particularly radical. It’s been a mainstay of the civitas humanitas since before the Enlightenment, whereby the study of literature, history, and philosophy wasn’t just an instruction in connoisseurship, but rather training in how to be a citizen. I’d propose instilling civic engagement is precisely what the “politicized” teaching and research that Douthat bemoans is trying to do. 

 

Unlike cultural warriors of the past, Douthat makes shallow gesture towards the academic jobs crisis, he alludes to the fact that he’s aware of economic austerity that’s gutted humanities departments. But like all culture wars masters of the form, Douthat’s conservative politics make it impossible for him to properly name the actual culprits of what happened to the humanities. Jacques Derrida didn’t kill the English department – Milton Friedman did. 

 

With some accuracy, albeit of the straw-man variety, Douthat argues that today it should be “easier than ever to assemble a diverse inheritor” of the old canon. I’m assuming that when he was at Harvard, he must have encountered those rightful diverse inheritors of the canon, because what we’ve been doing in the Renaissance literature classroom for thirty years is precisely that – teaching Shakespeare alongside Amelia Lanyer, John Milton with Aphra Behn. Like a teenager who asks why nobody has told him about the Beatles before, Douthat acts as if it’s some great insight that “This should, by rights, be a moment of exciting curricular debates, over which global and rediscovered and post-colonial works belong on the syllabus with Shakespeare” – but that’s precisely what we’ve been doing all this time. 

 

He writes that “humanists have often trapped themselves in a false choice between ‘dead white males’ and ‘we don’t transmit value,’” but this is only the situation within his own reactionary fever dream. Douthat’s prescription is as helpful as asking a person with an illness to just stop being sick, he tells us that the “path to recovery begins… with a renewed faith not only in humanism’s methods and approaches, but in the very thing itself.” May I suggest a humbler solution? The path to recovery of the humanities begins with actually funding the humanities, with hiring and paying people to teach and write about it, with making sure that its interests are not completely overlooked by universities more concerned with the NCAA, administrative pay, and making sure that students have the full “college experience.” That proposal might make the trustees of universities squeamish though, and they after all vote for the same political party which Douthat is a member of. Better just say that professors don’t love literature enough. 

 

Because the humanities do matter, the false choice that Douthat gives between aesthetic appreciation and critical analysis is to the detriment of both. Stoner wouldn’t have been as popular as it was if its sentiments didn’t move so many of us, graduate students who talked about the novel as if it were samizdat. What’s beautiful about Stoner is the character’s love of literature and of teaching. What’s inexplicable to us about it is that he’s actually able to have a job doing those things. Douthat may pretend that this is a spiritual problem, but the bare ruined choir of the academy wasn’t emptied because of insufficient faith, but rather because the money changers have long been in charge of the temple. It’s a spiritual problem only insomuch that all economic problems are at their core spiritual. Pretending that the disappearance of the English department has always been an issue about liberal professors attacking Western civilization is, with apologies to Borges, a bit like watching two bald men fight over a comb.  

 

The fact is that there is no excess in teaching critical analysis – in an era of increasing political propaganda and weakening democratic bonds it’s estimably necessary. We teach how to critically read culture – including movies, comics, and television – not because we don’t acknowledge the technical greatness of a Shakespeare, but in addition to it. Contrary to Douthat’s stereotypes, there’s not an English professor alive who doesn’t understand Shakespeare’s technical achievements when compared to lesser texts, but we understand that anything made by people is worthy of being studied because it tells us something about people. That is the creed of Terrence when he wrote that “I am human and I let nothing which is human be alien to me” – no doubt Douthat knows the line. Did I mention that he went to Harvard?   

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174069 https://historynewsnetwork.org/article/174069 0
George Orwell, 70 Years Later

 

Seventy years ago this Tuesday, George Orwell, the author of Nineteen Eighty-Four, died alone in a hospital sanitorium in London.  He had struggled for years with pulmonary tuberculosis, and his weak lungs hemorrhaged for the final time.  He was just 46, and just on the verge of fame, having published his great novel just seven months earlier in June 1949.

 

Would the visionary author of Nineteen Eighty-Four—who always insisted on the full eighteen letters as his novel’s proper title, not the four screaming digits “1984”—have ever imagined that George Orwell might become the most important writer since Shakespeare and the most influential writer who ever lived?  That is my contention, based on his cultural and social impact, that is, with the omnipresence of his coinages in the contemporary political lexicon and his dystopian vision in the political imagination.

 

Crucial to his compelling language and vision was his superlative rhetorical ability to coin catchwords, such as those in his beast fable that allegorizes the history of the Soviet Union, Animal Farm (1945) and his dystopian novel, Nineteen Eighty-Four. His talent for composing arresting, memorable lines in both his essays and his fiction, especially openers “It was a bright cold day in April, and the clocks were striking thirteen”) and closers (“…He loved Big Brother”) is equally unforgettable.  Nineteen Eighty-Four rose to #1 on the bestseller lists (for the amazing fourth time in history) during January 2017 (after Donald Trump’s inauguration).  I fully expect it to do so again sometime during the presidential campaign this year.

 

Orwell’s worldwide fame—sales of his last two books approach 60 million in more than five dozen languages—certainly rests on his fable and dystopian novel.  Yet I would also maintain that Orwell’s importance not only is due to his “impact” as a polemicist or rhetorician, but is also explainable on the grounds of literary style, that is, in literary terms too. He is arguably the most important literary figure of recent generations. His direct literary influence in Britain and America on the generations directly following his own—the Movement writers and the Angry Young Men of the 1950s, the New Journalists such as Tom Wolfe and Gay Talese of the 1960s—rivals that of virtually any other twentieth-century writer. His influence on literary-political intellectuals since his death in 1950 is unrivalled—no successor has even come close to filling his outsized shoes.

 

Even more notable than all this, however, is the authority his “clear, plain” prose style has indirectly exerted, as countless writers have attested.  Orwell’s style has played a role in shaping nonfiction writing since midcentury. Along with Hemingway, Orwell is the literary stylist whose work has contributed most significantly to shifting the reigning prose style in English from the eighteenth-century ideal of the orotund, Ciceronian periodic sentence of Dr. Johnson, Gibbon, and the Augustan Age toward the limpid, fast-moving, direct, and hard-hitting sentences of present-day journalism. It is in these respects that Orwell’s literary influence is sizable indeed and bolsters his claim to the title “England’s Prose Laureate.”

 

Until recently, this was not at all the consensus, especially among British and American professors of English. Until the late 1980s, Orwell was typically relegated by most professors of English to the ranks of a middlebrow author.

 

When I began teaching at UT in the 1980s, literary academe was still dismissing Orwell as a rather simple “juvenile” or “high school author.  Distinguished “difficult” authors such as Vladimir Nabokov (of Lolita fame) dismissed Orwell as a “popular purveyor.”  In other words, Orwell’s works were, at best—in the much-quoted phrase that George Orwell used about the writings of some of his own favorite authors--“good bad books."

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174077 https://historynewsnetwork.org/article/174077 0
1917: The War Movie at Its Very Best

The World War I movie 1917 starts out quickly. A British General tells two enlisted men, Private Schofield and Lance Corporal Blake, that two British battalions are marching into a trap set up by the Germans several miles away. The two men must reach the 1,600 men in the battalions – one man’s brother among them – and warn the soldiers to turn back. To get there, the duo must march in and out of trenches, survive No Man’s Land, undergo machine gun fire, avoid bombs, race through blazing buildings and continually test their own courage and fortitude.

 

It is a film on fire that emulates the world on fire back in Europe 1917. It is loud. It is tense. It is dramatic. It is terrific.

 

1917, that opened nationwide last week, produced by Dreamworks and directed by Sam Mendes, is one of those great war movies that comes out only once every generation or so (think Saving Private Ryan). It is also one of the few films about World War I, that always seems to run third in public interest behind World War II and the American Civil War.

 

There are numerous elements that make 1917 a classic war film, and classic film, period. First, the action is focused on just two men, at the start, and they have to win or lose in the effort to save the apparently domed battalions. Second, their route to the troops takes them through hell on earth, with numerous Biblical symbols (the air all around on fire, climbing over dozens of dead bodies to save their own lives). Third, the special effects are impressive, with airplanes, explosions long, long lines of men in trenches. Fourth, the film has numerous closeups of exhausted, wiped out soldiers, most of whom are panting from the fury of the battle.

 

The movie is a story within a story – the two men within the greater war. Director Mendes has fashioned the film so that you constantly cheer the two men on, praying that they make it but yet every moment of the film you think they might perish and shortly afterwards the 1,600 men they were sent to save.

 

There is no great cavalry charge up the hill here, like in so many westerns, no sterling oratorical speeches by Henry V at Agincourt, no General ridings a white horse waving a sword in the air. It is a war of the grunts, trying to just get home. World War II was a war of victory and considerable glory; World War I was a fight for survival. There are numerous references to the idea of survival and no real purpose to the conflict, to any conflict. One General tells a corporal that it doesn’t matter what today’s orders are – next week the high command will issue orders that are just the opposite. Men don’t think of victory and welcome home parades, just getting home in one piece.

 

The first half of the film is slow but has some just plain astonishing scene. In one, a troop transport truck gets bogged down in the mud and a dozen soldiers, pushing and grimacing, try to get it out and back on the road. All of the pain of war is told in their faces and their aching arms and legs. In another A German plane is shot down by two allied planes, hits the ground and slides directly at the two messengers and you are certain it is going to kill them.

 

There are dazzling cinematic scenes, such as long moments focused on soldiers in the trenches, vast wasteland of empty meadows except for a few lone bombed out farm houses, mud puddle after mud puddle. There are vast plains with just one single, tree, still standing in the middle of it. There is a poignant scene in which Schofield meets a young woman and her baby hiding out from everybody in a building. He is attracted to her but has to leave to evade the Germans who are constantly looking for him.

 

Much has been made of the one camera effect in the film. A single camera picks up the two soldier boys when they leave on their journey and follows them most of the way. You see everything through their eyes or with the camera in front of them, in their faces. The final scene of the movie, shot this way, is striking.

 

The one problem in the film, and it is in just about all war movies, is that it starts too slowly. Our two heroes march and march and march and little happens.

 

Then, all of a sudden, the whole world explodes around them, and around the audience.

 

We are off….

 

Director Mendes does a brilliant job on this film about a ghastly conflict that tore apart the world. He gets numerous fine performances from a strong ensemble of actors. The two stars of the film, Dean-Charles Chapman as Corporal Blake and George Mackay as Private Schofield, are superb and win you over from the fist shot of the story.

 

The movie won the Golden Globe for Best Movie and was nominated for Best Picture in the Oscars. It deserves the accolades.

 

Right after World War I ended, they all said that it was “the war to end all wars.” It sure did, didn’t it? 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174091 https://historynewsnetwork.org/article/174091 0
Edmund G. Ross Was a Profile in Impeachment Corruption, not Courage

 

The current impeachment proceedings have revived the historical error of proclaiming Kansas Senator Edmund G. Ross a hero for providing the vote that saved President Andrew Johnson’s job after the April 1868 impeachment trial.   

For many years after that one-vote verdict, Ross was proclaimed the savior of the presidency from the rabid forces of impeachment.  More recent studies have noted that the racist Johnson was mostly a blight on the presidency, creating harsh divisions after the Civil War rather than binding up the wounds from that bloody conflict.

But another lingering fiction is that Ross cast his pro-Johnson vote altruistically.  Ross proclaimed his own heroism in his memoir.  When he cast the impeachment vote, he wrote, it was like looking into his open grave, but he courageously leapt in despite the consequences.

Future President John F. Kennedy (or his ghostwriter), revived this myth in his often-inaccurate book Profiles in Courage, pronouncing Ross’s vote “the most heroic act in American history.”  

That vote was a profile in corruption, not courage.

Ross, a printer previously accused of bid-rigging on Kansas state contracts, secured his Senate seat because his sponsor – a crook named Perry Fuller – paid Kansas state legislators $42,000 in bribes to select Ross and Samuel Pomeroy to Washington as the state’s senators.  

In the 1860s, frontier Kansas had a well-earned reputation for corruption.  Ross replaced a senator who had killed himself after the revelation that he took a $20,000 bribe from (yup) Perry Fuller.  

In his first months in Washington, Ross did nothing to attract the attention of even most dedicated Senate-watcher.  When the impeachment crisis erupted in early 1868 and landed in the Senate for Johnson’s trial, Ross flirted with both sides of the contest, then began to offer his vote for the best deal he could get.  

Less then two weeks before the trial ended, Ross sent his sponsor, Perry Fuller, to Johnson’s Interior Secretary.  Fuller had already joined with a Treasury official to offer a bribe that would have secured Ross’s vote for Johnson.  Fuller told the Interior Secretary that Ross would vote for acquittal if only Johnson would speed the return to the union of three Southern states. That was outside of Johnson’s power, so no deal was struck.

Days later, Ross promised pro-impeachment senators he would vote their way, a pledge he repeated to a reporter three days before the final vote.  But then, something happened.  On the morning of the vote, Ross breakfasted with his good friend, the ubiquitous Perry Fuller.  Then Ross reversed himself to cast the vote that kept the president in office.

The inference that Ross was bribed to vote for Johnson is powerful, although bribes in 1868 were paid in cash that could not (and cannot) be traced.  But records show that Ross immediately moved to cash in on his pro-Johnson vote with patronage appointments.

At the top of Ross’s shopping list was a top job for – you guessed it – Perry Fuller. The job?  Commissioner of Internal Revenue, a position from which Fuller’s corruption could spread through the nation like a virus.  Johnson promptly nominated Fuller for the pivotal position, but the Senate Finance Committee would not swallow a flat-out crook in that office.  It sent Fuller’s name back to the president.

So Ross set his sights lower.  In August, he secured Fuller’s appointment as Collector of Revenue in the port of New Orleans.  Through seven months in that office, Fuller more than doubled the number of that office’s employees, then was arrested for stealing $3 million.  

When Fuller had to post bond for his pretrial release, Senator Edmund Ross of Kansas was happy to guarantee it.

But Fuller was not the only name on Ross’s patronage shopping list.  The Kansan also requested the appointment of a friend as superintendent of Indian lands in which is now Oklahoma, stressing to President Johnson the “large amount of patronage connected with that office.”  Johnson made the appointment.  Then Ross asked for ratification of a treaty with the Osage Tribe, which Johnson swiftly granted.  

Even the widespread belief that Ross’s impeachment vote ruined his life is a fable. He did lose his Senate seat to a man who paid $60,000 in bribes to Kansas legislators, a moment of poetic justice.  Ross went on to publish two newspapers in Kansas before landing in 1885 as territorial governor of New Mexico.

Edmund G. Ross a profile in courage?  No. Not ever.

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173849 https://historynewsnetwork.org/article/173849 0
Jimmy Carter and The Myth That Gave the Iowa Caucuses Their Political Power

 

Every four years, the country witnesses what should be an inspiring ritual: Iowans like me brave the cold winter night, gather in school gyms, and talk about politics with their neighbors. But there is another ritual that also merits our attention: the condemnation of the Iowa precinct caucuses coming from around the country. Such critictism comes from the political right and the left.

 

There are many good reasons to oppose the caucuses. After all, is impossible to justify the same state going first each time, since it undermines the spirit of equality that is supposed to be crucial to American government. 

 

Further, Iowa is unrepresentative--disproportionately whiter, older, and more rural than the country as a whole. Finally, people with work or childcare responsibilities at night are out of luck. In this system, even the Democrats are undemocratic. 

 

Unlike primaries, the caucuses are a Rube Goldberg machine; that is, they are a complicated system designed to do something simple.  If parties were starting from scratch, they would do better to create something more like a primary. Historians can tell you, though, that politicians never start from scratch. Our caucuses are the product of historical accident, rather than conscious design.  

 

The accident began at the Democratic convention in Chicago in 1968. With cameras rolling, police officers attacked young activists who opposed the Vietnam War.   Meanwhile, the party leadership, still loyal to war President Lyndon B. Johnson, nominated Vice-President Hubert Humphrey, whose position on the war was ambiguous at best.  To protesters and much of the press, it appeared that Humphrey had been hand-picked by unelected pro-war Democratic Party leaders in a secretive process.  Meanwhile, police violence against dissenters and even reporters on the streets outside the Convention escalated into chaos.   The spectacle looked terrible to millions of voters watching on television, and the mess contributed to the Democrats’ defeat that fall.  

 

Something had to give.  While caucuses and primaries happened in a few states by 1968, most state parties used a smorgasbord of methods to insulate the nomination process from popular control.  Often conventions featured multiple ballots in which delegates, not voters, made the final choice of a nominee.  Now, more state parties began choosing their convention delegates through primary votes meant to empower average voters.  Others turned to party caucuses.  In these, at least in theory, party enthusiasts and volunteers led the proceedings. 

 

In 1972 and again in 1976, Iowa’s precinct caucuses happened first by historical accident. Put simply, the complexity of the state’s system for selecting national delegates, which is unknown to most of the public, generates a lot of meetings. The now-familiar precinct caucuses are only the start of Iowa’s system for selecting delegates, which continues long after journalists have left the state. Jimmy Carter (or rather his young adviser, Hamilton Jordan) thought a win at the caucuses could generate momentum by bringing media attention early in the process.  Jordan believed that early notice in the press was essential for a candidate to gain traction in a crowded field of presidential hopefuls.  The story is well-known: Carter focused on the state and did well in the caucuses, which set him, an unknown, on the path to the nomination. Presidential hopefuls have flocked to the state ever since. 

 

However, Carter’s story is not quite that simple. First, Carter didn’t exactly win. He received about 27% support, putting him in second place behind an always formidable opponent:  “undecided.” The forgettable Birch Bayh of Indiana finished third, ahead of a large but thoroughly mediocre group of challengers.

 

Moreover, the precinct caucuses did not provide Carter with any delegates to the national convention.  It doesn’t work that way. (This may be a little boring, but it is important to the story, so stay with me.) Those chosen as delegates by their precincts advance to the county convention, which in turn chooses delegates for the district conventions, which then designates participants in the state convention.  Finally, the state convention picks delegates to the national convention.  This complex system takes a lot of time, and Iowa Democrats start early because they have to in order to have delegates before the summer.    

 

Those county conventions are all-day affairs which happen long after the media has moved on. County delegates often get the job not because they want it, but because of pressure from their neighbors.  They may have to change their vote because their candidate has dropped out and that disheartening fact can create no-shows. Yes, there are alternate delegates, but things can go wrong.  

 

To recap: county conventions choose delegates for the district conventions, which in turn picks delegates for the state convention, which then chooses a few genuine delegates for the national convention. There are not many of these because Iowa is a small state.  The delegate stakes of the precinct caucuses are incredibly low. 

 

In other words, Carter won a glorified straw poll.  Nonetheless, his staff touted the caucuses as the will of the people, and Iowans had no reason to disagree. The caucuses did help Carter did get a big media bounce.  Yet the burst of attention he received happened because he also won the New Hampshire primary, which was then the traditional indicator of early strength.  Without Iowa, New Hampshire alone would have put him on the map.

 

Carter’s success actually happened because he was the perfect post-Watergate candidate. He was an evangelical who promised never to lie to the public. He was a naval veteran, a former nuclear engineer, and a peanut farmer.  As a former Governor of Georgia, he seemed to be a Washington “outsider” at a time when public disgust with politics was at its zenith.

 

Importantly, Jimmy Carter’s success also happened because he was a moderate, even conservative, by the standards of his party. Democrats, still smarting over their landslide loss of South Dakota liberal George McGovern in 1972, believed, correctly, that Carter could win by moving the party to the right. Ironically, in 1972 McGovern had himself performed well in Iowa, finishing second to Edmund Muskie, but received little acclaim for it in the media.  In 1976, the Carter campaign touted the event, and reaped the benefits. 

 

Iowa may be crucial to the nomination system today, but it still mirrors the caucuses in 1976 in one important respect:  it was a media event. It mattered, not because it produced delegates, but because it gave Carter media attention.  The caucuses were real politics in the same way that Celebrity Apprentice was reality television.    The Chicago protesters in 1968, while being pummeled by the police, chanted “the whole world is watching.”  And it was. Party reformers changed the system after 1968, but with an unintended consequence.  Today, the whole world is still watching, but they are watching Iowa.  

 

If you are a blue-state Democrat and you don’t like our system, you can try to make it go away.  Iowa’s status is always in danger, and yet the caucuses continually survive, maybe because Iowans care more.  We don’t have an NFL franchise, and we deserve to watch something.  

 

Or, voters can ignore the caucuses. In 2008, John McCain skipped the caucuses, which he viewed as a lost cause given his opposition to tax credits for ethanol. He won the Republican nomination. Michael Bloomberg, a far richer man than Donald Trump, has launched the extraordinary experiment of skipping the first four states and saturating the big media markets. 

 

Most pundits would tell you Bloomberg is doomed to fail, but you have to admire the impulse. He is ignoring us, which is probably the best way to minimize our importance. His strategy seems smarter than that of former Maryland U.S. Representative John Delaney, who has spent nearly two years wandering the state to little avail. Parents will tell you to ignore a toddler’s tantrums if you want them to stop. Twitter users know that they should refuse to feed the trolls; to do otherwise would be to reward bad behavior.  If you really don’t want to spend the evening of February 3 hearing media chatter about which candidate exceeded expectations, it’s on you.  Go to a bar, go bowling, read a novel.  I love my adopted state, but I have faith that the country is ready to choose a nominee without our help.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174076 https://historynewsnetwork.org/article/174076 0
The World is Losing another Historic Generation

Temple of Confucius of JiangyinWuxiJiangsu. Photo by ZhangzhugangCC BY-SA 3.0.

 

Every day, little by little, one of the world’s largest and most important historic generations is passing away— the last generation to have grown up in China before the communist takeover in 1949.  These individuals were born between 1919 and 1937, right before the outbreak of the Sino-Japanese War, and were raised in what we might loosely call traditional China. Yes, China was surely modernizing during the years 1919-1949, but traditional Chinese culture retreated only slowly and was still pervasive in 1949, even in foreign-influenced coastal cities like Shanghai.  Dense bastions of traditional practices remained in education, the judiciary, and government bureaucracies, as well as in family structure and social relations in general. Students and young people shaped by this environment were in touch with a very ancient Chinese system of values, with all its cultural beauty and social flaws.  

 

After 1949 the Chinese Communist Party (CCP) launched an active campaign to eradicate traditional values they regarded as “feudal thinking,” extending their long-standing hostility against the Confucian worldview to national educational policy. Things got much worse during the Cultural Revolution of 1966-1976 when fanatical anti-traditional hysteria generated a decade of state-tolerated anarchic violence and contempt by Red Guard youth groups directed against “feudal” and “bourgeois” values. The widespread physical destruction of artwork and cultural sites during those years was the most visible manifestation of a policy designed to root out traditional thinking tied to the pre-revolutionary past.   

 

More recently the CCP has paid lip-service to the nation’s past culture by appropriating Confucius’s name for the vast network of international “Confucian Institutes” that project a positive image of communist China to the outside world.   These institutes have little connection to China’s past; they are primarily centers of communist propaganda and recruitment of naïve, sympathetic foreign students.  A Chinese ruling elite that tolerates corrupt communists and crony capitalists has no genuine interest in a state grounded in Confucian ethics. Instead they pick out the Confucian elements that reinforce communist rule, such as emphases on political loyalty and communalism.

 

Students coming of age in China today are acquainted with only those parts of classical Chinese art, literature, and philosophy that the CCP regards as useful for justifying its continued one-party dictatorship. Conversations with Chinese students reveal alarming gaps in their knowledge of China’s political and economic development before 1949.  Disastrous events under communist rule like the failure of the Great Leap Forward and the resulting famine of 1959-1961 or the Tian An Men massacre of 1989 are “forbidden topics” beyond discussion or even mention in Chinese schools and universities, but that is a separate topic.   This is not simply complaining about the habits and interests of today’s youth in China; these knowledge gaps are a deliberate product of the CCP’s Orwellian educational policy reaching back over many decades with the intent of allowing vast swathes of pre-communist history and culture to wither away. 

These trends mean that as the last generation of individuals who grew up in, were educated, in, or were at least exposed to classical Chinese education and culture leave us, they will not be replaced. That human loss will be a bitter blow, perhaps the death blow, for a civilization with a 5,000 year pedigree. The surviving cultural remnants will be only those crumbs from the pre-communist past selected by the CCP for public presentations carefully managed by the party for its own purposes.

 

In a dangerous irony, this great reduction in the human cultural capital of pre-communist China comes at the same time the growth of Chinese economic, military, and political influence around the world makes international audiences more interested than ever before in understanding some of the deeper elements of Chinese culture. 

 

If the outside world wants to understand China’s long history, traditional values, and cultural contributions in an intellectually honest way, we will have to get beyond the carefully controlled version of history being offered by the current communist government. We can help by doing two things. First, by maintaining open, honest, and critical scholarship of China’s long history at universities and research institutes around the world. That includes refusing to knuckle under to China’s frequent demands that we cooperate with their efforts to censor critical research produced by foreign-based scholars. Second, we can support the remaining outposts of Chinese culture that exist beyond the reach of the CCP, for example in Taiwan, where non-communist versions of China’s past, present, and future are on display every day.  

 

We cannot prevent the passing away of China’s last generation from the pre-communist era, but we can do our part in helping to preserve the ancient culture they knew, so that it does not vanish completely with them. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174068 https://historynewsnetwork.org/article/174068 0
Poles Apart: Putin, Poland and the Nazi-Soviet Pact

 

As the 75th anniversary of the end of World War II approaches, two of that war’s main victims – Poland and Russia – are once again embroiled in a highly emotional dispute about its origins. At the heart of the matter is the perennial controversy about the Nazi-Soviet pact of 23 August 1939. 

 

The polemics were kick-started by President Vladimir Putin when he was asked about the European Parliament’s resolution on the 80th anniversary of the outbreak of World War II at press conference in Moscow on 19 December. Putin deemed the resolution unacceptable because it equated the Soviet Union and Nazi Germany and accused its authors of being cynical and ignorant of history. He highlighted instead the Munich agreement of September 1938 and Poland’s participation in the dismemberment of Czechoslovakia. The Soviet-German non-aggression treaty was not the only such agreement made by Hitler with other states. Yes, said Putin, there were secret protocols dividing Poland between Germany and the USSR but Soviet troops only entered Poland after its government had collapsed. 

 

This is not the first time Putin has made such arguments. He made many similar points in 2009 on the 70th anniversary of the outbreak of war. But his tone then was conciliatory rather than combative. At the commemoration event in Gdansk, Putin stressed the common struggles of Poles and Russians and called for the outbreak of the war to be examined in all its complexity and diversity. Every country had been at fault, not just the Soviet Union: “it has to be admitted that all attempts made between 1934 and 1939 to appease the Nazis with various agreements and pacts were morally unacceptable and practically meaningless as well as harmful and dangerous.”

 

Responding to Putin, the then Polish Prime Minister, Donald Tusk, stressed that on 1st September 1939 his country was attacked by Germany and then two weeks later, invaded by the Soviet Union. But Tusk also emphasised that while “truth may be painful, it should not humiliate anyone.”

 

The day after his news conference in Moscow, Putin addressed leaders of the Commonwealth of Independent States at a meeting in St Petersburg convened to discuss preparations for the 75th anniversary. Putin used the occasion to deliver a long analysis of what led to the outbreak of war in September 1939, including detailed citations from many diplomatic documents.

 

One document that caught Putin’s eye was a September 1938 dispatch from Jozef Lipski, the Polish ambassador in Berlin, reporting on a talk with Hitler. During the conversation Hitler said that he was thinking of settling the Jewish issue by getting them emigrate to a colony. Lipski responded that if Hitler found a solution to the Jewish question the Poles would build a beautiful monument to him in Warsaw. “What kind of people are those who hold such conversations with Hitler?", asked Putin. The same kind, he averred, who now desecrate the graves and monuments of the Soviet soldiers who had liberated Europe from the Nazis.

 

The main point of Putin’s trawl through the British, French, German, Polish and Soviet archives was to show that all states had done business with the Nazis in the 1930s, not least Poland, which sought rapprochement with Hitler as part of an anti-Soviet alliance. Putin linked this history to present-day politics: “Russia is used to scare people. Be it Tsarist, Soviet or today’s – nothing has changed. It does not matter what kind of country Russia is – the rationale remains.”

 

Putin vigorously defended Soviet foreign policy in the 1930s. According to the Russian President, Moscow sought a collective security alliance against Hitler but its efforts were rebuffed, most importantly during the Czechoslovakian crisis of 1938 when the Soviets were prepared to go to war in defence of the country, provided France did the same. But the French linked their actions to that of the Poles, and Warsaw was busily scheming to grab some Czechoslovak territory. In Putin’s view the Second World War could have been averted if states had stood up to Hitler in 1938.

 

In relation to the Nazi-Soviet pact, while Putin accepted there was a secret protocol, he suggested that hidden in the archives of western states there might be confidential agreements that they had made with Hitler. He also reiterated that the Soviet Union had not really invaded Poland, adding that the Red Army’s action had saved many Jews from extermination by the Nazis.

 

Putin returned to the subject of the war’s origins at a meeting of Russia’s Defence Ministry Board on 24 December: “Yes, the Molotov-Ribbentrop Pact was signed and there was also a secret protocol which defined spheres of influence. But what had European countries been doing before that? The same. They had all done the same things”. But what hit him hardest, Putin told his colleagues, was the Lipski report: “That bastard! That anti-Semitic pig – I have no other words”.

 

To be fair to Putin there is more to his view of history than pointing the finger at Poland and the west. He also identified more profound causes of the Second World War, including the punitive Versailles peace treaty that encouraged “a radical and revanchist mood” in Germany, and the creation of new states that gave rise to many conflicts, notably in Czechoslovakia, which contained a 3.5 million-strong German minority.

 

Poland’s first response to Putin’s furious philippics was a statement by its foreign ministry on 21 December, expressing disbelief at the Russian President’s statements. Poland, the foreign ministry said, had a balanced policy towards Germany and the Soviet Union in the 1930s, signing non-aggression pacts with both countries. “Despite the peaceful policy pursued by the Republic of Poland, the Soviet Union took direct steps to trigger war and at the same time committed mass-scale crimes”.

 

According the Polish foreign ministry the crucial chronology of events was that in January 1939 the Germans made their claims against Poland; in mid-April the Soviet ambassador offered Berlin political co-operation and at the end of April Hitler repudiated the German-Polish non-aggression pact; in August the Nazi-Soviet pact was signed; in September Germany and the USSR invaded Poland and then signed a Boundary and Friendship Treaty that formalised Poland’s partition.

 

Among Soviet crimes against Poland was the mass repression of Poles in the territories occupied by the Red Army, including 107,000 arrests, 380, 000 deportations and, in spring 1940, 22,000 executions of Polish POWs and officials at Katyn and other murder sites.

 

On 29 December 2019 Polish Prime Minister, Mateusz Morawiecki, issued a statement, noting that Poland was the war’s first victim, “the first to experience the armed aggression of both Nazi Germany and Soviet Russia, and the first that fought in defense of a free Europe.” The Molotov-Ribbentrop pact was not a non-aggression agreement but a military and political alliance of two dictators and their totalitarian regimes. “Without Stalin’s complicity in the partitioning of Poland, and without the natural resources that Stalin supplied to Hitler, the Nazi German crime machine would not have taken control of Europe. Thanks to Stalin, Hitler could conquer new countries with impunity, imprison Jews from all over the continent in ghettos and prepare the Holocaust”.

 

Morawiecki pulled no punches in relation to Putin: “President Putin has lied about Poland on numerous occasions, and he has always done so deliberately.” According to Morawiecki, Putin’s “slander” was designed to distract attention from political setbacks suffered by the Russian President, such as US sanctions against the Nord Stream 2 oil pipeline project and the World Anti-Doping Agency’s banning of Russia from international sporting events for four years.

 

 All states like to present themselves as victims rather than perpetrators and this not the first time Poland and Russia have clashed over the Nazi-Soviet pact. The piquancy of the polemics is obviously related to the dire state of Russian-Western relations and to the presence in Warsaw of a radical nationalist government.

 

But how should we evaluate the historical content of these exchanges? My first book, published in 1989 on the 50th anniversary of the Nazi-Soviet pact, was The Unholy Alliance: Stalin’s Pact with Hitler. Since then I have written many more books and articles about the Nazi-Soviet pact. My research has led me to conclude that Putin is broadly right in relation to the history of Soviet foreign policy in the 1930s but deficient in his analysis of the Nazi-Soviet pact.

 

After Hitler came to power in 1933 the Soviets did strive for collective security alliances to contain Nazi aggression and expansionism. Moscow did stand by Czechoslovakia in 1938 and was prepared to go war with Germany.

 

After Munich the Soviets retreated into isolation but Hitler’s occupation of Prague in March 1939 presented an opportunity to relaunch their collective security campaign. In April Moscow proposed an Anglo-Soviet-French triple alliance that would guarantee the security of all European states under threat from Hitler, including Poland.

 

Some historians have questioned the sincerity of Moscow’s triple alliance proposal but extensive evidence from the Soviet archives shows that it was Stalin’s preferred option until quite late in the day. The problem was that Britain and France dragged their feet during the negotiations and as war grew closer so did Stalin doubts about the utility of a Soviet-Western alliance. Fearful the Soviet Union would be left to fight Hitler alone while Britain and France stood on the sidelines, Stalin decided to do a deal with Hitler -that kept the USSR out of the coming war and provided some guarantees for Soviet security.

 

The Soviets were not as proactive as they might have been in trying to persuade the British and French to accept their proposals. Some scholars argue this was because the Soviets were busy wooing the Germans. However, until August 1939 all the approaches came from the German side, which was desperate to disrupt the triple alliance negotiations. The political overture of April 1939 mentioned in the Polish foreign ministry statement is a case in point: the initiative came from the Germans not the Soviets.

 

One state that Moscow did actively pursue in 1939 was Poland. The bad blood in Soviet-Polish relations notwithstanding, after Munich the two states attempted to improve relations. When Hitler turned against Poland in spring 1939 Moscow made many approaches to Warsaw, trying to persuade the Poles to sign up to its triple alliance project. But Warsaw did not want or think it needed an alliance with the USSR given that it had the backing of Britain and France.

 

The failure of this incipient Polish-Soviet détente sealed the fate of the triple alliance negotiations, which broke down when the British and French were unable to guarantee Warsaw’s consent to the entry of the Red Army into Poland in the event of war with Germany.

 

After the signature of the Nazi-Soviet pact there was extensive political, economic and military co-operation between the Soviet Union and Germany. Most people see this as a tactical manoeuvre by Stalin to gain time to prepare for a German attack. However, I have argued that in 1939-1940 Stalin contemplated the possibility of long-term co-existence with Nazi Germany.

 

Putin makes the point that Stalin did not sully himself with meeting Hitler, unlike British, French and Polish leaders. True, but Stalin received Nazi Foreign Minister Ribbentrop twice - in August and September 1939 - and in November 1940 he sent his foreign minister, Molotov, to Berlin to negotiate a new Nazi-Soviet pact with Hitler. It was the failure of those negotiations that set Soviet-German relations on the path to war.

 

The first clause of the secret protocol attached to the Soviet-German non-aggression treaty concerned the Baltic states. Throughout the triple alliance negotiations Moscow’s major security concern was a German military advance across the Baltic coastal lands to Leningrad. With the signature of the Nazi-Soviet pact that Baltic door to German expansion was locked by a spheres of influence agreement that allocated Latvia, Estonia and Finland to the Soviet sphere. Lithuania remained in Germany’s sphere but was transferred to the Soviets in September 1939.

 

It was the second clause of the protocol that divided Poland into Soviet and German spheres but this should not be seen as a definite decision to partition Poland, though that possibility was certainly present. The protocol limited German expansion into Poland but did not specify the two states would annex their spheres of influence. The actions of both states in that respect would be determined by the course of the German-Polish war. In the event, Poland was rapidly crushed by the Germans, while the British and French did little to aid their ally except declare war on Germany. It was in those circumstances that Berlin pressed the Soviets to occupy Eastern Poland. Stalin was not ready, politically or militarily, to take that step but he knew that if the Red Army did not occupy the territory then the Wehrmacht would.

 

Putin glosses over the fact that the Red Army’s entry into Poland was a massive military operation involving a half million troops. Large-scale clashes with Polish forces were averted only because Poland’s commander-in-chief ordered his troops not to fire on Red Army. Even so, the Red Army suffered 3000 casualties including a thousand dead.

 

Often accused of parroting the Soviet line, Putin did not invoke the most potent argument that Moscow used to rationalise its attack on Poland, which was that the Red Army was entering the country to liberate Western Belorussia and Western Ukraine. 

 

Poland’s eastern territories had been secured as a result of the Russo-Polish war of 1919-1920. These territories lay east of the Curzon Line – the ethnographical frontier between Russia and Poland demarcated at Versailles. The majority of the population were Jews, Belorussians and Ukrainians and many welcomed the Red Army as liberators from Polish rule. Such enthusiasm did not outlast the violent process of sovietisation through which the occupied territories were incorporated into the USSR as part of a unified Belorussia and a unified Ukraine.

 

During the Second World War Stalin insisted that the Curzon Line would be the border between Poland and the USSR – a position that was eventually accepted by Britain and the United States. As compensation for its territorial losses Poland was given East Prussia and other parts of Germany. The result of this transfer was the brutal displacement of millions of Germans from their ancestral lands.

 

History is rarely as simple as polemicizing politicians would like it to be. Both sides of the Russo-Polish dispute have some valid arguments; neither has a monopoly of what is a bitter truth. The Nazi-Soviet pact is a fact but so is Polish collaboration with Hitler in the 1930s. The Soviet Union did cooperate with Nazi Germany but it also played the main role in the defeat of Hitler. Stalin was responsible for vast mass repressions but he was not a racist or genocidal dictator and nor was he a warmonger. The Red Army’s invasion of Eastern Poland was reprehensible but it also unified Belorussia and Ukraine. During the Second World War the Red Army was responsible for many atrocities but it did not commit mass murder and it did, together with its allies, liberate Europe from the Nazis.

 

Politicians will always use the past for political purposes. But in 2009 Putin came quite close to a balanced view about the Nazi-Soviet pact, as did Tusk in his measured rejoinder. Let’s hope that Poland and Russia can find their way back to such middle ground. 

 

The victory over Nazi Germany required enormous sacrifices by both countries. Surely it is possible to celebrate this common victory with dignity and with respect for differences about its complicated history.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174070 https://historynewsnetwork.org/article/174070 0
Strange Obsession: President Trump's Obama Complex  

 

Journalists were astonished when President Donald Trump took verbal shots at President Obama (without naming him) in a speech intended to deescalate a conflict with Iran on January 8, 2020. In that kind of international crisis, U.S. presidents ordinarily encourage a united American front. Yet Trump’s remarks had a disuniting effect. He presented a sharply negative judgment about Obama’s leadership. Trump criticized Obama’s “very defective” and “foolish Iran nuclear deal.” He claimed missiles fired by Iran at bases housing U.S. troops were financed “with funds made available by the last administration.” The statement implied that blood would be on Obama’s hands if Americans died in the bombings. Journalists said it was quite unusual for a president to lash out at his predecessor when delivering an important foreign policy message that needed broad public support. 

 

The journalists should not have been surprised. Trump has publicly berated Barack Obama on numerous occasions. While Trump’s dislike of Obama is complex and multifaceted, his behavior at the 2011 White House Correspondents’ Dinner may reveal an important source of the hostility.

 

At the time of that event Donald Trump maintained that Barack Obama had not been born in the United States and therefore was ineligible to be president of the United States. When President Obama stepped up to deliver a humorous monologue at the April 11 dinner, he saw an opportunity to poke fun at the champion of this false claim. Referring to rumors that Donald Trump might run for president someday, Obama pointed to Trump’s limited leadership experience as TV host of Celebrity Apprentice. Then Obama referred to recently published documentation confirming his U.S. birth. Now that the birther claim was put to rest, Obama teased, Trump could focus on “issues that matter -- like, did we fake the moon landing?” Comedian Seth Meyers piled on. “Donald Trump has been saying he will run for president as a Republican,” noted Meyers, “which is surprising, since I just assumed he was running as a joke.” 

 

Commentators in the national media interpreted the situation as a public humiliation. They observed that Trump appeared angry and did not smile. When asked about the event later, Trump scolded Meyers for being “too nasty, out of order” but said he enjoyed the attention. From that time on, though, Trump’s references to Obama became more contemptuous. Trump made several statements in 2011 claiming President Obama might attack Iran in order to boost his chances in the next presidential election.

 

A suggestion that President Trump’s contempt for Barack Obama played a role in America’s recent troubles with Iran is, of course, a matter of speculation. We cannot be sure that contempt for Barack Obama affected President Trump’s decision-making on key policy matters. But there is context for considering the idea. Throughout his presidential campaign and years in the White House, Donald Trump has delivered numerous verbal beatings to supposed villains. 

 

Just about anyone who criticizes Trump publicly becomes a target. The president has ridiculed Hillary Clinton, Adam Schiff, and Nancy Pelosi. Heroic and much-admired individuals received the president’s wrath, as well, including Senator John McCain, aviator and POW in the Vietnam War, and Khizr Khan, whose son, a U.S. soldier in Iraq, died protecting his men from a suicide bomber. Even the 16-year old climate activist Greta Thunberg received insults. Thunberg’s offense?  Staring down President Trump at a UN meeting on climate. Donald Trump has expressed scorn towards numerous people, but no public figure has been as consistent a mark for contempt as former president Barack Obama.  

 

Trump’s long record of criticizing Barack Obama seems to reveal deep-seated enmity. Ordinarily, U.S. presidents do not speak much about their predecessors, but when they do, the references tend to be positive. Trump refers to Obama often and in a disparaging way. CNN analyst Daniel Dale calculated that Trump mentioned Obama’s name 537 times in the first 10 months of 2019. “For whatever reason,” observed Fernando Cruz, who served both Obama and Trump at the National Security Council, “President Trump has fixated on President Obama, and I think that he views President Obama as a metric he has to beat.” Peter Nicholas, who covers the White House for Atlantic, said “a guiding principle of Trump’s White House has been, simply: ‘If Obama did it, undo it.’”

 

Trump hammered President Obama’s domestic initiatives. He tried to terminate Obama’s signature achievement, the Affordable Care Act, (its popular name, “Obamacare,” provided an attractive target). President Trump reversed Obama’s efforts to move energy consumption away from coal, and Trump opened national parks to commercial and mining activity, rejecting the protections Obama favored. Trump also undermined the Obama administration’s environmental initiatives. He mocked Obama’s promotion of wind power and rolled back regulations for oil and gas production, including standards for methane gas emission. 

 

In foreign affairs President Trump abandoned his predecessor’s efforts to bring nations together to fight climate change, and he rejected Obama’s plans for a trans-Pacific trade deal. Trump also scratched Obama’s programs for improved relations with Cuba. 

 

The most notable attack on Obama’s legacy in international affairs came in May 2018 when President Trump began pulling the U.S. out of the nuclear accord with Iran. Trump ignored advice from members of his national security team who supported the agreement. The accord had been working. Iranians complied with its terms, placing nuclear programs on hold in return for a promise of reduced sanctions. President Trump blasted the accord as “the worst deal ever.” His actions led Iran to reinstate nuclear development. In a brief time, Trump managed to smash an effective security arrangement that also had backing from the UK, Russia, France, China and Germany.

 

The reasons for Donald Trump’s major decisions appear shrouded in mystery. Why did President Trump try to obliterate Obamacare but offer no well-conceived substitute? Why did he abandon the Iran nuclear deal but offer no alternative that foreign policy experts considered effective? Perhaps Trump’s rejection of these and other important measures did not reflect disagreement about policy details. Maybe Trump objected to them because they symbolized goals and accomplishments of Barack Obama.

 

There can be no certainty about the emotional impact of Donald Trump’s unpleasant experience at the White House Correspondents’ Dinner on April 11, 2011. Suggestions about a connection must remain speculative. Not even well-trained psychologists or psychiatrists can provide definitive judgments about the significance. Nevertheless, Donald Trump’s lengthy and extensive record of negative comments about his predecessor is so unusual that connections to the event of 2011 deserve study.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174073 https://historynewsnetwork.org/article/174073 0
A Courageous Catholic Voice Against Antisemitism

Boston Irish activist Frances Sweeney was one of the few Catholic voices to challenge the silence in response to antisemitic attacks. 

 

“Eight young men came careening out of a side street. One snatched a yeshiva boy’s glasses and spun them into the street…another dumped the [Jewish] newsboy’s [papers] into the gutter; as yet another yanked as he had seen in the newsreels, an old, spidery Jew by his beard.”

 

This scene, which sounds as if it could have taken place this week in Crown Heights or Williamsburg, actually appears in the autobiography of the late award-winning journalist Nat Hentoff, recalling the wave of violent assaults on Jews in Boston in 1938.

 

Hentoff, then a student at Northeastern University, was an eyewitness to what the newspaper PM described as an “organized campaign of terrorism” against Jewish residents of Boston’s Roxbury, Mattapan, and Dorchester neighborhoods in the late 1930s and early 1940s.

 

The perpetrators were Irish Catholic youths, who were inspired by the rabble-rousing “Christian Front” organization and Father Charles Coughlin, the antisemitic priest whose hate-filled radio show drew millions of listeners each week.

 

As the harassment and beatings of Jews in the streets of Boston reached epidemic levels in 1943, one hundred Jewish boys and girls, ages 12 to 16, sent a poignant petition to the mayor. 

 

The violence “makes us sometimes doubt that this is a democratic land,” the children wrote. “We cannot walk on the streets, whether at night or in the daytime, without fear of being beaten by a group of non-Jewish boys.” They pointed out that the environment had become so dangerous for Jews that Jewish Girl Scout troupes and other social clubs had been forced to stop meeting,

 

Instead of taking action against the violence, Mayor Maurice Tobin dismissed the attacks as “strictly a juvenile problem,” while Governor Leverett Saltonstall accused the New York newspaper PM of being “utterly unfair” in criticizing the political leadership’s response to the crisis.

 

Given the fact that both the youth gangs and the Christian Front agitators were overwhelmingly Irish Catholic, the failure of the local Catholic leadership to speak out was especially troubling. 

 

Boston Irish activist Frances Sweeney was one of the few Catholic voices to challenge the silence. “The attacks on Jews…are the complete responsibility of Governor Saltonstall, Mayor Tobin, the [Catholic] church, and the clergy—all of whom [have] ignored this tragedy,” Ms. Sweeney charged.

 

Sweeney was the editor of a small crusading newspaper, the Boston City Reporter, which focused on exposing the antisemitic outbreaks and other instances of racism in the city. She was aided by a dozen volunteer researchers, including young Hentoff, who tracked the assaults and interviewed the victims. 

 

A fearless muckraker in the best sense of the word, Sweeney likely took her life into her hands when she infiltrated an event at South Boston High School in 1942 featuring the antisemitic priest, Rev. Edward Lodge Curran (he was known as the “Father Coughlin of the East”). Sweeney was spotted, roughed up, and physically thrown out of the building.

 

In her newspaper, Sweeney repeatedly called on the head of the Catholic Church in Boston, William Cardinal O’Connell, to “tell the faithful, without equivocation, to stop persecuting the Jews.” O’Connell summoned Sweeney to his office and threatened to have her excommunicated if she continued her “recklessly irresponsible attacks on the Church.” Not surprisingly, Sweeney refused. “The facts are the facts,” she replied. “Silence is a fact, especially when it comes from on high.”

 

Strong words of condemnation by Catholic leaders in the 1940s could have helped change the anti-Jewish atmosphere in Boston. Strong words of condemnation today by African-American leaders might influence those who have been assaulting Jews in Brooklyn in recent weeks. 

 

In the absence of such leadership, it is left to courageous individuals to speak out. That generation was fortunate to have Frances Sweeney. Will comparable voices emerge to counter the antisemitic violence of our own time?

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174071 https://historynewsnetwork.org/article/174071 0
Remembering Dennis Showalter, Grandmaster of Military History

 

On the evening of December 30, 2019, Dennis Showalter, a noted scholar of German, American, and military history, fought his last battle and rode off on a pale horse. He leaves behind a career spanning six decades teaching countless undergraduate students, shepherding thousands of graduate students, authoring many seminal works, and demonstrating the enduring importance of military history in the minds policy makers, service members, and the American public. 

 

As word circulates of his movement from a practitioner of the art to a subject of history, many individuals fortunate enough to have shared a room with him will undoubtedly give testimony to his excellence as a lecturer even as he fought against the cancer that eventually stilled his body. When he spoke on matters of history, culture, politics, or even baseball at gatherings, he did not simply lecture in the staid and muted voices too common in today’s academic halls. His deep baritone voice filled rooms with an oratory performance like few others, weaving complex thoughts into a symphonic-like confluence of research, historiography, and common culture made easily accessible to elders of the discipline and the uninformed alike. Reading a book a day on average, few could keep pace with the man on a wide array of topics, including the usage of transportation, armor and firepower, the ancient laws of piracy against Al-Qaida and ISIS, the exemplars of Buffy the Vampire Slayer as a cultural metaphor for a PTSD in RAF combat pilots, and how a Yankees pitcher managed to flush a perfect game. 

 

Like most serious scholars, Dennis Showalter’s depth and breadth of understanding on a wide array of subject matters was a cultural draw, but what kept students overfilling his classes and people coming back for more was the “Showalter Experience.” Over the years, my fellow surrogate academic sons and daughters of Dennis would often sit at the backs of rooms and watch new grad students enter for conference panel sessions, carrying notepads, pens, and almost dower expressions of academic seriousness. As he began to perform, we watched as these sullen figures suddenly dropped, first, their jaws, and then their pens, becoming entranced by the man’s intellectual repertoire sprinkled with cultural touchstones on science fiction, music, and clarifying his point with the phrase “Now in this discussion, mind you, there are three points that need to be considered….” No one was able to compete with Dennis and his sharp wit. 

 

Few knew from whence the mold that made Dennis Showalter was cast. Born in Minnesota in 1942, neither of his parents was well-educated, monied, or politically aligned. They were practical, stern, fiscally conservative, and they pushed their son with an urgency that defined the children of the Great Depression, hoping their progeny needed never suffer as they had. His mother was a stern homemaker who took care of the family while his salesman father was away. He had regularly traveled from town-to-town in rural America, selling items door-to-door in places still in recovery from the Depression and largely bypassed by the industrial transformation brought about by the Second World War. Upon reflection in the last years of his life, Dennis frequently mentioned how he treasured the trips he made with his father when he was old enough, looking for customers, talking baseball, understanding the value of money, and of how everyone deserved to be treated with dignity, inherent value, courtesy, and the closeness of a friend one had yet to meet. On a subconscious level, it also taught him the mechanics of oration, the audience-performer dynamic, and the art of the show in the sale. When it came time to make a career, Dennis admitted how he hated sales. As he was also “not good with his hands”, he knew the only path open to him was through making “this education thing work.” So he took his mother’s steadfastness and his father’s ability to sell into the academic world, graduating first with a BA from St. Johns University in 1963 and, later, earning a PhD from the University of Minnesota in 1969. In the later, he became close friends with noted Germanist and WWII OSS Chief of the European Axis Section of the Board of Economic Warfare Harold Deutsch, merging a thorough attention to detail with his now legendary showmanship. 

         

Now on the job market and perceiving himself as a fish out of water as the Vietnam War boiled over, Dennis Showalter displaced his political beliefs and pushed himself to measure up with the more “well-heeled crowd” of scholars with which he found himself sharing office and classroom space. In 1969, he began teaching at Colorado College and, in spare hours, threw himself into writing. During this period, he published many of the still standing authoritative works in his field such as Railroads and Rifles: Soldiers, Technology, and the Unification of Germany, German Military History Since 1648: A Critical Bibliography, and Tannenberg: Clash of Empires, 1914. However, they were more than just the requisite ticket punching of a new scholar. They were (and still are) recognized as a cut well above the rest, garnering him Distinguished Visiting Professorships at the Air Force Academy, the United States Military Academy, Marine Corps University, a position on the Iraq War Study Group, and regular fixture on the national and international lecture circuit. As he later identified in others, Dennis was “a first rate mind” and he tried to make the world a better place with it through the heart of American democratic principles. 

 

By the late 1970s, Professor Showalter was firmly entrenched in academic institutional circles, which granted him a vantage point from which he helped lift the profile of military history much maligned by radicals in the aftermath of the Vietnam War. At first, he served as a Trustee for the then-named American Military Institute and Editorial Advisory Board member for Military Affairs.  Following a major incident involving a small group of radicalized American Historical Association (AHA) participants who disrupting a panel of American military historians at the 1984 AHA conference, Dennis was also one of the members who stepped in to attempt to heal the breach between the small self-funded military history group and the organization chartered by congress. There he charmed crowds with his special brand of humor and personal charm, inviting the social history-centric clique to become involved in dialogues with the military history community. When Robin Higham retired after a lengthy term as editor of Military Affairs, Dennis Showalter, as president of the Society for Military History, helped steer the flagship publication, renamed as The Journal of Military History,through a rough transition into the now longstanding editorial hands of Bruce Vandervort at the Marshall Foundation, opening the door to a broader field of scholarly subject matter. Around the same time, Dennis became a regular pitch-hitter for various academic presses as series editor, using these outlets to aid the fledgling careers of young scholars by showcasing their work. Many of these included Kathryn Barbier, Patrick Speelman, Michael Neiberg, David Ulbrich to name just a few. If someone had a good idea, he would find a matching outlet for their efforts whether or not his politics aligned with the author’s views.  Long before “diversity” became a cultural buzzword for change, Dennis Showalter was already ahead of the crowd, championing the (still) most underappreciated quotient of American culture, “the diversity of ideas.” 

 

“New ideas are always needed,” he once told me. “If they can stand up to inspection, then no one should be left out in the cold.” For these efforts and more, he was awarded the SMH Samuel Elliot Morrison Award in 2005 and the Pritzker Literature Award in 2018 for lifetime achievement. At the time of his death, he was already working on his twenty-eighth book.

 

Still, Dennis Showalter never quite learned his father’s most important lesson or so he occasionally told me. You see, like the best of educators, Dennis sold his audiences on his subject matter with enthusiasm, but, instead of bargaining or raising the price as others do, he gave away his most valuable possessions for free: his time and his example of how to be a good person in a solipsistic world. Dennis went the extra mile for anyone looking for advice or in need of assistance….even his few detractors. In nearly twenty years, I’ve long lost count of the number of people in need to which he lent money, how many checks he picked up for starving grad students, how many dinners or manuscript edits he dropped to rush to campus to aid a student in crisis. When duty called, he stepped up before others even recognized the need. For example, when Harry Deutsch died, Dennis finished his last book (originally entitled What If?and since repackaged as If the Allies had Won) without question, losing time, money, and a few hairs off his head in the process. “If I make a promise, I stick to it,” my Doktorvater told me in the middle of a situation that would have broken others. “If I make a friend, I side with them to the bitter end.” As those close to him have long been intimately aware, this modus vivendi also extended to felines. Dennis Showalter never let a cat go hungry and never left the Colorado Springs pound without a grateful feline in arm …or two…or in one case, three. He said he couldn’t “bare the thought of leaving them there to await a needle in a cold cell when there was room in my home.” 

 

 It was an odd quirk of fate that put me in the same room with Dennis Showalter. One day, Bill Forstchen of Montreat College had looked towiden Dennis’s audience by bringing him into mainstream publishing and “get him paid what he was worth.” When the meeting was called, I had put my doctoral pursuits on hold to take care of my dying father. I knew of his scholarly reputation, had read his seminal work of “Railroads and Rifles” and his biography of Frederick the Great (still thedefinitive books in the field several decades later), but I knew little else to be the one to translate academes into businessspeak and keep him on target for manuscript delivery without seeming more than an interested fan. In the weeks following 9/11 and as the ashes of nearly 3,000 victims of America’s post-Cold War “Peace Dividend” rained down outside a Manhattan office, a group of us managed to craft for Dennis his dream project proposal he never thought an academic press would touch, a World War II dual biography akin to Stephen Ambrose’s Crazy Horse and Custer.  Patton versus Rommelwas born that day as was his follow-up Hitler’s Panzers. My first eye-opener to the “Showalter Experience” came when he refused to allow me to call him “Doctor” or “Professor.” Those were “titles that got in the way,” he said. “Just call me Dennis.” The second came when the conversation turned to money. Upon hearing the suggested sum to go with the proposal, he looked around the office, fixed on a copy of a Britney Spears’ memoir and asked with a broad grin if we “could possibly get Brittany to appear on the cover?  Maybe on top of a tank? Hey, if I’m getting my dream, why not get as many readers as possible?” 

 

When Dennis was first diagnosed with esophageal cancer, he asked me one night if I thought he had measured up to what his parents had wanted. I’ll tell you what I told him: “Some people are measured by their books, or their perceived professional reputation, or the number of bodies one leaves behind them or, in our current age, their vigorous support for a given political ideology. You sir, have an embarrassment of riches of which they would have been proud.” Dennis didn’t care about things for himself. He cared about what he could do with them to advance the cause of fellowship, free discussion, and an understanding of what it means to be human through the most horrific aspect of our animal dimension, war. There was a time when he was not here and that time, with profound sadness I must admit, has come again.  Many have and will speak of him as a valued grandmaster of the profession, but, most importantly as he now rides into the sunset of history; we should remember Dennis Showalter as a kind soul, a selfless friend, and a good man. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174072 https://historynewsnetwork.org/article/174072 0
A Play About Historical Reenactors Grapples With American Identity You have all seen the historical re-enactors, the men, women and children who dress up in period costumes, grab a musket and re-fight battles of the Civil War, American Revolution and other conflicts. They have been in numerous movies (Lincoln, Glory, Gettysburg),appeared on hundreds of television programs and been the subjects of countless magazine and newspaper articles. They jump back into history and bring it alive for us.

 

Talene Monahon’s new play, How to Load a Musket, takes a deep, hard look at the re-enactors of two wars, the American Revolution and the Civil War. There is a lot of humor connected to the American Revolution, but when she turns her sights on the Civil War she fires away at the lives of the re-enactors, and their views of history and politics, with a blazing musket of her own. She hits most of her targets, too. This play at the E. 59th Street Theater, in New York, that opened Thursday, is a scorcher and the big parades and quaint campfires we have come to know and love fade off into the distance as the playwright fires away about what America I was really like, is like, and might be like in the future. It is a bare knuckled, no holds barred historical brawl on the race issue in 1861 and today, too. She charges that the race argument is about today, and not yesterday.

 

The play starts off in the office of the head of the Lexington, Massachusetts, re-enactment group and its lovable members. They are cute and charming. One George Washington re-enactor says that he is actually jealous of another George Washington re-enactor. The Americans who play British soldiers poke fun at themselves and a high-spirited middle-aged woman with a thick Boston accent giggles about the men she meets on the battlefield, and so do the man chuckle about the women. They all talk about how hard it is to meet people, but quite easy in the middle of a re enactor battle. They discuss at length at what a warm world they have created within the confines of the re enactor universe.

 

When the playwright moves to the Civil War, though, the three-cheers-for-the-red-white-and-blue atmosphere changes and the terrain sizzles with debates over the role of re-enactors and which America they represent. There is loud and pronounced verbal fisticuffs over the controversial tearing down and removal of Confederate monuments and what many African Americans might really feel about race back then, today and tomorrow morning.

 

This is an electrifying play that pulls no punches, a play that grabs your throat. It asks again and again, whose American was it in the past, and whose America is it today?

    

The playwright focuses much of the second half of the play on the 2017 white supremacist rally in Charlottesville, Virginia, that was held to support far right political causes and to prevent the removal of statues of Confederate war heroes. There were KKK men and women in their white robes, far right sympathizers and dozens of Confederate flags flying in the breeze. The far-right people were opposed by hundreds of shouting counter protestors. Things got out of hand. One woman was killed and several people were injured. The confrontation, recalled again in the play, drew international attention. In the play, the re-enactors fear they’ll be attacked, too.

 

That incident then erupted into a national debate over racism and President Trump’s famous line that there were good people on both sides. He should have said there were bad people in the crowds. The line is repeated in the play.

 

The great grandson of a Confederate soldier says that what is happening in America with monument removals and name changes, is “historical genocide” and that liberals today are trying to seriously rewrite history and cutting the stories of brave Confederate heroes out of it. This is, he insinuates, denying a part of American heritage. This is, of course, a debate that has been raging for several years.

 

The Confederate great grandson notes that his family helped a post-Civil War newly freed slave family learn how to farm and take care of their home. America is not, he claims, just heroes and villains.

 

The play is more of a moving conversation and heated debate than it is either a comedy or drama. Ms. Monahon deftly turns it into a play, though, carrying you along in the trenches as the re-enactors debate their lives and their wars.

 

The playwright does step over the line a few times. She suggests that tomorrow morning the U.S. might plunge int a Civil War over race. That is highly doubtful. She has an African American character say that Abraham Lincoln was a white supremacist. Oh, come on!

   

Jaki Bradley has done a fine job of directing this play. She has carefully woven dialogues and story to turn a serious debate into an engaging and rewarding play. All of the performers are superb in this drama. Bradley gets fine performances from Carolyn Braver, Ryan Spahn, Adam Chanler-Berat, Andy Taylor, David J. Cork, Lucy Taylor, Richard Topol and Nicole Villami.

 

This play is a bumpy night at the theater. If you go, regardless of your political persuasion – lock and load!

 

PRODUCTION: The play at the E. 59th Street Theaters is produced by the Less than Rent Theatre. Sets: Lawrence Moten, Lighting: Stacey Derosier, Sound: Jim Petty, Props:  Caitlyn Murphy, Costumes Heather McDevitt Barton. The drama is directed by Jaki Bradley. It runs through January 26. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174075 https://historynewsnetwork.org/article/174075 0
Digesting History: A Conversation with the Museum of Food and Drink’s Curatorial Director Catherine Piccoli Catherine Piccoli is a food historian and writer, whose work focuses on the intersection of food, culture, memory, and place. She brings this multidisciplinary approach to the Museum of Food and Drink. As curatorial director, she oversees the creation of MOFAD’s exhibitions and educational programming, and guides the operations team. Catherine was instrumental in the research, writing, and development of past major exhibitions, Flavor: Making It and Faking It and Chow: Making the Chinese American Restaurant, as well as gallery shows Feasts and Festivals, Knights of the Raj, and Highlights from the Collection. She also established the museum’s robust public programming.

 

The following interview was originally conducted on November 25th, 2019.

 

What do you think separates MOFAD from other museums or similar institutions?

 

Because we are the Museum of Food and Drink, we start at a place of similarity with everyone. Everybody eats. Whether or not you like to eat, you have to do so multiple times a day. It’s something that we engage in out of necessity, it’s something that we engage with through culture, so I think having that starting point in common means that it’s much easier to reach people because they already have that interest in food. I think also because we believe that “Food is Culture”, and we do a lot of programming around that idea, we're meeting people with an idea that they're already comfortable and familiar with. Most people can think about what their families fed them, what they ate growing up, what is nostalgic to them, what is a part of their personal history and what that means to them. Then we can really take it from there and go in so many different directions and hopefully teach people something that they didn't know before. A big thing for us internally is thinking about the invisible every day. You can do that so easily with food. But when you open your refrigerator and you look at, say, a Chinese takeout box. What is the history of that food? What is the history of a Chinese takeout box? What is the history of a refrigerator? Why do we have one? Why is it an electric refrigerator? With all of these sorts of things, we can really blow people's minds wide open about food and use food as a lens to talk about larger ideas.

 

You mentioned that MOFAD’s slogan is “Food is Culture”. What does that mean to you in a historical context?

 

For me, that starts on a personal level. You know, your family's culture and history. We can take me for example, I am Italian-American and Polish- and Slovak-American, but I also grew up in the Midwest. So, growing up in Chicago, what are the things I grew up eating that my Italian family made? Or that my Slovak grandmother made? What does it mean to have grown up in a city with a really large Polish population? How did that impact the foods that I ate every day? And then you can go out even further than that to Poland. Cuisine in Poland, culture in Poland, how does that travel? What does transnational cuisine look like? How do cuisine and culture change when people move? For me, thinking "Food is Culture" is all-encompassing from the micro to the macro.

 

I have noticed that MOFAD offers an abundance of public programming, and that programming is more interactive than I have seen at other museums. What do you think the advantages are of inviting the public to become active participants in history?

 

We, not only in our public programming but also in our exhibitions, we feel it’s really important to engage people through all their senses. That's easy to do because food does that. When you come to see our exhibitions you will eat, you will literally, I like to say, "digest" the information that you have just literally digested. It's really important when you're talking about food to be able to experience it as well. We do that in our exhibitions as well as our public programming. We just had Marcus Samuelsson come last week and talk about the release of Our Harlem as an audiobook. He had some of the people that he interviewed there, they had a panel discussion and then there were foods from Red Rooster that people got to eat afterwards. Not everyone may be able to go to Red Rooster, but maybe you can come to MOFAD and taste some of those foods. Or not everyone may like to cook, or feel they're good enough to try one of those recipes. So they, too, can come to MOFAD and try that. Through the years we've done programming and exhibitions around the flavor industry. Which included programming around your sense of smell as well as things that you're eating and tasting. We've done honey tastings in the past, wine tasting, beer and cheese pairing, all sorts of different things to help people continue to engage with the topic but also think more deeply about food and drink.

 

What do you think are the advantages and disadvantages of featuring only one exhibit at a time? For example, you currently are displaying "Chow: Making the Chinese American Restaurant".

 

For us, I guess you could say we are a fledgling institution, and our current space is called MOFAD Lab. We call it that instead of calling it the Museum of Food and Drink because we really saw it as our experimental space, our exhibition design studio or even our "test kitchen" if you want to have another pun; where we can test out how to be a museum. While MOFAD has existed as an idea since 2010, it wasn't until 2015 that we had our first physical space. The Lab is not big enough to have multiple full-size exhibitions, but that was okay for us because we're still a small team and doing one exhibition at a time really helps us to focus and make the best exhibition possible. It has worked very well for us, I think, but it can be difficult at times. You know Chow’s been open for a few years now so some people think we're the Chinese Restaurant Museum or even a Chinese-American restaurant sometimes, which is a little bit silly. But we find that when people come in who are confused, once they get to MOFAD Lab and we can talk to them and they can really understand what we're doing and want to come back and see more. It is our goal, ultimately, to grow to an institution on the scale of the Smithsonian or the Met. Obviously, this is our first step towards that and hopefully the next phase will be several galleries instead of one so we can have multiple exhibitions at a time.  

What do you hope that a visitor who comes in with no prior knowledge gleans from your current exhibition? What do you what them to walk out of MOFAD thinking about?  

For us, a lot of it has to do with connection. With Chow, we're using food as a lens to talk about racist immigration policy. We're talking about the Chinese Exclusion Act and how despite the fact that during that 60-year period Chinese people are functionally excluded from entering this country, the Chinese-American restaurant really blossomed and that restaurant cuisine becomes a part of the culinary zeitgeist. We want people to leave understanding why that's a remarkable story and how that happened, but we also want visitors to go home and think about their local Chinese takeout place differently. Here in New York, a lot of the Chinese takeout restaurants are still family-run and probably across the country as well. Hopefully, people are going into those restaurants and engaging with the folks that are running them. Who are those people? How did they get to the U.S.? What are their plans? What are their dreams? What are they cooking? We really want people to look at those spaces in a new way and engage with the folks who are cooking their food.

 

David Chang, a chef who I personally admire, has been talking a lot recently about trying to get people to rethink MSG. Do you bring that at all into your current exhibit or into your conversation about Chinese-American food?

 

It's funny you asked that. Our first exhibition at Lab was called "Flavor: Making it and Faking it" and it was on the history and the technology of the flavor industry. We had three main stories that we told. One of which was the quote-unquote "discovery" of umami as a taste and how Dr. Kikunae Ikeda, who's a Japanese chemist, is the one who "discovers" it and names it and then begins manufacturing MSG in Japan. So we talked a lot about MSG in our first exhibition and we decided not to have any panels about it in this exhibition. But we often get that question at our culinary studio, and we often refer people to Harold McGee's piece on MSG which was in the first issue of Lucky Peach. But our stance as a museum on MSG, if that's what you’re asking, is that the studies have not borne out whether or not MSG is definitely bad for people. Now obviously everybody's bodies are different so if somebody feels that they react to MSG we're not going to argue with them about that because we don't know what's happening in each other's bodies. But, you know, MSG is used in so many foods in this country in the industrial food system and has been since the 30s in things like Campbell's soup here in the U.S. So, for us, it's not a scary thing. We did talk about in the Flavor exhibition the racist underpinnings of the fear of MSG with "Chinese restaurant syndrome", but again, that's not something that is an active piece of this exhibition.

 

While conducting research for the CHOW exhibit, did any one dish or food strike you as having a particularly interesting history?

 

I'm going to sort of answer your question. I became really intrigued by chop suey. We ate a lot of it because our initial thought was that our tasting at our culinary studio for Chow would be chop suey and historic tastings of chop suey. So, we found a lot of historic recipes dating back to the late 1800s for chop suey. We made a lot chop suey and we ate a lot of chop suey. It's one of those things that's funny right, it's one of the first Chinese-American dishes to really blow up if you will. But it's not something that's really on many menus anymore, and even now it's different. Those early recipes show usually a soy or Worcestershire based brown sauce and today chop suey is made with white sauce. It's interesting to me to think about how dishes change over time and why. I didn't look into it very much but I'm still fascinated by the idea that this one dish is a reason that Chinese-American restaurants really become en vogue in the late 1800s/early 1900s but it's something that we don't really eat anymore.

 

What made you choose to focus on food history?

 

I majored in history as an undergrad, my full major was social and cultural history which I got at Carnegie Mellon University in Pittsburgh. While I was at CMU I really thought I was going to be a music journalist. I also minored in clarinet performance so I was taking a lot of music history classes as well. That was where my passions laid. I did take a class where we read Sidney Mintz's book Sweetness and Power about sugar and I was not touched by it. It's a seminal food studies book but I was like "What is this? Why do I care?", which is funny to me thinking about it now. But I finished college and I didn't know how to write for Rolling Stone and didn't really know what was next. I was just working around Pittsburgh and started thinking about food in a different way in my 20s. I was having people over for dinner parties, getting into wine, those sorts of things. I had always been a good eater, both of my parents worked so we spent a lot of time together around the dinner table or in the kitchen on the weekends menu planning or cooking things for the week, or family baking around the holidays. Food was always the center of things that we did when I was growing up. So, it makes sense that I rediscovered that in my early 20s when I was working and forming a household of my own. I started thinking about food differently, I started interacting with food differently, and I started reading food memoirs such as Ruth Reichl and Michael Pollan. I started thinking about how I could have a career in food that wasn't necessarily working in a restaurant because that wasn't something that I was interested in, I didn't want to become a chef. I saw an ad in the newspaper for a food studies program at Chatham [University] which is where I did my master's degree. It really all sort of clicked, it fell into place for me. And of course, while I was there I realized that I could study food history. I think I saw it as the history of a recipe, or of a dish, or of a chef. Which, again, that's not personally where my passion for food history is. I like thinking about people and place and culture and the bearing that has on your food and what you eat and why you eat that, and I really was allowed to do that there. When I moved to New York with a master's degree and again wasn't really sure what I was going to do, I found MOFAD. It was sort of perfect because I had volunteered and interned and worked at history museums while I was in college and after college and then here was this museum that had food as its central focus and it made a lot of sense for me. I lucked out, I think, finding MOFAD and realizing that food history made sense for me, and being able to hold onto it and keep working.

 

As an undergrad studying history, I feel compelled to ask this question. How did your academic career influence your working career? (The answer to this question was submitted after the interview via email.)

 

There are the obvious skills around research (using databases to locate materials, analyzing primary and secondary sources, crafting and conducting oral history interviews) and writing (synthesizing and analyzing research, crafting tight and compelling narratives). A few other skills also come to mind that I've jotted down below:

 

1. Learning the formal way to address and communicate with professors: One of the first things my freshman seminar professor taught our class was the proper way to interact with professors – how to address them in person and over email, how to keep our requests short and respectful. It feels so simple now, but I'm so glad I learned this skill early on in my academic career. At MOFAD, I often have to reach out to professionals, academics and others, with no introduction. Sending that first professional email can set the tone for a productive working relationship.

 

2. Comfort using non-traditional primary sources: Perhaps "non-traditional" is not quite the right term. Still, I became quite comfortable during my undergraduate coursework for my music degree in using performances, songs, lyrics/poems as primary sources. This has served me well as a food historian where cookbooks, agricultural manuals, and recipes can serve as primary sources.

 

3. Communicating why you should care: I think this is something I began to learn as an undergrad, but really honed during my graduate work. Whenever I write, I keep the question "But why should I care?" in the back of my mind and try to answer it (sometimes again and again). I think with any topic, but especially with a topic rich in materiality like food, it's so important to convey to your reader why this thing matters, why they should care. What can a historical event teach us about current events that are affecting our daily lives?

 

Where do you see MOFAD headed in the future?

 

My dream for MOFAD is that we can continue growing and can continue putting together meaningful and thoughtful exhibitions. We're in a bit of a transition right now. Our next exhibition, which I'm really excited about, is called "African/American: Making the Nation's Table" and it's about the many contributions of African Americans to the creation of American cuisine. As part of that, we won the rights to the Ebony Test Kitchen from the Johnson Publishing Co. building. That was the test kitchen where all of the recipe testing was done for Ebony Magazine, Jet Magazine and also some other Johnson Publishing Co. magazines. It's really exciting for us to have this historic and crazy super psychedelic, 1970's, orange, purple, green, swirly kitchen on display from my hometown of Chicago as part of that. That exhibition will be on show at a different space, not MOFAD Lab but instead at the Africa Center, which is a museum in Harlem. It'll be on display there for six months next year and then it will travel. So that's amazing for us, a travelling exhibition. And then from there, we're figuring out what's next for us, where we'll go. I think that for me and for our staff, we're not trying to be and we don't want to be a place like the Museum of Ice Cream or one of those sort of Instagram “experiential” museums. We are really hoping that people come and they have an "a-ha" moment, and they learn something about food that they didn't know or they're inspired to think deeper about the things that they're putting into their body, or how foods and drinks get to their plates and to their cups. I think that we've been able to do it with our exhibitions so far and I just hope that they keep getting bigger and better.

 

 

Catherine Piccoli can be found on Twitter @gigaEats and Instagram @giga.eats. The Museum of Food and Drink can be found at mofad.org, as well as on Twitter and Instagram @mofad.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173798 https://historynewsnetwork.org/article/173798 0
The Supreme Court Historical Society: An interview with the President The Supreme Court Historical Society is a private, non-profit organization dedicated to the collection and preservation of the history of the Supreme Court of the United States. It was founded in 1974 by Chief Justice Warren E. Burger, and is still in operation today. 

 

I recently interviewed the President of the Supreme Court Historical Society, Chilton Varner. Ms. Varner is from Atlanta, Georgia where she is a litigator at the law firm King & Spalding. Ms. Varner graciously spoke about the work of the Society and its relation to history. 

 

Ms. Varner is passionate about the Society. She listed a number of things the Society does for the public, including the creation of various lesson plans, scholarly publications, a lecture series, an annual reenactment of landmark decisions, and a number of lectures open to the public.

 

Ms. Varner’s favorite activity the Society sponsors is the Supreme Court Summer Institute for Secondary Teachers. This program allows thirty secondary teachers to come to the Court to enhance “the level of their instruction about the court” for their own students. The teachers come to Washington D.C., where they are given a tutorial about the Court, the Constitution, and the Judiciary Branch. They are able to interact with one another and the Chief Justice of the Supreme Court. Ms. Varner described how this program leaves the teachers “excited, energized, and armed with new information” which they bring back to the students they teach. 

 

Ms. Varner also made note of how civic courses have been disappearing from schools around the country. She values these courses and wishes to see more young people educated about the government and its history. She would also like to see more young people sign up for the Historical Society as well because it “keeps them current, teaches them about legal history, and is important for the future.” It is exciting for her when she sees young people from her own law firm join. Ms. Varner expressed the importance of young people learning history. The Supreme Court Historical Society strives not only to keep the history of the Court alive, but to pass it on to the next generation. 

 

The Supreme Court Justices have been critical in the success of the Historical Society. Ms. Varner was greatly appreciative of all the Justices and what they have done for the Society. From introducing guest speakers to providing various forms of support, they have been key to the function of the Society. Ms. Varner noted the amount of time the Justices give to the Society. She believes they “recognize the importance of their own Court’s history” and the importance of the Society.

 

Ms. Varner herself is a part of history. In 1983, she became the first woman litigation partner at her firm and was the only woman trial lawyer at the firm for a number of years. Since the start of her time as a practicing attorney, she has seen numerous changes for women in law. Most notably, when she argued in front of the Eleventh Circuit Court of Appeals,  it was in front of a panel of all women. While Ms. Varner says there is “still a way to go and more progress to be made,” today is a very different environment where “nobody blinks an eye now when a women trial lawyer stands up to strike a jury.” Ms. Varner sees trial lawyers like herself as “historians” in both their lives and what they are arguing. 

Throughout my interview with Ms. Varner, she was clear about the importance of history in general and that of the Supreme Court. Ms. Varner highlighted just how important the Supreme Court Historical Society is in preserving this history and educating the public. You can check out the Historical Society's website at https://www.supremecourthistory.org/.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173476 https://historynewsnetwork.org/article/173476 0
Do Morals Matter in Foreign Policy?

 

Do morals matter in American foreign policy, or is American moralism just hypocrisy as realists teach us? Conventional wisdom is skeptical and surprisingly few books and articles focus on whether presidents’ moral views affected their foreign policies and how that should affect our historical judgment of them. I set out to answer these questions in my new book Do Morals Matter? Presidents and Foreign Policy from FDR to Trump

 

Examining 14 presidencies since 1945 shows that a radically skeptical view of morality is bad history. Morals did matter. For example, a purely realist account of the founding of the postwar order in terms of the bipolar structure of power or an imperial imposition of hegemony does not explain FDR’s Wilsonian design or Harry Truman’s delay in adapting it after 1945, or the liberal nature of the order that was created after 1947. George Kennan suggested a realist policy of containment, but Truman defined and implemented it in broader liberal terms. 

 

Similarly, an accurate account of American intervention in Korea in June 1950—in spite of the fact that Secretary of State Dean Acheson had declared earlier that year that Korea was outside our defense perimeter—would have to include Truman’s axiomatic moral decision to respond to what he saw as immoral aggression. Similarly, to explain the major elevation in the priority of human rights in American foreign policy after the Vietnam era, we must include the moral outlook of Jimmy Carter. Ronald Reagan’s decision to ignore his advisors and his previous harsh rhetoric about the “evil empire” must be understood in the light of his personal moral commitment to ending the nuclear threat.

 

Looking back over the past seven decades of American primacy, we can see certain patterns in the role of ethics and foreign policy. All presidents expressed formal goals and values that were attractive to Americans. After all, that is how they got elected. All proclaimed a goal of preserving American primacy. While that goal was attractive to the American public, its morality depended on how it was implemented. Imperial swagger and hubris did not pass the test, but provision of global public goods by the largest state had important moral consequences.

 

The moral problems in the presidents’ stated intentions arose more from their personal motives than from their stated formal goals. Lyndon Johnson and Richard Nixon may have admirably sought the formal goal of protecting South Vietnamese from Communist totalitarianism, but they also expanded and prolonged the war because they did not want to be “the man who lost Vietnam.” In contrast, Truman allowed his presidency to be politically weakened by the stalemate in Korea rather than follow General Douglas MacArthur’s advice of using nuclear weapons. Morality mattered greatly in both these cases.

 

If we determine morality based on the three dimensions of intentions, means and consequences, the founding presidents of the post-1945 world order—FDR, Truman, and Eisenhower—all had moral intentions, both in values and personal motives, and largely moral consequences. Where they sometimes fell short was the dimension of means, specially the use of force. In contrast, the Vietnam era presidents, particularly Johnson and Nixon, rated poorly on their motives, means and consequences. The two post-Vietnam presidents, Gerald Ford and Jimmy Carter, had notably moral foreign policies on all three dimensions but their tenures were brief, and they illustrate that a moral foreign policy is not necessarily the same as an effective one. The two presidents who presided over the end of the Cold War, Reagan and George H.W. Bush, also scored quite well on all three dimensions of morality. The years of unipolarity and then the diffusion of power in the twenty-first century produced mixed results with Bill Clinton and Barack Obama above the average and George W. Bush and Donald Trump falling well below average. Among the fourteen presidents since 1945, in my view the four best at combining morality and effectiveness in foreign policy were FDR, Truman, Eisenhower, and Bush 41. Reagan, Kennedy, Ford, Carter, Clinton and Obama make up the middle. The four worst were Johnson, Nixon, Bush 43, and (tentatively because of incompletion) Trump. Of course, such judgments can be contested and my own views have changed over time. Historical revision is inevitable as new facts are uncovered and as each generation re-examines the past in terms of new circumstances and its changing priorities.

 

Obviously, such judgments reflect the circumstances these presidents faced, and a moral foreign policy means making the best choices that the circumstances permit. War involves special circumstances. Because wars impose enormous costs on Americans and others, they raise enormous moral issues. Presiding over a major war such as World War II is different from presiding over debatable wars of intervention such as Vietnam and Iraq. 

 

The importance of prudence as a moral virtue in foreign policy becomes clear when one compares Dwight Eisenhower’s refusal to send troops to Vietnam with John Kennedy and Lyndon Johnson’s decisions. After losing 241 Marines in a terrorist attack during Lebanon’s civil war in 1983, Reagan withdrew the troops rather than double down. Similarly, Obama and Trump’s reluctance to send more than a small number of forces to Syria may look different with time. Bush 41 was criticized for restricting his objectives, terminating the Gulf War after four days’ fighting, and not sending American armies to Baghdad in 1991, but his decision seems better when contrasted with the lack of prudence that his son showed in 2003 when members of his administration expected to be greeted as liberators after the invasion of Iraq and failed to prepare adequately for the occupation. In foreign policy as in law, some levels of negligence are culpable.

 

Realists sometimes dismiss prudence as an instrumental and not a moral value, but given the complexity and high prospect of unintended consequences when judging the morality of foreign policy decisions, the distinction between instrumental and intuited values breaks down and prudence becomes a crucial virtue. Moral decisions in foreign policy involve both intuition and reason. Willful ignorance or careless assessment produces immoral consequences. Conversely, not all decisions based on conviction are prudential, as some of my cases indicated. Truman’s response to North Korea’s crossing the 38th parallel in Korea, for example, was imprudent, though he saw it as a moral imperative. These reasoned and intuited virtues can conflict with each other. Principle and prudence do not always coincide. The problem that presidents often face is not a question of right versus wrong, but right versus right. But whatever the choices, they cannot be dismissed on the basis of simplistic realist models of “national interest.” What matters is how interest was defined, and to answer that question, good history shows that morals mattered. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174018 https://historynewsnetwork.org/article/174018 0
Our GOP Problem

New York Times, April 2, 1950

 

Stone Age Brain is the blog of Rick Shenkman, the founding editor of the History News Network. His newest book is Political Animals: How Our Stone-Age Brain Gets in the Way of Smart Politics (Basic Books, 2016). You can follow him on Twitter @rickshenkman.

 

The Republican Party, unsurprisingly, has taken the position that President Trump should be defended. This is unsurprising because this is what parties in power do.  If we want to explain what has happened to the Republican Party, which all must try to do in this hour of crisis when democracy itself is on the line owing to Republican perfidy, it is essential for us to view events not from the perspective of the rational actor but from that of the party politician.  Only then can the alarming events through which we are living become understandable. 

 

We must begin with the basics.  Three overriding causes may be said to account for the behavior of politicians holding national office.  One, is money, with which we need not concern ourselves too much.  It’s obvious the role money plays in our politics.  Members of Congress must always be thinking in the back of their mind how any vote may affect their chance of financing their next election. Every GOP member of Congress has to worry that if they vote against Trump they’ll be cut-off from various campaign funds available to Republicans in good standing with the party and the party’s major-domo donors such as Sheldon Adelson and Charles Koch. 

 

More interesting, though also obvious, is the second factor, pure partisanship. The social sciences tell us that partisanship is hard-wired in the human brain.  It is the reason we cheer for our side in a ball game and hope for the opposition’s defeat.  Once we identify with a group we look for evidence that confirms the group’s status and dismiss evidence that detracts from it.  Because partisanship is stronger among Republicans generally than it is among Democrats, perhaps owing to a default loyalty bias among people who identify as conservative, it is pretty easy to comprehend the ordinary Republican’s behavior in ordinary times.  

 

Of course, these are not ordinary times.  Presidents are rarely impeached.  So Judiciary Committee Chairman Jerry Nadler, ahead of the committee’s vote on impeachment, issued a rare plea that his Republican colleagues consult their consciences before voting.  As many have noted Republicans during Watergate did just this, voting against Nixon when they similarly faced an impeachment vote.  Why is no Republican doing that this time?

 

The explanation may be found in the third factor accounting for the behavior of politicians. It is this one that is perhaps the most telling in the current situation.  Politicians prefer winning over losing and recent history suggests that the way to win, notwithstanding the losses the party suffered in 2018, is to stand with Donald Trump .  By nature politicians are cautious.  The only way to know what will succeed in winning votes is to follow the path of proven winners like Trump.  As long as he appears to be retaining the support of the GOP party base it is prudent to assume that he has figured out the magic sauce in the recipe of political victory and to follow the recipe closely.  Only a few dare to tamper with the ingredients.

 

Change is unlikely in the Republican Party short of a massive defeat.  Only in defeat do politicians, facing years in the wilderness, risk experimenting with new approaches.  Thus far there’s little sign that the party base is fielding second thoughts about Trump.  He remains nearly as popular today among Republicans as he did when he was elected.  Polls show his support among Republicans in states like California and Texas is north of 85 percent.  Nixon's support, by contrast, began to collapse by the time he faced impeachment.  At the beginning of 1973, before Watergate shook the country, Nixon had the support of 91 percent of GOP voters.  By the end of the year — a year in which John Dean testified about payoffs to the Watergate burglars and Special Prosecutor Archibald Cox was fired in the Saturday Night Massacre — Nixon’s support in the GOP had fallen to 54 percent.  

 

So the real question isn’t why members of Congress are remaining staunch Trump supporters, but why the GOP base is.  Many reasons have been offered for this strange phenomenon (strange because Trump is so unlikely an avatar of Republican virtue). They include Fox News, Rush Limbaugh, and the other leading cogs in the propaganda machine that props up the Republican Party.

 

Whatever the cause of Trump's hold over the GOP base, it's a fact, and we as a country need to do something about it. We have to hope that the GOP evolves into a better version of itself because, as Arthur Schlesinger Jr. observed in an article in the New York Times in 1950, this country needs two intelligent parties. Right now we've got just one.  Only the Democrats are grappling with the real problems the United States faces, among them climate change and inequality.  This is untenable over the long term.

 

Through much of our history we have had a responsible conservative party, as Schlesinger noted in his piece in the Times.  In antebellum America the party of Jefferson was cross-checked by the party of Hamilton and Adams.  In the next generation Jacksonians faced off against Whigs, and while the Whigs eventually disappeared, for decades they offered Americans like Lincoln an intelligent alternative.  In the postbellum period the GOP espoused (for a time)  a bold vision of racial equality and entrepreneurial zeal.  Later it was captured by the plutocrats but by the turn of the 19th century reform elements led by Teddy Roosevelt succeeded in refashioning the party as an engine of reform.  In the 1920s the party once again became beholden to the rich until the Great Depression put an end to its control of the federal government.  For a couple of decades it nearly ceased to exist at the national level.  Then, as if in response to Schlesinger’s call, the party finally made peace with the New Deal under the leadership of Dwight Eisenhower.  “Should any political party attempt to abolish social security, unemployment insurance and eliminate labor laws and farm programs,” Ike wrote, “you would not hear of that party again in our political history.”

Under both Richard Nixon and Ronald Reagan the GOP continued to deal with real world problems, particularly in foreign affairs.  But slowly in the years following the end of the Cold War Republicans gave themselves over increasingly to fake nostrums.  They did this because they found they couldn’t win by running on their real agenda -- tax cuts for the wealthy, which constituted nearly the whole of their domestic program once welfare had been reformed in the 1990s.  

 

Trump in 2016 correctly identified several key issues that demand public attention, especially the decline and demoralization of much of rural America.  But rather than offer a rational program to address this and other issues he won election by dividing the country along racial and religious lines.  Instead of appealing to the better angels of our nature he played his voters for fools.  He began his time in the national political spotlight by hinting that Barack Obama was born in a foreign country and might be a secret Muslim.  Later he signed up as a card carrying member of the anti-science brigade of climate change deniers.  Throughout his presidency he’s spread rumors of conspiracies. And his biggest "accomplishment"?  It was giving the wealthy huge tax breaks.

 

What if the GOP doesn’t reinvent itself as a responsible party? Schlesinger worried seven decades ago the GOP could collapse into pieces, leaving “its members prey for fascist-minded demagogues.”  There was, it turns out, another possibility Schlesinger didn’t anticipate.  It’s that the party would hold together by itself appealing to a trinity of fascist evils: xenophobia, racism, and authoritarianism. This should worry all of us.

 

 

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/blog/154299 https://historynewsnetwork.org/blog/154299 0
Chief Justice John Roberts' Predecessors: The Supreme Court Chief Justices Who Presided Over Previous Impeachment Trials

 

As the Senate impeachment trial of Donald Trump looms, many aspects of the trial are still undetermined. Will the parties call witnesses? How long will it last? How seriously will Senate Majority leader Mitch McConnell take it? 

 

One aspect that is determined but often misunderstood is who presides over the trial. As Chief Justice John Roberts, appointed by President George W. Bush in 2005, readies himself for his historic role as the presiding judge over the trial, it is instructive to look back at the experiences of the two prior Chief Justices who presided over the trials of President Andrew Johnson in 1868 and of President Bill Clinton in 1999.

 

Salmon P. Chase, Chief Justice from 1864-1873, and William Rehnquist, Chief Justice from 1986-2005, both faced great pressures as presiding judge over the highly partisan impeachment trials. Neither one would be considered noncontroversial in his career, but both had the responsibility to uphold the Constitution at times of great turmoil, and both did so, after an early period of controversy around Salmon P. Chase.

 

Salmon P. Chase’s career reflected the realignment of political parties in the mid nineteenth century. He was a member of the Whig Party in the 1830s, the Liberty Party of the 1840s, the Free Soil Party from 1848-1854, the Republican Party from its founding in 1854 to 1868, and finally, the Democratic Party in the last five years of his life, while still serving as Chief Justice by appointment of Abraham Lincoln.

 

Chase helped recruit former Democratic President Martin Van Buren to run as the Free Soil Presidential candidate in 1848; helped found the Republican Party on the same principles of antislavery activism; sought the Republican nomination for President in 1860 before Lincoln was selected by the Republican National Convention; and he sought the Presidency on the Democratic Party line in 1868 and the Liberal Republican line in 1872 while still serving as Chief Justice.  He had a varied career as Ohio Senator (1849-1855), Governor (1856-1860), and Secretary of the Treasury under Lincoln (1861-1864).

 

Chase attempted to establish the concept of unilateral rulings on procedural matters during the early days of the trial of Andrew Johnson, but he was overruled by the Senate majority, controlled by Radical Republicans, and quickly gave up trying to control the trial. He moved toward neutrality and simple presiding as the trial moved forward after early turmoil.

 

William H. Rehnquist could not have been more different than Salmon P. Chase in his political leanings.  As far “left” as Chase was in his times, Rehnquist was far ‘right”, starting his political career as a legal advisor to Republican Senator Barry Goldwater in his failed campaign for President of the Arizona Senator in 1964.  Rehnquist was appointed Assistant Attorney General of the Office of Legal Counsel in 1969 by President Richard Nixon. 

 

Nixon nominated him for the Supreme Court in late 1971 and he was confirmed and sworn in the first week of 1972. Rehnquist served nearly 34 yearson the Court and was elevated to Chief Justice in 1986 by President Ronald Reagan. He was regarded as the most conservative member on the Warren Burger Court and was one of the most consistently conservative Justices in modern times. Rehnquist recused himself from participating in the US V. Nixon Case in 1974, where the President was ordered to hand over the Watergate Tapes to the Special Prosecutor Leon Jaworski, leading to Nixon’s resignation on August  9, 1974.

 

Presiding over the Bill Clinton Impeachment Trial in the Spring of 1999, Rehnquist chose to  limit any attempt to influence the trial that was being promoted by a strong conservative Republican leadership in the House of Representatives, led by Speaker of the House Newt Gingrich and House Judiciary Committee Chairman Henry Hyde.  Despite his strong conservative credentials, Rehnquist managed always to get along well with his Supreme Court colleagues, and there were no controversies about his handling of the Clinton Impeachment Trial. 

 

He was, despite his right wing credentials and voting record on the Court, seen as fair minded, approachable, and a far more unifying leader of the Court before and after the Clinton Impeachment Trial than Chase was before and after the Andrew Johnson Impeachment Trial.

 

Now, Chief Justice John Roberts, who clerked for Rehnquist in 1980-1981, is faced with the same challenge of presiding over a highly charged impeachment trial.

 

Roberts worked in the Ronald Reagan and George H. W. Bush Administrations in the Justice Department and the Office of White House Counsel, then as Principal Deputy Solicitor General,followed by private law practice before his appointment to the DC Court Of Appeals by George W. Bush in 2003.  In 2005, he was nominated to replace the retiring Associate Justice Sandra Day O’Connor, but before hearings could begin on the nomination, Chief Justice Rehnquist died. Roberts was then nominated to replace Rehnquist. 

 

Roberts has been very clear in his desire to run a Court that has the respect and regard of the American people, and while he has a strong conservative judicial philosophy in his 14 plus years on the Court, he has also come across as having a willingness to work with the Supreme Court’s liberal bloc, and is seen as the “swing” vote on the Court since Associate Justice Anthony Kennedy retired in 2018.  

 

He has surprised many liberal commentators with some of his votes, including the preservation of “ObamaCare.” He is seen as comparatively more moderate and conciliatory, and he has been somewhat critical of utterances by President Donald Trump regarding bias of Justices appointed by Presidents Bill Clinton, George W. Bush, and Barack Obama.

 

 It is clear that Roberts wants to have a good historical reputation as only the 17th person to head the Supreme Court, and while he will work to avoid controversy in the upcoming Trump Impeachment Trial, he will wish to preserve respect for the Constitution, democracy, and the rule of law, and will be the center of attention in the coming weeks and months.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/blog/154302 https://historynewsnetwork.org/blog/154302 0
Roundup Top 10!  

The Job of the Academic Market

by Rebecca S. Wingo

Over three years, I dedicated 106.5 workdays to getting a job—while working another job. 

 

Prohibition Was a Failed Experiment in Moral Governance

by Annika Neklason

A repealed amendment and generations of Supreme Court rulings have left the constitutional regulation of private behavior in the past. Will it stay there?

 

 

History and the Opioid Crisis

by Jeremy Milloy

In the 1970s, just as now, people living with and recovering from substance use disorders faced prejudice and mistreatment at the hiring stage and in the workplace itself.

 

 

1619?

by Sasha Turner

What to the historian is 1619?

 

 

Boris Johnson Might Break Up the U.K. That’s a Good Thing.

by David Edgerton

It’s time to let the fantasy of the “British nation” die.

 

 

The problem with a year of celebrating the 19th Amendment

by Andrew Joseph Pegoda

Our entire understanding of the history of feminism is skewed.

 

 

Assassination as Cure: Disease Metaphors and Foreign Policy

by Sarah Swedberg

Kinzinger’s words fit within a long historical tradition of badly used disease metaphors that often accompany bad outcomes.

 

 

Another Disability Disaster in the Making

by Jonathan M. Stein

The Trump administration’s Social Security proposal would repeat one of Ronald Reagan’s most damaging mistakes.

 

 

How the President Became a Drone Operator

by Allegra Harpootlian

From Obama to Trump, the Escalation of Drone Warfare

 

 

 

What Australia’s Fires Should Teach the USA: Be Alarmist!

by Walter G. Moss

Most importantly in this 2020 election year, the Australian tragedy tells us we should vote out all the human-caused climate-change deniers and minimizers.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174066 https://historynewsnetwork.org/article/174066 0
Stepping Back From the Brink of War Trump’s order to kill General Soleimani is one of the most reckless acts taken by a president, who once again has put his personal political interest above the nation’s security. Certainly, Soleimani deserved to meet his bitter fate. He was behind the killing of hundreds of American soldiers in Iraq while threatening and acting against American allies. However, killing him without considering the potentially dire regional repercussions and without a strategy, under the guise of national security concerns, is hard to fathom. Republican members of Congress who praised the assassination of General Soleimani seem to be utterly blinded by their desire to see him eliminated. What will happen next, they seem to have no clue. Trump, who is fighting for his political life, appeared to have cared less about the horrifying consequences as long as he distracts public attention from his political woes. He made the decision to assassinate Soleimani seven months ago, but he gave the order now to serve his own self-interest, especially in this election year where he desperately needs a victory while awaiting an impeachment trial in the Senate. During the Senate briefing on Iran led by Secretaries of State and Defense Pompeo and Esper, and CIA Director Haspel, they produced no evidence that there was an imminent danger of an attack on four American embassies orchestrated by Soleimani, as Trump has claimed. In fact, Esper said openly in a January 12 interview that he saw no evidence. Republican Senator Mike Lee labeled it as “probably the worst briefing I have seen, at least on a military issue…What I found so distressing about the briefing is one of the messages we received from the briefers was, ‘Do not debate, do not discuss the issue of the appropriateness of further military intervention against Iran,’ and that if you do ‘You will be emboldening Iran.’” Now, having failed to produce evidence of imminent danger, the Trump administration claims that the killing of Soleimani was part of a long-term deterrence strategy. The assassination itself has certainly emboldened Iran’s resolve to continue its nefarious activities throughout the region, but even then, the measure Trump has taken to presumably make the US more secure has in fact done the complete opposite. It has created new mounting problems and multiple crises. Trump dangerously escalated the conflict with Iran; severely compromised the US’ geostrategic interest in the Middle East; intensified the Iranian threat against our allies, especially Israel; led Iran to double down in its support of terrorist and Jihadist groups; badly wounded the US’ relations with its European allies; deemed the US untrustworthy by friends and foes; and pushed Iran to annul much of the nuclear deal, all while impressively advancing its anti-ballistic missile technology. And contrary to Trump’s claim that he made the right decision for the sake of American security, 55 percent of voters in a USA Today survey released on January 9th said he made the US less safe. And now we are still at the brink of war. Although Iran has admitted to being behind the attack on the Asad air base in Iraq, it initiated the attack to save face in the eyes of its public and demonstrate its possession of precision missiles and willingness to stand up to the US. This retaliation was expected, but since Iran wants to avoid an all-out war, it was strategic and carefully calculated to inflict the fewest American casualties, if any, to prevent a vicious cycle of retaliatory attacks which could get out of control and lead to a war. This, however, does not suggest that Iran will stop its clandestine proxy operations—employing its well-trained militia in Iraq, Yemen, and Syria to execute new attacks on American and allies’ targets in the region while maintaining deniability. Similarly, the clergy can also pressure hawks in and outside the government to avoid any provocative acts against the US. Iran is patient and will carefully weigh its gains and losses before it takes the next step. Following Iran’s attack on the Asad base, Trump has also shown restraint because he too wants to prevent an all-out war, knowing that even though the US can win it handedly, it will be the costliest victory in blood and treasure and certainly in political capital. The whole mess began when Trump withdrew from the Iran deal. What did Trump think he could accomplish? Withdrawing from the deal without having any substitute, without consultation with the European signatories, and with re-imposing sanctions, especially when Iran was in full compliance with all the deal’s provisions, is dangerously reckless—undermining our national security interests and jeopardizing the security of our allies in the region. The Iran deal was not perfect, but the idea was to build on it, gradually normalize relations with Iran, and prevent it from acquiring nuclear weapons altogether as it works to become a constructive member of the community of nations. To resolve the crisis with Iran, the US must demonstrate a clear understanding of the Iranian mindset. Iran is a proud nation with a long and continuing rich history; it has huge natural and human resources, is the leader of the Shiite world, occupies one of the most geostrategic locations in the world, and wants to be respected. The Iranians are not compulsive; they think strategically and are patient, consistent, and determined. The revocation of the Iran deal simply reaffirms Iran’s distrust of the US, from the time the CIA toppled the Mosaddeq government in 1953 to the continuing sanctions, adversarial attitude, and the open call for regime change. Both Khamenei and Trump have their own domestic pressure to contend with and want to avoid war. The Iranian public is becoming increasingly restive. They are back in streets demanding immediate economic relief. Conversely, Trump calculated that further escalation of violent conflict with Iran will erode rather than enhance his political prospects, and would make defeat in November all but certain. West European countries are extremely sensitive to any major escalation of violence, as it would lead to mounting casualties and destruction on all sides. Iran can resort to a wide range of hostile measures, including disrupting oil supplies from Saudi Arabia and other Gulf states, by mining the Straits of Hormuz through which 21 million barrels per day (21% of global oil consumption) pass, resulting in a massive economic dislocation in the Middle East and Europe in particular. The pause in hostilities offers a golden opportunity to begin a new process of mitigation. Germany, France, and Britain have already engaged the Iranians in an effort to ease the tension between Iran and the US and create conditions conducive to direct US-Iran negotiations. By now, Trump must realize that Iran cannot be bullied and the only way to prevent it from pursuing nuclear weapons is through dialogue. Regardless of how flawed Trump views the Iran deal, it still provides the foundation for a new agreement, as many of its the original provisions remain valid and can be built on it. Other conflicting issues between the two sides, especially Iran’s subversive activities, should be negotiated on a separate track. In some ways, both Iran and the US need to lick their wounds and begin a new chapter, however long and arduous it may be, because war is not and will never be an option.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174065 https://historynewsnetwork.org/article/174065 0
Trump Checks all of the Impeachment Boxes: Will it Matter?

 

In my last article, I wrote that there is something wholly different about Donald Trump’s actions than those of other presidents who have exceeded their power. Why? Unlike other presidents, Trump’s actions meet each of the requirements that the Framer’s laid out to impeach a president. Ironically, just as impeachment is needed most, the partisan tenor of the times may make it impossible to accomplish.

 

The Framers of the Constitution had included the power of impeachment for instances of treason, bribery, or other high crimes and misdemeanors committed by executive branch officials and the president. They had rejected policy disputes as a basis for impeachment. When George Mason proposed adding maladministration as an impeachable offense, Madison responded that “so vague a term will be equivalent to tenure during pleasure of the Senate.” It was at this point that “high crimes and misdemeanors” were added. While the first two are clear, the third sounds vague to us today. Yet, the Framers had a clear idea of what this meant. As I wrote previously for the History News Network, the Framers “thought that the power of impeachment should be reserved for abuses of power, especially those that involved elections, the role of foreign interference, and actions that place personal interest above the public good” which fall within the definition of high crimes and misdemeanors.  Professor Noah Feldman of Harvard Law School, in his testimony to the House Judiciary Committee, said that for the Framers “the essential definition of high crimes and misdemeanors is the abuse of office” by a president, of using “the power of his office to gain personal advantage.” Trump’s actions with Ukraine checks each of these boxes.

 

Presidents and Congress have often found themselves in conflict, dating back to the earliest days of our Republic. Part of this is inevitable, built into a constitutional system of separation of powers with overlapping functions between the two branches. Only Congress can pass laws, but presidents can veto them. Presidents can negotiate treaties but must obtain the advice and consent of the Senate. As Arthur Schlesinger Jr. observed, checks and balances also make our political system subject to inertia. The system only really works “in response to vigorous presidential leadership,” Schlesinger wrote in The Imperial Presidency. But sometimes presidents grasp for powers that fall outside of the normal types of Constitutional disputes. This is where impeachment enters the picture.

 

In the early Republic, disputes between presidents and congress revolved around the veto power. Andrew Jackson was censured by the Senate over his veto of bank legislation and his subsequent removal of federal deposits from the Second Bank of the United States. Jackson’s actions showed a willingness to grab power and to ignore the law and the system of checks and balances when it suited his purposes. While the censure motion passed, it did not have the force of law. It was also unlikely that impeachment would have been successful, since the dispute was over policy. While the president had violated the law, not all illegal acts are impeachable. As Lawrence Tribe and Joshua Matz have noted, “nearly every president has used power in illegal ways.” Impeachment is meant to be limited to those actions, like treason and bribery, “that risk grave injury to the nation.”

 

Congress considered impeaching John Tyler in 1842 over policy disputes for which he too used the veto power. Tyler is sometimes referred to as the accidental president since he assumed the presidency when William Henry Harrison died in office one month after he was sworn in. Tyler had previously been a Democrat and state’s-rights champion who had joined the Whig Party over disagreements with Jackson’s use of presidential power. Yet he preceded to use the powers of the presidency to advance his own policy views, and not those of his newly adopted Whig Party, vetoing major bills favored by the Whigs, which led to an impeachment inquiry. A House Committee led by John Quincy Adams issued a report that found the President had engaged in “offenses of the gravest nature” but did not recommend that Tyler be impeached. 

 

Even Andrew’s Johnson’s impeachment largely revolved around policy disputes, albeit extremely important ones. Johnson was a pro-Union southerner who was selected by Lincoln to aid in his reelection effort in 1864. Johnson was clearly a racist, a member of the lowest rung of southern white society, those that felt their social position was threatened by the advancement of blacks, a view that shaped Johnson’s policies on Reconstruction. While the Civil War ended slavery, it did not end the discussion of race and who can be an American. Johnson, much like Trump, represented those who believed that America was a typical nation, made up of one racial group. “I am for a white man’s government in America,” he said during the war. On the other hand, the Radical Republicans believed that America was essentially a nation dedicated to liberty and equality, and they set out to fulfill the promise of the American Creed for all American men, black as well as white. This was the underlying tension during the period of Reconstruction. “Johnson preferred what he called Restoration to Reconstruction, welcoming the white citizenry in the South back into the Union at the expense of the freed blacks,” Joseph Ellis writes.

 

But Johnson was ultimately impeached on false pretenses, not for his policy disputes with Congress, but due to his violation of the Tenure of Office Act, which some have characterized as an impeachment trap. The act denied the president the power to fire executive branch officials until a Senate confirmed appointment had been made. While the House impeached Johnson, he escaped removal by one vote in the Senate. Johnson’s sin was egregious; he had violated one of the core tenants of the American Creed, that all are created equal. Yet this did not rise to the level of an impeachable offense that warranted the removal of a president. It was a policy dispute, one that went to the core of who we are as Americans, but it was not a high crime or misdemeanor. 

 

It would be over one hundred years before impeachment would be considered again, this time in the case of Richard Nixon. Like Trump, Nixon abused the power of his office to advance his own reelection. Both men shared a sense that the president has unlimited power. Nixon famously told David Frost that “when the president does it, that means its not illegal,” while Trump has claimed that under Article II “I have the right to do whatever I want.  But Nixon, unlike Trump, did not elicit foreign interference in his reelection effort. The two men also shared certain personality traits that led to the problems they experienced in office. As the political scientist James David Barber wrote in his book The Presidential Character, Richard Nixon was an active-negative president, one who had “a persistent problem in managing his aggressive feelings” and who attempted to “achieve and hold power” at any cost. Trump too fits this pattern, sharing with Nixon a predilection toward “singlehanded decision making,” a leader who thrives on conflict. 

 

Indeed, Nixon would likely have gotten away with Watergate except for the tapes that documented in detail his role in covering up the “third rate burglary” that occurred of the Democratic Party headquarters on June 17, 1972. Paranoid about possibly losing another election, Nixon had directed his staff to use “a dirty tricks campaign linked to his reelection bid in 1972,” presidential historian Timothy Naftali has written. When the break in was discovered, Nixon then engaged in a systematic cover-up, going so far as to tell the CIA to get the FBI to back off the investigation of the break in on bogus national security grounds. 

 

Much like Trump, Nixon stonewalled the various investigations into his actions, what Naftali calls “deceptive cooperation.” He had the Watergate Special Prosecutor, Archibald Cox, fired in October 1973 in order to conceal the tapes, knowing his presidency was over once they were revealed. In the aftermath of Cox’s firing during the so-called Saturday Night Massacre, Nixon refused to release the tapes to the House’s impeachment inquiry. Instead, he provided a transcript he had personally edited that was highly misleading. The final brick in Nixon’s wall of obstruction was removed when the Supreme Court unanimously ruled in July 1974 that Nixon had to release the tapes, which he complied with. One wonders if Trump would do the same.  

One difference with the Trump case is that there was a degree of bipartisanship during the Nixon impeachment process. By the early summer of 1974, cracks had begun to appear in the Republicans support for Nixon. Unlike today, there were still moderate Republicans who were appalled by Nixon’s actions and had become convinced that the president had engineered a cover-up. Peter Rodino, the Democratic chairman of the House Judiciary, had bent over backwards to appeal to the moderate Republicans and to Southern Democrats, where Nixon was popular. Despite strong pressure from the leadership of the GOP in the House, it was this group that ultimately drew up the articles of impeachment. 

 

Still, a large number of Republicans in the House continued to stick with the president until the tapes were finally released. It was at this point that even die-hard Nixon supporters deserted him when it became apparent that Nixon had been lying all along and had committed a crime. Nixon’s case shows both the importance of bipartisanship in the impeachments process, but also how difficult it is for members of the president’s party to turn on him. In August of 1974, Nixon resigned when confronted by a group of Senators and House members, led by conservative Senator Barry Goldwater.

 

The impeachment of Bill Clinton is the anomaly, since it was not about policy (as in Johnson’s case) or the abuse of power (in Nixon’s case). Rather it emerged in part due to a character flaw. Clinton could not restrain himself when it came to women. 

 

The facts of the case are well known. While president, Clinton had an illicit sexual encounter with Monica Lewinski in the Oval Office. He then proceeded to lie about it, both to the country and also during a deposition in the Paula Jones case, and attempted to cover up the affair. Kenneth Starr, who had been appointed as Independent Counsel to investigate the Whitewater matter, a failed land deal in which the Clinton’s lost money but did not nothing wrong, then turned his investigation to the president’s actions with Lewinski and recommended that the House of Representative consider impeachment proceedings for perjury and obstruction of justice.

 

By this point, Clinton had admitted he had lied to the country and apologized for his actions. The House had the opportunity to censure Clinton, but Tom Delay, one of the Republican leaders, buried that attempt, even in the aftermath of the midterm elections when Democrats gained seats, clearly pointing to public opposition to impeachment of the president, whose approval rating were going up. While the House voted for impeachment largely along partisan lines, the Senate easily acquitted Clinton on a bi-partisan basis. Clinton’s actions, while “indefensible, outrageous, unforgiveable, shameless,” as his own attorney described them, did not rise to the level the Framers’ had established for impeachment. 

 

Clinton’s impeachment in the House was largely a product of partisan politics that were out of control. As Lawrence Tribe and Joshua Matz have written, “starting in the mid-1990’s and continuing through the present, we’ve seen the creeping emergence of a permanent impeachment campaign.” Both George W. Bush and Barack Obama faced impeachment movements during their terms in office over issues that in the past no one would have considered impeachable. During the 2016 election, both Hillary Clinton and Donald Trump lobbed accusations that the other would face impeachment if elected. The current toxic political environment raises the issue of whether a bipartisan impeachment effort has any chance at all, especially when the two sides cannot even agree over basic facts. Nancy Pelosi was right to hold off the movement to impeach Trump prior to the Ukrainian matter, but now that we have such a clear abuse of power, what is the alternative? At this moment, when the tool of impeachment is most needed for a president who meets all of the criteria laid out by the Framers, the process itself has become captive to extreme partisanship by the Republicans. 

 

The ball is now in the Senate’s court, where the trial will soon occur. While conviction and removal from office is highly unlikely short of additional corroborating evidence (which Mitch McConnell has been attempting to squelch), perhaps the Senate can find the will to issue a bipartisan censure motion that condemns the president and issues a warning that another similar abuse of power will result in removal. Ultimately, the voters will likely decide Trump’s fate come November. We can only hope they choose correctly.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174016 https://historynewsnetwork.org/article/174016 0
Annual Jewish Film Festival, Following New Wave of Anti-Semitism, Offers Hope and Inspiration Early last month, four people were killed at a Jewish food store next to a synagogue in Jersey City, N.J. (two blocks from the building in which I work). A few days later, a man wielding a machete stabbed and badly injured five Jews praying in the home of a Rabbi in Monsey, New York, about 40 miles from New York City. Since then, several swastikas have been painted on buildings in various cities. These incidents are all part of a growing wave of anti-Semitism  in America. The anti-Semitic crimes in Chicago, New York and Los Angeles were the highest in 18 years in 2019. The hate crimes in Los Angeles, a category expanded by police to include swastikas an any religious property, doubled in 2019 over the previous year.  New York City counted 229 anti-Semitic crimes in the past year, a new record, and up significantly from last year.  The Anti-Defamation League said 2019 showed the third highest anti-Semitic crime total in the entire history of the organization.

 

On Wednesday, the 29th annual New York Jewish Film Festival, a two week (January 15-28) cinematic celebration of Jewish life, kicks off at Lincoln Center’s Water Reade Theater, in New York, and serves as hope and inspiration to not just Jews, but everybody.

 

Given the attacks on Jews all over the country, the Jewish Film Festival, one of the oldest in the United States, could not have come at a better time.

 

Aviva Weintraub, the executive director of the festival that is sponsored by the New York Jewish Museum and the Film Society of Lincoln Center, said what has been happening against Jews in the nation over the last two months is “horrifying.” She said the goal of the festival each year is to “bring Jews together with each other and others” and said she is hopeful that will happen again this year. 

    

Discrimination and persecution, of course, are no strangers to Jews and the selectins of films from the festival reflects that.

 

The film festival starts with the upbeat Aulcie, a sports film about how basketball star Aulcie Perry was spotted playing basketball on a New York City playground tournament by a scout for the Maccabi Tel Aviv basketball team from Israel in 1976. He was signed and, despite personal problems, helped the Maccabi team win two separate European championships. He later converted to Judaism and became an Israeli citizen

 

The centerpiece of the film festival is the screening of the award winnings 1970 film The Garden of the Finzi-Continis, director Vittorio De Sica’s movie about the struggles of the Jews in the World War II era in Italy, now celebrating its 50th anniversary. That Holocaust era film is joined by a new documentary, Four Winters: A Story of Jewish Partisan Resistance and Bravery in WW II, that tells the story of Jewish resistance to the Nazis throughout World War II in different countries.

 

“We chose The Garden of the Finzi-Continis because of its anniversary, but also because it is such as great film about the struggle of the Jews against the Nazis and because it is a beautiful and moving story,” said Ms. Weintraub. The film won 26 international awards in 1970, plus the Oscar for Best Foreign Language film.

 

Executive director Weintraub is equally proud of Four Winters. “The strength of the film is not just its story, but the inspiring story of each of the men and women, seniors now, who survived the Holocaust and, in the movie, describe what happened. It Is stirring,” said Ms. Weintraub. “You cannot see that documentary and not be moved by it.”

 

She said Four Winters is a factual and inspirational story of years of resistance against the Nazi regime. “It’s amazing to realize that the Jews and others resisted for that long,” she said.

 

The festival has always been popular. Weintraub chuckles when she thinks back on different years of the Festival. “We have hordes of people who literally camp out at Lincoln Center to catch as many films in the festival as they can,” she said. “It’s not surprising for someone to see several films. Many people come to Lincoln Center on their way home from work, or between shopping trips on weekends,” she said.

 

She and two others spend about a year winnowing down the films to 31 or 32 for each festival. “We look for films that represent history, politics and Jewish life. Each year the mix of movies is different, “ she added.

 

Some movies in this year’s festival represent the Holocaust. There is Birch Tree Meadow, a 2003 film that tells the story of a concentration camp survivor who returns to the camp years later to confront memory and the descendant of a Nazi guard. 

 

An Irrepressible Woman is the story of 1940s French Prime Minister Leon Blum, imprisoned at Buchenwald, and his love, Jeanne Reichenbach, who fell in love with him as a teenager, and risks her life to find him again.

 

There are cultural tales. The 1919 silent film Broken Barriers was the first film to tell some of the old Sholom Aleichem stories, that much later became world famous play and move Fiddler on the Roof.

 

Incitement is the complicated story of the lead up to the highly publicized assassination of Israeli Prime Minister Yitzhak Rabin in 1995. It tracks not only the murder, but the politics in the nation at the time.

 

God of the Piano is the story of a woman who is forced to meet high expectations by her father as a pianist. When she grows up, she places those same high expectation of her son but he is deaf. The story is the larger family conflict. 

 

The festival closes on a high note with the unification film Crescendo, the true story of how music conductor Eduard Sporck took over a joint Israeli-Palestinian youth orchestra.  At first, he saw his job as the man to get all of the musicians to produce beautiful music, but he soon realized the harder job,  and the more rewarding job, was to get the children from the two opposing political sides to forget personal differences and to work together as a smoothly running musical group. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174042 https://historynewsnetwork.org/article/174042 0
Painter of Disquiet: Félix Vallotton at the Metropolitan Museum of Art'

La Chambre rouge (1898)

 

The Metropolitan Museum of Art is currently presenting the work of Félix Vallotton, an artist who has been largely neglected relative to his contemporaries, such as Pierre Bonnard and Édouard Vuillard. This makes the present exhibition all the more welcome, and fascinating. Vallotton’s work unquestionably merits the renewed attention — his paintings possess a mysterious quality, narrative appeal, and attention to detail, as well as invoke a delicious sense of irony and wit. Born in the Swiss town of Lausanne on the shores of Lake Geneva in 1865, Vallotton displayed early an ability to draw from life; and at sixteen he arrived in Paris to study painting at the Académie Julian. A self-portrait at the age of twenty reveals a virtuosic, confident brush and much more — Vallotton is not interested in merely demonstrating his facility as a realist painter: he has a penetrating eye, a psychological depth, and a naturalism that owes much to northern Renaissance masters, especially Albrecht Dürer and Hans Holbein. It is not entirely clear when Vallotton’s friendship with Les Nabis began – from the Hebrew word for prophet, the Nabis were an avant-garde group of Parisian artists, which included Bonnard, Vuillard, Charles Cottet, and Ker-Xavier Roussel, among others. Valloton’s nickname within the group may be revealing – he was the ‘foreigner Nabi’. Perhaps this name reflected his Swiss origin, but the Nabis were an international group anyhow. It could also reflect in some measure that Vallotton was something of a loner; but it may allude to the fact that while for a time he adopted the Nabi’s dismissal of naturalism in favor of flat forms and the expressivist use of color, Vallotton was and fundamentally remained a realist painter. Recognition arrived early in Vallotton’s career for reviving the art of woodcut prints which had largely been forgotten since the Renaissance. Indeed, by the age of 25, Vallotton had single-handedly brought about a kind of revolution in xylography. Inspired by Japanese woodcuts that were popular in Paris at the time, Vallotton produced images with sharp contrasts of jet black and pristine white. His woodcuts are remarkable for their commentary on the French bourgeoisie, their stinging rebuke of societal decadence in fin-de-siècle Paris, their critical orientation to the police. Vallotton has an eye for violence — be it in the form of murder, or the sudden carriage accident, the political execution or the suicide. Vallotton combines a dark realism with sophisticated satire, wry humor, and a keen acerbic wit. He has an eye for the ambiguous and the enigmatic — a deft and subtle touch that defies finalization. The Demonstration from 1893 renders the political chaos of the day with humor — from the old man who has lost hold of his hat to the sole figure in white, a woman racing along the left hand side. Vallotton would return in both woodcuts and paintings to the scene of the bourgeoisie shopping for the latest luxury goods at the Bon Marche Department Store — the first such store of its kind in the world. The 1898 triptych is not without a certain irony — given that the format Vallotton chose was traditionally associated with altarpieces such as would be found in church. Which is just to underscore the Vallotton is a close observer of modernity, fascinated by the new world he sees emerging around him — with its rampant consumerism, and its technological novelties (such as a moving conveyor belt along a footbridge, which he includes in his woodcut series devoted to the World’s Fair of 1900.) A series of ten woodcuts entitled Intimités is a sustained and biting critique, and among Vallotton’s greatest achievements as a graphic artist. As an unsettling and disquieting series which lays bare the hypocrisies of bourgeois society, Vallotton deftly exposes a decadent class through scenes of adultery, deceit, romantic quarrels and indecent proposals. In 1899, Vallotton married Gabrielle Rodrigues-Henriques and the union was to have a significant effect on the remainder of his career. His wife was a wealthy widow and the daughter of a Paris art merchant, which meant that he now enjoyed a certain financial security and could turn exclusively to painting. While Vallotton’s work generally lost its satirical wit and subversive edge — there is also a certain psychological insight and marked turn towards the inwardness of his subjects that constitutes much of the power of this later period. The acquisition of a Kodak camera in 1899 led to changes in the way the artist worked. Now, he would typically take snapshots of imagery that appealed to him – and then used those photographs to craft his painting in the studio. It appears that often he would retain the sharply contrasting patterns of light and shadow revealed in the small photograph. However, the painter however was by no means subservient to the photographic image, as The Red Room, Etretat (1899), demonstrates. It is a remarkable painting for its unity of composition and psychological structure. All lines in the painting essentially point to (and up to) the seated figure of Gabrielle Vallotton – even the viewer is made to feel they’re looking up to this woman, who meanwhile looks down at the small child in the foreground. This child, so crucial to the psychological depth of the painting is entirely absent, however, from the photograph. Vallotton was also a master of ambiguity. There is always something more to the story he is telling that must ever remain just beyond our reach. Consider, for example, one of this exhibition’s finest offerings, The White and the Black (1913), a provocative work depicting a young white woman, nude and reclining; and a black woman, seated and coolly smoking a cigarette. The painting may be a response to Édouard Manet’s Olympia (1863) in which a white model, a prostitute likely, lies on a bed attended by a black servant bringing her flowers (probably from a client). But Vallotton may also be in dialogue with Jean-Auguste-Dominique Ingres’ Grande Odalisque (1814) and Odalisque with a Slave (1839). All of his friend’s attest to Vallotton’s love and admiration for Ingres, by whom he was “conquered without offering any resistance” – as one contemporary put it. Vallotton was clearly a close observer of Manet as well, and many of his paintings – for example, Misia at Her Dressing Table (1898) – emphasize the harsh light, the large color surfaces and shallow depth that was characteristic of Manet’s work, including Olympia (1863). But at the same time Vallotton subverts the traditional roles of mistress and servant by making the relationship between these two women utterly ambiguous. The Provincial (1909) is notable for its exploration of one of Vallotton’s recurring themes – namely, the complex, uneven relationship between men and women. In this painting, and others such as Chaste Suzanne (1922), a powerful female figure holds sway over her subservient male counterpart. The white lace of her blouse protrudes in a witty but subtle reminder of her breasts, which only underscores her sexual dominance over the docile man who sits beside her with eyes deferentially lowered. Moonlight (1895) is a standout work that reveals the influence of the Symbolists, in Vallotton’s attention to emotional force over actual topographical representation – water, earth and sky have become interfused in what is almost an abstraction. The picture also anticipates the latter part of his career, when Vallotton increasingly turned to landscape painting – often beginning with a photographic image or an on-site sketch, which was then imaginatively reconstructed on the canvas. The painter referred to his later landscapes as paysages composes (composed landscapes) – and remarked in 1906, “I dream of a painting free from any literal respect for nature.” Valloton said he wanted to “be able to recreate landscapes only with the help of the emotion they have provoked in me.” His method allows him to simplify the compositional structure, to intensify the mood and emphasize the emotional impact of color – as, for example, in Last Rays (1911) where we find the silhouettes of umbrella pines as they receive the last light of the evening sun. Félix Vallotton more than deserves the attention that this exhibition brings. His work – from the groundbreaking forays into xylography, to his portraits and scenes of bourgeois society, to his hauntingly mesmerizing landscapes – defies identification with any artistic school. Like all truly great artists, Vallotton is inimitable; while he experiments with various artistic programs, he ultimately remains aloof from them, determined to paint pictures purged of all sentimentality, beholden only to the emotional, psychological or social reality with which he is concerned. Such a body of work remains ever fresh and vital, and rewards close attention with a glimpse of truth.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174040 https://historynewsnetwork.org/article/174040 0
Depression Era Tenor Hits All the High Notes

 

It is 1934, at the height of the Depression, and an opera company in Cleveland is trying to make enough money to stay in business. The answer, its officials believe, is to bring in fabled Italian tenor Tito Merelli for a one-night concert. Merelli’s appearance is highly publicized and the theater is sold out. The opera company has a stage set. It has an orchestra. It has an audience. But it has no Merelli. He is missing.

 

In late afternoon, just a few hours before the performance, the distraught opera company’s manager takes a bold step. He will use Max, his assistant and an amateur tenor, to dress as the clown Pagliacci and do the show as Merelli. With all the makeup and man in disguise with a tenor’s voice, and also about the correct height and weight, who would know? What could possibly go wrong?

 

That’s where everything starts to collapse in this play set in 1934, Lend Me a Tenor, that opened last weekend at the Westchester Broadway Theater in Elmsford, N.Y. It is a wacky, crazy, upside down play by Ken Ludwig that has the audience roaring with laughter. This tenor can hit all the high notes and makes everybody laugh, too. He is his own opera company – if he can be found.

 

The fraudulent singer plan is in place, Max is ready for his fifteen minutes of fame and the audience is waiting, unaware of the deception. Then, unannounced, Tito Merelli arrives, as flamboyant as flamboyant can be, with his overly emotional, arm waving wife, Maria, who is always angry about something. He is ready to go on. Max is ready to go on. Who will go on? 

 

And then…………well, see the play.

 

Lend Me A Tenor is not only a hilarious show, well directed by Harry Bouvy, but a neat look back at how opera singers and other highly regarded performers toured the country in that era. Merelli lived when they left their homes in the U.S. or in Europe and went on tours of America or appeared in different cities at different times of the year. The play captures the physical details of the traveling star’s Depression life well.

 

Star tours were very common in the Depression, despite the sagging national economy, that hurt theaters and opera houses as badly as it hurt everything else. Bringing in a star for a one-night performance usually worked not only to put money in the opera house bank, but garner enormous attention for the opera company which translated to ticket sales for the company’s regular operas. Marian Anderson, the great singer, toured the U.S almost every year in the late 1930s, sometimes taking tours that lasted several months an included up to eighty concerts.  The big bands of the era - Duke Ellington, Glen Miller, the Dorseys - did the same thing, hitting the road early in the year and living out of suitcases for most of it. The casts of Broadway shows also traveled in that manner. There were always logistical problems- train breakdowns, snowstorms, ticket mix-ups, but, in general, the tours worked well.

 

Except for Tito Merelli.

 

He and his wife have enormous problems to combat when they finally do arrive in Cleveland (Tito’s main problem is his firecracker wife). How do you un-vanish? How does Max, bursting for a chance in the musical spotlight, cope with the fact what Tito is, in fact, in Cleveland? Or is he? Is anybody really in Cleveland?

 

In Lend Me A Tenor, the Cleveland Opera Company has rented a palatial, two room suites for Merelli and his wife, an adoring bell hop brings up their bags and the opera company’s manger treats him like visiting royalty. All of this is historically accurate. Performers arrived by plane or train, mostly, although some went by car. They ate well in the hotel’s restaurant, were given tours of the area ad introduced to dozens of artistic and municipal officials. Newspaper ads for Merelli would have been taken out for a week or more prior to the show. Hundreds of large, cardboard broadsides would have been put in store windows or nailed to trees and telephone poles. There might even have been a Tito Merelli Day (that is, if they could find Tito). 

 

Everything that show business people did to arrange an overnight stay and performance for a star like Morelli in 1934 was done year after year and became the backbone of the American Tour. It is the same way today although social media, television, emails and i-phones have modernized the tour. Tito Merelli would love the contemporary tour, if he could find it.

 

Director Bouvy, who pays such careful attention to the history of the play, has done a superb job with a talented cast of actors. He lets them all shine as a very group of the eccentric, egomaniacal show biz people we all know and love. The two stars of the show are Tito, played wonderfully and with great style by Joey Sorge, and Max, played equally well by J.D. Daw. In addition to them, Bouvy gets fine work from Molly McCaskill as Maggie, Max’ girlfriend, Philip Hoffman the opera manager, Kathy Voytko as Tito’s wife, Tregoney Sheperd as the doughty opera company board chair, Hannah Jane McMurray as a soprano, Sam Seferian as the bellhop and John Savatore as an assistant stage manager.

 

PRODUCTION: The play is produced by Westchester Broadway Theater. Sets: Steve Loftus, Costumes:  Keith Nielsen, Lighting: Andrew Gmoser, Sound: Mark Zuckerman . The play is directed by Harry Bouvy. It runs through January 26.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174041 https://historynewsnetwork.org/article/174041 0
Can America Recapture Its Signature Exuberance?

 

As the news cycle churns on with impeachment coverage, pundits and politicians are quick to remind that the Constitution is all that stands between us and the whims and dangers of authoritarian rule. It’s a good point, but incomplete. America is a country of law and legend and our founding document, essential as it is, won’t save us if we don’t also buy into a binding story about the point and purpose of our democracy.

 

That’s a tall order in today’s world. To author Lee Siegel, hacking America’s dour realities is like scaling a rock face. In his recent essay “Why Is America So Depressed?” he suggests we reach for the metaphorical pitons—after “the iron spikes mountain climbers drive into rock to ascend, sometimes hand over hand”—to get a humanistic grip amid our “bitter social antagonisms” and the problems of gun violence, climate crisis and social inequities that have contributed to an alarming rise in rates of anxiety, depression and suicide. 

 

“Work is a piton,” says Siegel. “The enjoyment of art is a piton. Showing kindness to another person is a piton.”

 

Thanks to Siegel, I now think of the 200-year anniversary of poet Walt Whitman’s birth in the year just ended as a piton for the uplift it gave me. As Ed Simon, a staff writer for The Millions and contributing editor at HNN, declares in another fine essay published New Year’s week, Whitman is “our greatest poet and prophet of democracy.” We’d do well to enlist the bard’s ideas in 2020, he argues, to steer ourselves through “warring ideologies” and to offset “the compelling, if nihilistic, story that authoritarians have told about nationality.”

 

I would simply add this: Whitman is also the godfather of our national exuberance, a champion of the country’s heaving potential and can-do spirit—traits that, until recently at least, could win grudging regard from some of America’s fiercest critics. Despite his intermittent bouts of depression, Whitman tapped into the energy and brashness of our democratic experiment like no other. His verse hovers drone-like above the landscape, sweeping the manifold particulars of American life into a rapturous, cosmic frame.

 

Whitman did some of his most inspiring work in the worst of times. He published the first edition of his collection “Leaves of Grass” in 1855 as the country drifted toward civil war. His Homeric imprint, inclusive for its time, invited individuals from varied walks of life (“Workmen and Workwomen!…. I will be even with you and you shall be even with me.”) to splice up with the democratic push to explore contours of the American collective.

 

My first brush with Whitman’s high spirits came when I was an undergraduate in the late 1960s. Like campuses across the country, the University of Washington was riven by clashing dogmas and muscular protests against the Vietnam War. Like not a few working-class kids brown-bagging it from their family homes, I rode the fence politically, not knowing how to jump. As a citizen, I was offended by club-wielding riot police entering campus in force; as a student of Asian history, I was repulsed watching radical activists trash an Asian studies library, flinging rare old texts out windows onto the grass. Cultural revolution was a tough business.

 

Lucky for me, a generous English professor, Richard Baldwin, supplied some useful ballast. Sensing my fascination with Emerson, Thoreau and Dickinson, he offered to help me sort out the Transcendentalists during office hours. But it was Whitman who bulled his way forward. His booming energy, delivered in his self-proclaimed “barbaric yawp,” turned me into a forever fanboy.

 

I loved how Whitman challenged the thou-shalt-nots of established order in “Song of Myself,” joining the role and responsibilities of the individual to humanity’s common lot:

 

Unscrew the locks from the doors!

Unscrew the doors themselves from their jambs!

 

Whoever degrades another degrades me,

And whatever is done or said returns at last to me.

 

I loved his fervent embrace of democracy in “Democratic Vistas,” his 1871 dispatch to a divided nation:

 

Did you, too, O friend, suppose democracy was only for elections, for politics, and for a party name? I say democracy is only of use there that it may pass on and come to its flower and fruits in manners, in the highest forms of interaction between men, and their beliefs - in religion, literature, colleges, and schools- democracy in all public and private life … .

 

I loved his zest for the many-parted American experience, for linking the natural environment to the soul of the country, as he proclaimed his outsized poetic ambition in “Starting from Paumanok”:

 

… After roaming many lands, lover of populous pavements,

Dweller in Mannahatta my city, or on southern savannas,

Or a soldier camp'd or carrying my knapsack and gun, or a miner 

in California,

Or rude in my home in Dakota's woods, my diet meat, my drink

from the spring,

Or withdrawn to muse and meditate in some deep recess,

Far from the clank of crowds intervals passing rapt and happy

Aware of the fresh free giver the flowing Missouri, aware of mighty

Niagara,

Aware of the buffalo herds grazing the plains, the hirsute and

strong-breasted bull,

Of earth, rocks, Fifth-month flowers experienced, stars, rain, snow,

my amaze….

Solitary, singing in the West, I strike up for a New World.

 

Whitman had his flaws, to be sure. Despite cheerleading for a big-tent America, he also shared the prejudices of his time, blotting his copybook, for example, with ugly remarks about African Americans. By 1969, when I hungered for a healing narrative, the bard’s prescriptions were undergoing reappraisal in light of righteous and long-overlooked storylines offered up by the civil rights and antiwar movements.

 

Still, Whitman has kept his place in chain of title to America’s resilience. He conveys a panoramic confidence that we are greater as a people than our contemporary realities; that if we lose the thread, we’ll find it again and reweave our story into a new and improved version. As Whitman says in the preface to the 1855 edition of “Leaves of Grass,” people look to the poet “to indicate the path between reality and their souls.”

 

Thomas Jefferson saw utility in the cohesive national story. In her 2018 book “The Death of Truth,” Michiko Kakutani writes that Jefferson “spoke in his inaugural address of the young country uniting ‘in common efforts for the common good.’ A common purpose and a shared sense of reality mattered because they bound the disparate states and regions together, and they remain essential for conducting a national conversation.”

 

Today, Kakutani argues, we’re mired in “a kind of homegrown nihilism … partly a by-product of disillusion with a grossly dysfunctional political system that runs on partisan warfare; partly a sense of dislocation in a world reeling from technological change, globalization, and data overload; and partly a reflection of dwindling hopes among the middle class that the basic promises of the American Dream … were achievable … .” 

 

Whitman understood transcendence of national mood is an uphill climb. Periods of division and strife are baked into our democracy, part of the yin and yang, the theory goes, that sorts new realities into a renovated sense of purpose. Yet periods of upheaval must necessarily lead to a refitting, not obliteration, of our common story or democracy is toast.

 

Do Americans still have the chops to reimagine the song of ourselves? 

 

Lord knows, we’ve had the chops. In his 1995 book “The End of Education,” media critic Neil Postman writes: “Our genius lies in our capacity to make meaning through the creation of narratives that give point to our labors, exalt our history, elucidate the present, and give direction to our future. ”But we do have to mobilize the genius.

 

That’s easier said than done in a time brimming with distraction, distress and division. It’s hard to say when or if we’ll devise a story with the oomph necessary to yank us up and over the walls of our gated communities of the mind—toward efforts that will make society more inclusive, the economy more equitable and life on the planet more sustainable. 

 

In the meantime, three cheers for the ebullient currents of American life Walt Whitman chronicled. On our best days, they can lift us above our brattish, self-defeating ways.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174015 https://historynewsnetwork.org/article/174015 0
Cinema Paradiso: The Academy Museum of Motion Pictures Will Be The Home of Movies, Past and Present

(For more images of the museum, click the image above or here)

 

Acclaimed film director Martin Scorsese was recently in the news for asserting his opinion that comic book movies are not “cinema.” In light of his upcoming historical film depicting the life of notorious mobster Jimmy Hoffa, Scorsese expanded on his claim that the massive push for superhero movies has affected young people’s understanding of history, saying “they perceive even the concept of what history is supposed to be [differently]." 

 

Whether it promotes a better understanding of history or not, film itself has remained a major cultural influence around the world for over a century now. Its larger historical impact is hard to measure. Soon, however, the first large-scale museum in the country solely dedicated to the history of motion pictures is set to open and attempt to do just that. 

 

The Academy Museum of Motion Pictures will be located on the corner of Wilshire and Fairfax Boulevard in Los Angeles. According to Jessica Niebel, the Exhibitions Curator for the museum, “the Academy Museum of Motion Pictures will be the world’s premier institution devoted to the art and science of movies and moviemaking.” The museum is on track to open sometime in 2020 after several delays in construction. 

 

The purpose of the museum is to encapsulate how film has changed over time. Moving pictures can be traced all the way back to the 19th century; in 1895 the first ever black and white film, the 50-second long “Arrival of a Train,” was released. The film caused an uproar from audiences, who had never even conceived what moving picture could be. Since then it has grown as an art form, which is something that the Academy Museum aims to capture. Niebel hopes that the museum’s programs and exhibitions will allow visitors to experience, “how movies evolved and are made, and highlight artists, craftspeople, designers and technicians who make the movies possible.”

 

The 20th century was integral to the growth of film as an art form. Despite the fact that it was new, Niebel explains that film, “being the most democratic artform of the 20th century as it was available, affordable and attractive to the masses, had a very strong connect to cultural histories everywhere.” Hollywood’s growth as an industry was soon followed by the rapid growth of film industries in India with “Bollywood,” and later in Nigeria with “Nollywood.” The Academy Museum plans to showcase international film in its first major temporary exhibition, “an unprecedented retrospective on Hayao Miyazaki, which will be the first major exhibition in the United States of the work of the legendary Japanese filmmaker, organized in collaboration with Studio Ghibli.”

 

Visitors to the museum can expect to experience a wide variety of programs, including film programs, exhibitions, public programs, publications, and education programs. The six-story museum itself will be a resource for film education, featuring “more than 50,000 square feet of exhibition galleries, a state-of-the-art education studio, two film and performance theaters, a 34-foot high, double-height gallery for cutting-edge temporary installations, a restaurant and café, and public and special event spaces.” Some programming may involve hosting industry professionals and guest speakers to give insight into their experience with film and an insider look into how much of a collaborative process filmmaking really is. A recent New Yorker piece detailed how the American movie industry took off in the 20th century with the help of many groups that don’t get on-screen credit, especially women. Museum programming hopes to address that.

 

An official grand opening date has not been announced yet, despite the fact that it’s been a long time coming. Plans for the construction of the museum were announced in 2012; there have been significant delays for the project in the seven years since. The Academy has chalked that up to the sheer feat involved in building it, including the renovation of a 1939 LA landmark (the May Company building), building a new spherical structure that includes a 1500 panel glass dome, and joining them together. In a statement, the Academy said that “we have always chosen the path that would enhance the structure, even if that meant construction would take more time to complete,” and “we are weighing the overall schedule for major industry events in 2020, and on this basis will choose the optimal moment for our official opening.”

 

Once it finally opens, the museum will be the first of its kind in the US. As such, it has been very important for planners like Niebel to create an experience that is altogether unique with an eye to the future. That involves screening films in the correct format that they were intended to be seen in, while also providing exhibitions that complement the screenings. Education will be an emphasis, as Niebel explained: “Film exhibitions cannot recreate the cinematic experience but translate it into another medium, that of the exhibition.” The Academy Museum has differentiated itself from other museums in that regard. History and art museums typically focus on one aspect of either education or visual display. Niebel maintains that the Academy Museum will be able to address both, because “film exhibitions are a ‘genre’ of their own in that they combine artifacts, film clips, design and immersive experiences to achieve not only educational aspects or aesthetic impressions, but wholistic experiences.” 

 

The museum will have the advantage of access to the Academy of Motion Pictures Arts and Sciences Archive. The archive has been acquiring film since 1929 and currently holds over 190,000 materials, including many of the films nominated for Oscars in all categories and all of the award-winning films for Best Picture. Niebel confirmed that the museum will “draw on the unique intellectual and material resources of the Academy of Motion Picture Arts and Sciences.” It should be an interesting landmark for historians and movie buffs alike. Film has always been a kind of public history, a reflection of society and culture. The Academy Museum of Motion Pictures, once it opens, may capture that. Score one for Martin Scorsese. 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173843 https://historynewsnetwork.org/article/173843 0
Why The West Is Losing The Fight For Democracy

 

Adapted from The Light That Failed by Ivan Krastev and Stephen Holmes, published by Pegasus Books. Reprinted with permission. All other rights reserved.

 

The future was better yesterday. We used to believe that the year 1989 divided ‘the past from the future almost as clearly as the Berlin wall divided the East from the West.’ We had ‘trouble imagining a world that is radically better than our own, or a future that is not essentially democratic and capitalist.’ That is not the way we think today. Most of us now have trouble imagining a future, even in the West, that remains securely democratic and liberal.

 

When the Cold War ended, hopes for liberal capitalist democracy spreading globally were high. The geopolitical stage seemed set for a performance not unlike George Bernard Shaw’s Pygmalion, an optimistic and didactic play in which a professor of phonetics, over a short period of time, succeeds in teaching a poor flower girl to speak like the Queen and feel at home in polite company.

 

Having prematurely celebrated the integration of the East into the West, interested observers eventually realized that the spectacle before them was not playing out as expected. It was as if, instead of watching a performance of Pygmalion, the world ended up with a theatrical adaptation of Mary Shelley’s Frankenstein, a pessimistic and didactic novel about a man who decided to play God by assembling replicas of human body parts into a humanoid creature. The defective monster felt doomed to loneliness, invisibility and rejection. And envying the unattainable happiness of its creator, it turned violently against the latter’s friends and family, laying their world to waste, leaving only remorse and heartbreak as legacies of a misguided experiment in human ​self-​duplication.

 

So, how did liberalism end up the victim of its heralded success in the Cold War? Superficially, the fault lay with a series of profoundly destabilizing political events: the 9/11 attack on the World Trade Center in New York, the second Iraq War, the 2008 financial crisis, Russia’s annexation of Crimea and intervention in Eastern Ukraine, the impotence of the West as Syria descended into a humanitarian nightmare, the 2015 migration crisis in Europe, the Brexit referendum, and the election of Donald Trump. Liberal democracy’s ​post-​Cold War afterglow has also been dimmed by the Chinese economic miracle, orchestrated by a political leadership that is unapologetically neither liberal nor democratic. Attempts to salvage the good name of liberal democracy by contrasting it favourably with ​non-​Western autocracy have been undercut by the feckless violation of liberal norms, as in the torture of prisoners, and the evident malfunctioning of democratic institutions inside the West itself. Tellingly, how democracies atrophy and perish has become the question that most preoccupies liberal scholars today.

 

The very ideal of ‘an open society,’ too, has lost its ​once​-fêted lustre. For many disillusioned citizens, openness to the world now suggests more grounds for anxiety than for hope. When the Berlin Wall was toppled, there were only sixteen border fences in the world. Now there are ​sixty​-five fortified perimeters either completed or under construction. According to Quebec University expert Elisabeth Vallet, almost a third of the world’s countries are rearing barriers along their borders. The three decades following 1989 turned out to be an ‘inter-​mural period’, a brief ​barricade-​free interval between the dramatic breaching of the Berlin Wall, exciting utopian fantasies of a borderless world, and a global craze of ​wall​-building, with cement and ​barbed-​wire barriers embodying existential (if sometimes imaginary) fears.

 

Most Europeans and Americans today also believe that the lives of their children will be less prosperous and fulfilling than their own. Public faith in democracy is plummeting and ​long-​established political parties are disintegrating or being crowded out by amorphous political movements and populist strongmen, putting into question the willingness of organized political forces to fight for democracy’s survival in times of crisis. Spooked by the phantom of ​large​-scale migration, electorates in parts of Europe and America are increasingly drawn to xenophobic rhetoric, authoritarian leaders and militarized borders. Rather than believing that the future will be uplifted by the liberal ideas radiating out of the West, they fear that ​21st-​century history will be afflicted by the millions of people streaming into it. Once extolled as a bulwark against tyranny, human rights are now routinely accused of limiting the ability of democracies to fight terrorism effectively. Fears for liberalism’s survival are so acute that references to William Butler Yeats’s ‘The Second Coming’, written in 1919 in the wake of one of the deadliest conflicts in human history, became an almost obligatory refrain for political commentators in 2016. A century after Yeats wrote them, these words are now the mantra of apprehensive defenders of liberal democracy worldwide: ‘Things fall apart; the centre cannot hold; / Mere anarchy is loosed upon the world.’

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174017 https://historynewsnetwork.org/article/174017 0
The History of Fire: An interview with Stephen Pyne

 

Stephen Pyne is an emeritus professor at Arizona State University. He has published 35 books, most of them dealing with fire, but others on Antarctica, the Grand Canyon, the Voyager mission. With his oldest daughter, he wrote an inquiry into the Pleistocene. His fire histories include surveys of America, Australia, Canada, Europe (including Russia), and the Earth. To learn more about his work, please feel free to visit his website

 

What made you want to be a historian?

I drifted into history. I enjoyed reading it, especially world history, as a kid, then the spark took when I realized that I understood things best through their history.  It was by reading Carl Boyer’s History of the Calculus, for example, that I finally appreciated what the numbers were all about.  The same has proven true for topics like fire, Antarctica, Grand Canyon, and the rest.  Then I realized that history could also be literature. That clinched it.

I saw that you used to be a wildland firefighter. Is this what made you want to study the history of fire?

A few days after graduating from high school, I was hired as a laborer at Grand Canyon National Park.  While I was signing my papers, an opening appeared on the North Rim fire crew, and I was asked if I wanted to join.  I’d never been to the North Rim, never worked around a flame bigger than a campfire, didn’t even know the names of the basic tools, and of course said, Sure. It was a moment of biographical wind shear.  I returned to the Longshots for 15 seasons, 12 as crew boss, then spent three summers writing fire plans for Rocky Mountain and Yellowstone National Parks.  Then I went to Antarctica for a season. The North Rim fire crew and campus were separate lives: neither had much to do with the other.  It took 10 years as a Longshot and after getting a Ph.D. that I finally brought the two worlds together.  I decided to apply to the scholarship I had been trained into the subject that most animated me. Fire in America was the result. It’s not a hard weld; the two lives have never fully fused.  I have one line of books that continues the topics I studied in grad school, and another that deals with fire, particularly big-screen fire histories for America, Australia, Canada, Europe (including Russia), and the Earth.  In some respects, I’ve been a demographic of one. But it’s also like being on the American River in 1848 California. There are riverbeds of nuggets just waiting to be picked up.    

 

How did personal experience influence your scholarship?

Obviously, as a topic.  I would never have thought of fire as a subject without those hopping seasons on the Rim.  They gave me woods credibility. More subtly, those years shaped how I think and speak about fire.  On a fire crew, you quickly appreciate how fires shape a season, and how fire seasons can shape a life.  It’s not a big step to wonder if the same might be true for humanity. After all, we are a uniquely fire creature on a uniquely fire planet.  Fire is what we do that no other species does. It makes a pretty good index of our environmental agency. You pick up a language, a familiarity, a sensibility toward fire – it’s a relationship, in some way.  Without anthropomorphizing fire, you learn to animate it – give it a presence. That’s what I can bring that someone else might not.  At the same time, I need to abstract the vernacular into more general concepts, which is what scholarship does. I’m constantly fluctuating between the two poles, a kind of alternating current. There are always trade-offs.  I begin with fire and construct a history.  I don’t begin with historiography – questions of interest to the history community – and use fire to illustrate them.  That makes my fire stuff different, and it means it can be hard to massage it into a more general history or a classroom.  I sometimes wonder if I invented a subject only to kill it.  

 

As I’m sure you are aware, wildfires recently ravaged the state of California, and bushfires continue to burn Australia. How does your understanding of the history of fire shape how you think about these wildfires? 

Ah, as long as California keeps burning it seems I’ll never be lonely.  I do a lot of interviews. Last year I finally wrote a fire primer for journalists and posted it on my website. California is built to burn and to burn explosively.  Against that, we have a society that is determined to live and work where and how it wants.  For a century California has buffered between hammer and anvil with a world-class firefighting apparatus.  But four fire busts in three years have – or should have – broken that strategy. It’s costly, it’s ineffective against the extreme events (which are the ones that matter), and it’s unfair to put crews at risk in this way.  Doing the same things at ever-higher intensities only worsens the conditions. Australia is California at a continental scale, only drier, and with winds that can turn the southeast into a veritable fire flume.  It has a long history of eruptive fires, but the 2009 Black Saturday fires and the Forever fires burning now feel different.  They are more savage, more frequent, and more disruptive.  Australia is also a firepower because it has a cultural connection to fire at a range and depth, from art to politics, that I haven’t found elsewhere.  Australia’s foresters were the first to adopt controlled burning as a strategy for protection – their experience makes a fascinating contrast to what happened in the U.S. Both places show the importance of framing the problem.  Fire is a creation of the living world, but we have defined it as a phenomenon for physics and chemistry, which leads us to seek physical solutions like dropping retardants and shoving hydrocarbons around.  We haven’t really thought about nuanced ecological engineering. We’ve mostly ignored the ideas and institutions that shape the social half of the equation. We’ve neglected how these scenes are historically constructed, how they carry a long evolution that doesn’t derive from first principles. For that matter, the intellectual history of fire is relevant because fire as an integral subject was a casualty of the Enlightenment.  We have no fire department at a university except the one that sends emergency vehicles when an alarm sounds. There is a lot here for historians to chew on.  But it isn’t enough to problematize. We have to show how our analysis can lead to problem-solving.

 

Is climate change alone enough to explain wildfires or do we need to understand more about history to understand why they are such a problem? 

There are many ways to get big fires.  Presently, climate change is serving as a performance enhancer.  In the 19th and early 20th centuries, megafires an order of magnitude greater than those of today were powered by logging and land clearing slash.  Climate integrates many factors, so does fire, and when you stir those two sloppy variables together, it’s tricky to attribute particular causes to the stew of effects. If you make fire an informing principle, a narrative axis, you find that the shift to burning fossil fuels, what I think of as the pyric transition, unhinged Earth’s fire regimes even without climate change.  The conversion has rolled over habitat after habitat. Land use and humanity’s fire practices, for example, are hugely important in interacting with climate to shape fires. But most of those changes also trace back to fossil fuels. Basically, we’re burning our combustion candle at both ends. How lithic landscapes and living landscapes interact has not been something fire ecology or physics has considered.  They dread dealing with humans because people muck up the models. You have to come at those notions sideways, you have to view the scene from outside the disciplinary prisms that we’re trained in.  Paradoxically perhaps, the humanities may be better positioned to make conceptual contributions than the natural sciences.

 

How do you think the field of environmental history will change as the climate crisis becomes a more and more pressing issue?

The crisis is spooky.  But I reject the notion that we are heading into a no-narrative, no-analog future.  With fire, I can offer a pretty substantial narrative – it’s one of the oldest humanity has.  In truth, I now regard climate history as a sub-narrative of fire history. And I’ve come to imagine our evolving fire age as something comparable to the ice ages of the Pleistocene.  That’s a crisp analog. Changing sea levels, mass extinction, wholesale upheavals of biotas, regions reconstructed with the fire-equivalent of ice sheets and pluvial lakes – it’s all there.  For me, the Anthropocene extends across the whole of the Holocene, and from a fire perspective, the Anthropocene could be usefully renamed the Pyrocene. There are other helpful analogs out there.  The problem of powerline-kindled wildfires is very similar to that of railroad fires in the past.  The problem of fringe communities burning (the fatuously named wildland-urban interface) replays the chronicle of settlement fires in the 19th century.  Then it was agricultural colonization; now, an urban reclamation of rural lands. We have pretty good examples of how we might cope. The WUI problem got defined by the wildland fire community which saw houses complicating their management of land, but it makes more sense to pick up the other end of the stick and define these places as urban settings with peculiar landscaping.  The aptest analogy is not to wildland fire but to urban fire. Do that, and it’s obvious what we need to do to reduce the havoc. History also holds lessons beyond data and techniques.  How do we live in a contingent world about which we have incomplete knowledge?  That’s a topic for stories and characters, not algorithms. Mostly, though, the sciences and fire folk don’t credit history with much analytical power.  Historians deal with anecdotes. Historians are good for yearbooks and court poetry. The critics are wrong, but it can be a tough slog, like mopping up in mixed conifer duff. Still, when I began, fire was a fringe topic.  Now it’s a global concern.  People want context, and that’s what history provides, which has created a context for what I do.  It’s been an interesting ride.     

 

I also saw that you specialized in the history of exploration. Have you been able to intertwine that interest with your knowledge of fire?

I went to grad school to study geology, western history, exploration – stuff relevant to my life on the Rim.  All my applications were rejected. Then, serendipitously, it was suggested I apply to William Goetzmann in the American Civ program at UT-Austin.  He accepted me and mostly left me alone. At the time he was playing with the idea of a second great age of discovery. I quickly added a third and have used it as an organizing principle – a kind of conceptual rebar - for a series of books, including The Ice, How the Canyon Became Grand, and Voyager.  I’ve finally systematized the grand schema into The Great Ages of Discovery, now headed for publication. I see exploration as a cultural movement and for the West a kind of quest narrative. My exploration books do better critically and commercially than my fire books.  Exploration has a literary tradition, fire other than disaster and battlefield doesn’t.  It’s been helpful to have two themes – puts me back into the two-cycle rhythms I knew in my rim-campus days, keeps me from getting too stale in either one.  If I were starting over, I’d write the two series under different names.

 

In 2012, you wrote a book with your daughter, The Last Lost World. What was it like working with her, and what kind of expertise did she bring to the book?

My oldest daughter, Lydia, was attracted to archaeology and paleoanthropology and went to graduate school to study them.  Then she decided that the history of the field was more appealing. For years we joked about writing a book together sometime.  She graduated in 2009, a horrible moment for a new Ph.D., so I thought that the time had come. I had a sabbatical semester, and we took her topic and wrote The Last Lost World.  I knew about a few of the subjects, and Lydia the others, and she directed our research.  I credit her with keeping us (me) on message. I have good memories of the project, particularly the weeks we spent at our mountain cabin revising. (She’s gone on to a successful career as a writer; her latest, Genuine Fakes: How Phony Things Teach Us About Real Stuff, is just out.)  And, yes, we’re still speaking to each other.

 

Is there any other information you want people to know about you or the work that you do?

Because I ended up in a school of life sciences, I didn’t do much with graduate students.  Historians didn’t want someone from biology on their committee, and biologists didn’t want a historian.  Eventually, I decided to offer a course on nonfiction writing, and then wrote a couple of books about writing books.  Call it my postmodern phase.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173629 https://historynewsnetwork.org/article/173629 0
A New History of American Liberalism

 

Historians interested in the history of political philosophies would do well to read James Traub's new book What Was Liberalism? The Past, Present, and Promise of a Noble Idea.  Traub's ambitious book documents that liberalism has evolved over time. For John Stuart Mill and Thomas Jefferson, it was mostly a check on government's power and preservation of individuals' liberty. For Theodore Roosevelt and Franklin Roosevelt, liberalism meant using government's power to regulate business and promote social welfare. Recently it has been  more about government policies for equality and inclusion. 

 

But Traub frets that liberalism has foundered. People's faith in government has eroded. Working and middle class people have become disenchanted. Traub recounts that George Wallace, running for president in 1964, rebuked what he called liberal elites. "They have looked down their noses at the average man in the street too long," Wallace cried. That alienation grew and culminated in Donald Trump's 2016 campaign. Trump, as president, keeps hammering at liberal values. In addition, in Traub's view, Trump poses a threat to political civility, free speech, and the rule of law.

 

Traub's book begins with a chapter on "Why Liberalism Matters."  He concludes it with a chapter on "A Liberal Nationalism" that laments liberalism's eclipse but looks for signs of its potential resurgence. He finds faint hope in liberalism's history of adaptation and resurgence. Liberals need to identify with opportunity, national confidence, inclusion, civility. They need to be seen as champions of "the public good." 

 

In  the book's last sentence, he says "Liberalism will renew itself only if enough people believe that its principles are worth fighting for."

 

James Traub's sense of concern contrasts sharply with the buoyancy, optimism, self- confidence and determination that helped make liberalism a success in the past.

 

In his odyssey through liberalism's history, Traub brings in Hubert H. Humphrey (1911-1978), longtime Democratic senator from Minnesota, Vice President (1965-1969) and the party's unsuccessful presidential nominee in 1968. Traub draws extensively on Humphrey's autobiography, The Education of a Public Man: My Life in Politics, published in 1976. A better book, though, for insight into the liberal mind at the movement's high tide in the 1960's -- and a contrast to liberal disarray and doldrums today -- is Humphrey's 1964 book The Cause is Mankind: A Liberal Program for Modern America.

 

Humphrey exudes optimism and confidence: liberals know what the nation needs and are determined to secure it.  “The enduring strength of American liberalism," Humphrey wrote in the book, "is that it recognizes and welcomes change as an essential part of life, and moves to seize rather than evade the challenges and opportunities that change presents. It is, basically, an attitude toward life rather than a dogma—characterized by a warm heart, an open mind, and willing hands.”

 

To be sure, there were lots of challenges. "We are living in an age when America seems to be bursting with issues and problems. We must secure civil rights for all our citizens. We must end poverty, Our economy must grow in all parts of the country. Automation and technology must create new jobs, not more jobless...We must rebuild our cities, revitalize our rural areas... We must conserve our natural resources."

 

But in Humphrey's sunny view of things, challenges were little more than opportunities for reform.  Nothing was too difficult for Americans who needed to endorse bold government initiatives engineered by the liberal spirit.  Humphrey's book included a chapter on planning. He had proposals to streamline the work of Congress. He had proposals for reconciling big business and labor and preserving competition. "The chief economic role of government," he wrote, "must be the smoothing of the way for new men and new ideas."

 

He endorsed the "welfare state" and government's responsibility to ensure human dignity and a decent standard of living for everyone. He proposed a massive "war on poverty" that was bolder than what President Lyndon Johnson was sponsoring. Better agricultural policies would preserve family farms and at the same time provide food in abundance. Federal education aid would boost that sector.

 

Civil rights, something Humphrey had championed for years, would keep progressing.

 

Sometimes, it would take experimentation and improvisation. No matter, said Humphrey. Americans were good at that.

 

Humphrey did not foresee that the war in Vietnam and other events that would soon undercut the Johnson/Humphrey liberal domestic agenda. In his book, it was all about forward momentum into a beckoning, bracing future. American liberalism "sees free competitive enterprise as the mainspring economic life and is dedicated to the maintenance of the traditional freedoms of speech, of the press, of assembly and the like." But it is behind "the use of power of the state to achieve both freedom and a reasonable measure of equality."

 

Liberals looking for an enlightening but sobering account of their movement's history should read James Traub. 

 

But liberals should also read Hubert Humphrey for some much-needed inspiration.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174019 https://historynewsnetwork.org/article/174019 0
Capitalism Versus Socialism: Did Capitalism Really Win?

 

In a recent op-ed, “I Was Once a Socialist. Then I Saw How It Worked,” conservative columnist David Brooks wrote, “We ran that social experiment [between capitalism and socialism] for 100 years and capitalism won.” But did capitalism really win? As I indicated in a chapter (“Capitalism, Socialism, and Communism”) in my An Age of Progress? Clashing Twentieth-Century Global Forces, history tells a more complicated story.

 

As sociologist Max Weber and conservative economist Milton Friedman have told us, the primary purpose of capitalism is earning a profit. Friedman even believed it was business’s main “social responsibility.” Sociologist Daniel Bell wrote that capitalism has “no moral or transcendental ethic.” 

 

 

Nineteenth-century capitalism provided no adequate answers for how to deal with such problems as unsafe working conditions, unfair business practices, pollution, public health, slum housing, or the abuse of child labor. Nor did it deal with the distribution of income and left unanswered if great poverty should exist along with great wealth. 

 

 

At that time socialism, especially Marxist socialism, arose as the major challenger of capitalism. By 1917, socialism had split into two principal types, the communist variety led by Vladimir Lenin--later formalized in the name Union of Soviet Socialist Republics (USSR)--and a revisionist type, more common in western Europe, that advocated democratic, reformist means of obtaining power. The goal of such socialism was primarily greater economic equality through such steps as government ownership of at least the chief means of production. In general, democratic socialists wished to control and regulate the economy in behalf of the entire population. In the German parliamentary election of 1912, the German Social Democratic Party received more votes than any other party. 

 

 

Marx’s ideas also influenced trade-unionism and other isms that challenged nineteenth-century capitalists. One such was Progressivism, a cross-fertilizing trans-Atlantic movement that was powerful enough in the USA to produce the Progressive Era from 1890 to 1914.  

 

 

But Progressivism did not attempt to overthrow or replace capitalism, but to constrain and supplement it in order to insure that it served the public good. As Daniel Rodgers indicates in his Atlantic Crossings: Social Politics in a Progressive Age (2000), it was a diverse movement “to limit the socially destructive effects of morally unhindered capitalism, to extract from those [capitalist] markets the tasks they had demonstrably bungled, to counterbalance the markets’ atomizing social effects with a countercalculus of the public weal [well-being].” 

 

 

After World War I and a dozen years of Republican rule, progressivism returned with the election of Franklin Roosevelt (FDR) in 1932 and his subsequent implementation of the New Deal. 

 

From FDR’s time to our own, the U.S. economy, as our State Department indicated  in 2001, has been “mixed:” “The United States is often described as a ‘capitalist’economy,” it “is perhaps better described as a ‘mixed’ economy, with government playing an important role along with private enterprise.”

 

Following World War II, many western European economies also became mixed as they introduced more welfare-state provisions--social programs assisting the poor, old, sick, and unemployed. Generally, compared to the USA, western Europe gave more assistance to the needy and the government controlled more of the economy. 

 

 

But in both the USA and western Europe the degree of government regulation has fluctuated considerably. Prominent European democratic socialists Chancellor Willy Brandt of West Germany and President Francois Mitterrand of France led their countries from from 1969 to 1974 and 1981 to 1995, respectively. Conversely, in the 1980s Ronald Reagan in the USA and Margaret Thatcher in Britain cut back the government regulation of capitalism. While progressivism, the New Deal, and welfare-state policies inched liberals closer to democratic socialism, the Reagan-Thatcher opposition to “creeping socialism” and “big government” encouraged conservatives to place more trust in capitalism or “the free market.”

 

Today, not only are there very different types of capitalism in different countries--Brooks, for example, distinguishes “between a version of democratic capitalism, found in the U.S., Canada and Denmark, and forms of authoritarian capitalism, found in China and Russia”--but U. S. capitalism today is not the same as it was before the Progressive Era. Nor is democratic socialism in Europe the same as it was in 1912, when the German Social Democratic Party demonstrated its popularity.

 

fair and unbiased review, “Democratic Socialist Countries 2019,” concludes that most prominent European countries contain a mix of capitalist and democratic-socialist elements. For example, it states, “Norway, like other Scandinavian countries, is . . . [not] fully socialist nor fully capitalist.”

 

Thus, Brooks is too simplistic when he writes “capitalism won.” If one equates socialism with communism, as Brooks comes close to doing, then saying capitalism won over socialism is closer to the truth. For Soviet-style communism did collapse by 1991 with the disintegration of the USSR. Still, that’s not completely accurate because western economic-political systems are more mixed than purely democratic capitalist.

 

Brooks’ narrow view of socialism and triumphant take on capitalism require a historical corrective because of three significant  problems. 

 

The first is the political use of simplistic capitalism-vs.-socialism propaganda. Socialism and socialist have long been scare words conservatives hurl at opponents. Social Security? Socialist! Medicare? Socialist! In the 1960s when Congress debated Medicare, the American Medical Association (AMA) hired Ronald Reagan to speak out against it on a record entitled “Ronald Reagan Speaks Out Against Socialized Medicine.”

 

More recently, Donald Trump stated that socialism “promises unity, but it delivers hatred and it delivers division. Socialism promises a better future, but it always returns to the darkest chapters of the past. That never fails. It always happens. Socialism is a sad and discredited ideology rooted in the total ignorance of history and human nature.” His campaign has claimed that “Bernie Sanders has already won the debate in the Democrat primary, because every candidate is embracing his brand of socialism.”

 

The second problem is that our political-economic development has made us  more of a mixed system than a capitalistic one thanks to such developments as Progressivism, the New Deal, Lyndon’s Johnson’s “Great Society,” and Medicare. 

 

The third main problem is that capitalism’s main aim of profit-seeking still needs (in Bell’s words) a higher “moral or transcendental ethic.” Seeking the common good is one such ethic that has often been suggested, from the Progressive era to, more recently, Pope Francis and Nobel Prize–winning economist Joseph Stiglitz.

 

In previous articles such as “BP, Corporations, Capitalism, Progressivism, and Government” and “The Opioid Crisis and the Need for Progressivism,” I dealt with the profit-at-all-cost operations of corporations such as British Petroleum (BP), and Purdue Pharma. I also addressed why governments need to regulate companies responsible for oils spills and opioid deaths to ensure they serve the common good. 

 

 

In the last two years, two more notable cases of insufficient corporate concern for anything but profits emerged, again demonstrating the need for government oversight. 

 

 

The first involves Pacific Gas & Electric (PG&E). In early December, 2019 it agreed to pay $13.5 billion for causing Northern California wildfires that destroyed thousands of homes and businesses and killed more than 100 people in just two of the fires. The LA Times noted that critics believe the company put “short-term profits before necessary safety measures.”A Marxist publication’s headline read “Profits before people: Why PG&E turned off the lights in California.”

 

The second case involves aircraft manufacturer and defense contractor Boeing. In late 2018-early 2019, two Boeing 737 Max 8 Jets crashed killing 346 people. Again, profits were placed before safety. One Daily Beast article declared that “top managers in the company’s Chicago headquarters, more alert to Wall Street than airline safety, gave priority to stock buybacks and shareholder returns.”

 

Again, government regulation of Boeing was insufficient. Pilot Captain Chesley “Sully” Sullenberger, depicted by Tom Hanks in the movie “Sully: Miracle on the Hudson,” criticized the USA for not providing sufficient funds to the Federal Aviation Administration (FAA), leaving it ill equipped to oversee aircraft safety. Instead, Sully charged, the FAA allowed Boeing itself to certify that it was meeting safety standards.  “To make matters worse, there is too cozy a relationship between the industry and the regulators. And in too many cases, FAA employees who rightly called for stricter compliance with safety standards and more rigorous design choices have been overruled by FAA management, often under corporate or political pressure.”

 

Thus, history demonstrates that the assertion that capitalism “won” is inaccurate. The irresponsibility of corporations like BP, Purdue Pharma, PG&E, and Boeing indicate the ongoing cost of unregulated capitalism. Today, President Trump continues to assail the type of enlightened regulation produced by the Progressive Era and FDR. The main task of voters in 2020 and thereafter will not be to choose whether we want capitalism or democratic socialism. Rather, it will be to decide the best pragmatic mix of the two (and best presidential candidate) to further the common good. 

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173969 https://historynewsnetwork.org/article/173969 0
Have Americans Usually Supported Their Wars?

Editor's note January 7, 2020: After recent events in Iraq, many people across the globe are worried that the United States and Iran might go to war. Many attended anti-war protests over the weekend. Here is helpful historical context. 

Related Links

●  Lawrence F. Kaplan: What the Public Will Stomach

● Stanley Karnow: Did Public Opinion Really Shift After Tet? 

It seemed to come as a bit of a shock to the Bush administration that Americans turned against the war in Iraq. Just as administration officials weren't prepared for the fierce resistance that developed to the American occupation of Iraq, neither were they prepared for a change in public opinion at home as the war dragged on. Cindy Sheehan's galvanizing protests outside the president's vacation home caught officials by surprise.

But throughout American history there has always been significant opposition to war. New England states threatened to secede from the union during the War of 1812, which severely hampered the trade carried in New England ships. The Mexican-American War was opposed by leading Whigs including Congressman Abraham Lincoln. During the Civil War President Lincoln faced the opposition of the Copperheads, many of whom he had thrown in jail. The Spanish-American War triggered a robust anti-imperialist movement led by William Jennings Bryan.

In the twentieth century, as Hazel Erskine demonstrated in her widely cited 1970 article, "Was War a Mistake?" (Public Opinion Quarterly), "the American public has never been sold on the validity of any war but World War II." She noted that as of 1969--a year after the Tet Offensive and the brief invasion of the American embassy in Saigon--"in spite of the current anti-war fervor, dissent against Vietnam has not yet reached the peaks of dissatisfaction attained by either World War I or the Korean War."

Reviewing the wars of the 20th century, she noted: "In answer to quite comparable questions, in 1937 64 per cent called World War I a mistake, in 1951 62 per cent considered Korean involvement wrong, whereas in 1969 burgeoning anti-Vietnam dissent nationwide never topped 58 per cent."

Her charts demonstrated the extent of American opposition to war:

 

 

 

 

 

 

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/14857 https://historynewsnetwork.org/article/14857 0
Winter Isn’t Coming. Prepare for the Pyrocene.

A massive fire burns the Amazon in Brazil

 

 

Millions of acres are burning in the Arctic, thousands of fires blaze in the Amazon, and with seemingly endless flareups in between, from California to Gran Canaria – fire seems everywhere, and everywhere dangerous and destabilizing. With a worsening climate, the fires dappling Earth from the tropics to the tundra appear as the pilot flames of an advancing apocalypse.  To some commentators, so dire, so unprecedented are the forecast changes that they argue we have no language or narrative to express them.  

 

Actually, the fire scene is worse than the headlines and breathless commentaries suggest because it is not just about bad burns that crash into towns and trash countrysides.  It’s equally about the good fires that have vanished because they are suppressed or no longer lit.  More of the world suffers from a famine of good fires than from a surfeit of bad ones; the bad ones are filling a void; they are not so much wild as feral.  

 

Underwriting both is that immense inflection in which humans turned from burning living landscapes to burning lithic ones in the form of fossil fuels.  That is the Big Burn of today, acting as a performance enhancer on all aspects of fire’s global presence.  So vast is the magnitude of these changes that we might rightly speak of a coming Fire Age equivalent in stature to the Ice Ages of the Pleistocene.  Call it the Pyrocene.

 

So there does exist a narrative, one of the oldest known to humanity, and one that has defined our distinctive ecological agency. It’s the story of fire.  Earth is a uniquely fire planet – it has been since life clambered onto the continents.  Equally, humans are a uniquely fire creature, not only the keystone species for fire but a species monopolist over its manipulation.  The fires in the Arctic testify to the planetary antiquity of fire.  Nearly all are kindled by lightning and burn biotas nicely adapted to fire; many could be suppressed, but extinguishing them will only put off, not put out, the flames. By contrast, the fires in the Amazon bear witness to a Faustian pact that hominins made with fire so long ago it is coded into our genome.  They are set by people in circumstances that people made, well outside ecological barriers and historical buffers.

 

This is a narrative so ancient it is prelapsarian. Our alliance with fire has become a veritable symbiosis.  We got small guts and big heads because we learned to cook food.  We went to the top of the food chain because we learned to cook landscapes.  Now we have become a geological force because we have begun to cook the planet.  We have taken fire to places and times it could never have reached on its own, and it has taken us everywhere, even off world. We have leveraged fire; fire has leveraged us.

 

How this happened is a largely hidden history – hidden in plain sight.  Fire disappeared as an integral subject about the time we hid fire into Franklin stoves and steam engines.  (The only fire department at a university is the one that sends emergency vehicles when an alarm sounds.)  It lost standing as a topic in its own right.  As with the fires of today, its use in history has been to illustrate other themes, not to track a narrative of its own.  

 

Yet how the present scene came to be is clear enough in its general contours.  How, outfitted with firesticks early humans could take over select biotas.  How, with axes and plows and livestock as fire fulcrums, societies could recode the patches and pulses of vast swathes of land for agriculture.  How, hungering for ever more firepower, we turned from burning living landscapes to burning lithic ones – once-living biomass converted over eons into oil, gas, lignite, and coal.  Our firepower became unbounded.

 

 

That is literally true.  The old quest for sources has morphed into one for sinks.  The search for more stuff to burn has become a problem of where to put all the effluent.  Industrial combustion can burn without any of the old ecological checks-and-balances: it can burn day and night, winter and summer, through drought and deluge.  We are taking stuff out of the geologic past and unleashing it into the geologic future.  

 

It’s not only about changing climate, or acidifying oceans. It’s about how we live on the land. Land use is the other half of the modern dialectic of fire on Earth, and when a people shift to fossil-fuels, they alter the way they inhabit landscapes.  They rely on industrial pyrotechnologies to organize agriculture, transportation, urban patterns, even nature reserves, all of which tend to aggravate the hazards from bad fire and complicate the reintroduction of good fire. The many conflagrations sparked by powerlines nicely capture the pyric collision between living and lithic landscapes. Still, even if fossil-fuel combustion were tamed, we would yet have to work through our deranged relationship to fires on living landscapes.  

 

Because fire is a reaction, not a substance, the scale of our fire-induced transformations can be difficult to see.  But we are fashioning the fire-informed equivalents of ice sheets, mountain glaciers, pluvial lakes, outwash plains, and of course changing sea levels, not to mention sparking wholesale extinctions.  Too much bad fire, too little good, too much combustion overall - it’s an ice age for fire.  The Pyrocene is moving from metaphor to descriptor.  

 

It’s all there: narrative, analogue, explication.  A couple of centuries ago we began hiding our fires in machines and off site, which can make it difficult for modern urbanites to appreciate how profoundly anthropogenic fire practices inform Earth today.  We use the rampaging flames to animate other agendas, not to understand what fire is telling us.  But fire, the great shape-shifter, is fast morphing beyond our grasp.  

 

What does a full-blown fire age look like?  We’re about to find out.

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/172842 https://historynewsnetwork.org/article/172842 0
Roundup Top 10!  

 

Why Did the U.S. Kill Suleimani?

by Elizabeth Cobbs and Kimberly C. Field

The attack illustrates America’s lack of a clear grand strategy — and why we need one immediately.

 

War with Iran is not inevitable — but the U.S. must change course

by Kelly J. Shannon

The relationship between the countries, once friends and allies, has soured — because of U.S. aggression.

 

 

The Global War of Error

by Tom Engelhardt

Failure is the new success and that applies as well to the “industrial” part of the military-industrial complex.

 

 

Putin’s Big Historical Lie

by Anne Applebaum

In a series of comments in late December, the Russian president appeared to blame Poland for the outbreak of the Second World War.

 

 

The Colonization of Puerto Rico and the Limits of Impeachment

by DJ Polite

It is misleading to call impeachment "justice" when it reflects the priorities of empire.

 

 

It’s 1856 All Over Again

by Steve Inskeep

Immigration. Race. Demographic change. Political demagogy. That year’s presidential race had it all. What can it tell us about 2020?

 

 

Before the ‘Final Solution’ There Was a ‘Test Killing’

by Kenny Fries

Too few know the history of the Nazi methodical mass murder of disabled people. That is why I write.

 

 

Yes, Bernie Sanders Could Be the Nominee—and It Would Be an Epic Nightmare for Democrats

by Ronald Radosh

Sanders is where he is today in part because no one has really attacked him. But just wait until Republicans spend a billion dollars painting him as an extremist.

 

 

Why policymakers must act to preserve information freedom at home and abroad

by Diana Lemberg

A Cold War dichotomy pitting capitalism and democracy versus state control misreads history.

 

 

 

 

Evangelicals using religion for political gain is nothing new. It is a US tradition

by Reverend William Barber

No one who has read US history can be surprised by the hypocrisy of Evangelicals for Trump but it also tells us how their undoing will inevitably come.

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/174014 https://historynewsnetwork.org/article/174014 0
A History of Climate Change Science and Denialism

The girl got up to speak before a crowd of global leaders. “Coming here today, I have no hidden agenda. I am fighting for my future. Losing my future is not like losing an election or a few points on the stock market. I am here to speak for all generations to come.” She continued: “I have dreamt of seeing the great herds of wild animals, jungles and rainforests full of birds and butterflies, but now I wonder if they will even exist for my children to see. Did you have to worry about these little things when you were my age? All this is happening before our eyes.” She challenged the adults in the room: “parents should be able to comfort their children by saying "everything's going to be alright', "we're doing the best we can" and "it's not the end of the world". But I don't think you can say that to us anymore.” 

 

No, these were not Greta Thunberg’s words earlier this year. This appeal came from Severn Suzuki at the Rio Earth Summit back in 1992. In the 27 years since, we have produced more than half of all the greenhouse gas emissions in history

 

Reading recent media reports, you could be forgiven for thinking that climate change is a sudden crisis. From the New York Times: “Climate Change Is Accelerating, Bringing World ‘Dangerously Close’ to Irreversible Change.” From the Financial Times: “Climate Change is Reaching a Tipping Point.” If the contents of these articles have surprised Americans, it reveals far more about the national discourse than then any new climate science. Scientists have understood the greenhouse effect since the 19th century. They have understood the potential for human-caused (anthropogenic) global warming for decades. Only the fog of denialism has obscured the long-held scientific consensus from the general public.

 

Who knew what when?

Joseph Fourier was Napoleon’s science adviser. In the early 19th century, he studied the nature of heat transfer and concluded that given the Earth’s distance from the sun, our planet should be far colder than it was. In an 1824 work, Fourier explained that the atmosphere must retain some of Earth’s heat. He speculated that human activities might also impact Earth’s temperature. Just over a decade later, Claude Pouillet theorized that water vapor and carbon dioxide (CO2) in the atmosphere trap infrared heat and warm the Earth. In 1859, the Irish physicist John Tyndall demonstrated empirically that certain molecules such as CO2 and methane absorb infrared radiation. More of these molecules meant more warming. Building on Tyndall’s work, Sweden’s Svante Arrhenius investigated the connection between atmospheric CO2 and the Earth’s climate. Arrhenius devised mathematical rules for the relationship. In doing so, he produced the first climate model. He also recognized that humans had the potential to change Earth’s climate, writing “the enormous combustion of coal by our industrial establishments suffices to increase the percentage of carbon dioxide in the air to a perceptible degree."  

Later scientific work supported Arrhenius’ main conclusions and led to major advancements in climate science and forecasting. While Arrhenius’ findings were discussed and debated in the first half of the 20th century, global emissions rose. After WWII, emission growth accelerated and began to raise concerns in the scientific community. During the 1950s, American scientists made a series of troubling discoveries. Oceanographer Roger Reveille showed that the oceans had a limited capacity to absorb CO2. Furthermore, CO2 lingered in the atmosphere for far longer than expected, allowing it to accumulate over time. At the Mauna Loa observatory, Charles David Keeling conclusively showed that atmospheric CO2 concentrations were rising. Before John F. Kennedy took office, many scientists were already warning that current emissions trends had the potential to drastically alter the climate within decades. Reveille described the global emissions trajectory as an uncontrolled and unprecedented “large-scale geophysical experiment.” 

 

In 1965, President Johnson received a report from his science advisory committee on climate change. The report’s introduction explained that “pollutants have altered on a global scale the carbon dioxide content of the air.” The scientists explained that they “can conclude with fair assurance that at the present time, fossil fuels are the only source of CO2 being added to the ocean-atmosphere-biosphere system.” The report then discussed the hazards posed by climate change including melting ice caps, rising sea levels, and ocean acidity. The conclusion from the available data was that by the year 2000, atmospheric CO2 would be 25% higher than pre-industrial levels, at 350 parts per million. 

 

The report was accurate except for one detail. Humanity increased its emissions faster than expected and by 2000, CO2 concentrations were measured at 370 parts per million, nearly 33% above pre-industrial levels.

 

Policymakers in the Nixon Administration also took notice of the mounting scientific evidence. Adviser Daniel Patrick Moynihan wrote to Nixon that it was “pretty clearly agreed” that CO2 levels would rise by 25% by 2000. The long-term implications of this could be dire, with rising temperatures and rising sea levels, “goodbye New York. Goodbye Washington, for that matter,” Moynihan wrote. Nixon himself pushed NATO to study the impacts of climate change. In 1969, NATO established the Committee on the Challenges of Modern Society (CCMS) partly to explore environmental threats.

 

The Clinching Evidence

By the 1970s, the scientific community had long understood the greenhouse effect. With increasing accuracy, they could model the relationship between atmospheric greenhouse gas concentrations and Earth’s temperature. They knew that CO2 concentrations were rising, and human activities were the likely cause. The only thing they lacked was conclusive empirical evidence that global temperature was rising. Some researchers had begun to notice an upward trend in temperature records, but global temperature is affected by many factors. The scientific method is an inherently conservative process. Scientists do not “confirm” their hypothesis, but instead rule out alternative and “null” hypotheses. Despite the strong evidence and logic for anthropogenic global warming, researchers needed to see the signal (warming) emerge clearly from the noise (natural variability). Given short-term temperature variability, that signal would take time to fully emerge. Meanwhile, as research continued, other alarming findings were published.

 

Scientists knew that CO2 was not the only greenhouse gases humans had put into the atmosphere. During the 1970s, research by James Lovelock revealed that levels of human-produced chlorofluorocarbons (CFCs) were rapidly rising. Used as refrigerants and propellants, CFCs were 10,000 times as effective as CO2 in trapping heat. Later, scientists discovered CFCs also destroy the ozone layer. 

 

In 1979, at the behest of America’s National Academy of Sciences, MIT meteorologist Jule Charney convened a dozen leading climate scientists to study CO2 and climate. Using increasingly sophisticated climate models, the scientists refined estimates for the scale and speed of global warming. The Charney Report’s forward stated, “we now have incontrovertible evidence that the atmosphere is indeed changing and that we ourselves contribute to that change.” The report “estimate[d] the most probable global warming for a doubling of CO2 to be near 3°C.” Forty years later, newer observations and more powerful models have supported that original estimate. The researchers also forecasted CO2 levels would double by the mid-21st century. The report’s expected rate of warming agreed with numbers posited by John Sawyer of the UK’s Meteorological Office in a 1972 article in Nature. Sawyer projected warming of 0.6°C by 2000, which also proved remarkably accurate.

 

Shortly after the release of the Charney Report, many American politicians began to oppose environmental action. The Reagan Administration worked to roll back environmental regulations. Obeying a radical free-market ideology, they gutted the Environmental Protection Agency and ignored scientific concerns about acid rain, ozone depletion, and climate change. 

 

However, the Clean Air and Clean Water Acts had already meaningfully improved air and water quality. Other nations had followed suit with similar anti-pollution policies. Interestingly, the success of these regulations made it easier for researchers to observe global warming trends. Many of the aerosol pollutants had the unintended effect of blocking incoming solar radiation. As a result, they had masked some of the emissions-driven greenhouse effect. As concentrations of these pollutants fell, a clear warming trend emerged. Scientists also corroborated ground temperature observations with satellite measurements. In addition, historical ice cores also provided independent evidence of the CO2-temperature relationship.

 

Sounding the Alarm

Despite his Midwestern reserve, James Hansen brought a stark message to Washington on a sweltering June day in 1988. “The evidence is pretty strong that the greenhouse effect is here.” Hansen led NASA’s Goddard Institute for Space Studies(GISS) and was one of the world’s foremost climate modelers. In his Congressional testimony, he explained that NASA was 99% certain that the observed temperature changes were not natural variation. The next day, the New York Times ran the headline “Global Warming Has Begun, Expert Tells Senate.” Hansen’s powerful testimony made it clear to politicians and the public where the scientists stood on climate change.

 

Also in 1988, the United Nations Environmental Programme (UNEP) and the World Meteorological Organization (WMO) created the Intergovernmental Panel on Climate Change (IPCC). The IPCC was created to study both the physical science of climate change and the numerous effects of the changes. To do that, the IPCC evaluates global research on climate change, adaptation, mitigation, and impacts. Thousands of leading scientists contribute to IPCC assessment reports as authors and reviewers. IPCC reports represent the largest scientific endeavor in human history and showcase the scientific process at its very best. The work is rigorous, interdisciplinary, and cutting edge.

 

While the IPCC has contributed massively to our understanding of our changing world, its core message has remained largely unchanged for three decades. The First Assessment Report (FAR) in 1990 stated “emissions resulting from human activities are substantially increasing the atmospheric concentrations of the greenhouse gases.”  Since then, the dangers have only grown closer and clearer with each report. New reports not only forecast hazards but describe the present chaos too. As the 2018 Special Report (SR15)  explained: “we are already seeing the consequences of 1°C of global warming through more extreme weather, rising sea levels and diminishing Arctic sea ice, among other changes.”  

 

Wasted Time

As this story has shown, climate science is not a new discipline and the scientific consensus on climate change is far older than many people think. Ironically, the history of climate denialism is far shorter. Indeed, a 1968 Stanford University study that reported “significant temperature changes are almost certain to occur by the year 2000 and these could bring about climatic changes,” was funded by the American Petroleum Institute. During the 1970s, fossil fuel companies conducted research demonstrating that CO2 emissions would likely increase global temperature. Only with political changes in the 1980s did climate denialism take off. 

 

Not only is climate denialism relatively new, but it is uniquely American. No other Western nation has anywhere near America’s level of climate change skepticism. The epidemic of denialism has many causes. It is partly the result of a concerted effort by fossil fuel interests to confuse the American public on the science of climate change. It is partly due to free-market ideologues that refuse to accept a role for regulation. It is partly because of the media’s misguided notion of fairness and equal time for all views. It is partly due to the popular erosion of trust in experts. It is partly because the consequences of climate change are enormous and terrifying. Yet, you can no more reject anthropogenic climate change than you can reject gravity or magnetism. The laws of physics operate independently of human belief. 

 

However, many who bear blame for our current predicament do not deny the science.  For decades, global leaders have greeted dire forecasts with rounds of empty promises. James Hansen has been frustrated the lack of progress since his 1988 testimony. “All we’ve done is agree there’s a problem…we haven’t acknowledged what is required to solve it.” The costs of dealing with climate change are only increasing. Economic harms may run into the trillions. According to the IPCC’s SR15, to avoid some of climate change’s most devastating effects, global temperature rise should be kept to below 1.5°C above pre-industrial levels. That would likely require a reduction in emissions to half of 2010 levels by 2030, and to net-zero emissions by 2050. Had the world embarked on that path after Hansen’s spoke on Capitol Hill, it would have required annual emissions reductions of less than 2%. Now, according to the latest IPCC report, the same goal requires annual reductions of nearly 8%. 1.5°C appears to be slipping out of reach. 

 

We have known about the causes of climate change for a long time. We have known about the impacts of climate change for a long time. And we have known about the solution to climate change for a long time. An academic review earlier this year demonstrated the impressive accuracy of climate models from the 1970s. This is no longer a scientific issue. While science can continue to forecast with greater geographic and temporal precision, the biggest unknown remains our action. What we choose today will shape the future.  

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173971 https://historynewsnetwork.org/article/173971 0
New PBS Documentary "McCarthy" Highlights a Tumultuous Time in Our History

 

The new American Experience documentary “McCarthy," premiering January 6 on PBS, details the meteoric rise and fall of Wisconsin Senator Joseph McCarthy. Through interviews with historical scholars and first-hand witnesses, this impeccably-sourced film tells the harrowing story of McCarthyism, the House Un-American Activities Committee, and the fear of a sinister communist threat—real or imagined—that gripped the American people during the Second Red Scare. By revisiting this tumultuous period in American history, viewers can reflect on our nation’s controversial past and draw clearly defined parallels between McCarthy’s time and our current political climate.

 

Ahead of the film’s release, I had the exciting opportunity to speak with its director, Sharon Grimberg, about the value of public television as a historical platform, the importance of reaching younger viewers, and what we can learn from the McCarthy era in modern-day America.

 

Q: What value, in your opinion, does public television have as a platform for teaching history?

 

A: Well, I worked for American Experience on staff for fifteen years. From my experience, I felt like we took on a lot of topics that are not taken on by other broadcasters, so what we really try to do is spend a lot of time really digging in and doing the research and telling stories in a very kind of in depth and thoughtful way and giving producers the time to think about the best way to tell a story. It’s harder for people to do in other broadcast networks—I don’t want to denigrate anybody else, but PBS is very committed to doing programming that’s thoughtful and also reaching people both in the home and in the classroom, reaching kids in all sorts of communities. In the past they’ve even taken pieces of the story and created a teaching module around them. So many people have said to me, “Sharon, my son just watched one of your films at school.” There’s definitely a commitment to catering to a broad range of viewers including kids in classrooms, and creating content that works for different communities. 

 

 

Q: In what ways do you think public broadcast documentaries like “McCarthy” positively impact the field of history?

 

A: I think the way that there’s a push and pull is that for a TV documentary to work, it has to have a narrative story, and not all historical scholarship is written that way, right? So I think there is a way in which there’s something to learn from telling stories narratively, because I think it is more accessible to people. But the thing about a documentary is that you can watch a two-hour documentary about McCarthy and you’re going to walk away knowing something, whereas there’s a lot of really great scholarship about this era and the Red Scare—for example David Oshinsky wrote a very, very fine book, really well-constructed and thoughtful—but it’s a much bigger commitment for the average person. On the other hand, maybe the film will encourage people to do more reading—pick up a book on the topic and make that bigger time commitment.

 

On this topic, there’s one story that always stands out to me. Years ago I was in Georgia, and my father got a speeding ticket when he was driving me here. We had to go to the police station, and when we got there, there was a police officer who was reading the collected works of Ibsen, and I asked her why. It was because she had just seen the play dramatized on PBS. She had gone to the library and gotten his collected works, because she had seen his play The Dollhouse on television. It just felt to me that there is this kind of synergy of what goes on television: someone who might not necessarily have picked up that particular book had gone to the library and gotten herself a copy, and there she was in this rural, one-room police house reading Ibsen. I do think, if you’re taken somewhere you haven’t been taken before, it opens new doors for you, and that’s what TV can do.

 

Q: When making historical content for public television, how do you appeal to younger audiences?

 

A: That’s such an interesting question. I have to say that when I was making this McCarthy film, younger people had no idea who Joseph McCarthy was—absolutely none. So I was very conscious when I was making this film about what people do and don’t know, and wanted to fill in the gaps around what people hadn’t already learned. It’s a very complicated story, and I wanted people to understand why there was a communist community in America during the 1950s: what appealed to people about communism, why they might have agreed with it at that particular moment, and why in the 1950s that might have seemed so scary. Why a woman who was a schoolteacher would be fired from her job, no questions asked, because she was a member of the communist party or even just a communist-dominated labor union. I wanted people to understand the context of those things—hoping that people who don’t come in with a lot of knowledge would understand the complexity of the time. You would need a certain level of interest to turn on the program, but I would hope that kids who see part of the film in the classroom would be intrigued.

 

Q: The documentary mentions how Joseph McCarthy actually started as a Democrat. Were his Republican views purely opportunistic, and if so, is this sort of flip-flopping a common behavioral pattern among demagogues?

 

A: It’s very hard to know exactly what he was thinking, but that’s certainly what David Oshinsky would say. He was looking at this whole landscape and he was very ambitious, and he was working as a judge for a while, but that wasn’t going to work out for him. His campaign aide even said to him once, “Why are you so glum? Even if you lose this campaign, you’re still going to be a judge.” He replied, “I don’t want to be a judge all my life—what are you thinking?” He saw that he had the chance to win the seat, and so that’s why he switched sides, I think. He was very ambitious, and I think he saw an opportunity.

 

Q: There are striking parallels between the McCarthy era and modern political discourse in terms of polarization and enmity between the two parties. Do you think this documentary will raise awareness of this repeated historical pattern?

 

A: I hope that it makes people think about the way in which democracy works, and the way in which we all have a part in that. Everybody comes with preconceptions, and it’s very easy to drown out people who oppose you instead of taking the time to figure out “why does this person see things so differently from the way I do?” That to me seems like a big take-home.

 

There was one story, one that didn’t make the final cut, about this guy called Harold Michaels who was a young Republican in Wisconsin. He had worked as a volunteer helping McCarthy get elected and worked on his reelection in 1952, but by 1954 he was completely disgusted with McCarthy. He had been a lifelong Republican, but he was one of the few to look around at what was happening and say, “This isn’t right.” So he ended up campaigning for McCarthy’s opponent in the next election. They didn’t get quite enough votes, but they made a good effort. So here’s somebody who was in the middle and shifted his perspective.

 

Q: The documentary mentions how the rest of the Republican party was wary of McCarthy, but continued to support him because “they had no other alternative.” Margaret Chase Smith was, in fact, the only Republican to stand up to him. In a political climate that is, once again, strikingly similar, what can we learn from Republicans’ reaction to McCarthy?

 

A: The thing that always sticks in my mind is the Edward Murrow broadcast, in which he says something like “this is no time for people to be quiet.” We cannot abdicate our responsibilities as citizens of a democracy; we have to protect freedom at home, and we are the beacon of freedom around the world. But we can’t defend freedom around the world if we’re not defending it here. What he’s saying is, democracy is fragile, and we are responsible for protecting other people’s freedoms and ensuring that America remains true to its founding ideals.

 

McCarthy did overstep bounds, and he became so reckless and thoughtless that people who hadn’t done anything were hurt. I think, if anything, we don’t do enough to explain what happened to some of the people called before his committee—one man even killed himself.  There were real, horrible consequences for people who, sometimes, had done nothing but be part of a communist-dominated union. They weren’t traitors to their country; they were just left-leaning. But he so overstepped his bounds that people began to have a distaste for that sort of ideology. Some of my historians in the film would say that he made anti-communism so distasteful that the government actually backed too far away from it, because of what had happened to ordinary people. It was counter-productive to be that reckless with ordinary people’s lives.

 

One story that really stuck out to me was when a graduate student named Leon was charged with contempt of congress for refusing to answer McCarthy’s questions. He was brought to trial, but the case was thrown out on a technicality when McCarthy walked into the room and a group of Irish Catholic spectators began to cheer. The judge threw out the case because the jury had been tainted by cheering, and then held a bench hearing where he found that McCarthy didn’t have jurisdiction over the case. Even though the case was throw out, Leon couldn’t find a job after, so he moved to Canada for over a decade. Leon’s experience really illustrates that most of those targeted by McCarthy were uninfluential people—they didn't have access to state secrets, they were just ordinary, left-leaning citizens.

 

I think that, overall, what the film says to people is that democracy is fragile, and we all have a responsibility to protect it. The ordinary person does that by voting and protesting and writing letters, and elected representatives by acting correctly.

 

Q: What did you find most interesting or surprising when you were creating this film?

 

A: What I found interesting is that America was very, very divided back then, and you could see that in the way that Republicans and Democrats treated each other in the Senate. It was a very divided time, venomous and acrimonious. You could see that in archives, even—representatives got horrible letters from constituents, which is different from today because the sort of thing wasn’t broadcasted on social media.

 

Another thing is that even though McCarthy was chastised in the McCarthy hearings and then censured by the Senate, if you look at the polling all through the fall of 1954, there was still a very solid 30% of the country that supported him. It didn’t waver.

 

There is a whole lot of baggage that comes with how we understand things: you can point to anyone on the political spectrum and see that people come to things with their preconceived notions and worldview intact. You trust your sources because they fit into your worldview, and that doesn’t allow you to shift your position much because that’s what you bring to the table.

 

I was very struck with that. Despite everything, there were a lot of people who still felt he was fighting a good fight. And that’s why I wanted the film to end the way it does—with Splits saying that he was a patriot—because a lot of people still felt that way.

 

Q: Finally, what do you hope your audience will take away from “McCarthy?”

 

A: I think that I would say what I said before—that democracy is fragile and we’re all responsible. That, to me, seems an enduring truth. We can’t count on always living in a democracy, but we need to work at it to make sure that the vulnerable aren’t exploited, that the powerful don’t usurp too much power, and that the justice system is working fairly. Those things don’t happen unless we keep are eyes open and we’re active. We as citizens have a responsibility to keep our eyes open, listen, and speak up when we see things that aren’t right.

 

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173886 https://historynewsnetwork.org/article/173886 0
A Historian Reflects on the Return of Fascism

Members of the Maquis, French Resistance Fighters in World War II

 

Back in 1941, the year of my birth, fascism stood on the brink of conquering the world. During the preceding decades, movements of the Radical Right―mobilized by demagogues into a cult of virulent nationalism, racial and religious hatred, and militarism―had made great strides in nations around the globe.  By the end of 1941, fascist Germany, Italy, and Japan, having launched massive military invasions of other lands, where they were assisted by local rightwing collaborators, had conquered much of Europe, Asia, and the Middle East.

 

It was a grim time.

 

Fortunately, though, an enormous movement arose to resist the fascist juggernaut.  Led by liberals and assorted leftists around the world and eventually bolstered by the alliance of Britain, the Soviet Union, and the United States, this resistance movement ultimately prevailed.

 

The antifascist struggle of World War II established the groundwork for a new and better international order.  In January 1941, U.S. President Franklin D. Roosevelt, in a major public address, outlined what became known as The Four Freedoms.  The people of all nations, he proclaimed, should enjoy freedom of speech and expression, freedom of worship, freedom from fear, and freedom from want.  That August, Roosevelt and British Prime Minister Winston Churchill unveiled the Atlantic Charter, declaring that people should have the right to choose their own form of government, that force should be abandoned in world affairs, and that international action should promote improved living and working conditions for all people.  

 

These public declarations―coupled with the widespread discrediting of rightwing parties, movements, and ideas―led directly to the establishment, in 1945, of the United Nations.  According to the UN Charter, the purpose of the new world organization was to “to save succeeding generations from the scourge of war,” “to reaffirm faith in fundamental human rights,” and “to employ international machinery for the promotion of the economic and social advancement of all peoples.”

 

And, in fact, in the decades following World War II, there were significant strides forward along these lines.  Led by Eleanor Roosevelt, the United Nations issued a Universal Declaration of Human Rights, setting forth fundamental rights to be protected.  Furthermore, much of Europe, the cockpit of two terrible world wars, cast aside nationalism to establish a federal union.  Moreover, a wave of decolonization freed much of the world from foreign rule, UN forces engaged in numerous peacekeeping operations, and the United Nations and many national governments established economic aid programs for the world’s poorest countries.

 

Admittedly, national policies sometimes fell short of the new internationalist, antimilitarist, and égalitarian ideals and programs.  Governments―and particularly governments of the major powers―all too often ignored the United Nations and, instead, squandered their resources on military buildups and terrible wars.  Many governments also had a spotty record when it came to respecting human rights, promoting social and economic progress, and curbing the rising power of multinational corporations.

 

Even so, for decades, humane domestic policies―from banning racial discrimination to scrapping unfair immigration laws, from improving public health to promoting antipoverty efforts and workers’ rights―remained the norm in many nations, as did at least a token genuflection to peace and international law.  Political parties with a democratic socialist or liberal orientation, elected to public office, implemented programs emphasizing social justice and international cooperation.  On occasion, though far less consistently, centrist and communist governments fostered such programs, as well.  Only parties of the Radical Right attacked these policies across the board; but, swimming against the tide, they remained marginal.

 

Nevertheless, in the last decade or so, enormous headway has been made by movements and parties following the old fascist playbook, with rightwing demagogues trumpeting its key elements of virulent nationalism, racial and religious intolerance, and militarism.  Seizing, particularly, on mass migration and funded by avaricious economic élites, the Radical Right has made startling progress―undermining the European Union, contesting for power in Britain, France, Germany, the Netherlands, and Greece, and taking control of such countries as Russia, India, Italy, Hungary, Poland, Turkey, Brazil, the Philippines, Israel, Egypt, and, of course, the United States.

 

Long before the advent of Donald Trump, the Republican Party had been shifting rightward, pulled in that direction by its incorporation of Southern racists and Christian evangelicals.  This political reorientation sped up after the election of Barack Obama sent white supremacists into a frenzy of rage and self-pity.

 

Trump’s 2015-16 campaign for the presidency accelerated the GOP’s radicalization.  Drawing upon unusually hate-filled rhetoric, he viciously denounced his Republican and Democratic rivals.  Along the way, he engaged in his characteristic lying and mocked or incited violence against his critics, the disabled, immigrants, racial minorities, Muslims, women, and the press.  His racism, xenophobia, and militarism, combined with his thuggish style and manifest lack of qualifications for public office, should have doomed his campaign. But, instead, he emerged victorious―a clear sign that a substantial number of Americans found his approach appealing.

 

As president, Trump has not only displayed a remarkable contempt for truth, law, civil liberties, the poor, civil rights, and women’s rights, but catered to the wealthy, the corporations, white supremacists, and religious fanatics.  He has also proved adept at inciting hatreds among his rightwing followers through racist, xenophobic diatribes delivered at mass rallies and through propaganda messages. Meanwhile, he has forged close alliances with his authoritarian counterparts abroad.  Either out of fear or love, Republican officeholders cling ever more tenaciously to him as the nation’s Supreme Leader.  If the GOP is not yet a fascist party, it is well on its way to becoming one.

 

Having grown up at a time when ranting maniacs dispatched their fanatical followers to stamp out freedom and human decency, I am, unfortunately, quite familiar with the pattern.

 

Even so, the struggle to shape the future is far from over.  During my lifetime, I have seen powerful movements wage successful fights for racial justice, women’s rights, and economic equality.  I have seen massive campaigns successfully challenge wars and nuclear insanity.  I have seen the emergence of inspiring political leaders who have toppled dictatorships against incredible odds.  Perhaps most important, I have seen millions of people, in the United States and around the globe, turn the tide against fascism when, some eight decades ago, it threatened to engulf the world.  

 

Let’s hope they can do it again.   

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173972 https://historynewsnetwork.org/article/173972 0
The Tea Party Revisited Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

Ten years ago, the Tea Party was big news. The Tea Party announced itself just as I began writing political op-eds in 2009. I found them deeply disturbing. They proclaimed their allegiance to freedom as loudly as they threatened mine. I didn’t agree with their economic claims that the deficit was America’s biggest problem, and I suspected their pose as the best protectors of the Constitution was a front for less reasonable beliefs about race, gender, and religion.

 

Founded in 2009 as a reaction to the election of Barack Obama as President, the federal bailouts of banks and other institutions in the wake of the great recession of 2008, and, later, the passage of the Affordable Care Act in 2010, the Tea Party entered conservative politics with a splash in the 2010 elections. NBC identified 130 candidates for the House and 10 for the Senate, all Republicans, as having strong Tea Party support. Among them, 5 Senate candidates and 40 House candidates won election. Those numbers are very high, because many Tea Party candidates defeated established politicians. Pat Toomey in Pennsylvania, Rand Paul in Kentucky, Marco Rubio in Florida, Ron Johnson in Wisconsin, and Mike Lee in Utah defeated more established politicians, including some incumbents, in both parties. They are all still Senators. Among the 5 Senate candidates who lost, Christine O’Donnell in Delaware, Sharron Angle in Nevada, and John Raese in West Virginia took extreme and sometimes laughable positions; Ken Buck in Colorado and Joe Miller in Alaska lost by tiny margins.

 

The Tea Party claimed to follow an ambitious agenda. One list on teaparty.org of “Non-negotiable Core Beliefs” included many economic items: “national budget must be balanced”; “deficit spending will end”; “reduce personal income taxes a must”; “reduce business taxes is mandatory”. A slightly different list called the “Contract from America” was also heavy with economic priorities: a constitutional amendment requiring a balanced budget; a single-rate tax system; “end runaway government spending”; “stop the pork”. The Contract included no social issues at all. The Core Beliefs began with “Illegal Aliens Are Here Illegally”, and included “Gun Ownership is Sacred”, “Traditional Family Values Are Encouraged”, and “English As Core Language Is Required”. Tea Partiers claimed complete allegiance to the Constitution as originally written.

 

Recently many commentators have asserted that the Tea Party was a failure and is dead. A NY Times article said “the ideas that animated the Tea Party movement have been largely abandoned by Republicans under President Trump”, because deficit spending has ballooned since he took office. Senator Rand Paul said “The Tea Party is no more.” A New Yorker article noted “the movement’s failure”, because they did not achieve a repeal of Obamacare. Jeff Jacoby, the conservative columnist for the Boston Globe, “mourned its demise in February 2018 under the title, “The Tea Party is dead and buried, and the GOP just danced on its grave”. He focused on the Tea Party’s inability to get Republicans to rein in spending.

 

Most of the successful Tea Party candidates from 2010 are no longer in Washington. Aside from the 5 successful Senators, only 16 of the 40 Tea Party House membersare left. Justin Amash recently left the Republican Party after indicating support for impeachment. But those figures are not a surprise. The average tenure in office of a member of the House is just under 10 years, so about half should have left by now. Two moved up in the political world. Mick Mulvaney is now head of the Office of Management and Budget. Tim Scott won election as a Senator. 

 

The whole narrative of Tea Party failure is wrong, in my opinion. While Tea Party organizations proclaimed high-minded principles of fiscal restraint, I don’t think that complex budgetary issues or particular readings of the Constitution motivate masses of voters. Today’s Republican Party is entirely in the hands of Trump, he completely ignores adherence to the Constitution and maintaining a balanced budget, and Tea Partiers are delirious with joy. The enthusiasts who scream at Trump rallies are the same people who signed on to the Contract from America in 2010. Trump embodies their real core beliefs: white supremacy; opposition to abortion rights, gay marriage, transgender people and anything that appears to deviate from their mythology of the “traditional family”; opposition to government regulation of private business, but support for government intrusion into private life; opposition to gender equality.

 

The social scientist Theda Skocpol, who studied Tea Party grassroots at the beginning, dismissed their economic policies as window dressing. She argued in 2011that these white older conservative Americans “concentrated on resentment of perceived federal government “handouts” to “undeserving” groups, the definition of which seems heavily influenced by racial and ethnic stereotypes.” She noted that “the opposition between working and nonworking people is fundamental to Tea Party ideology”, and that “nonworking” was assumed to refer to non-white. In a recent interview, Skocpol identifies Tea Party advocates as Christian conservatives, not libertarians. Today the Christian right shouts its joy about Donald Trump from every pulpit.

 

I was right and wrong about the Tea Party in 2010. I recognized that “The Tea Partiers are wrong. The people they support will increase government intrusion into our private lives, under the guise of protecting us from enemies all around, and will help big business exploit our private resources.”

 

I also wrote, “They won’t change American politics. Despite putting pretty faces like Glenn Beck and Sarah Palin on their posters, they’re way too unattractive. Like the guy who strolls into Starbucks with his gun, they might get a lot of attention, but they’ll make no friends.” How wrong that was. Their disdain for the views of other Americans, their distorted understanding of the Constitution, their blindness to facts which do not support their ideology, their racism and sexism, are now in control of the White House. The Republicans they called RINOs are gone.

 

They only supported limited government when a black man was President. Now they shout for the arrest of anyone they don’t like. The Tea Party no longer needs to attack the Republican Party from the right. They are the Republican Party, and their desire to recreate our country in their image is non-negotiable.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/blog/154297 https://historynewsnetwork.org/blog/154297 0
Trump is no Hitler: His Enablers are the Greater Problem

 

Comparisons of Donald Trump and Adolf Hitler are becoming more relevant as the president responds to further revelations of his priorities and his impeachment. Despite the protestations of some analysts who claim the contrast has no value, it is worth considering any historical antecedents that might give us insight into the current distressing political climate.

 

The two leaders’ tactics and personalities have been reported and analyzed over the last few years with fascinating similarities. For example; Trump’s first wife, Ivana, revealed he kept a collection of Hitler’s speeches at his bedside, and a respected Hitler biographer and scholar, Ron Rosenbaum, claims the president continues to use Mein Kamph as a playbook. Trump's inflammatory rhetoric and scapegoating certainly emulates the infamous German leader. 

 

However, most of the Trump-Hitler comparisons have been made without consideration of how the facilitators of dangerous leaders are central to understanding any parallels.

 

As Washington’s climate becomes dramatically polarized, the behavior of Trump’s allies reflect an abhorrent historical pattern. Impending disaster looms as facts are dismissed and personal attacks are the only response of the president and his defenders. We relentlessly hear the details in daily news reports; outright lies and fabrications have become so prevalent there is a growing tolerance to the climate of dysfunction. 

 

If we consider the larger picture, a blind and immoral political frenzy has seized the United States, similar in some ways to pre-war Nazi Germany. The rise of one of the most destructive dictators in world history was promoted and tolerated in a similar atmosphere. 

 

There are, of course, some clear differences. Though within his base he has a small army of domestic neo-Nazis and some supporters in the military, the president is incapable of exercising the kind of brutal authority that Hitler used to gain full control. He has succeeded in putting refugees in camps, but Trump has not been able to round up his perceived enemies and have them silenced. His hostility towards the press has yielded some violence, though much to his frustration, media continues to report on his worst offenses. Although he intimidates witnesses and detractors, Trump has not succeeded in creating enough animosity towards any enemy to deter investigators or distract a majority of the public from his ridiculous behavior. He imagines being all-powerful, but the president will not have the Capitol burned as Hitler torched the Reichstag. Trump has so far failed in his ability to control the country with the efficiency of the German führer. 

 

The degree of Trump's success has been achieved only because of the leverage he has on political cronies. Trump is no Hitler, and his acolytes are at a loss, lacking a truly powerful leader. In their empty defense of the president, whether repeating Trump’s angry excuses or remaining silent, many members of the party of Lincoln feign their innocence in allowing the unfolding constitutional crisis. How they manage to do so in the face of articles of impeachment and very specific constitutional violations will be telling. But the overt posturing of those compromised by self-serving loyalty and blatant hypocrisy is a symptom of political cowardice, thus making any change in position unlikely.

 

The collaborators who claim they are behind the president are watching the bizarre show, waiting to see which direction the wind blows with the public to best protect their own interests. They are not fools, thus like the president, they willingly put personal interests before the country’s. What degree of insanity and illegality will be needed before they admit the president deserves removal from office? 

 

Consider if both the Senate and the House were now controlled by Republicans, would there be any remnant of balance of power? And how much control would be granted to Mr. Trump? Perhaps Congress would tolerate a declaration of a national emergency that transferred all of their powers to the president’s cabinet. Could it happen here?

 

Hitler’s corporate and political supporters assisted in the collapse of Germany’s constitutional democracy in 1933, although a similar coup is very unlikely in the United States. Trump may have a narcissistic personality disorder and erupt with vitriolic diatribes like Hitler, but his minions have given their loyalty to a self-absorbed, self-incriminating president who is driving the country towards chaos. They have fallen for the tactics of a charlatan who commands the most powerful and menacing military force on the planet.

 

The president's continuing egomaniacal diplomatic decisions have initiated distress among some in his party. However, Republican defenders have not dared to admit the obvious: Trump’s betrayal of US international interests and his illegal seizure of foreign affairs for personal gain are part of the same outrageous, subversive behavior, establishing without doubt that he is impeachable. 

 

This bizarre and divisive epoch has exposed moral failings in government of the highest order. Investigations and judicial rulings must further unfold before there is any consensus on Trump’s legacy. Yet the empowerment granted by his protectors has already initiated a collapse of constitutional order and degradation of government.

 

We witness a repeating pattern that echoes through centuries: the enablers of ruinous tyrants light fires that eventually consume them. Some will recognize their catastrophic failures only in retrospect.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173973 https://historynewsnetwork.org/article/173973 0
Reexamining the Mayflower 400 Years Later

 

Adapted from The Journey to the Mayflower by Stephen Tomkins, published by Pegasus Books. Reprinted with permission. All other rights reserved.

 

There were a number of considerations that made the Separatists look to America in 1617, but the most commonly cited motive for the sailing of the Mayflower– to escape persecution and worship freely – was not one of them. Persecution had driven them out of England to the Netherlands, but they had not suffered persecution in the Netherlands and the Leiden church had especially good relations with the Dutch. Persecution prevented them from going home, but not from staying where they were. 

 

The key consideration, according to the Separatists themselves, was that life in Dutch cities seemed just too grim for their church to have any future. They were losing the older generation, Bradford said, who seemed to be dying prematurely after years of unskilled urban labor, or, having used up their savings, returned to England to lodge with family. And they were losing the younger generation, who were fed up with godly poverty and surrounded by worldliness they would never have encountered in an English village. ‘Getting the reins off their necks’, Bradford said, some young people joined the army, some went off to sea and others took ‘worse courses’. Those young people who remained in the faith, he said, joined in their parents’ labors and were incapacitated by them. The leaders feared that within years their church would collapse. America, they imagined, would allow them to create an English village 3,000 miles from the nearest bishop, returning to the agricultural life they yearned for, and escaping the fleshpots of Leiden and Amsterdam.

 

A sense of failure in their mission added to the Brownists’ discontent, according to Winslow, who lamented ‘how little good we did, or were like to do’. He had hoped that the purity of their church would inspire the Dutch to a new reformation, but they ignored them. The English paid more attention, and many were impressed by the Brownists’ example, but too few were willing to join them in the Netherlands, and fewer stayed. Admittedly, America would offer still fewer opportunities to convert the English, but perhaps, Bradford said, they would be God’s instrument for converting the Americans.

 

On top of all the difficulties of life in the Netherlands, the English feared that it was about to get worse. The Spanish truce expired in 1621, and it seemed likely that the war would resume. This was a serious consideration, though war with Spain had not stopped the church coming to the Netherlands in the first place. 

 

Another motive Winslow mentioned for going to America was, paradoxically, to stay English. The younger generation had gradually integrated with Dutch society, and it was clear, as Winslow put it, ‘how like we were to lose our language and our name of English’. This undermined their mission to model reform to the English churches. If the Brownists did survive in the Netherlands, becoming just another Dutch denomination would defeat their object in being there.

 

There is, however, something intriguingly inconsistent and incoherent about the reasons the colonists gave for their decision. Although Bradford’s main explanation was that they were driven to America by the grimness and poverty of life in Leiden, which lay behind all their other problems too, yet he says elsewhere: ‘At length they came to raise a competent and comfortable living’; and though it was not easy at first, he adds,‘(after many difficulties) they continued many years, in a comfortable condition’. He describes their departure from ‘that goodly and pleasant city’. Above all, he says they looked to America as a ‘place of better advantage and less danger’ than Holland, but also says that they were aware of its ‘inconceivable perils’. ‘The miseries of the land ... would be too hard to be borne.’ It would be likely ‘to consume and utterly to ruinate them’. They feared famine and nakedness, ‘sore sicknesses, and grievous diseases’. They had heard terrifying accounts of the indigenous Americans torturing people to death. They were aware of how many settlements had failed and how many lives had been lost. Add to all this the perils of the journey itself, and the frailty of the travelers, and one has to wonder in what sense exactly Bradford was using the phrase ‘less danger’.

 

When members voiced their serious doubts about the expedition Robinson and Brewster proposed, the reply, as Bradford has it, is fascinating: ‘All great and honorable actions are accompanied with great difficulties; and must be both enterprise, and overcome with answerable courages.’ The reason they needed to leave Leiden, supposedly, was to escape its hardships; and when critics of the plan worried that America would have great hardships, they were told that this great enterprise was worth great hardships. So what was the enterprise, if not to escape hardships?

 

The colonists seemed to have difficulty articulating exactly what compelled them to leave the Netherlands for America. They had a sense of an important enterprise, which, though they might rationalize it in terms of escaping the difficulties of life in Holland, did not meet that objective terribly well. Their calculation of the risks and imperatives does not add up unless we remember how ingrained it was for the Separatists to see their story in biblical terms, as following biblical patterns. They saw their reflection in countless scriptural parallels, but above all in the exodus – God leading the new children of Israel out of the bloody and antichristian land of their birth, to a place he had prepared for them. From Browne’s first writing, to Helwys’s last, the Separatist equation was repeated (critically in Helwys’s case): ‘England was as Egypt’. They had been delivered and there could be no going back. The Netherlands, however, had not felt at all like the Promised Land, which meant that instead they were in the wilderness, the stretch of wandering and learning that occupied Israel before reaching Canaan. They were God’s ‘little church, fled into the wilderness’, in Ainsworth’s words, or, as Johnson put it, they had followed Moses’ path, rejecting the treasures of Egypt ‘to suffer adversity with the people of God’; they were ‘but strangers and pilgrims’.

 

The Separatists were prime examples of the fundamentally forward-looking nature of Protestantism – ‘a religion of progress’, as Alec Ryrie puts it, ‘of restless, relentless advance toward holiness, not of stagnation’. Each stream saw itself as the culmination in a century of progress towards truth and obedience, while still challenging its own orthodoxies, their church covenants explicitly committing them to embrace God’s future revelations, to be ready to take the next step. Individual members had once walked miles to acceptable churches, then moved to London maybe, then sailed for Amsterdam, then moved again to Leiden or Emden. If their dreams had failed to come true and no simple way forward presented itself, then their whole religious experience and outlook told them God would reveal a new path ahead. Satan called them back, the Lord called them on. And if the way forward involved a terrifying leap of faith, ‘yet might they have comfort in the same’. Whatever America might entail, it was not retreat; it was not going back to Egypt.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173967 https://historynewsnetwork.org/article/173967 0
A Personal History of Vietnam War Refugee Policies

 

When the war ended in Vietnam in 1975, America was completely unprepared for the sudden influx of refugees who came to the United States from South Vietnam. There was no policy for letting them in. There was no policy to keep them out. The Vietnamese who came did so because they felt they had no other option. 

 

That first group to arrive was a diverse lot. Some worked for the American press. Others were in the employ of big American companies with construction and communication projects in South Vietnam. Some were professionals -- lawyers, teachers, and doctors and nurses. Some worked for South Vietnamese non-profits. There were cooks and dishwashers, street cleaners and soldiers. Many were family members who feared reprisals from the North Vietnamese. Their backgrounds and occupations may have been different, but they all shared one overriding goal: they did not want to live under communism and the domination of Hanoi and the Viet Cong, who were equally hated and feared.   

 

Throughout its history, America has seldom been kind or overly hospitable to refugees. Despite the millions who came in the late 19th and early 20th centuries when our doors were open without prejudice, there has always been an unfounded fear that the newcomers would change the American way of life. After World War I, nationalists, always a powerful factor in American politics, decided enough was enough. They wanted to stop the flow of Italians, Jews, Irish, Germans and the Chinese into the U.S. To mollify the nativists, Congress enacted laws that created strong quotas and tough strictures on immigrants. Deportations for illegal immigrants mainly from Mexico and Central America are also nothing new. For example, between 1931 and 1940 America deported more than 500,000 Mexican-Americans back to Mexico. 

 

In the case of Vietnam, however, these strictures were relaxed, perhaps because of guilt over the way the war had ended. 

         

*****

Josephine Tu Ngoc Suong was born in Saigon in 1942 during the Japanese occupation. She had worked for many years for NBC News before I arrived in Saigon in 1966. We married in December 1968 in a simple ceremony at the Hong Kong city hall. Josephine's father, Tu Hong Phat, had  lived though the long French occupation, the short Japanese takeover during World War II and then the return of French domination until the fall of Dien Bien Phu and the new dominance by America. Through hard work he carved out a confortable life for himself and his family who lived on a typical, unpaved street with a market and Catholic Church in central Saigon. 

 

Before the war was over, Josephine and I were living in Rockville Centre, New York, She and her father wrote many letters to each other. I recently found a few surviving letters her father wrote about what he observed in Saigon as the war was ending. The letters give us a rare insight into what an average Vietnamese man thought about the war. 

 

There were major battles on the days when he wrote. Fighting and chaos were everywhere. Most of Mr. Tu's letters were realistic, well informed and filled with information about the war, how Josephine's three brothers were faring and how the family was dealing with food shortages. 

 

He thanked Josephine for packages of food that she sent weekly. He told her how he sent his wife, her mother, Nguyen Thi Ba, to purchase 500 kilos of rice, dry shrimp, and dry Vietnamese sausage that could last as long as five months if the military situation grew worse and there was a shortage of food. 

 

He wrote "The military situation in the south of Vietnam is in an extremely serious stage. Numerous events have occurred recently, never seen in the history of this war." It was surprising to see how well informed Mr. Tu was about the war. I never trusted the local press in South Vietnam, but Mr. Tu obviously knew how to parse the information, real and rumored, from a myriad of sources. 

 

Mr. Tu wrote about the provinces in northern South Vietnam that had recently fallen to Hanoi: "The communists have taken this opportunity to change their guerrilla fighting over to tactical and strategic fighting using main force troops in strength. Our South Vietnamese forces often withdrew its troops before clashing with the enemy." He said, "according to the latest sources, the northern provinces are in a shambles with over 10,000 North Vietnamese troops fighting near Binh Dinh." He told Josephine "refugees keep flowing deeper south even as far as the southern resort of Vung Tao." 

 

At the time of this letter, the ARVN, South Vietnam’s army, was on its own. American troops had left the country almost two years prior. South Vietnam's ground forces were not doing well without American support, hardly a surprise to most observers.

 

Toward the end of one long letter Mr. Tu became political, voicing some of his thoughts about life in Vietnam as the war was ending. "In brief, the people who are living under freedom could have a happy life if they worked hard. Living in a free society, the people have all their rights, while the communists have none. People under communism must listen to the Communist party if they want to live. If the communists appear anywhere, the place is soon ruined and damaged. No one can forget the VC general offensive during the New Year of 1968. Thousands of innocent people have been buried alive at Hue, the former Imperial capital. That is why people are fleeing the communists. But the present regime in South Vietnam is not much better since there is much injustice and corruption in the military and administrative machinery. Evil."

 

As the war was ending, NBC decided it had a moral obligation to bring to America any Vietnamese employees and their families who wanted to come here. Josephine's family never hesitated. With the help of NBC News they were part of the first wave of 125,000 immigrants who came to the U.S. in 1975. The whole family, except for Josephine’s three brothers who were in the South Vietnamese military, left.

 

The Vietnamese who successfully fled Vietnam were housed in refugee camps in the United States, on Guam, or in the Philippines. Josephine’s family members were sent to California and Arkansas.

 

Meanwhile, as Saigon was falling in 1975, the brothers were at the Bien Hoa Air Base outside Saigon. Unsure of their next move, they decided to make their way to the family home.

 

Their friend Xuan was still serving in the navy as the situation in Saigon deteriorated. Xuan said that "patrols on the water were normal until the morning of April 28 when the VC opened a heavy offensive against the Thanh Giang Bridge," the biggest span on the highway into Saigon. A battle ensued with no definitive result. 

 

After that battle, Xuan drove with his commander to a meeting in the Rung Sat jungle riverine force zone. His commander had an order to evacuate all his navy ships and he asked Xuan if he would like to go with him. Xuan agreed to flee but He had more to do. He took a chance and went to the Tu home where he found Khiet, Khai and Quan waiting for something to happen. It did. The brothers agreed to come with Xuan on the navy boat. Unfortunately, Xuan’s parents didn’t want to leave Saigon. "My parents still would not leave. My father would not budge," Xuan recalled.

 

Saigon was about to fall and the Viet Cong were everywhere so the men could no longer delay their departure. It was time to get moving. After securing extra food for his family and saying goodbye to his mother and father, Xuan and the three brothers made their final scooter ride to Nha Be where the patrol boats were docked at the rendezvous point ready to go. 

 

As they fled, Xuan recalled, "Outside the main streets of central Saigon, the city seemed as a dead city. No houses or stores were open. People stayed inside unsure of what would come next." It was April 29, 1975, one day before Saigon fell and war officially ended. 

         

After briefly staying in a small house with about fifty other people, they boarded the ships. Worried they might have to fight the enemy to escape, the men mounted weapons on the ship’s deck. Luckily, the small fleet with its lights turned off, escaped undetected. Xuan remembered a moving, tearful ceremony as the crew lowered the Republic of Vietnam flag as the passengers said goodbye to their homeland. After seven days at sea they arrived in the Philippines. From there they transferred to a large commercial ship that took them to Guam. They spent two weeks there before being flown to Fort Indiantown Gap, Pennsylvania. 

 

While the family was still settling into our house on Long Island, I had to take up a new assignment for the Today Show in Washington, D,C. Josephine and I flew to D.C. to look for a place to live. While house hunting, we received a 4 A.M. phone call from Josephine’s 23-year-old brother Quan. He was calling from the National Guard camp at Fort Indiantown Gap. Quan, his brothers Khiet and Khai, and their friend Xuan told us they ended up there after a long journey at sea. 

 

Knowing Josephine’s brothers were safe gave added impetus to our house hunting.The day after the 4 AM phone call Josephine and I rented a big home in suburban Maryland that would take care of the family at least temporarily. That Sunday, Josephine and I drove to Pennsylvania to find her brothers and their friend.

 

When we arrived at Indiantown Gap, I told the guard at the gate what we wanted and he unhesitatingly directed us to the administrative building. I entered the office, explained what I wanted to the desk sergeant, and handed him a piece of paper with the names of the men we wanted to take home with us. He grunted in assent, sent a corporal to find the men and soon they were in the office, hugging Josephine, laughing and crying as I signed them out into my custody. 

 

We walked back to the car and I drove to the first Burger King I saw. Everyone ordered burgers, fries and Cokes. Their first real meal in America and, of course, it had to be the fastest of fast food, the most iconic welcome they could have had. They ate, and kept eating. 

         

When we returned to Rockville Centre (we wouldn’t move to Maryland for a few more weeks), there was no negative reaction to the sudden influx of refugees in my Long Island town. Donations of clothing from the two Jewish temples and the Catholic diocese started arriving on our front porch. People cooked meals for us and volunteered to drive family members to stores and medical appointments. Others volunteered to help Josephine in any way she needed. Kindness prevailed. The community outdid itself welcoming the newcomers. 

 

Josephine's three brothers, their newly adopted cousin and her father were still wearing Saigon-style sandals, leather thongs sewn into soles made from discarded rubber tires. They needed something better for their feet, more applicable to the colder weather soon to come. I piled everyone into the station wagon and drove to Baldwin, one town over, and a Shoetown store to buy shoes and socks for everyone. I told the startled clerk that I wanted sneakers and sweat socks for everyone. The men sat down, tried on various sneaker styles, made decisions on what they would wear, and walked out of the store in newly clad feet, happy with their selections and their deeper inclusion into American society.  

 

Josephine’s brothers, their friend and her father started helping around the house washing the bathrooms, cleaning the floors, washing the windows, cutting the grass and upgrading the flowerbeds. Inside and out, the house and yard never looked better. 

 

Then the whole family was off to Maryland so I could start my new job. As we pulled up to our new home, one of our neighbors watched from behind his curtains. Later, he told me he thought he was witnessing a benign invasion by a delegation from an Asian country. The fact that Josephine’s female family members were wearing their traditional dress, the Vietnamese ao dai, only heightened his impression. 

 

Once settled, I started my new job, driving to work every morning at 5:30 A.M. Meanwhile Josephine helped her family adjust to their new home, finding work for everyone so they could be independent. 

 

The men had no definable skills that translated easily into work, which made it hard to find jobs for them. Through a personal contact of mine, we were able to get Josephine's three brothers, their friend, and her father jobs cutting grass, cleaning flowerbeds, and watering lawns. The pay was low, but it was in cash and in those warm weather months they could spend their time outdoors. Soon they started to bring home money and feel more secure in their new life. 

 

It took more doing to find work for the women, who also had no transferable skills. But they were not afraid of menial labor. Josephine made a contact with a local Ramada Inn. They got jobs cleaning rooms, making beds, and washing bathrooms. These were not cash jobs but the accumulated paychecks gave the women a sense of security and worth. 

 

I must emphasize that strangers who helped my family did it with a smile, unlike today when many Americans regard immigrants with disdain and anger. I do not, and the family does not, recall having suffered because they were refugees. 

 

Once Josephine had secured jobs for everyone, her next step was housing. Her family had money coming in and collectively they could afford to move to a place of their own. However, that proved to be difficult. Buying was impossible. Renting was also difficult because none of them had a bank account. None of them had credit cards or Social Security numbers. They had never had a telephone, even in Saigon. Neither had they ever paid an electric bill. In other words, not one of the family had a record of having spent any money in their new country. That would eventually come, but not fast enough to get them their own home. However, Josephine persevered and eventually found a town house the family could rent. She and I vouched for each family member individually and we provided the money for the security deposit. 

 

In a matter of weeks, the family moved into their new home in a lower middle-class development in Gaithersburg, a few miles from where Josephine and I lived. The first night in my Potomac home without them was strange, eerily quiet, as if our basement had never been the home for those who had fled the communist takeover of South Vietnam leaving their old lives behind. 

 

Josephine’s family, continued to work and save money. They got drivers licenses, bought cars and relieved Josephine of her duties as a chauffeur. As soon as they could, the family moved into a bigger house, Here their story takes a new turn. Josephine’s mother, Nguyen Thi Ba, had been a well-known cook in Saigon who ran a small breakfast and lunch restaurant in the front courtyard of their home. 

 

Now she opened a clandestine, illegal restaurant in the front parlor of her house where she served home cooked Vietnamese food to the growing community of refugees starting to populate Maryland, Virginia, and Washington. Some neighbors complained, causing the local board of health to occasionally shut down the restaurant, but that never lasted too long. Even the health inspectors grabbed a quick meal of pho, the hearty signature soup of Vietnam, or a variety of noodles and chicken in plum sauce. The food was too good to pass up. 

 

Josephine’s family had always wanted to own a restaurant. They soon realized their dream. With the money they had been saving the family opened the first of its two restaurants, Taste of Saigon, in Maryland and then Virginia. Their aim, with Josephine as their mentor, was to stand on their own and not be a burden to anyone. In their life as restaurateurs, they were starting to reach their goal.  

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173970 https://historynewsnetwork.org/article/173970 0
A Civil War Heirloom, Ancestry.com, and the Importance of Tracing Our Family's Historical Roots

 

My grandfather, Leo “Butch” Armbruster, was a railroad engineer and a man of exceptional frugality.  Famously in our family circle, when feeding an infant grandchild, he would eat whatever was left in the Gerber jar when we were full.  The shanty behind his house bulged with corroded pipes and faucets, old eyeglasses and tobacco tins brimming with bits of hardware, and the sleds and ice skates of his now-grown children.

 

The real treasures were in his attic.  As the grandson who lived closest, who spent the most time with him, and who he knew loved history, I was favorably positioned to get occasional glimpses of, and sometimes to handle, those treasures.  And on one very special day in the mid 1960s, when I was a high school senior, I got to call his best treasure my own.

 

The treasure is a Spencer repeating rifle.  Known as the “Seven-Shot Wonder”, this invention of one Charles Spencer was the world’s first military, brass-cartridge, repeating rifle.  The Spencer Repeating Rifle Company, and the Burnside Rifle Company under license, together manufactured more than 200,000 of these “Wonders” between 1860 and 1869.  Mine is engraved, “Spencer Repeating Rifle Co., Boston, Mass., Pat’d  March 6, 1860.”

 

Boasting a 30-inch barrel, the Spencer was accurate to 500 yards.  Seven 13mm (approximately half-inch in diameter) cartridges were loaded into a spring-energized tube that slid into the breach (stock). The Wonder could spray the enemy with up to 20 rounds per minute in the hands of a well-trained Union soldier.  This involved levering a cartridge into the chamber, while simultaneously dropping the spent brass to the ground, then pulling back the hammer, aiming and firing. This was not much effort per shot when compared to the gymnastics needed to load shot and powder into a musket, put more powder onto the pan, and get off a (less accurate) lead ball.

 

In an excellent example of bureaucratic bone-headedness, the U.S. Department of War’s Ordinance Department delayed putting the Spencer into the field, because of fear that soldiers would waste too much ammunition.  You needed more mules and wagons to move all that ammunition. And you could buy several trusty Springfield muskets for the price of just one Spencer. Consequently, not until the Battle of Gettysburg did the 13th Pennsylvania Reserves demonstrate the withering worth of the Wonder. (1)

 

The Spencer’s performance at Gettysburg caught the attention of President Lincoln, who gave Charles a chance to demonstrate his invention against the best musketeers.(2)  After that, the rest, as we say, was history. The Confederates were not only outnumbered; they were pathetically outgunned.

 

Time passed

In the fall of 1965, I went off to college.  To my lasting shame, I pretty much lost touch with my grandparents.  My Spencer gathered dust in my old room at home, while I majored in fraternity at Franklin & Marshall College.  My “Pappy” passed away at 86 during my college years.  The remaining treasures in his attic got divided up among his kids.  In 1969, joining the Coast Guard after graduation to steer clear of Vietnam, I moved away from home for good.

 

The Wonder went with me, first to Cleveland, Ohio, and later to Austin, Texas, where I taught business law after getting my doctorate and law degree. Sometimes it got displayed, sometimes shoved into a closet.  All I knew of its original owner from Pappy was that his name was Aaron Henry and he hailed from the Scotch-Irish side of my grandfather’s family.  Pap’s mother’s maiden name was Morrison.  Pappy claimed Aaron was his uncle, which made him my great, great uncle.  The old veteran had retired to the Armbruster family farm.  Having been wounded in the knee, he remained in Pappy’s memory as a geezer who limped around the property until he finally passed away.

 

When my wife and I --- the family prodigals --- finally moved back to Pennsylvania in the 1980s, I made a few desultory attempts to track old Aaron down.   Pennsylvania’s Civil War military records weren’t known for their ease of access. Caught up in raising a family and pursuing a law practice, I wasn’t known for my persistence in such matters as genealogy.  

 

Enter Ancestry.com

 

As one ages, ancestry, like an old friend long taken for granted, often gains in importance.  My retirement from Rider University in June 2019 was coincident with my son’s Christmas gift --- a six-month subscription to Ancestry.com --- going “live.”  He was wise enough to know that with more time on my hands, this gift would be appreciated.

 

Thanks to the vast archival resources Ancestry has pulled into its tent, I was able to find seven “Aaron Henry” entries in the annals of Pennsylvania regimental records.  From there, it wasn’t hard to narrow the quest down to my great, great uncle. Born in Mauch Chunk (now Jim Thorpe, my hometown) on November 12, 1837, he was baptized a couple of weeks later in the town’s First Presbyterian Church.  By 1850, 13-year-old Aaron was living with his mother, her second husband Alexander Craig, and a trio of half sisters.  

 

Aaron it seems never left Mauch Chunk, until at 23 he volunteered for a three-month enlistment with the Pennsylvania Sixth Regiment on April 22, 1861.  Note that the bombardment of Fort Sumter, the first battle of the Civil War, occurred on April 12th and 13th.  Young Aaron wasted no time in joining up.

 

In fact, Dr. Gregory J.W. Urwin, an esteemed military historian and Civil War reenactor on the faculty of Temple University, tells me, “You can consider Henry one of the most ardent of Pennsylvania’s patriots by his rushing off to fight as soon as war broke out.”(3)

The Battle of Bristoe Station

According to Ancestry.com, Uncle Aaron received a musket ball he carried in his knee for the rest of his life on October 14th of the war’s third year.  Presumably, it was either more efficient or less dangerous to leave the ball in situ than to risk removing it and losing the lower leg in the process.

 

Urwin noted, “If he was wounded at Bristoe Station on October 14, 1863, that means he then joined a regiment raised to serve for three years.  The fact that he wanted to see the war through to its conclusion also says something about his character.”

 

The prelude to the Battle of Bristoe Station was the nadir of the Union army’s fortunes, a trough of incompetence from which the Yankees only began their painful climb at Gettysburg.

 

The depth of Uncle Aaron’s patriotism, as Dr. Urwin surmised, is punctuated by the fact that, during the winter and spring of 1863, the Army of the Potomac was enduring some 200 desertions a day.  On New Year’s Day, encamped on the northern banks of the frozen Rappahannock River, the troops hadn’t been paid for six months. Living conditions were deplorable, facilitating the spread of vermin and disease.(4) No wonder men lit out for home!

 

Worse was to come in the first half of 1863.  Following a desultory foray in the enemy’s general direction that ended in a spring-mud debacle, General Ambrose Burnside ---the previous year’s loser at Fredericksburg--- was replaced by Joseph Hooker.  Apparently a better quartermaster than tactician, Hooker cleaned up the camps, saw his army fed and paid, then marched them off to a crushing defeat at Chancellorsville.

 

While Hooker’s incompetence and timidity cost the Army of the Potomac some 17,000 men, Lee’s brilliant win cost the Confederacy more than 13,000.(5)  In a war of attrition, the Union could absorb the loss so much easier than the Confederacy, even allowing for the desertions and the Copperhead agitation that was stirring northern and especially mid-western anti-war sentiment. 

 

Unable to sustain a war of attrition, Lee led his army north into Pennsylvania.  In the first days of July, he met George Gordon Meade, Hooker’s successor, at Gettysburg.  The three-day engagement is often called the “high water mark of the Confederacy.” Lee would never have the initiative again.  Still, nearly two more years of war lay ahead. 

 

In the grand sweep of all the war that still remained--- an ordeal extending from Lincoln’s July 4,1863 battlefield address to April 1865 at Appomattox --- the Battle of Bristoe Station is a mere burp of a battle.  The October 14, 1863 encounter produced only 380 Yankee casualties and 1,360 Rebel dead and wounded.(6)  The facts are straightforward.  After Gettysburg, Meade and Lee played a chess game in northern Virginia.  In early October, Lee stole a march on the less agile Meade, forcing the Army of the Potomac to fall back from its southern-most advance to protect its flanks.

 

Major General Gouveneur K. Warren’s II Corps had been stung by Confederate General J.E.B. Stuart’s cavalry at Auburn (VA) on the 13th.  Warren was faced with pushing Stuart aside, while maintaining an orderly retreat from the Rebel corps commanded by Richard Ewell. He chose Bristoe Station, a stop on the Orange and Alexander Railroad line, to face the music.

 

Lt. General A.P. Hill’s Confederate III Corps reached Bristoe on the 14th. Warren deployed his troops behind a railroad embankment.  From that hidden vantage, he ambushed Hill’s troops as they rushed to catch up with the Yankee rear guard.  The encounter was a Union win.  But with Ewell’s Confederate troops advancing on his left, Warren had to rejoin the general retreat.(7)

 

Aftermath

 

Warren got a promotion out of the encounter.  He is remembered to the present day, at least by Civil War buffs, and is the subject of a relatively recent biography.  His official photograph survives.

 

First Sergeant Aaron Henry of the 81st Pennsylvania Regiment went home to Mauch Chunk. According to Ancrestry.com, he married Sarah Johnson on January 10, 1867.  She died less than a year later, apparently in childbirth.  The daughter, Jennie Henry, survived and lived until 1937. 

 

In 1870, Uncle Aaron took up blacksmithing in his hometown.  He remained single until 1884, when at 47 he married the 34-year-old Amelia Hahn with whom he had a son, Garfield.  They lived for awhile in Franklin Township, a Carbon County hamlet.  Exactly when he moved to the Armbruster farm is unclear.  In 1901, aged 64, Uncle Aaron decamped to a veterans’ home in Hampton, Virginia.  Nine or maybe ten years later, he returned once again to Carbon County, where the disabled vet died on January 10, 1912.

 

Unlike General Warren, Sergeant Henry left behind no portrait.  I don’t even know which knee bore the musket ball.  He was one of three million Americans who fought in the War Between the States, 600,000 of whom died.(8)  Thanks to intrepid photographers with their bulky and primitive equipment, a tiny fraction of these soldiers were captured on glass plates.  For all I know, Uncle Aaron is somewhere among those unidentified fighters.

 

That he brought home his rifle was probably not so unusual.  My father’s generation returned from World War II with all sorts of memorabilia.  As kids, my brother and I played with a disarmed Japanese mortar shell, among sundry other authentic souvenirs.  My Uncle Albert had his .45 pistol.  Pappy told me the Spencer had been used for deer hunting as recently as the 1930s and it still seems to be in good working order.

 

So what?

 

So… now, at last --- more than a half century after my grandfather gave me his best treasure, I know the broad details of the life and labors of the man who carried it into battle.  What does that matter?

 

The reason I think it matters… the reason I have written about it… is the disruptive moment in which each of us, and our nation, find ourselves. Not since Uncle Aaron answered the call in April of 1861 has the nation been so starkly divided. Not since then have we heard each side shout across the divide that the “others” aren’t worthy of life itself. If we are going to raise our eyes from the abyss, gaze across it and acknowledge our fellow Americans, I believe we must first look to ourselves.  What are our personal stories that teach us that the American democracy is greater than our transient differences?  For me it’s knowing that nearly 160 years ago, my great great uncle rallied to the Union cause.  The “Wonder” on my shelf reminds me daily of my own roots.  

 

I am convinced that Churchill could not have faced the existential threat of the Third Reich and rallied his nation had he lacked his appreciation of British history and the place of his family tree in that history.

 

“If this long island story of ours is to end at last, let it end only when each one of us lies choking in his own blood upon the ground.”

 

That sentence illustrates for me how he masterfully combined the sweep of a collective historical experience with the immediate challenge facing each individual Britain, requiring each man and woman to look inside themselves.

 

I believe each of us is well advised to tear ourselves away from the ephemeral distractions of the social media and presidential tweets and TV’s talking heads, and take a little time to recall those ancestors who contributed to making each of us an American.  No matter if, as with me, one can look back a century or two, or if one must look to the history of another nation that drove a decision to emigrate to the U.S. 

 

As with my successful search for Uncle Aaron, a retracing of the tap root of what made each of us an American can be a profound reminder of why we must put our democratic republic ahead of transient sectarian differences and deal with tomorrow’s existential challenges as, collectively, the American constitutional democracy.

 

(1) Philip Leigh, Lee’s Lost Dispatch and Other Civil War Controversies (Yardley PA: Westholme Publishing 2015) at 25-36.

(2) John Walter, The Rifle Story (London: Greenhill Books 2006) at 69.

(3) Gregory J.W. Urwin, “Re: Union Pennsylvania Volunteers,” Message to James Ottavio Castagnera, August 8, 2019, via gmail.com

(4) Geoffrey C. Ward, The Civil War (New York: Alfred A. Knopf, Inc. 1990) at 184.

(5) bid. at 210.

(6) David M. Jordan, Happiness Is Not My Companion: The Life of General G.K. Warren (Bloomington: Indiana University Press 2001) at 108.

(7) Ibid. at 110.

(8) Ward, op. cit., at xix.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173968 https://historynewsnetwork.org/article/173968 0
Little Women are Bigger Than Ever

 

The Civil War was a difficult time for all American families, North and South. It was especially tough for the March family, of Concord, Massachusetts. A mom and four daughters ran the household and worked at various full and part-time jobs while their father was in the Union army. Their story was recorded as fiction in the autobiographical work Little Women, by Louisa May Alcott (it was her family). It was the story of Jo March, the eldest sister, a writer, and her professional and emotional ups and downs and the emotional struggles of her family. It is one of the country’s most beloved novels.

 

I believe I am the only living American who has never read the novel or seen any of the many movie adaptations of it. There was an elderly man who lived in the foothills of the Rockies who never read the book either, but I think he has passed  away.

 

So when I walked into my packed neighborhood theater to see the film (passing the 334 million people on line to see the latest Star Wars), I did not know what to expect. It seemed from the pre-show chatter that everyone in the theater had read the book at least four times.

 

The movie unfolded, very slowly. The first third of the film, that opened nationwide last weekend, is very disjointed. I was not sure what woman was in love with what boy, where the dad was, who was getting sick, what neighbors were friends and what Meryl Streep was doing playing some aunt with a great deal of money. . I was confused and just did not know why the book and previous movies had become, well, immortal.

 

Then, by the middle of the movie, the plot started to develop and the characters, the sisters, bloomed. From that moment on, they had me. I was enthralled with the story and the sisters. I felt like yelling at the screen to tell the sisters to do something about a situation because I so desperately wanted to help them.   The new Little Women,marvelously directed by Greta Gerwig, is a tremendous movie, an all engulfing, emotional roller coaster that not only displays the March family, with its triumphs and tragedies, but the lives of women in the middle of the nineteenth century and how they all fit into the story of the Civil War, that at the time of the story was ripping the nation apart.

 

Little Women, a Sony PIctures film, succeeds on several levels.  First and foremost, it is a loving look at several very admirable young women and the  problems they have to grapple with at that time. They fight with each other, fret with each other and most of all love each other. They march down the street together, sit in the theater together, shiver in the New England cold together and, in the end, in several ways, triumph together.

 

Director Gerwig, who also wrote the screenplay, has  made this a very modern look at life in the 1860s. There are several well-crafted dialogues from the girls about how unfair a woman’s lot was in that era – unable to do just about anything because of gender discrimination. Why does their goal in life have to be finding a husband and not finding a career? Why do men look at them as cooks and servants and not as emotional partners?  Why is the best thing a woman could do was bear several children and not have several jobs? That dialogue is clear and reflects the plight  of women today, too. Gerwig’s screenplay is as much 2019 in tone concerning women’s roles in the world as Alcott’s story was about them in the middle of the 19thcentury. The screening I attended was filled with mostly women; men should see this movie too to learn something.

 

My complaint about the story and the movie is that while set during the Civil War and about a soldiers’ family back home, there is not much about the war. There is a scene where Concord residents  provide blankets and food for coming home soldiers, several descriptions of losing loved ones, some worry over the Dad’s safety and, later, his sickness. Jo even cuts her hair for money to help get her dad get back home. The film should have offered more about the war. It should have been noted, too, that while life was hard on northern families, it was hard on southern families, too. For every March family in the Union, there was surely one in the Confederacy.

 

The roles of the women are clearly delineated.  Jo is the older sister who writes plays and novels but doesn’t get anywhere. She guides all the others, who find husbands, despite declarations that they don’t want to be just wives and moms. Mom has nothing but problems with the girls, always dispensing advice about life and love that are just as valuable today as then. She is their rock that the girls stand upon while dad is in the army.

 

In the end, life turns out reasonably well for the girls and for the Union, too. You learn a tremendous amount of history in the film – church, economics, writing, farming, Universities, modes of travel, weather, fashions and  Christmas breakfasts and dinners for rich and poor. Oh, the piano, too. It is a enchanting look at life in the mid 19th century and it is full of worry and travail, too, just as life is today.

 

Director Gerwig gets superb performances from her village of actors. The star of the story, writer Jo, is played with wonder and awe by Saoirse Ronan, who keeps all on their seats as she barrels through life. Other fine performances are by Emma Watson as Meg, Elilza Scanlen as Beth and Florence Pugh as Amy.  Laura Dern is America’s mom as their mother.  Bob Odenkirk, musket over the shoulder, is the dad and Meryl Streep is the very rich and very eccentric aunt. 

 

The film starts after the war with Jo trying to sell her work and then flashes back to the start of the war and the abandonment of the girls when dad marches off to war. The story is both Jo’s hard work as a writer and the life of the girls together with a few tragedies tossed in. The strength of it is the love of the girls for each other, even though, in anger,  they do awful things to each other.

 

These are big sisters for the little women of the story.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173974 https://historynewsnetwork.org/article/173974 0
Somewhere Over Their Rainbows - Deanna Durbin and Judy Garland

 

Do you remember actress Deanna Durbin? If so, you are one of the few.

 

How about Judy Garland? Well, of course you do - Dorothy from Oz.

 

In the late 1930s and 1940s, Deanna and Judy, just teenagers, were two of the biggest stars in Hollywood. Deanna was not only a superb actress, but as a singer had the voice of an angel. Judy had, well, Judy had it all.

 

Judy stayed in Hollywood, led a tragic life and died of a drug overdose at 47. Deanna fled the bright lights and cameras at age 29, stunning the world, moved to a farm house in France, became a recluse and never appeared in another film. Over the next 62 years, she only gave one single media interview. Two careers, two lives and two distinctly different stories.

 

Actress/singer Melanie Gall has merged the two stories into one, Ingenue: Deanna Durbin and Judy Garland, and the Golden Age of Hollywood. The one woman play just opened at the Soho Playhouse, on Van Dam Street, in New York. Gall, who also wrote the drama, has done a fine job. It is an eye opener of a tale and an absolute treasure chest of show business history. Gall plays Deanna and brings in the story of Judy in an interview with an invisible New York Times reporter. It is Deanna’s story, not Judy’s, and she sings Deanna’s music and not Judy’s (except for Somewhere Over the Rainbow).

 

I knew a little bit about Durbin, the Canadian born singer who rocketed to fame by the age of 15, but not much. Nobody knows very much about her. When Durbin fled Hollywood, she not only never appeared as an entertainer again but pulled the plug on most of her American movies and you can hardly see them any more (ironically, a Durbin movie and Garland film were playing at the same time on television last week).

 

The legend was that the two, who starred in a movie together in 1936, were lifelong bitter rivals, but really, they were not. There may have been some jealousy between them, but I doubt they were enemies. Gall, in her story, suggests that latter version, and points out that Garland thought Durbin was shortsighted in leaving the movies and wished she had remained.

 

Gall tells a fascinating and colorful story. Durbin came to Hollywood as kid, like so many others, but had a great voice and won a $100 a week contract with MGM. There, she met Garland and the two became close friends. Movie mogul L.B. Mayer did not think he needed two child stars, so he fired Durbin (the play suggests that might have been accidental and he may have wanted to boot Garland), Durbin, at her new studio, Universal, became famous right away and her first few pictures were so successful that they saved the studio (Deanna starred in  21 movies in her storied career). Judy caught fire with The Wizard of Oz and became immortal. The rumor was that MGM wanted Deanna for the role of Dorothy in Oz and that she auditioned for it, but refused it because Judy wanted it

 

Gall tells the audience that Durbin was probably a better singer, but Judy had more hits. However, film historians seem to agree that in that era Durbin was one of the most beloved actresses in the world. In 1947, she was not only the highest paid actress in Hollywood but the highest paid woman in America.  That year her fan club was the biggest on earth. American GIs in World War II even named a bomber after her. The Metropolitan Opera was even after her to join its company. Deanna was also Winston Churchill’s favorite actress.

 

So why did Durbin become a recluse? Gall says she was tired of Hollywood, found fame tedious and wanted to live a normal life. That can’t be all of it, though. Others say she hated the studio system of dictatorial control of a performer’s life and thought her life was over at 29, as it was for many actresses, and hated never being cast in very serious roles (the directors always had her singing something somewhere in the script).

 

Gall is quite good playing Durbin and she is a superb singer. The problem with the play is that It is a play abut Durbin and Garland without Garland. It would be much better as a two-character play and it should be a bit longer (it’s just a little over an hour). Gall carries the play well, but you really need a richer story and more nuance about Judy’s life.

 

Also, nowhere in the play is any reference to Ray Bradbury’s The Anthem Sprintersa delightful short story about a Deanna Durbin movie screened in Ireland followed by a race of moviegoers to a pub before the cinema starts to play the Irish national anthem at the conclusion of the film.

 

The story needs a far better explanation of why Durbin fled Hollywood. What did her friends say? Show biz buddies? Neighbors? Family? How did her rather wild personal life (three marriages and two out of wedlock pregnancies) affect her?

 

She is far better known today in the United Kingdom and Europe that in the U.S. because the actress never cut off her films there. In fact, there is still a ‘Deanna Devotees’ fan club in England.

 

The best part of the play, for me, was the question and answer session at the end. Gall is an authority on both women and researched their lives thoroughly. She really illuminated their lives by just answering audience questions. It was there, in that Q and A session, that she dropped her bombshell. It seems that back in the early 1950s, when Durbin had been retired for a few years, that the writers of the still untested MyFair Lady went to see her to convince her to play Eliza Doolittle. She flat out refused. That role, of course, would have made her famous all over again, an international superstar, a brilliant comet racing across the show business sky.

 

The strength of the play is its show biz history. You get a wonderful education in how the old movie studio system worked, how child actors were educated at special studio schools how stars had homes built for them right on the film sets. Impressive money could be made, too. The play should be subtitled Show Biz History 101.

 

If you ever notice that one of Deanna Durbin’s movies is on television, a rarity, catch it. This girl could sing!

   

PRODUCTION: The play, originally staged in the Fringe Festival is produced by the Soho Playhouse. Piano and musical arrangements: Bennett Paster, Graphic Design: Christache Ross. The play had a three week run and will be staged again at some U.S. theater.

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/article/173975 https://historynewsnetwork.org/article/173975 0
Profiles in Courage and Cowardice Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

The votes in the House and soon in the Senate about impeaching Trump are mostly seen as foregone conclusions. It was clear that he would be impeached on both articles in the House, and that two-thirds of Senators will not vote to convict. Hidden in these outcomes are many individual dramas for the few members of Congress who position themselves near the middle, who represent districts where elections are in doubt. Their votes represent more than partisan loyalty – they display courage or its absence.

 

Democrat Elissa Slotkin represents a House district in Michigan that had long been in Republican hands and was won by Trump in 2016 by 7 points. She beat the incumbent in 2018, winning just 50.6% of the vote. Like all the Democratic House members who won in districts that had gone for Trump, she worried about how her vote would affect her chances of re-election. She read our founding documents at the National Archives and spent a weekend at her family farm reading the hearing transcripts. As she appeared at a town hall meeting in her district last week, she was jeered by Trump supporters before she said a word. When she announced that she would vote for impeachment, she got a standing ovation and shouted insults.

 

She explained herself in an op-ed in the Detroit Free Press:  “I have done what I was trained to do as a CIA officer who worked for both Republicans and Democrats: I took a step back, looked at the full body of available information, and tried to make an objective decision on my vote.” She also faced the consequences squarely: “I’ve been told more times that I can count that the vote I’ll be casting this week will mark the end of my short political career. That may be. . . . There are some decisions in life that have to be made based on what you know in your bones is right. And this is one of those times.”

 

Of the 31 House Democrats who won in districts that Trump carried in 2016, 29 voted to impeach him. Collin Peterson of Minnesota, from a district that Trump won by 31 points, voted against. And then there’s Jeff Van Drew, first-term House member from New Jersey, who won in a district that has flipped back and forth between parties, and was won by Trump by 5 points. His 2018 victory was aided by considerable financing from the Democratic Congressional Campaign Committee. In November, he said in a teleconference that he was against impeachment, but vowed to remain a Democrat, which he had been his whole life: “I am absolutely not changing.” Then he saw a poll of Democratic primary voters, in which 70% said they would be less likely to vote for him if he opposed impeachment. Meanwhile, he was meeting with White House Republicans, who promised him Trump’s support. So he voted against impeachment and became a Republican. The next day, Trump asked his supporters to donate to Van Drew’s campaign. Van Drew’s Congressional staff resigned en masse. He told Trump in a televised Oval Office meeting, “You have my undying support.”

 

Somewhere in the middle between political courage and its absence lies the case of Jared Golden of Maine, whose successful 2018 campaign to unseat a Republican incumbent I supported. His district went for Trump by 10 points in 2016, and Golden won only because the new Maine system of ranked-choice voting gave him enough second-place votes to overcome his rival’s lead. His re-election certainly qualifies as endangered.

 

Golden took a unique approach to impeachment, voting for the first article on abuse of power, but against the second on obstruction of Congress. He said that Trump’s obstruction of Congress “has not yet, in my view, reached the threshold of 'high crime or misdemeanor' that the Constitution demands.” Golden wrote a long statement explaining his actions, arguing that House Democrats had not yet tried hard enough to get the courts to force Trump’s aides to testify.

 

I cannot judge Golden’s motives. He said, “I voted my heart without fear about politics at all.” Perhaps his heart feared the end of his political career.

 

But it is worth considering how Trump has defied Congress since he was elected. When Congress refused to appropriate as much money as he wanted to build his Wall, Trump decided to spend it anyway by declaring a “national emergency”. According to the Constitution, only Congress has the authority to decide how to spend taxpayer funds. Federal courts then blocked Trump’s use of other funds. Trump’s lawyers argued that no entity has the authority to challenge in court Trump’s extension of his powers. In July, the Supreme Court sided with Trump and allowed spending for the Wall to proceed.

 

Trump’s defiance of Congressional oversight began long before the impeachment crisis. In February, the administration refused to send Congress a legally required report about the murder of Jamal Khashoggi by Saudi operatives. A Trump official said, “The President maintains his discretion to decline to act on congressional committee requests when appropriate.” In April, he told a former personnel security official not to appear before the House Oversight Committee, which was investigating White House security clearance practices. That month, the Justice Department defied a bipartisan subpoena from the Oversight Committee investigating the addition of a citizenship question to the 2020 Census.

 

Robert Mueller found many instances of Trump’s obstruction of justice in the Russia investigation. Mueller declined to conclude that Trump had committed a crime, only because of a Justice Department memo that claims temporary immunity of a sitting president from prosecution. He clearly pointed toward impeachment as a remedy, and the House impeachment committees considered putting those actions into an article of impeachment. They decided not to, in order to simplify the process.

 

There are many other examples. Jared Golden’s idea that the House should wait and pursue their requests through the courts ignores the unprecedented nature of Trump’s refusal to do anything that the Democratic House requests or demands. It makes no sense to treat each instance of obstruction as a separate judicial case, which makes it impossible for Congress to do its job. Jeffrey Toobin of the New Yorker wrote, “Trump will create a new constitutional norm—in which the executive can defy the legislature without consequence.”

 

When John Kennedy wrote (or just put his name on?) Profiles in Courage, he quoted a column from Walter Lippmann, who had despaired of any courage among elected politicians: “They advance politically only as they placate, appease, bribe, seduce, bamboozle, or otherwise manage to manipulate the demanding and threatening elements in their constituencies. The decisive consideration is not whether the proposition is good but whether it is popular.” Yet historian Jon Meacham and political science PhD candidate Michael E. Shepherd write that many Congresspeople who took unpopular votes survived.

 

The great majority of young House Democrats who face difficult re-election campaigns in Trump districts acted courageously. Elissa Slotkin explained what courage looks like: “Look, I want to get reelected. The greatest honor of my life is to represent this district. But if I’m not, at least I can look at myself in the mirror when I leave office.”

]]>
Fri, 24 Jan 2020 17:40:06 +0000 https://historynewsnetwork.org/blog/154296 https://historynewsnetwork.org/blog/154296 0