History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Fri, 30 Sep 2022 19:36:22 +0000 Fri, 30 Sep 2022 19:36:22 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed Around the World, Censorship of Historians is Tied to Attacks on Democracy

Russia's Vladimir Putin and Brazil's Jair Bolsonaro are two world leaders who have linked their own rule to revisionist histories of their nations' past dictatorships.



On 17 August 2022, the Network of Concerned Historians (NCH) published its twenty-eighth annual report. Set up in 1995, NCH documents news on the intersection of history and human rights, in particular the censorship of history and the persecution of its producers. Its focus ranges from issues related to the freedom of historical research and teaching to the right to remember. This year’s report covered 100 countries and documented, among others, the political murder of five history producers.


Of these history producers who were killed for political reasons, three were murdered during the unfolding of a military coup. In Myanmar, after the military deposed democratically elected State Counsellor Aung San Suu Kyi, large-scale protests erupted across the country. In one of such protests, in February 2021, high school history and math teacher Tin Nwe Yee was hit by a tear gas cannister fired by security officials; she suffocated and died of a heart attack. Two months later, amidst violent resistance against the junta in Chin State, first-year history student Felix Than Muan Lian was shot dead by soldiers of the Tatmadaw (armed forces) on his way to work.


In Afghanistan, in August 2021, eleven days before the Taliban took control of Kabul, historian and poet Abdullah Atefi was taken out of his house in the Chora district, Uruzgan, allegedly by Taliban forces, tortured and shot on the street. Atefi was known for his writings on the history of Pashto literature and culture, and was a member of the Afghan branch of PEN International.


The other history producers’ deaths were caused by negligence of the regime. In Iran, in January 2022, film director and member of the Iranian Writer’s Association (IWA) Baktash Abtin died from complications relating to COVID-19, which could have been avoided had he received medical care earlier. He had contracted the disease in Evin Prison in Tehran, where he had been sent in 2019 on a five year prison sentence in relation to a book he had co-authored on the history of the IWA.


In China, also in January, the Uyghur writer and former editor at Kashgar Uyghur Press, Haji Mirzahid Kerim reportedly died in the hospital after he had “jumped and fell.” Despite a serious health condition, he had been sentenced to eleven years in prison for writing about Uyghur history and historians.


The political murder of history producers is only one, albeit the worst, type of history-related censorship. Throughout 2021 and 2022, examples of censorship and persecution ranged from political or politically-supported interference in the production of history textbooks and curricula—notably in India, Hungary, Hong Kong and currently unfolding in the Philippines—to restrictions on archival access, for example in France and Romania. The multi-headed monster that is history-related censorship reared its head in debates about racial justice initiatives in the United States; in politically motivated appointments in Brazil; in the liquidation of a historical organization in Russia; and in the persecution of individual history producers in India. Let us look a little closer at these four cases.


The number of so-called anti-Critical Race Theory (CRT) laws restricting the teaching of race and racism in United States history increased exponentially throughout 2021 and 2022. Research by PEN America found that 183 educational gag orders had been introduced between January 2021 and April 2022, with nineteen of them having become law in fifteen states. It further observed that the state level bills–almost exclusively brought forward by Republican party politicians–were vaguely drafted, leaving room for arbitrary interpretation, while often enabling direct punishment of teachers’ speech, including through a “private right of action” allowing parents and citizens to levy their own punishment. All this together made the laws likely to also cause self-censorship and thus lead to the effective banning of a wide swath of historical materials.


While in the US politicians attempted to censor historical teaching, and by extension future historical research, Brazilian President Jair Bolsonaro intervened personally. Known for his admiration of the military dictatorship that ruled the country between 1964 and 1985, he reportedly asked his Minister of Education Milton Ribeiro to interfere with the November 2021 Exame Nacional de Ensino Médio (a standardized examination given to more than 8 million secondary students), requesting the term “1964 military coup” to be changed into “Revolution.” On November 15, six days before the exam would take place, he announced that the exam would now start “looking more like the government” so that nobody would have to worry about “those absurd issues from the past.” Four days later, he nominated Ricardo Braga, the owner of a private security company, as the Director of the National Archive, despite him lacking expertise. The nomination provoked fears among many, including the National Forum of Associations of Archivists of Brazil, of a rewriting of the history of the dictatorship, and possibly a destruction of archival documents.


In Russia, matters were even worse. In November 2021, the Prosecutor General’s Office filed a lawsuit with the Supreme Court seeking to liquidate Memorial and all of its regional and structural units. Memorial worked as an international human rights organization in Russia since its founding in 1988, uncovering crimes against humanity, especially those committed during Stalin’s reign (1928–1953). A thorn in the eye of President Vladimir Putin’s history policy of valorizing the Soviet Union and Stalin’s rule, Memorial had been forced to register as “foreign agent” in 2014 for receiving foreign funding, and had since been fined at least 21 times for a sum of more than 4.2 million rubles. On December 28, Memorial was liquidated by the Supreme Court. Its archives, including a database of three million victims and 42,000 collaborators of the Soviet secret police between 1935 and 1939, a library and a museum, were all under threat after the ruling.


In India, political persecution was mostly aimed at individual history producers. In May 2021, Gilbert Sebastian, an assistant professor in international relations at the Central University of Kerala, was suspended for describing the militant political organization Rashtriya Sqayamsevak Sangh (RSS), connected to the ruling Bharatiya Janata Party, as a “proto-Fascist organization” during a course on “Fascism and Nazism.” One year later, in May 2022, Waqas Farooq Kuttay, an assistant professor at Sharda University, was similarly suspended after students had complained about an “objectionable” question for a mid-term undergraduate paper, which asked students to discuss whether there were “any similarities between Fascism/Nazism and… Hindutva.” Following the suspension, the university said the question had “distorted the great national identity.” The links between the BJP, the RSS and an aggressive policy of Hinduization are clear, and so is the link with the 2002 mob violence targeting Muslims in Gujarat. Prime Minister Narendra Modi, then Chief Minister of Gujarat, was deemed responsible for the violence by human rights activist Teesta Setalvad. Whereas the National Human Rights Commission and the Supreme Court had strongly condemned the Gujarat government for its failure to deliver justice in the 2002 mob violence case, the Gujarat authorities repeatedly prosecuted Setalvad on false charges. In June 2022, for example, she was arrested in Mumbai on charges of criminal conspiracy and forgery. She was released on interim bail this September.


What these four cases, in their great diversity, show, is an attempt by those in power to limit historical research and teaching to those subjects and perspectives that in some capacity function as a foundational element of their legitimacy: be it Putin presenting himself as the harbinger of stability after the dissolution of the Soviet Union, while promising Russia’s return to power in a new world order, or the attempts of the Bharatiya Janata Party to push a view of India inspired on an aggressively excluding Hindutva ideology.


Similar processes were found at work in how the past was allowed to be remembered. Elsewhere, I argued that the governments of Rwanda, Russia, China and Sri Lanka channeled their legitimizing interpretations of the past into national commemorative practices. Similar examples took place in Vietnam, where human rights activists were put under house arrest on important commemoration days, such as April 30 (the end of the Vietnam War); in Bosnia and Herzegovina, where the Bosnian Serb authorities of Prijedor banned the annual White Armband Day commemoration (for the victims of ethnic cleansing during the 1992–1995 war) on the grounds of a “serious danger of violence” by two far-right Bosnian Serb nationalist groups, thereby strangely reversing its responsibilities; and in Israel, where in June 2022 the Knesset approved a provisional bill banning the display of “enemy flags,” including the Palestinian one, at state-funded institutions, after Palestinian flags had been waved on May 15 (known by Palestinians as Nakba Day to commemorate the 1948 Declaration of Independence of Israel and the accompanying displacement of Palestinians), in direct opposition to a September 2021 ruling by the Jerusalem Magistrate's Court.


When analyzing the censorship of history, one is drawn to look at the backsliding of democracy in the world, as shown time and time again by democracy watchers such as Freedom House and the Economist Intelligence Unit, since 2005 as well. The pursuit of historical truth and the falsification of historical revisionism, depend on such freedoms that are foundational for democracy: freedom of expression, freedom of information and the right to truth. When democracy is under threat, historical research is among the first casualties. Both need our fullest support.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/184003 https://historynewsnetwork.org/article/184003 0
The History of DDT Shows Government Agencies Have Responsibility for Today's Skepticism about Science



BERKELEY ­– This year marks the 60th anniversary of Rachel Carson’s Silent Spring, which highlighted the negative effects of DDT and other pesticides on wildlife and helped spur the rise of the environmental movement. Unfortunately, the commemoration comes at a time when Americans are becoming more skeptical of scientific institutions. Looking at DDT’s story, it is clear that some of this doubt was sowed by corporate actors such as Phillip Morris in the 1990s. Still, our public health institutions must also repair the damage they have done to their reputations and rebuild trust with the American public. It is their responsibility to free themselves from the “economic poisons” that threaten to discredit their work.


In my book, How to Sell a Poison: The Rise, Fall and Toxic Return of DDT, I delve into the story of the toxic chemical. Back in the 1940s, when DDT was first developed, “economic poisons” were chemicals designed to kill pests that otherwise caused high-cost damage. In the broader sense, however, the term “economic poisons” can describe how the Invisible Hand often steers scientists, medical professionals and governments away from reasoning and toward profit. In America, this phenomenon intensified as a rising world superpower clamored for post-war profit at any cost. Today, our government institutions, and especially our scientific ones, have a duty to rebuild the public trust that this process eroded over the last half century.


In the late 1940s, Lester Petrie came face-to-face with these economic poisons. Petrie was head of the Georgia Division of Industrial Hygiene, responsible for investigating the problems posed by chemical encounters in workplaces. Yet ever since the war, his division’s work had become more complicated. Before, the job largely involved making sure that people working in factories didn’t come in contact with chemicals or equipment in ways that harmed them. Now he was hearing more and more about chemical poisonings in people’s homes and farms. Most of them were caused by what those in his line of work called “economic poisons,” newly developed synthetic chemicals that killed insects ranging from the annoying to the deadly. DDT was one of them.


Before long, Petrie had compiled a harrowing collection of poisoning cases. Most were crop-dusting pilots choked to death by their own cargo. One was a woman who died after eating blackberries grown nearby a sprayed cotton field. A group of small farmers wrote him to say DDT sprayed on neighboring farms was killing their bees and chicks and making them sick. People used to kill insects on crops, when necessary, using compounds that contained the well-known poisons arsenic and lead. It troubled Petrie that the economic poisons seemed as deadly as the older poisons, and that people clearly didn’t know.


Petrie himself, after all, had been sure DDT in particular was safe, because of reports and assurances from the U.S. government during the Second World War. Now he wondered whether he ought to create a warning for it, as he had just done for another economic poison, tetraethyl pyrophosphate or TEPP. He reached out to FDA pharmacology chief Arnold Lehman, who had so helpfully reviewed the TEPP warning he had drafted for farmers in his state. “A few days ago,” he wrote, “I noticed a news item in the paper cautioning against the spraying of barns with DDT. I would appreciate very much your sending me all of the new information you may have available concerning the toxicity of DDT.” This time, Lehman didn’t reply.


Instead, Petrie received a form letter from the FDA’s Division of State Cooperation. “I regret that we are unable to send you literature dealing in a detailed way with the toxicity of DDT,” it read. Enclosed was a list of articles on DDT poisoning published in medical journals, and an official statement issued by the FDA, PHS, USDA, Army, Navy, and Pan American Sanitary Bureau. Recent news had “misled and alarmed the public concerning the hazards of DDT,” the statement read. “DDT is a very valuable insecticide which has contributed materially to the general welfare of the world.” It was a “poison” like any other insecticide, but any harm it seemed to cause was a result of “errors” in its use.


Then Petrie got word of a young boy who died minutes after taking a swig from a bottle he found resting in the crotch of a tree. Petrie felt certain the bottle must have contained TEPP. He also knew that no one, not even the most advanced forensics lab in the state, had the means to identify it. He forwarded details of the case to the Toxicology Branch at the CDC. Then he wrote to the list of manufacturers his staff had compiled, asking them to add tracers to their products so state health officers like him could figure out which products were causing illnesses and deaths. He received one polite reply, from the medical director at Monsanto, which made parathion and DDT. The company would not be able to do this, he said, because it would cost too much.


Petrie had tried to raise the alarm on the new chemicals that were being produced at an unprecedented scale after the war—and that were now seeping inside every American household. Instead, he was met with skepticism from government officials and indifference from corporations, who knew it was dangerous but that warning the public would hurt their bottom line. His job, it became clear, was to protect production over public health.


The responsibility to disclose such dangers doesn’t just fall on scientists and government officials. Journalists who have knowledge that could save the public from deadly viruses or dangerous chemicals have a duty to shine light on those issues, too. The Los Angeles Times accomplished this with their groundbreaking story on dozens of barrels of DDT waste dumped into the Southern California Bight.


The response to that reporting was swift. California Senator Dianne Feinstein demanded that the EPA take action. Other lawmakers called hearings in the House of Representatives in Washington and the State Assembly in Sacramento. Foundations reached out with grant funds. Scientists convened meetings and conference panels. A team from the Scripps Institution of Oceanography and NOAA pulled together an expedition in record time to go down to the seafloor and map the dump site in the spring of 2021. They didn’t see dozens of barrels; they saw more than 27,000. Recently released memos from the EPA’s investigation indicate that the actual amount of ocean-dumped DDT may be far, far higher.


It’s not too late for these institutions to repair the damage they have done. But it will take years of effort beginning at the grassroots level. It will mean scientists who consistently push back on industry, even when it is the unpopular, or in the short term, more costly decision. It will take brave policymakers and politicians willing to back up scientists, even when it is not politically convenient. It will mean journalists taking the time to build trust in Black and Hispanic communities, places where they have ignored their stories and their calls for environmental justice. These previous pollutants and pandemics are but a dress rehearsal for the growing climate crisis, a global emergency that will once again pit big business against sound science.


Scientific proof of environmental harm has only ever shifted policy when government and business interests have aligned. We must remember that there’s no greater shared interest than our very survival.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/184004 https://historynewsnetwork.org/article/184004 0
Anne Frank's Next Diary Entries

Women survivors of the Bergen-Belsen concentration camp, April 17-18, 1945



When I was a teenager, I imagined that Anne Frank was at my mother’s 15th birthday party. After all, they were the same age and they were both in the Bergen-Belsen concentration camp.


How did I arrive at such a phantastic conflation?


For one, who hadn’t heard about the gifted diarist? Expressing normal adolescent interests and concerns, Anne’s letters to imaginary friends helped her cope with the oppressive situation in “the Secret Annex.” I could picture the hiding place she described. I could appreciate her humor, wisdom, and powers of observation. Though Anne knew that terrible things were happening, “The Diary of a Young Girl” does not touch the horrors of the Holocaust. Her book is relatable. Hopeful. 


When I was a young teenager (about 13, the age my mother was during the war), my mother was restrained in what she shared with me. She worried that her experiences in Auschwitz and Bergen-Belsen would give me nightmares. She recalled and elaborated on her birthday party in a labor camp. The episode was relatable. Hopeful.


Eventually, after multiple conversations and into my college years, I learned the truth. Here is why Anne and my mother were never at the same place at the same time:


On August 4, 1944, SS-Oberscharfuehrer (Nazi squad leader) Karl Josef Silberbauer and Dutch police officers raided Prinsengracht 263, where Otto Frank ran his business. Searching behind the office’s bookcase, they found Anne and the seven others (Otto, Edith and Margot Frank; Herman, Augusta and Peter Van Pels; and Fritz Pfeffer) who had been hiding there for more than two years. They hauled the group to a detention center, a stop before Westerbork, the Dutch transit camp.


By this time, my mother had endured the harrowing journey from Sighet to Auschwitz. And the trauma of being separated from her parents and siblings on the ramp at Birkenau. For more than two months, she had struggled to survive in the shadows of the gas chambers and crematoria. Finally, providentially, she and her sister Elisabeth were sent to Christianstadt, a labor camp in Lower Silesia. In its neighboring munitions factory, she gouged holes in grenades, preparing them for pins. (She deliberately made the holes shallow, rendering the weapons useless.)


By October 8, the date of my mother’s birthday, Anne — having been deported with her family on the last train to leave Westerbork — had known the terror and depredations of Auschwitz-Birkenau for more than a month. A celebration of any sort would have been unthinkable. Meanwhile, in Christianstadt, my mother received gifts: slippers and a hat made of straw pulled from mattresses; flowers made of somehow-procured crepe paper. Had Anne been at my mother’s party, she would have had a slice of the cake decorated with a scene from Hansel and Gretel, made by women who worked in the SS officers’ kitchen, created at the risk of punishment for stealing scarce ingredients.


On November 1, 1944, Anne, her sister, Margot, and Augusta Van Pels were transferred from Auschwitz to Bergen-Belsen. It would be another four-and-a-half months before my mother encountered that hell.


With the advance of the Red Army in the winter of 1945, the Nazis hurried to evacuate concentration camps in the east. Bergen-Belsen, deep in northwest Germany, received the largest number of war-ravaged survivors. Without adequate shoes and clothing, with hardly anything to eat, they had suffered terribly on death marches and in overcrowded rail cars. Now they landed in a place without sanitation, clean water or habitable housing. A February transport brought typhus to the camp. Epidemics raged. Death was practically inescapable.


After witnessing the demise of Margot, Anne, emaciated and ill, also succumbed. No one knows the exact date, but it was likely within one or two weeks of my mother’s arrival in mid-March. More than 17,000 prisoners died that month. The dead lay in huge piles, or among the living in the huts.


By the time units of the British Second Army liberated Bergen-Belsen on April 15, half of the camp’s 60,000 inmates were recent arrivals. If not too far gone, they were able to withstand the final five days under Nazi rule, when no food or water was distributed. Those who were there longer had less of a chance. More than 13,000 died after the liberation. It took four weeks for the liberators to stem the tide of death.


My mother barely survived. On April 23 or 24 she was washed, disinfected, and placed in a makeshift hospital room with 12 beds. Every day for three weeks, 11 dead were removed and 11 nearly dead filled the empty beds.


Even when I had the crazy idea that my mother and Anne Frank knew each other, I was aware that they were very different girls from very different backgrounds. Despite her family’s move from Germany to the Netherlands in the wake of the Nazis’ rise to power, Anne enjoyed the trappings of a privileged childhood: family trips, expensive activities, material comforts. My mother came from a poor family in an isolated Carpathian Mountain town. The second of six children, she’d had big responsibilities — taking care of her younger siblings, tending to household chores and, by age 12, delivering orders for her grandmother’s poultry business. While Anne attended a Montessori School, my mother attended a provincial Romanian school. While Anne kept a diary, my mother kept a log of what customers owed. Beyond being Jewish (Anne’s family was secular; my mother’s, religious) the only thing they had in common was the developmental stage they were at when caught in Hitler’s murderous web.


Had Anne survived, she and my mother may have crossed paths. This was plausible; in hospitals and TB sanatoriums and, at one point, at an International School for teenage survivors, my mother was with girls who came from various socioeconomic groups. What they had in common was more significant than what they had experienced in their diverse prewar lives. Almost all were orphaned, sick and homeless. All had endured against all odds.


Had Anne survived, her mighty contribution to Holocaust literature would surely have taken a different tack. Readers would have had to be ready to learn the whole truth.


This essay was originally published on August 4 by Kveller, and is republished by permission. It may not be reproduced without permission. To receive a free subscription to Kveller, visit https://www.kveller.com/signup/.


Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/184006 https://historynewsnetwork.org/article/184006 0
Thinking and Teaching the Implications of Federalist #10 for Democracy


When I picked up my copy of Federalist #10 to begin writing this article, I was stunned by the subtitle: "The Union as a Safeguard Against Domestic Faction and Insurrection." Despite my 30 years of teaching this document, the emotions that welled up in me upon reading "Insurrection" were a shock. These are hard times. That the present shapes our understanding of the history we study was brought home to me with new force.

Knowing that Shays' Rebellion was a cause of the calling of, and the high attendance at, the Constitutional Convention and the prominence of the phrase "to insure Domestic Tranquility" in the preamble helps explain what the framers thought was at stake in 1787. I always spent 10 or 15 minutes parsing the meanings of the preamble, but even though I taught the Constitution more than 150 times over the years, I never felt the depth of those words as I do at this time. The insurrection by the followers of Donald Trump puts us in a situation James Madison would recognize. Donald Trump and his minions have been frightening us every day for years now. It is time to analyze the most famous of Madison's essays: Federalist #10. This essay is addressed to teachers.

This article is in two major parts: An analysis of Madison's Federalist #10 on his terms in the first section, which is a pared down student-led lesson, and a second section which builds on the first to critique #10. Usually historians and political scientists refer to the electoral college as the major anti-democratic feature of the Constitution, but in Federalist #10 Madison, as you will see, had fundamentally no respect for the will of the of the people. He baked this idea into his theory of the republic.

That final section takes on the chimerical idea of the (single) public good and Madison's outright rejection of “the people themselves” to protect the government from dangerous majorities. In 2022 the white supremacist Republican party has ditched democracy and gerrymandered Madison's constitutional structure. We are on the brink of a fascist takeover.These contradictions could not be compromised away in 1787 and cannot be smoothed over in 2022. “The Miracle in Philadelphia” nearly failed as a system on January 6, 2021. Democracy cannot be defended by depending on a group of men of “wisdom” to lead us to control “the mischiefs of faction.” Instead we need majority rule.

When I assigned Federalist #10 I asked the students to download and read the document. (1) They were required to choose two sentences from the beginning, three from the middle and two from the end of the document. As I have explained in detail in "The Tarzan Theory of Reading," on my Substack site, the students were to single out sentences with which they agreed or disagreed strongly or those that they thought were important and explain why. The students will lead the discussion with their questions, comments, and the sentences they choose which they will read out loud to the class. In addition I asked them to identify the sentence that was at the logical center of the argument. Federalist #10 has an elegant architecture.

When I began the class, I asked for questions or comments. Students often made comments on the definitions of faction or insurrection, which is now a term many students will encounter in the news. The definition of faction is "a majority or minority... opposed to the permanent and aggregate interests of the community." The students will come up with the common term "special interest," but how can that be a majority? This is key problem with Federalist #10, since Madison's understanding of the term faction is not intuitive. The students may object that the Constitution describes a democracy: does not the majority rule? You should put that idea in a separate list on the board and leave it until the end of the discussion (we will discuss that separate list of ideas in depth in the Part II critique). An insurrection is an attempt at the violent overthrow of a government. The students know that Shays' Rebellion (1786 – 87) was an insurrection.

Majority faction is itself a contradiction that can be addressed by working through Madison's series of subtopics: the climate of disorder in the country, his diagnosis of factions the proposals to eliminate them, or to control them, and a critique of his solution. Although the discussion will jump around the document, as the students volunteer their sentences those subtopics will organize the notes as we go along.

Disorder in the Country

Shays' Rebellion was a major factor in Madison's concerns. The students will know that indebted farmers in western Massachusetts denounced unaffordable taxes and complained that they were losing their mortgages to foreclosure. Daniel Shays was a Revolutionary War captain who led his followers to attempt to close the courts to prevent the foreclosures. In addition, they demanded representation equal to the proportional per capita representation in the east close to Boston. After the rebellion was quashed, Shaysites were elected to the Massachusetts legislature. Another problem was that the rebellion was a protest against unfair taxation reminiscent of the protests in the 1760s and 70s. It reminded many leaders in Massachusetts of the lead-up to 1776 (similarly, some of the insurrectionists in 2022 used 1776 as a threatening slogan). This armed insurrection was a major cause of the convening of the Constitutional Convention in Philadelphia, because the Articles Congress had no power to raise an army directly: the state had to defend itself along with any allies it could muster.

Madison describes how, in his view, the public good was being ignored. "The friend of popular governments" opposes the "violence of faction" which causes "instability, injustice and confusion." There are "overbearing" majorities that cause " governments" to be "too unstable" because they do not respect the "rights of the minority," and governments controlled by "specious (unsupportable) arguments" causing "mortal diseases under which popular governments have everywhere perished." He blames the "factious spirit that has tainted our public administrations."

Madison's Definition of Faction

"By faction I understand a number of citizens whether amounting to a majority or a minority of the whole who are united and actuated by some common impulse or passion or of interest adversed (sic) to the rights of other citizens or the permanent and aggregate interests of the community." If a student chooses this sentence, you have to be careful to explain each part of the definition. How do you explain this definition, I ask. Eventually the students come to realize that Madison expected that the people would support particular conclusions (how else could he call it a majority faction?). How could a leader find "the permanent and aggregate interests of the community," I ask. This should also go in the Critique section for discussion. The rest of Federalist #10 discusses how to eliminate factions or how to control them.

Eliminating Factions

This is the first of the methods to secure the government against the "mischief of faction." There are two methods to eliminate factions: destroying liberty or giving everyone the same opinions. The students will come to the conclusion that restricting liberty is not possible in a democratic government because we depend on freedom of thought and action to maintain democracy.

The second method, giving "everyone the same opinions," is also an impossible solution because "as long as man continues fallible, and he is at liberty to exercise it, different opinions will be formed." How do you understand that, I ask. Here, students might note Madison’s identification of opinions based on "self-love," the diagnosis that "reason is connected to passion" and the observation that "Diversity in the faculties of man" were factors in the differences of political opinions.

The rights of property and the ownership of different kinds of property and the faculties to obtain those kinds of property all cause divisions. Faculty seems to be an ability, the students will conclude. So, Madison describes it thus: “(t)he latent (underlying) causes of faction are thus sown in the nature of man." It soon becomes clear that Madison was not making an argument for the change in distribution or the control of production or property or goods in the US—Madison was not a Marxist! Instead, the students will conclude that Madison was attempting to find ways to manage the political effects of that inequality or those differences. But in whose interest did he want to manage those inequalities: was it  to be a country of the enslaved, the ordinary people, or did he favor his class of the southern gentry?

Controlling the Effects of Faction

"The inference to which we are brought is, that the CAUSES of faction cannot be removed and that relief is only to be sought in controlling its EFFECTS." In the ensuing discussion students will come to the conclusion that this sentence begins the second half of the argument. It is the sentence at the logical center of the argument. Here Madison turns to the idea of controlling the effects of factions instead of eliminating them and eventually introduces the republic as a solution.

"If the faction consists of less than a majority" voting, the "republican principle" is the remedy. There might be disagreements, but majority rule does offer a solution. Therefore, what to do about a majority faction is the most intractable problem. Someone is likely to pick the sentence: "To secure the public good and private rights against the danger of such a (majority) faction, and at the same time to preserve the spirit and the form of popular government is then the great object to which our inquiries are directed." The ensuing discussion can conclude that it is a thesis sentence pointing to the chief point of the whole article.

The "existence of the same passion in the factional majority" must be prevented or "the majority must be rendered unable to concert. When people “concert” they work together. Madison is actually opposing the rule of the majority here. A pure (direct) democracy in which the citizens are the legislature "can admit of no cure" for "the mischiefs of faction" because "the common passion or interest will in almost every case be felt by a majority of the whole and there is nothing to check... an obnoxious individual" or group from influencing everyone.

In a republic as envisioned by Madison, however, “the representatives refine and enlarge the public views by passing through a medium of a chosen body of citizens whose wisdom may best discern the true interest of their country (my italics)." He added, “the public voice pronounced by the representatives of the people might be more consonant to the public good than if pronounced by the people themselves.” Here Madison added the idea of making the republic cover larger areas. He suggests that by ”e)xtend(ing) the sphere -- you take in a greater variety of parties and interests (and) you make it less possible they will concert... “ The conclusion of this part of the argument can lead to a choice of more famous and experienced statesmen who possess the “wisdom” referred to above, because the a large number of voters would be participating in a larger district, the chances if a more famous or experienced person (i.e. of wisdom) would be greater.

Finally, "The influence of factious leaders may kindle a flame within their particular states, but will be unable to spread a general conflagration through the other states." He uses religious sects, a rage for paper money, and abolition of debt as examples that are more likely to "taint a particular county or district than an entire State." These are some of Madison's most famous statements. The students will see that the purpose of representation and extending the area of the republic was to elect men of wisdom. The factions may cancel each other out or the men of wisdom will convince the other legislators to follow the "true ideas" of the public good because the ordinary people cannot end the controversy, Madison and his fellow leaders will decide for them.

Madison's essay seems clear as a the ringing of two groups of bells: There are two groups of opposing solutions:

Eliminating Factions or Controlling its Effects. Each has two methods of solution: He moves through the ideas with alacrity going from one solution to another. The logic is stunning and elegant, like a mathematical proof.


Part II: A Critique of Madison's Argument

Now we have to confront the sentences we have put aside or left without exploring thoroughly, in particular the idea of the majority faction:

"By faction I understand a number of citizens whether amounting to a majority or a minority of the whole who are united and actuated by some common impulse or passion or of interest adversed (sic) to the rights of other citizens or the permanent and aggregate interests of the community." Eventually, the students will conclude that a majority vote is not what Madison is seeking as a solution to the problem of the majority faction. Somehow the government must override the majority.

Another example of Madison's majority problem: The "public voice pronounced by the representatives of the people" might be more consonant "to the public good than if pronounced by the people themselves." The students will determine that Madison is counterposing the representatives to "the people themselves." Representatives certainly do not have to vote by taking instructions from their constituents, but it is clear that Madison is trying to circumvent the majority. Why would a legitimate republic be so designed? When we discuss this idea the students reach the conclusion that he does not trust the people to make the right decisions. It is obvious from the sentences that are there for the choosing.

Another of Madison’s sentences expresses the same contradictory view: "To secure the public good and private rights against the danger of such a (majority) faction, and at the same time to preserve the spirit and the form of popular government is then the great object to which our inquiries are directed." What, students may ask, is the “public good” other than the will of a majority? If you have not yet discussed "public good," it is an opportunity to discuss the major contradiction. When the students analyze this the discussion is not done until the students see that even though Madison seems to be discussing the solutions as benefitting all the people, instead he is claiming the right to decide for the majority which citizens will benefit.

Eventually the students reach the conclusion that everyone does not have the same interests in society or that the public good may change. So it is not clear how to reach the public good, or that the public good can be expressed as a singular rather than a series of public goods. Madison believed, however that the public good was not only attainable, but a key factor in overcoming the mischiefs of the majority faction. Do we really think that the Constitution has been a success for all the people as Madison designed it and the conventional wisdom in the US has always assumed?

Now we have entered a realm of ambiguity and contradiction. Madison's elegant proof, which seemed so clear, becomes murky, and most importantly, unreachable by the majority of ordinary men—or women!! How do you understand this "public good" now, I ask. The students will determine that not all people under the Constitution have the same interests as propertied white men: There are women and Black people and poor and rich. In 1787 these citizens were not formally part of the political community. The First Peoples "not taxed" were thus excluded by the clause on taxing and representation in Article I after the 3/5 clause. The Black underclass in the US has been living without the protection of the law for the vast majority of American History; much of the US seemed to only discover the true level of relentless and widespread violence against Black people on May 25, 2020—the day of George Floyd's murder. Madison had been fine with slavery and its terrible consequences. These actions were not a new development.

The interracial uprising that resulted was unique They were the largest multiracial demonstrations ever in the US .The violence against Blacks has been a dark undercurrent in the US since the ratification of the Constitution. What is the public good? Do you think now that Madison was protecting the whole people as he implied in paragraph after paragraph by calling his goal the public good?

Now we come to the final sentence in the statements we have put aside for critique. When Madison brought up the danger of Shays' Rebellion (see Disorder in the Country above), he blamed the eastern leaders of Massachusetts for the unequal taxation which caused the rebellion. The western farmers rebelled against the unfair taxatiion as they had in the 1760s and 70s. Madison commented: “Enlightened statesmen will not always be at the helm," i.e. elected to office.

These men on whom Madison depended must convince the other representatives and the senators that they know the public good better than the people themselves. Are these people philosopher kings who see the reality in Plato's cave? Or are they advocating legislation based on the general will in the theory of Rousseau? The general will is discerned outside of debate, and expresses the "true will" of the people. This ability is a "faculty" of enlightened statesmen. It depends not on majority vote but on "the permanent aggregate interests of the community" or the"public good," determined by the men "whose wisdom may best discern the true interest of their country" in Madison's phrase.(2) These men of the "better sort" must convince other legislators to follow their lead. What in Madison's argument places these statesmen in power, I ask. The students eventually identify the layering that takes the decisions out of the hands of the direct voters who have elected men of deeper perception or who represent more conservative interests that protect the government from the "vexed," the poor or the enslaved, in other words, the factions born of ambition race, and class. These men can find the public good for the benefit of the permanent aggregate interests of their countrymen. But as I stated at the outset such a belief is a chimera.

How can we call the history of the US a long story of a developing public good for all the people when the 3/5 Compromise ruled the House of Representatives until 1818, when the large white population of the North overwhelmed the slaveholders' advantages, and up until the Civil War the small population states controlled the Senate with the help of the “dough-faced” northerners who voted with the South in the Senate and the House. These all acted together to repress democratic solutions to slavery and keep women, the poor and the First Peoples in literal and virtual shackles and chains.

When the slave power was overthrown and the Reconstruction Amendments were passed after the Civil War, there was a brief period from 1866 to 1877 when a fragile interracial democracy existed in the South, which for a time kept the Republican reformers in power. But then violent mobs attacked and killed Black Republican voters, overturned that hard won peace between the races, and Blacks lost suffrage in nearly the whole South. White supremacy ruled again until the Civil Rights Revolution capped by the Voting Rights Act of 1965 produced a second period of Black and minority participation.

Now we are in a different era in which our political life has also been commandeered by white supremacy in the form of Republican re-districting in the states so, despite the large populations in the Democratic-controlled states, the Democrats have only bare majorities in the House and only the tie-breaking vote of the vice president in a 50-50 Senate. Democratic senators represent 41.5 million more Americans than the Republicans. (3) These are problems quite different from Madison's majority factions. It is minority rule that the majority cannot use the "republican principle" to "cure." It is a deadlock caused by the filibuster and the small population states, which have controlled the Senate since they were born in the Great Compromise. Madison's "Machine that Would Go of Itself" has been rejiggered. (4) There is a fascist threat to democracy led by the followers of the former President. Madison's governmental structure has been under threat by these insurrectionists and the democratic traditions have been undermined to the breaking point. It is unclear whether democracy shall survive the next election, let alone the ones after.

The call in Federalist #10 for the protection of the public good and for the permanent and aggregate interests of the community was based on the will and experience of a minority Madison called the "enlightened statesmen," who protected slavery for the white majority. The white majority in the country is now disappearing and the movements to defend the "historical white republic" are threatening the lives of workers, women and all minorities. This is our problem now, and it is rooted in the ideal of the public good which Madison believed he and other enlightened statesmen could conjure up to protect the true interests of the “whole” community. He fought to maintain the rule of people like him. There was no working compromise between the interests of slavery vs freedom, or today between the evangelical radicals opposed to abortion and advocates of women's rights, or between the refusal of the rights of the poor to health care and advocates of medicare for all, or finally, the interests threatening the rights to clean air, water, food and jobs and the movement for a Green New Deal. The Electoral College and the unrepresentative Senate must not control our politics. We are at a crossroads.

The myth of the "divinely inspired" Constitution has sustained Madison's reputation of infallibility, but the flaws in his reasoning, as we have pointed out, have come to haunt us and brought us to the brink of losing our democracy. What, after all, is the public good if it does not represent a clear majority of the US population? As the students realized in their analysis there is no single or public good. We are a country of classes, races and genders. We should not be controlled by the rich white men or their MAGA insurrectionists. We are still being ruled by the magical thinking of former centuries, from ancient Greece to the early modern concepts of the virtue of the white landed aristocracy. All this is embodied the persons of senators from states with populations smaller than assembly districts in New York or the city of Washington DC. These modern day conservatives talk about the Constitution as a document describing a republic, not a democracy. They believe that the proper leaders of this republic are the whites: the real Americans. This idea brings us back to the earlier argument concerning the dangers of reaching for the single public good or the "permanent aggregate interests of the community." The chimera of the public good turns out to be a smokescreen for white supremacy—as it always was. No amount of leisure or learning can motivate the white supremacists to discern true interests of our country. They are in it for themselves.


(1) You can find Federalist #10 at:


(2) For example see passim




(3) https://www.minnpost.com/eric-black-ink/2021/02/u-s-senate-representation-is-deeply-undemocratic-and-cannot-be-changed/

(4) See The Machine that Would Go of Itself , a book about the Constitution by Michael Kammen published in 1986 which shows that the understanding of the Constitution is weak, but Americans believe it is inspired by God and unchangeable.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/184002 https://historynewsnetwork.org/article/184002 0
Ukraine Shows the Need to Break the Cycle of National Insecurity



There is no doubt that the Russian invasion of Ukraine constitutes a criminal act of aggression.  What lay behind this, however, is a complicated set of competing geopolitical ambitions and threat perceptions, and beyond these, a fundamental weakness in the United Nations mediation and collective security systems. 


Russian leaders from Boris Yeltsin to Vladimir Putin have viewed NATO’s gradual expansion to the east as a grave national security threat.  In 2008, President George W. Bush opened the door to Ukraine and Georgia for future NATO membership, thus bringing this Western military alliance to the doorstep of Russia.  While there has been much debate over whether NATO expansion constitutes a broken promise to the Soviet Union (and Russia), there is no doubt that Russian leaders have regarded it as an existential threat to their nation.      


The Russian response has been to support separatist movements in Georgia and Ukraine, annex the Crimea Peninsula, and, most recently, invade Ukraine.  U.S. leaders have deemed these actions gross overreactions, or perhaps indications of a desire on Putin’s part to remake Russia into an empire along the lines of the old Soviet Union.  Whatever the case, the invasion of Ukraine has been counterproductive for Russia, as NATO has been strengthened and expanded further. 


U.S. leaders have taken the position that NATO is of no threat to any nation, and thus they have been unwilling to compromise on Ukraine’s eventual membership.  Then, too, U.S. leaders generally regard U.S. global power as protective and benevolent, notwithstanding a long record of military interventions and covert operations in other nations.  No doubt, they would respond with alarm if Russia or China invited Mexico to join in a military alliance. 


Might the war in Ukraine have been avoided if the U.S. had relied on the UN rather than NATO to ensure security in the region? 


The UN Charter requires that parties in any dispute “shall, first of all, seek a solution by negotiation, enquiry, mediation, conciliation, arbitration, judicial settlement, resort to regional agencies or arrangements, or other peaceful means of their own choice” (Article 33).  Should the latter fail, the charter provides for collective security measures in which member nations can collectively deter or repel would-be aggressors through joint diplomatic, economic, and military actions “for the purpose of maintaining international peace and security” (Article 43).


The UN has often failed to live up to its mandate to “end the scourge of war,” which has led many to dismiss the institution as irrelevant.  Yet, like all great changes and paradigm shifts, progress in establishing a global security system is incremental and may be centuries in the making.  Think of the establishment of democratic governments, human rights policies, and religious tolerance. 


In the late 1980s, Soviet leader Mikhail Gorbachev attempted to revive this inclusive concept of security set forth in the UN Charter.  Speaking before the UN General Assembly on December 8, 1988, he declared, “The world community must learn to shape and direct the process in such a way as to preserve civilization, to make it safe for all and more pleasant for normal life.  It is a question of cooperation that could be more accurately called ‘co-creation’ and ‘co-development.’  The formula of development “at another’s expense” is becoming outdated.” 


Turning to the U.S., Gorbachev proposed that the U.S. and Soviet Union begin a “joint effort to put an end to an era of wars, confrontation and regional conflicts, to aggression against nature, to the terror of hunger and poverty as well as to political terrorism.  This is our common goal and we can only reach it together.”  Giving substance to these aspirations, he announced Soviet decisions to withdraw significant numbers of troops and tanks from Eastern European countries and to seek a UN-brokered ceasefire in Afghanistan.


It was a remarkable speech, especially as Americans had been conditioned to view the Soviet Union as the graveyard of idealism.  The editors of the New York Times had difficulty describing it:  “Breathtaking.  Risky.  Bold.  Naive.  Diversionary.  Heroic.  All fit.  So sweeping is his agenda that it will require weeks to sort out.  But whatever Mr. Gorbachev’s motives, his ideas merit – indeed, compel – the most serious response from President-elect Bush and other leaders.”


The incoming George H. W. Bush administration was hesitant to endorse Gorbachev’s idealistic reforms.  Indeed, the U.S. foreign policy establishment viewed Moscow’s retreat from great power domination as an opportunity to advance U.S. interests and establish the U.S. as the sole superpower in the world. 


Gorbachev remained undaunted.  Having taken up the challenge to reinvent the Soviet socialist system, he was equally determined to instigate humanistic reforms in the international arena.  To diffuse the Cold War in Europe, he proposed an end to both NATO and the Warsaw Pact.  Speaking in France in July 1989, he called for a cooperative “commonwealth of sovereign democratic states with a high level of equitable interdependence and easily accessible borders open to the exchange of products, technologies and ideas, and wide-ranging contacts among people.” 


Eastern European nations, in other words, would join Western European nations in creating a common European identity and culture, buttressed by open trade and travel.  Western European nations had already modeled this in forming the European Economic Community (1957) followed by the European Union (1993), enabling age-old national animosities to dissolve.      


In essence, Gorbachev was proposing a way out of great power competition, in line with the UN Charter, presuming that a friendly international neighborhood is the best security.  This is the vision that is needed today – a reimagining of the world order.  Beyond immediate crises, we need to work toward an inclusive and sustainable global security system.     


Moreover, continuation of the current system of big power competition and rival blocs bodes ill for the future.  The Bulletin of Atomic Scientists has set its “doomsday clock” at 100 seconds to midnight, closer than it has ever been, based on nuclear and global warming threats, an indication of how close humanity is to “destroying our world with dangerous technologies of our own making.”  Moving toward mutual security and cooperation will set the clock back and allow humanity to move forward. 




Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183961 https://historynewsnetwork.org/article/183961 0
The Roundup Top Ten for September 23, 2022

Once More, Railroad Workers are Taking the Lead for American Labor

by Nelson Lichtenstein

Railroad companies' profits hinge on inhumane scheduling practices—cutting the workforce to the bone and squeezing everything possible out of those who remain—that will soon be part of every industry if workers aren't able to fight back. 


Governors DeSantis and Abbott Borrow from the Jim Crow Playbook

by Greta de Jong

"Immigration scholars have noted how U.S. foreign policies contributed to the poverty and violence in Central and South America that migrants are fleeing. Yet rather than acknowledge this – along with assuming the moral responsibilities it entails – some GOP leaders denigrate and dehumanize refugees to win support from voters."



What Does it Mean for History that Gen Z Can't Read Cursive?

by Drew Gilpin Faust

Today's college students are the vanguard of a cursive-free world. It may not be so great for the study of history. 



Republicans Were Trumpy Long Before Trump

by Nicole Hemmer

Although he ran as an independent, Ross Perot's 1992 presidential campaign raised questions about how the Republican Party would position itself in the post cold-war world. That same year, Pat Buchanan started to provide answers. 



Biden's Taiwan Rhetoric Risks Antagonizing China For No Gain

by Stephen Wertheim

The United States' "One China" policy is ambivalent, awkward and dissatisfying. But it's served to prevent a destructive war for decades. Biden's recent comments threaten to destabilize the arrangement. 



"I'm Not Racist, I'm Just Mad Amazon is Destroying Tolkien's Middle Earth with Black Hobbits"

by Mary Rambaran-Olm

Viewer complaints that Amazon Prime has defiled the author's fantasy vision with "wokeness" ignore the historical diversity of the medieval society on which Tolkien based his works. 



A Short History of Fake History, and Why We Fight for the Truth

by Robert S. Mcelvanie

One of the most important parts of the civil rights struggle was an interracial effort to fight against a narrative of fake history that had been institutionalized in and out of the Jim Crow South—the white supremacist mythology of the "lost cause." That legacy should guide schools today. 



"The Woman King" Softens Truths of the Slave Trade

by Ana Lucia Araujo

The film has a delicate task: showing the involvement of the Kingdom of Dahomey in selling other Africans to European slave traders without feeding narratives that blame Africans for the slave trade. It largely sidesteps this history instead. 



We're Living in Ken Starr's America

by David Greenberg

With two proceedings against Donald Trump and Republicans promising one against Joe Biden if they retake the House, we live in an age where impeachment is a part of national politics. Ken Starr can take as much credit or blame as anyone. 



When the News of a Royal Death Arrived Slowly, it Changed American History

by Helena Yoo Roth

The void of power in the American colonies created by rumor of the death of King George II was critical to loosening the monarchy's claims to rule in North America. 


Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/184015 https://historynewsnetwork.org/article/184015 0
What Casey Jones Tells Us about the Past and Present of America's Railroad Workers

Postcard image Sltaylor1954CC BY-SA 4.0




With a potential railroad strike in the news, Americans are learning quite a bit about the poor working conditions on the freight railroads that keep this country running. Railroad workers threatening to strike have complained about poor pay, dangerous working conditions, and punitive attendance policies. If Americans think about the stereotypical railroad engineer, perhaps Casey Jones comes to mind. Casey Jones, who crashes to his doom in a famous song from the Grateful Dead, a folk ballad, vaudeville hit, and countless parodies, has become the almost universal stand-in for a railroad worker in American culture. Yet despite a haze of mythology, there was a real Casey jones, and his work life tells us much about railroad work in the past and present.


As Casey Jones songs spread around the nation, engineers and their friends from across the country claimed to be the “real” Casey Jones, a fact that tells us just how universal his experience was. But most folklorists find John Luther Jones, an Illinois Central engineer who died in a 1900 train wreck near Vaughan, Mississippi, to be the most credible of these claims. While we do not know all that much about his life, we do know what it was like to be an engineer for the Illinois Central, and the story of the real Casey Jones reminds us that there is nothing new about the grievances of modern rail workers.


There has generally been a perception among labor historians that engineers were part of the aristocracy of labor and to be sure, they did enjoy a level of privilege far above most Gilded Age workers. Engineers were better paid than most railroad workers and held one of the best paid and valued positions (outside of management) in railroad companies. Casey Jones was a member of the Brotherhood of Locomotive Engineers, but this was one of America’s more conservative labor unions. Instead of pushing for large scale strike action or systemic change, the BLE organized skilled workers and mostly fought for incremental gains for engineers. This perception of engineers as aloof from labor struggles likely is what inspired IWW songsmith Joe Hill’s parody version of “Casey Jones (the Union Scab),” which imagines Casey Jones as a company toady.


Despite the advantages engineers had, within Casey Jones’s lifetime, from the 1860s to 1900, engineers on the Illinois Central experienced a stunning downfall in terms of their independence and control of the workplace. In the earlier days of railroading, engineers were assigned to a specific engine, but by the turn of the century most railroads were continuously running engines and switching up crews at points along the line. So Casey’s run from Memphis to Canton was just one segment of a longer journey from Chicago to New Orleans and companies essentially used engineers like interchangeable parts to plug into different runs as needed.


Casey and his fellow engineers were bound by a dizzying array of rules in a thick rulebook and discipline for violating these rules was arbitrary and often brutal – in 1897 one master mechanic with the IC estimated that a quarter of the engineers who worked under him in the previous three decades had been fired for disciplinary reasons. Engineers were once paid a stable monthly wage but the Illinois Central shifted this to a per mile rate in the 1870s. Perhaps this is why Casey, on his final day, agreed to take on a second run in a twenty-four hour period. As Sim Webb, his fireman, later related, they got into Memphis in the morning and barely had enough time for a full sleep before leaving again.


The late-nineteenth century also witnessed a reckless speed-up on the rails that mirrors today’s railroad companies’ push for efficiency and austerity. Companies pushed workers to run trains at faster and faster speeds, and with much fanfare they rolled out fast mail trains. A traffic manager or executive could create an aggressive new timetable for a route but it would be up to engineers to make this speed-up happen on the ground. Trying to make up time on the IC’s fast mail schedule was ultimately what killed Casey Jones, as he wrecked while pushing his train to higher and higher speeds and did not see a backed-up freight on his track until it was too late.


By the 1890s, train wrecks were surging in the United States, driven by new traffic demands, higher speeds, and the financial woes of railroads following the Panic of 1893. The carnage got so bad that an entire genre of American folk song emerged – the train wreck ballad. So we remember many of these fast trains not for their speed records, but for the grisly wrecks they caused, such as Casey’s and the “Wreck of the Old 97” that left engineer Steve Brady “scalded to death by the steam.” Yet even these ballads whitewash the horrors and turn what are systemic issues into personalized, man vs machine struggles, and in some cases even make a mockery of the dead engineers. The final line of the 1909 vaudeville version of Casey Jones’s ballad promises his grieving children that "they have another papa on the Salt Lake Line."


Just as COVID-19 has exposed deep inequalities in America and stirred new labor unrest in 2022, disease framed the work life of Casey Jones. Outbreaks of yellow fever, a terrifying mosquito borne ailment, raced up southern rail lines in the late nineteenth century, rendering engineers as nineteenth century “essential workers.” Engineers had to run trains into infected areas and confront both the threat of disease and the threat of violence from townspeople enforcing local quarantines at the barrel of a gun. In an 1878 outbreak, hundreds of railroad workers in the Mississippi Valley were sickened or killed while working through a devastating epidemic. During outbreaks, which periodically flared up in the summer months, many engineers refused to go out on runs and countless others left the company altogether. Casey Jones got his job with the Illinois Central due to fever-induced vacancies, and he ended up in quarantine in Yazoo City, Mississippi a year before he died.


Finally, Casey Jones reminds us that then and now, railroad labor can be a canary in the coal mine for the ills of the broader American working class. If anything, the railroad strikers of 2022 have echoed a building critique of corporate power in America that started in the aftermath of the Great Recession and Occupy movements and that has accelerated with the COVID-19 pandemic. Why, today’s strikers ask, can companies rake in massive profits for shareholders while preaching austerity for workers? This is a question that has broad relevance for the American working class, which is being squeezed by rising costs of living, low wages, and the continued struggles of laboring during a deadly pandemic. Only time will tell if 2022’s railroad labor unrest is a preview of broader labor insurgency, but already, the signs from a broad spectrum of the economy point to yes.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183956 https://historynewsnetwork.org/article/183956 0
Director Lynn Novick on the New Holocaust Documentary



The opposite of love is not hate. It’s indifference.

--Elie Wiesel, Human Rights Activist, Author and Holocaust Survivor


The word holocaust derives from the ancient Greek for “burnt offering.” The term “the Holocaust” refers to Nazi Germany’s systematic, deliberate, state-sponsored campaign of dehumanization, terrorism, persecution and mass murder that resulted in the deaths of at least six million European Jews during the Nazi era (1933-1945). This horrific, genocidal Nazi initiative is also called “the Shoah,” Hebrew for “catastrophe.”  

The roots of the Holocaust ran deep in the centuries-long history of antisemitism in Europe, with a ferocious escalation of persecution of Jews under Nazi dictator Adolf Hitler during Third Reich that began with harassment and deprivation of rights of Jews leading to physical attacks and destruction of Jewish property, to isolated atrocities, to segregated ghettos, and then to mass murder on an industrial scale with the “Final Solution,” the Nazi plan to exterminate all European Jewry. By 1941, death camps for mass killing sprouted in Eastern Europe. By the end of the war in May 1945, the Nazis had exterminated two thirds of the Jews in Europe. 

In the United States, as one of the greatest humanitarian crises in history unfolded, Americans were confronted by the fate of the Jewish people in Europe. Jewish pleas for sanctuary in America to escape Nazi persecution and likely death tested American ideals. Debates raged throughout the prewar and war years about America’s responsibility to assist imperiled refugees as leaders balanced domestic needs during the Depression with military considerations, as well as popular sentiment influenced by pervasive antisemitism, racism, xenophobia, and racial “purification” based on the pseudoscience of eugenics. 

The popular understanding of the US response to the Holocaust combines a sense that Americans were unaware of the Nazi atrocities against Jews in Europe and the idea that Americans were simply unable to help while ignoring the anti-immigrant and isolationist sentiments of the time. 

The story is much more nuanced and complicated as reflected in a groundbreaking and moving new PBS six-hour documentary series The United States and the Holocaust, directed by the iconic filmmaking team of Ken Burns, Lynn Novick, and Sarah Botstein. The series is set to debut on PBS on September 18. 

By carefully examining the period from the early twentieth century through the Second World War in the US and Europe, this series dispels competing myths about the American response to the Holocaust and explores the reality of that period.

As always with previous Burns-Novick film collaborations, the series draws on extensive and groundbreaking research to help viewers understand how American perceptions of the persecution of European Jews were shaped by circumstances in the United States including a severe economic crisis, fear of immigrants, deep-rooted antisemitism and racism, and popular isolationist tendencies. Among other materials, the series presents the fruits of an intensive exploration of the past and includes the commentary of expert historians and Holocaust survivors, as well as rare photographs and films, home movies and family photos and personal memorabilia, official records, newspaper and magazine articles, and popular cultural materials.

Lynn Novick, co-director of the series and a distinguished filmmaker in her own right, graciously responded to a series of questions on the creation of this revelatory and engaging documentary by Zoom. 

Ms. Novick is one of the most renowned documentary filmmakers and visual storytellers working in the US today. She has been honored with Emmy, Peabody, and Alfred I. DuPont Columbia Awards for her extraordinary work. She has served as co-director with Ken Burns for more than 25 years, and together they have created the most critically acclaimed documentary films that have aired on PBS including Hemingway (2021); The Vietnam War (2017); Prohibition (2011); The Tenth Inning (2010); The War (2007); Jazz (2001); Frank Lloyd Wright (1994); and Baseball (1994). Ms. Novick came to Florentine Films in 1989 to work on Burns’s landmark 1990 series, The Civil War, as associate producer for post-production.  Her 2019 series College Behind Bars on a unique education program in prisons was her debut project as solo director and the series was nominated for two Emmys. She previously served as researcher and associate producer for Bill Moyers on two PBS series: Joseph Campbell and the Power of Myth and A World of Ideas with Bill Moyers.


Robin Lindley: It’s a pleasure to talk with you again Lynn. Thanks for your previous thoughtful conversations with me on your Vietnam War and Hemingway documentaries.  And now, congratulations to you and Ken Burns and Sarah Botstein and your crew on your new documentary masterpiece The United States and the Holocaust. For me, the series was extremely moving and illuminating, and I think it will surprise viewers as it addresses many preconceptions and flawed history about this period in the United States and in Germany. What was the inspiration for this documentary? 

Lynn Novick:  Ken and Sarah Botstein and Geoffrey Ward (author, historian and screenwriter) and I have all been interested in the Holocaust in different ways and for different reasons for years. We've touched on it in glancing ways in some other projects we've done, but we weren't especially focused on tackling it as its own distinct topic until 2015. That’s when we were approached by the US Holocaust Memorial Museum in Washington, DC, when they were planning an exhibition called “Americans and the Holocaust” for 2018. They asked if we thought it would be interesting to do a documentary that might come out around the same time as the exhibition. We said immediately, that is a terrific idea. But we added that we probably couldn’t get it done so quickly because of other projects we were all working on. 

We put the project in our pipeline and began to think about it and do the research and assemble the advisors and look for survivors to interview and all of that. But we didn't really dive into it full time for a few years. We didn't get it done in time for 2018, but the exhibition is still up. The museum did a beautiful job and we were able to benefit from their research and scholarship. We also did our own research into the topic and took it in some different directions. 

Robin Lindley: How did the project evolve from your original conception to the final series? I appreciate your deep dive into American history in particular. 

Lynn Novick: One of the big questions with a project like this is always where do you start?  Do we begin with Hitler coming to power or do we begin with the Kristallnacht or other events once the Nazis had taken over Germany? 

But, since we were focusing on the American response, it became clear pretty quickly that we would have to lay out for our audience what our policies were towards immigrants and refugees before this crisis happened. To get our arms around the context for the story, we decided we had to go back to the 19th century—to the ideals behind the Statue of Liberty and our values as a society. And then we looked at how we lived up to or did not live up to those values when push came to shove in the 1930s and 1940s. 

It was really a process of rewinding or pulling back the threads to get where we wanted to start, and then lining up the history of the Second World War; the history of America's involvement in the Second World War; and the history of the Holocaust itself. When it happened? How it happened? Where it happened? Who was victimized? Who were the perpetrators? All of these questions.

I'm not sure I could have fully explained these concerns before we began to work on this project and then simultaneously lay out what information Americans had about all of this, both at the highest levels of government and just among ordinary people reading the newspaper, going to the newsreels, or people who had family members in Europe and were hearing through informal networks about what was happening.

That was the ambitious scope of what we were trying to do. To do that, we heard from scholars and historians who have studied this period and know a lot more about it than we will ever hope to know. And we also had to make it real for ourselves and for our audience so we wanted to find people who actually could still bear witness to it, who had lived through it themselves. And these were people who would now be in their eighties, nineties, and were children at the time. 

All of those things were happening simultaneously for us. 

Robin Lindley: You cover an amazing range of history in six hours. I appreciate, as always, your meticulous and extensive research from film and photographic resources to archival material, to actual location visits, to the presentation of experts and witnesses. And you follow many story threads. How do you see your research process? 

Lynn Novick: First, I can't say enough about the beautiful script that Geoffrey Ward wrote for us. We must give him an enormous amount of credit for helping us establish the overall structure; in figuring out the chronologies and how these stories would intertwine; and writing about such difficult materials. He did so beautifully, and in an understated way. We're so grateful to Geoff for everything he has done for this film. 

The research is done throughout the project by many people. Geoff does his own research in terms of often reading published material, including secondary sources. He looks online for material as well. We have a team of producers that do the archival research, which is looking for the photographs, the footage, the documentary evidence, including letters, telegrams, and many newspapers in this case, to help tell the story. 

And then there's the research to find the people whose stories we're going to tell. We collect their first-person accounts and archival materials that they have so that their stories can also become real. 

So there's many different dimensions to the research and it goes on throughout the project. And we had several aces in the hole for this project. Some of the scholars who we worked with most closely also had worked on the [Holocaust Museum] exhibition and some are connected to the museum. They know this story better than many, and we could call them with questions. They could make suggestions and let us know of finding materials that we didn't know about and sharing that with us. I'll just give one example. It’s a small one, but it represents what you could multiply for every story in the film. 

As we put the series together, we sometimes recognize, for example, that we’re not finding ways to connect the dots between stories. Here, we had two stories with what's happening in Europe and what's happening in America. How do we bring them together besides just alternating between them? And one of the ways we wanted to do that was to find a family that came to the United States before 1924 and had made a life for themselves and had gone back to visit Poland or Ukraine or parts of the former Soviet Union with their movie cameras and had taken pictures or footage of the places where they were from the way you would when you are on vacation. And there's a small collection of this type of material and, we realized about halfway through editing, we really wanted to put in a scene about a story like that, but we didn't have one. 

We reached out to the Holocaust Museum and they had been collecting all manner of oral histories and archival records from different people. They showed us the story of the Bland family, and we were able to look at their home movies that the teenaged son filmed when they went back to the villages where the parents were from in Poland. And we had the oral history of the younger child who was there with his memories of that experience. And we built a scene from that. And that's just one example out of six hours, but you could multiply that across many moments in the film and it speaks to how open our process is. Sometimes we are looking for something and sometimes material just comes up, and it's a little bit of both. 

Robin Lindley: That’s fascinating background on the rigorous work that went into this series. Viewers will be amazed by much of this history. 

Lynn Novick: We were amazed by it.  

Robin Lindley: I appreciate the powerful storytelling and the many threads the series follows. You frame the series with the especially poignant story of Anne Frank's family. She’s perhaps the most well-known victim of the Holocaust for American viewers.

Lynn Novick: Thank you for asking about that story. When we were developing the project and set out to try to tell this story and to find ways for our audiences to understand how interconnected the stories were, some materials came to light, including some letters that Otto Frank, Anne Frank's father, had written to a friend in the US who was a well-connected New Yorker who he had known. Otto Frank begged him for help to try to get his family out of the Netherlands. They had already fled Germany for Amsterdam, and they were trying to get out of Amsterdam. We immediately seized on that as a powerful framing device because, as you say, Anne Frank is the most well-known representative of the Holocaust for most Americans, or certainly for many.

I know for myself, when I read her diary in school and even subsequently reread it before we were working on the project, I read it as a document of something that happened far away that had nothing to do with me except that, when I read it, I was a girl and she was a girl. I was very interested and moved by it and devastated by it, but I didn't think about it as anything to do with America at all. 

And so, we wanted to show from the beginning that America was part of the story, or that this story is part of us in ways that we don't fully understand, and that story showed one family of many people who tried to get here and were not able to. And why was that? If we find out that one of those people was Anne Frank and then realize she might still be here today if our policies had been different, we can then explain all the reasons why they weren't different. We’re not saying America is responsible for what happened to Anne Frank in any way, shape, or form, but we're trying to help our audience and ourselves see that these narratives are connected and that in America we have to look at ourselves in this story. 

Robin Lindley: The series provides context for her story that most people probably don't know much about. 

Lynn Novick: The film is six hours long and it could have been ten hours, but a lot of material that we had originally included ended up on the cutting room floor, including one of my favorite scenes. 

We couldn't fit it into the film, but we found out also that Anne Frank’s Montessori school teacher arranged for her and her sister to have pen pals in America. So, she and her sister exchanged letters with sisters in Burlington, Iowa—another connection with the US. We have a beautiful letter from Anne to her pen pal saying my name is Anne and I'm in this grade and I don't really speak English that well, and tell me about yourself. And I found Burlington on the map after I read this normal pen pal letter. It's very moving and it speaks to how much of this history we really don't know and how difficult it is to excavate, especially from the Holocaust, an act of erasure or an attempted erasure. And Anne’s diary is so powerful because she refuses to be erased. 

Robin Lindley: Thanks for sharing that moving moment in your research. You also worked with a panel of experts, mostly renowned American historians. 

Lynn Novick: We were grateful to have such a distinguished and brilliant and well- informed and thoughtful group of advisors who know the story of the Holocaust and also American history and how they connect. 

Some of our most treasured experiences on a project like this are the times when we sit down in person and share the film with our advisors before it's done. That gives us a chance to make adjustments after the screenings. We had only one meeting because of COVID. This project was unfortunately put together during the worst of the pandemic. 

We had our only in-person screening in the summer of 2021 when things opened up for a little moment. We were able to bring several of our advisors to New Hampshire to screen the film with us and they gave us their full attention, their depth of knowledge, and the nuances of language and image and the other choices that make the series so powerful. 

Rebecca Erbelding, for example, a young scholar at the Holocaust Museum, has an interesting story of how she got interested in this topic and involved in the scholarship. She wrote her dissertation on the War Refugee Board, which I had never heard before their exhibition went up. She worked with us, and she is a scrupulous historian who understands American history and the Holocaust with all the pitfalls and tropes and oversimplifications. And she was just unstinting in demanding that we get it right. She has been a huge help. 

We also have Peter Hayes, a preeminent scholar of the Holocaust, who has a beautiful way of explaining what happened very matter-of-factly and also pushing back on the idea that it was unthinkable. It was impossible. It was unimaginable. All those words, and he holds us to account to say this did happen. It could happen. It's not unthinkable. He was reframing the way that we think about this catastrophe. He's brilliant. 

We also appreciated having Nell Irvin Painter who is a retired history professor, and now an artist. She made time for us and gave us a great interview and helped us situate the entire conversation and the framing of the film within the context of the powerful undercurrent in America of white supremacy, racism, antisemitism, and xenophobia. How does this story fit into that history? And we're enormously grateful for her perspective. 

Robin Lindley: I was impressed that Professor Nell Painter was part of the series. She’s a legendary professor of American history specializing race relations. I interviewed her a couple of years ago on her history teaching and her “encore career” as an artist. Her inclusion in the series is evidence of the deep dive the film takes into our ugly history in the early 20th century with the discriminatory laws, racism, xenophobia, and eugenics—with US laws and policies that served as models for Hitler’s Third Reich. 

Lynn Novick: It's very ugly and deeply disturbing to work your way through that material, but it was part of our story. To tell this story without exploring all of that would be strange. And we were also fascinated and repelled by the interplay. 

We could have done more, but I think we gave our audience an understanding that eugenics was popular on both sides of the Atlantic. The Germans embraced it and we embraced it. And Nell has written about this as has Isabel Wilkerson and James Whitman. There was a cross-pollination of racist ideas between Germany and the United States and England and France and other parts of Western Europe. How mainstream those ideas became based on a pseudoscience is appalling. There was nothing scientific about eugenics whatsoever.

Robin Lindley: Yes. A professor I know asked, do you know the abbreviation for eugenics? It's BS, he said. 

Lynn Novick: Exactly. That's perfect. BS is exactly right. And yet we had all these very eminent Americans on the bandwagon hyping it. 

Robin Lindley: Yes. You note in the documentary that Rockefeller, Carnegie, and a lot of renowned academics embraced eugenics. The research in the United States and the United Kingdom, I believe, brought eugenics to the attention of Hitler and the Nazis, and they used it in their master race formulation. 

Lynn Novick: I believe so. And then we also have to account for the fact that again, and other scholars have explained this much better than I can, that when the Germans were trying to figure out how to structure their society so that Jews would be stripped of their citizenship and their rights and do it little by little and do it legally, they looked to us for how to do that. And we had set quite an example over many years. 

Robin Lindley: Exactly. And Hitler, in his hateful, 1924 screed Mein Kampf, applauded America’s restrictive and racist immigration laws as well as the Jim Crow segregation laws and other policies that made Black Americans second-class citizens including a ban of miscegenation. These American ideas were an inspiration for Nazi race laws which deprived Jewish people of all rights and embraced Aryan supremacy with a whole categorical scheme of Jewish ancestry and blood to determine who was truly German and who was not. And, according to several historians, the American laws were actually harsher in some ways than the Nazi edicts. 

Lynn Novick: Exactly. I knew that from visiting the Holocaust Museum in Washington. They have artifacts that speak to those laws. And it wasn't news to me that there were Nazis looking to our laws as a model, and the degree to which they emulated us in that regard handicapped our ability to criticize them. They could throw it back in our face and say, you can't criticize us for oppressing a minority. You do the same thing. I'm talking about before the mass killings and the extermination of people, but right up to that moment, there wasn't that much we could say in response to their laws. 

Robin Lindley: The widespread American embrace of the racist principles of eugenics many surprise many viewers. I recall reading that families who were adjudged the most “Nordic”—the apex of the eugenics race hierarchy—would win ribbons at county fairs like animals or produce. And your series notes that 43 of 48 states had legalized the sterilization of defective or “feebleminded” people as well as some criminals. 

Lynn Novick: Yes. That’s really horrifying. And some of those laws were not really rescinded until very recently. 

I'm actually going to tackle that topic again because I'm working on another project on the history of crime and punishment in America. We're going to show that people who were under the control of the state in mental hospitals and prisons and other places of confinement, were subject to the idea that they should be prevented from reproducing and that idea was a very durable one. It was quite horrifying. They called these ugly ideas racial hygiene and racial purification and social engineering according to some construct of the pseudoscience of eugenics. This horrific engineering of humanity is basically racism, and it was not unique to Nazi Germany. 

Robin Lindley: And speaking of racism, how do you see American immigration laws in the early 20th century? You recount a backlash against immigration and refugees after World War I and the extremely restrictive Johnson-Reed Immigration Act in 1924. 

Lynn Novick: That's a critical piece of our story. To understand why it was so difficult for refugees under Hitler to enter the United States, we have to know that immigration laws changed dramatically in 1924. 

Before 1924, there had been other restrictions, specifically the Chinese Exclusion Act and some specific rules targeting people from Asia. But for the most part, until the early twenties, anyone could come here from anywhere without any real process. My ancestors all just got on a boat and came here. And if they hadn't, I wouldn't be here today because they were from the places where a lot of the killing took place when the Nazis overran Soviet Union. 

So why did we decide as a society to change our policy and make it very difficult for people to come here from certain parts of the world? There was a backlash that I didn't understand, and it had been building for quite a while. You had wave upon wave of millions of people coming to the United States from 1880 or so to 1920. I think 25 million people came here from Eastern Europe and Southern Europe. And the people who ran this country really felt that America was being destroyed—that something innately, inherently good about America was being eroded. And I would argue that the exact opposite was true, but that's how they saw it and they were determined to stop the wrong people from coming here. And it does tie into eugenics and a sense of racial hierarchy that they wanted to preserve. And they used this umbrella of science, of eugenics, but it wasn't scientific at all. It was just pure bigotry.

And sadly, the 1924 Johnson-Reed Act passed handily without a lot of resistance to it. The law set quotas for every country that were carefully apportioned according to the ideals that the people who wrote the law wanted America to represent. The Act permitted immigration of a lot of people from Northern Europe: Germany, England, and Scandinavia. That's the people that they thought should come here. Everybody else was restricted to little quotas. If you were from Poland, or if you were from Italy, or if you were from Eastern Europe, your chances of immigrating were very small.  

Robin Lindley: That’s another sad aspect of this story. And after the First World War, as your film shows, the white supremacist and anti-immigrant Ku Klux Klan held surprising political sway in many states in both the South and the North.  

Lynn Novick: Indeed. They had reinvented themselves as an anti-immigrant, anti-Jewish, anti-Catholic organization and were much more “mainstream” than they had been as a violent terrorist organization after the Civil War. In the 1920s, they called themselves “The Invisible Empire,” but there was nothing invisible about them. They marched down Pennsylvania Avenue in Washington DC. And they strongly favored immigration restrictions, and they were hugely popular. Politicians had to reckon with that. 

Robin Lindley: And, by the 1930s, the US Department of State rigidly enforced the immigration quotas and restricted visas and thus severely limited immigration, especially from unfavored nations. And some upper-level officials in the State Department were openly antisemitic. What did you learn about the State Department?  

Lynn Novick: I knew a little bit about this particular character named Breckenridge Long [Assistant Secretary of State—responsible for refugee visas] who is certainly one of the villains of the story. He seemed to have no compassion for Jewish refugees fleeing Nazism, and he saw them all as a threat to the United States. To some degree, we could say that perhaps there was a fear that refugees coming here would become spies or other security risks. That was the argument, but Long seems to have been just unapologetically antisemitic, and he felt he was doing his patriotic duty to keep the wrong people out of his country and to use the power of his position to do so. And the people under him mostly went along with that. 

On the other hand, there was no great pressure being put upon Long from the White House or from Cordell Hull who was the Secretary of State. The Department went out of its way to enforce the immigration rules to the letter of the law and to delay and to not make things easy for refugees. 

And also, some of the worst offenses beyond those—which were pretty bad—was when the State Department basically suppressed reports of Hitler’s mass extermination of people in 1941 after the invasion of the Soviet Union. These credible reports never made their way beyond the Department. State Department officials in Washington wrote back to their offices in Switzerland, where someone in the Department had sent a sent a detailed report [on mass murder of Jews], and ordered, “please don't send us any more of these reports.” They had no interest. They didn't want this explosive information that could have been used perhaps to raise awareness of the crisis. And instead, they just concealed it. There's no good way to see that story. 

Robin Lindley: And the president, Franklin D. Roosevelt, knew of atrocities against Jews in Germany and beyond, but was reluctant to speak out on behalf of Jewish refugees, even though Eleanor Roosevelt, the First Lady, and Frances Perkins, Secretary of Labor, were arguing essentially that providing a haven for refugees was the purpose of America. 

Lynn Novick: FDR didn't leave behind a diary or audio tapes or much of a record of exactly how he felt about what he was doing or why he made the decisions he did or why he embraced or not the policies he embraced. We have to go with what he did and didn't do and try to infer what we can, and that's a risky business.  

We know that FDR was aware of the crisis. We know that people in the Jewish community in particular and Eleanor and others were pleading with him to do something. And we also know that he was a very astute politician who understood the American political scene better than most, and he was the leader of a deeply antisemitic country that was also xenophobic, as we have said. 

In the 1930s, we were coming out of the Depression and he was not a king. He didn't get to make all the rules. And, to get reelected, he had to bring the public along with him. He was trying to do a lot of different things to get the country out of the Depression, for one, and then move us toward a war footing, and to try to help England and France in particular defend themselves as he began to see the Nazi threat emerging, and then ultimately prepare the nation to fight a war.

And I think it reflects not on him, but on our whole nation. If FDR had said in a Fireside Chat, “I have it on good authority that the Jews of Europe are facing the threat of annihilation, and we know this is happening and Americans have to rally to do everything we can to save them,” I think that would have gone over like a lead balloon. He knew the country very well. He was walking a tightrope trying to move the country forward, but he also knew that if he got too far ahead of public opinion, he’d get nowhere. I'd like to think he could have done more or I wish he had done more, but I also think it's not fair to pin this failure on FDR alone by any means. 

Robin Lindley: There were also some heroes in American officialdom. The series mentions John Pehle who stood up to the State Department and became director of the War Refugee Board late in the war. Who was he?

Lynn Novick: Yes. John Pehle is one American who was not indifferent to the plight of Jews and refugees. He grew up in Omaha and his father was a German immigrant and his mother was the child of Swedish immigrants. He worked for the U.S. Treasury Department as the director of the Foreign Funds Control and, when Gerhart Riegner’s report about the mass murder of Jews came across his desk, he was prepared to do anything in his power to help.

However, as I previously mentioned, the State Department and Assistant Secretary Breckenridge Long, in particular, were deliberately obstructionist and not willing to send any aid abroad. Rather than stand by idly, Pehle worked with Secretary Henry Morgenthau, Jr. and others, to draft an executive order for President Roosevelt. The result of their efforts was the establishment of the War Refugee Board, which was signed into effect in January of 1944. Pehle acted as the first director of the board and it was estimated that the WRB saved tens of thousands of lives by providing materials and sending money. Pehle is an American hero who used his position of power to save individual lives and make a difference.

Robin Lindley: You mention the deep antisemitism in the United States at this time. What explains this attitude? Why did many Americans find Jewish people a threat? 

Lynn Novick: I'm not an expert on that topic. I think we have to see it in the context of a white supremacy and ideology. There was lots of prejudice and bigotry to go around. It was not only targeted towards Jews. There was racial hatred towards African Americans, Mexican Americans, Native Americans, Italian Americans. There was dehumanizing language and othering of many groups.

I don't know that Jews were singled out as being more benighted, as more of a target of this hatred, than other groups, except to say that there was long history of antisemitism from the Catholic Church and, as we see today, from a part of Christian heritage. That's part of the story we have to deal with. And that was true here. 

But what was interesting to me was that antisemitism in America became a much more powerful force only after Jewish immigration increased. When you see large numbers of Eastern European Jews coming here in the 1890s and up to 1920, that antisemitism becomes much more powerful. And that's when you start to see quotas and restrictions, and the overt, explicit antisemitism became much more pervasive when there were more Jews here. 

And Henry Ford, an American icon with an enormous amount of social capital not to mention actual capital, was a big part of this. We cannot let him off the hook. I had heard of The Protocols of the Elders of Zion and I knew it was as horrendous antisemitic hoax. I didn't appreciate the degree to which Henry Ford was involved in spreading this hoax. He printed this antisemitic filth in his newspapers and published it as a book and promoted it for years. And he had a lot of credibility. And we're seeing that today. You repeat a lie enough times and people start to believe it. 

So, what you have in our story on American antisemitism, on the one hand, was more of Jewish presence, and then you also had a credible, powerful person spreading hateful ideas. Maybe that shouldn't be surprising. 

Robin Lindley: And we are still faced with Big Lies, as effectively used by Goebbels and Hitler in Nazi Germany, and now employed here. I was surprised by the story from the series about Catholic gangs in the US who assaulted Jewish people and destroyed their property even as the persecution of Jews was happening in Germany.

Lynn Novick: Yes. Father Charles Coughlin is another powerful antisemitic person to pull into this because he incited Nazi-aligned and proto-fascist organizations of both Catholics and Protestants. And we have a tradition of vigilantism here that's often been directed at other people, specifically African Americans, but other groups as well. So, there's a context for that, that we can't ignore. It’s not happening in a vacuum, but it's certainly true that there were self-appointed vigilantes that went out to try to exact violence on Jewish people in America. It’s devastating. 

Robin Lindley: How aware were most Americans of the oppression of Jews in Germany in the 1930s before the war such as atrocities like the destruction of Jewish property and murders during Kristallnacht, the “Night of Broken Glass,” in 1938?

Lynn Novick: One of the many misunderstandings I had of this history before we worked on this film was that Americans didn't know much about what was happening in Europe. The Holocaust was carried out theoretically in secret. 

I incorrectly thought that the world discovered the atrocities only after the war, after the spring of 1945. But the Holocaust Museum exhibition belied that notion. And then we were able to benefit from that as well as from a book by historian Deborah Lipstadt—one of our advisors—who wrote Beyond Belief that showed that the coverage of Nazi oppression of Jews and persecution of Jews and other groups was quite significant. There was a lot of coverage in US newspapers, magazines, and on the radio. Reporters were there and they were telling the story of what they saw. It got harder and harder, but Kristallnacht made front-page headlines all around the country and the world. 

The presence of killing centers like Auschwitz was a different story. That news didn't come out simultaneous with the process of new killing happening, but it did eventually come out before the war ended.

So, the American public was well aware of what Hitler was saying and of everything the Nazis were carrying out in public, which was horrendous. Before the mass killings, there was a lot of coverage of the condition of Jews, and Americans were understandably very upset about it, but it didn't change American attitudes toward immigration. 

Robin Lindley: I think the fateful 1939 voyage of the Jewish refugees aboard the ship St. Louis encapsulates American attitudes then, just a few months before the war started. What did you learn about this tragic voyage? 

Lynn Novick: The story of the St. Louis was very well known at the time, and maybe less well known now, but certain generations of Americans who were alive during the war or soon after have heard of it. And there was a Hollywood movie called The Ship of the Damned. So, it got some attention. 

There are a lot of misconceptions about the voyage. It didn't come down to us exactly accurately so we really were grateful to be able to tell the story with the benefit of all of the historical records that we could put together. 

There was a steady stream of ships across the Atlantic bringing mostly German Jewish and other Jewish refugees to the Americas. You had to get a visa for US entry, and there was a whole legal process to get permission to come here or to Cuba or to some other place. But there were ships going across until the war began, and the St. Louis was one of them. There were 900 something people aboard and they had not been able to get visas to America because that was so difficult, but they had been able to buy visas to go to Cuba, which was a very close ally of the United States at that time—long before Fidel Castro took over. 

For those of us who maybe aren't so clear in Cuban history, Cuba then was almost a satellite country of the United States in some ways, and a lot of refugees thought that if they got to Cuba, they could wait there, and then eventually their visas would come through for the United States and then they could come to America. I know several people whose families did do that. In fact, the COO of PBS, Jonathan Barzilay, said that's what his mother did. So this is a common thread. I know some other families that went the Cuba route, and there were already thousands of Jewish refugees in Cuba at this time. 

But unfortunately for the people aboard the St. Louis, from the time they bought their visas to the time they got to Havana, Cuban policy changed and the government decided they didn't want to let them in. There were many complicated reasons for that, which I will not get into, but it was corruption and internal rivalry between different factions there and the Cuban leader Batista, who literally became dictator, was part of this. Regardless, the refugees were now stranded and the Cubans would not let them off the boat. So, you had this ship full of 900 people who didn’t want to go back to Germany and they had nowhere to go.

The media was in Havana harbor for several days trying to figure out the situation. They were telegraphing to people in the US and around the world, and it was quite an international crisis. And a lot of people who witnessed it then and who think about it now hold the United States to account. Why couldn't we just let them in? And of course, potentially we could have, but there was a process and there were other people who were waiting on lists, and these refugees would have jumped the line, and that would have created some problems in the process for admitting immigrants. I'm not saying it couldn't have been done, but they unsuccessfully appealed to President Roosevelt and the State Department. 

For the people on the ship, the situation was just excruciating. In the end, with a Jewish aid organization and some political connections, they were able to get other countries to agree to take them. They raised some money, about half a million dollars. But they had to go back across the Atlantic. They didn't go back to Germany, but ended up in Belgium, England, France, and other countries, and a good portion of them survived the war, but a third of them were killed by the Nazis, and that's a tragic story. 

Robin Lindley: Thanks for that account, Lynn. That’s another grim chapter in this horrific history. I haven't got to the mass murder of Jews and others in the Holocaust yet. I learned from my reading, if I may add, and from your series, that Hitler openly called for extermination of all Jews in Europe by 1941. The mass murder of Jews and others grew out of the Nazi T4 program in 1939 to euthanize “defective people,” the disabled and the criminal, or “life unworthy of life,” as the Nazis put it. That program was the precursor of the mass extermination at death camps during the Holocaust. 

Lynn Novick: Right. 

Robin Lindley: The mass exterminations of the Holocaust began in earnest after the German invasion of the USSR in 1941. SS Einsatzgruppen troops killed thousands of Jews people in mass shootings or in mobile gas chambers in Eastern Europe and the Soviet Union. And the death camps were operating by 1942 after the Wannsee Conference where Nazi leaders planned the “Final Solution.” And you recount that Rabbi Wise reported to FDR that the Nazis began using lethal Zyklon-B gas at places like Auschwitz to kill thousands of Jews and other prisoners.

And it may surprise many viewers that 4.5 million of the six million Jews killed in the Holocaust were already dead by the autumn of 1943. 

Lynn Novick: I agree with everything you said. That was a very good summary. 

What I would say when we think about America's response is that we didn't have boots on the ground in Europe then. Once the mass killing started, it happened very quickly. And that industrial scale was quite chaotic, but it turned out horribly because it was not that difficult to kill a lot of people quickly if that's what you wanted to do and you were determined to do it. 

As this was happening, there wasn't much we could have done to prevent it or to stop it given our military situation on the ground. This killing was happening in parts of Europe that were quite far from anywhere American soldiers would ever get to. We never got deep into Germany really. 

The Germans did some killing in Germany and the horrible camps that were liberated there at the end of the war were really the remnants of the process. The massive killing happened in what we would call now killing centers and they were deep in Poland, in places that Americans never got to. For us, it was important to line up the timelines of what happened and when and where when in thinking about the American response.

The challenge here is that nobody knew what was going to happen. The war started in 1939 and America didn't get involved until 1941. Before 1939 would have been the time to encourage people to get out of the parts of Europe where Hitler was going to go. But that's all that can be said with the benefit of hindsight. 

Robin Lindley: Your series does not shrink from portraying the ghastly and horrific reality of the Holocaust. You present images of the mass killing centers and heartbreaking films of dead and sick and dying prisoners during the liberation of the death camps. 

Lynn Novick: That was important to us in terms of the visual representation of the story, and we could not show our viewers very much of what Americans had not yet seen. So, most Americans didn't see any images of what we think of as the Holocaust, until the spring of 1945. Auschwitz was liberated in the winter of 1945, and there were some brief mentions of that, but Auschwitz was liberated by Soviet troops.

The true horror was not clear to Americans until Buchenwald and Mauthausen and Bergen-Belsen and other camps were liberated and Western cameras were there. We held off on showing that material until the end of the film. When we're talking about the killing earlier in the film, we show some of the sites where it happened and the memorials there, which we filmed ourselves. 

I think seeing is believing and we are trying to get the point across that not just for Americans but probably for people around the world the mass killing was covered in the media in print and, little by little, the realities began to accrue on the scale of killing and the scale of persecution, but it wasn't until the images came out that people could understand it.

Robin Lindley: And it’s stunning that, even after the war, there were still severe restrictions on refugees to the United States when there were millions of Jews and others who were displaced in the ruins of Europe. 

Lynn Novick: For us, that's one of the saddest parts of the story in a way, because then the US couldn’t say that we didn't know. We couldn’t say that, if we had known, we would have been more welcoming, more generous, because at that point we did know what the people who had survived had been through, what they had lost, why they couldn't go home. They had lost their families. They had lost their possessions. They had lost their livelihoods. All they had left was just the fact that they were alive in many cases. And they had nowhere to go. 

And there weren't that many people left, frankly, who had survived and still Americans were not disposed to make any exceptions to the policy to let them in. Little by little we did let some in, but it was a battle that shouldn't have been, in my opinion. 

Robin Lindley: You share a much more nuanced story about the United States and the Holocaust than most of us have learned. I think viewers will be struck by the extensive research and the riveting story the series presents about how the United States responded to an international crisis brought on by a brutal dictatorship in Europe, and how the response was influenced by our own history. How do you see the resonance for the story today as we continue to struggle with racism and white supremacy, restrictive immigration laws, domestic terrorism, foreign policy challenges, and serious threats to democracy?

Lynn Novick: When we started working on the film or thinking about it, it was 2015, and Barack Obama was still president. Now, that feels like a lifetime ago, frankly, with everything that has happened since.

I wish the film were not so relevant to today. I really am deeply disturbed, and I know that Ken and Sarah are and everybody who we worked on the series is disturbed by what we see happening all around us, not just in the United States, but around the world. 

But if we speak about America, there will be surges of white supremacy, racism, antisemitism, hate speech, and bigotry. It has become mainstreamed and moved from some fringe corner of the far right to “mainstream media” and to the White House under the previous occupant. He and his allies continue to use the rhetoric of hate and dehumanization of immigrants with racist tropes and fear to attain the goals they have for the society. The breakdown of social norms, the breakdown of democratic norms, the breakdown of civil society, and the rise of propaganda and lies to serve them is truly frightening.  

This film is relevant in so many ways and we wish it weren't, but we're eager to share it with the public for all those reasons. We stopped editing the film last winter when we had to finish it to get it ready for broadcast, but more events have happened that could still be relevant, and there will be more things that happen next week. In other words, it's hard to put a pin in when we will have an ending to our film and we stand by that. But since we made the film, I feel there's still more. 

The story continues in ways that are very worrisome. We're grateful to share it and hope that it can contribute in some ways to at a deeper understanding of the fragility of our democracy and the vulnerability of the institutions that many of us take for granted and the hard work it takes for every generation to preserve them and not to take them for granted. 

Robin Lindley: I appreciate those heartfelt comments, and the timeliness of the film when our democracy is imperiled. And your expert, Professor Timothy Snyder, has written brilliantly about the fate of democracies and reality of fascism in history. 

Lynn Novick: Indeed. I was listening to another historian earlier who studies the rise of fascism and he pointed out that, in many cases, or maybe all cases, fascism has emerged in democracies. Countries that have fair elections and an open society and a free press also have stresses, dislocations, insecurities and tensions in the society. That's fertile ground for fascism to rise and to once it rises, then all bets are off. 

Robin Lindley: Your documentary represents an opportunity for viewers to reflect on the fate of our democracy and our checkered history as these threats again emerge.

Lynn Novick: It's been quite a journey for everyone to work on while this has been happening. It’s been very, very sobering.

Robin Lindley: I really appreciate you bearing with me and sharing the remarkable back story of your powerful new series, Lynn. Congratulations. 

Lynn Novick:  I was just going to say I always enjoy our conversations. I think you are so thoughtful and it's really great to be able to go in depth and explore some of the things that we have tried to explain in the film. Thank you for taking the time to watch it and to have this conversation. 

Robin Lindley: That’s very kind Lynn. It’s a gift for me to talk with you and other bright people who add so much to our understanding of the past and where we are now. Thank you for your brilliant contributions, Lynn, and especially for this revealing and powerful new documentary. I wish every American could view this illuminating and timely film on our history.


Robin Lindley is a Seattle-based attorney, writer, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer's Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service including as a staff attorney in federal agencies and with the US House of Representatives Select Committee on Assassinations. His writing often focuses on the history of human rights, conflict, medicine, art, social justice, and culture. His email: robinlindley@gmail.com.


Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/blog/154633 https://historynewsnetwork.org/blog/154633 0
El Caudillo: Colonialist Violence and the Rise of Francisco Franco




Francisco Franco Bahamonde was a naval officer’s son, who was unable to follow family tradition and go to sea. Instead, he joined the army as a fourteen-year-old military cadet. His high-pitched voice and skinny, short frame saw colleagues give him the nickname of Cerillita, or Little Matchstick (though when he plumped out later in life, they called him Paca la Culona, or Big-Arsed Fanny). Like almost everybody, they underestimated the iron will hidden by his unimpressive aspect. What he lacked in brains and imagination, Franco made up for in ambition and fearlessness. It was enough to make him the most important person in Spain’s bloody and repressive twentieth century.

As a young officer in Morocco, he displayed considerable physical courage time and time again. The consequences were war wounds, a lucky escape from death, rapid promotion and meteoric fame. He arrived in Spain’s narrow sliver of North Africa as a nineteen-year-old second lieutenant in 1912 and left as a thirty three-year-old general in 1926. ‘This is where the idea of rescuing Spain was born,’ Europe’s youngest general admitted later. ‘Without Africa I can barely explain who I am to myself.’

In 1920, when Spain created its own version of the French Legion, the Legión Española, with a battle-cry of ‘Long live death!’, he was an obvious choice to be one of its commanders. The Legion mostly recruited Spaniards and, indeed, came to be seen as the supreme expression of a chest-beating Spanish masculinity based on fearlessness, violence and disdain for weakness or ‘introversion’ of any kind. Its members were known as ‘The Bridegrooms of Death’.

The new Spain was born out of extreme notions of colonial pride, violence and martial vigour. It also reflected the small-minded Roman Catholicism of Franco himself, whose idea of religion was based on obedience rather than love. That did not stop him from stamping on Spanish coins that he was dictator ‘por la gracia de Dios’, ‘by God’s will’. In short, Franco was the perfect reactionary. He wanted a return to social cohesion based on fear and the obedience owed to church, landowners, police and those who sacrificed themselves for the fatherland, meaning the army. He was still relatively young – aged forty-three – as well as notoriously cold and difficult to read. ‘Only his eyes show life and cleverness,’ Alfonso XIII’s son and heir Juan de Borbón (who Franco never allowed to rule), said of him. ‘One is the master of what one does not say, and the slave of what one does,’ Franco himself observed, to explain why he spoke so little. He was a skilled exponent of retranca, a deliberate ambivalence that trips up other people and is reputed to be a mark of people from Galicia.

As a man accustomed to military violence, without empathy and incapable of understanding fear, he felt no compunction about using terror to impose a new, highly personal regime. This was, in some ways, an imitation of the absolutist monarchies of the past, but with a massive increase in violent repression and little of the Enlightenment thought that those regimes encouraged to bring innovation or progress. A sluggish conservatism and a heavy-handed love of ‘order’ stymied early attempts by even those close to him to bring Spain properly into the twentieth century. It did not help that he refused to see himself as a politician, since this was a profession that he despised. Nor was he keen on intellectuals who questioned the supposed virility of his regime.

Miguel de Unamuno had been appalled by anarchist atrocities early in the war and was prepared to give Franco a chance. Within months he discovered his mistake, as friends were detained or shot. In a famous public row with Franco’s former Foreign Legion commander General Millán Astray at Salamanca University late in October 1936, Unamuno denounced the ‘uncivil war’ and warned Franco that military victory alone did not confer moral authority, and that he must convencer, convince, rather than vencer, win. José Millán Astray allegedly shouted: ‘Death to intelligence!’ and Unamuno, once more, was sacked as rector and remained shut up in his house until his death ten weeks later. Another philosopher, José Ortega y Gasset, warned that with Unamuno’s voice now permanently muted, ‘I fear that this country is entering a period of terrible silence.’ He was right.


Excerpted from España: A Brief History of Spain. Used with the permission of the publisher, Bloomsbury. Copyright © 2022 by Giles Tremlett.


Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183957 https://historynewsnetwork.org/article/183957 0
Boris Johnson's Legacy? It's Complicated

Boris Johnson signs the Brexit Withdrawal Agreement, January 24, 2020



As Boris Johnson makes his disheveled way out of No. 10 Downing St., it is reasonable to wonder what legacy this historically-minded prime minister leaves posterity.


It is easy to point to the mishandling of the COVID-19 pandemic, the cost-of living crisis threatened by energy price rises and the looming prospect of inflation or, more positively, British efforts to support the Ukrainian struggle against Russian aggression. But does this list constitute a legacy, or does it simply enumerate the in-tray Johnson has bequeathed his successor, Liz Truss?


It is the incorporeal aspects of Johnson’s Downing St. tenure that offer a more durable legacy. One in particular stands out: the three-year long stress-test the Johnson administration performed on Britain’s uncodified constitution.


The British constitution is a complicated beast. It has no ur-text or guiding document. Instead, it draws on precedent, whether defined by law or by convention. As a result, Britain is reliant on what historian Peter Hennessey terms the “good chap" theory of government. This premise rests on the assumption that every incoming government will adhere to the conventions, norms and unspoken agreements that guided previous administrations.


But what happens when a new prime minister is, for lack of a better word, a cad?


Johnson began his premiership by suspending parliament for five weeks. This action, known as prorogation, bridges the few days between the end of one parliamentary session and the beginning of another. However, the length of the 2019 prorogation alarmed many. The Johnson administration claimed that it needed time to bring forward a new legislative agenda. MPs were skeptical and worried that Johnson was trying to limit scrutiny of Brexit-related legislation. The Supreme Court agreed, ruling prorogation illegal.


Hannah White, Acting Director of the Institute for Government, an influential non-partisan thinktank, argues that this attempt to delegitimate parliamentary scrutiny was the most consequential aspect of the Johnson administration. Since 2019, ministers have rushed through legislation, refused amendments and failed to give House of Commons committees sufficient time to evaluate the impact and constitutionality of bills moving through parliament. The result, as White explains, has been the continued strengthening of the executive and the concurrent weakening of parliamentary checks and balances provided by the constitution.


Traditional norms and behaviors failed to trouble the Johnson administration. Johnson’s view of his premiership as a personal mandate that expressed the “will of the people” does not accord with the reality that prime ministers are appointed, not elected. The former prime minister misled parliament on at least 27 occasions and remains under investigation for lying about parties held in Downing St. during lockdown. Johnson resigned following another episode of misdirection, this time regarding his knowledge of sexual assault allegations against a former Deputy Chief Whip.


So what’s the problem? Politicians lie. The British system held. Johnson did not try to avoid his fate by dissolving parliament and calling a general election. His own Conservative MPs forced him from power. Nor, once he lost power, did Johnson incite insurrection. Hordes of Johnson supporters restrained themselves from rampaging through the halls of parliament.


Nonetheless, the precedent set by Johnson is dangerous. His premiership did not just erode the “good chap” theory of government. It also stretched the boundaries of acceptable political behavior, meaning actions once considered beyond the pale are now potentially available to future prime ministers.


Brexit, the political project that brought Johnson to power in the first place, also contains a set of constitutional implications.


Whether one likes or dislikes Brexit—personally, I deplore it—withdrawal from the European Union did return to Britain a sense of constitutional symmetry. As Helen Thompson, a political scientist and former host of the Talking Politics podcast points out, the principle of parliamentary sovereignty anchors the British political system. Unfortunately, this ideal is incompatible with European Union membership, which prioritizes EU over national law. In effect, Brexit healed a constitutional rupture created when Britain joined the European Economic Community in 1973.


But as one breach heals, another emerges. In his book How Britain Ends (2021) Gavin Esler, a former BBC political editor, argues that Brexit has hastened a fracturing of national unity and purpose. Many living in Scotland and Northern Ireland, which voted to remain in the EU by considerable margins, resent being forced out of Europe by an expression of English nationalism. In response, the Scottish government has demanded a second independence referendum. In Northern Ireland, tensions over Brexit have reawakened fears of terrorism. Even in England, Remain and Leave, political identities that emerged during the 2016 referendum, continue to have purchase.


It may be premature to talk about Johnson’s legacy though. In his final speech as prime minister, Johnson expressed his desire to follow the example of Cincinnatus, a Roman statesperson who was appointed dictator by the Senate in 458 BCE. Cincinnatus successfully fought off an invasion by the Aequi, a neighboring Italian people, before immediately retiring to his farm. Twenty years later, Cincinnatus performed a similar action, crushing a plebian revolt before retiring to his farm once again. What exactly is the former prime minister trying to tell us?



Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183960 https://historynewsnetwork.org/article/183960 0
Another Documentarian's Perspective on "The U.S. and the Holocaust"

Jewish refugees from Germany aboard the St. Louis. Nearly all of the 907 passengers were denied entry first to Cuba and then to the United States and Canada in 1939. Returned to Europe, 254 would become victims of the Holocaust.




    As the producer and director of a PBS film on America’s response to the Holocaust some years ago, I was at first delighted to learn that Ken Burns has now likewise made a film for broadcast on PBS about how our country responded to the Nazi genocide. But some advance publicity for the broadcast raises questions as to whether his film will accurately portray key issues such as U.S. refugee policy and the failure to bomb Auschwitz.   My film, America and the Holocaust: Deceit and Indifference, was first broadcast in 1994 as part of the PBS history series The American Experience. I have been most gratified that it has become a staple for American history and Holocaust education in many secondary schools around the country. Ensuring that young people learn about these difficult periods in our country’s history is essential to our future as a morally responsible nation.   When I set out to tell the complex and troubling story of our nation’s response to the Holocaust, I believed it would be most effective to chronicle those events through the experience of a single person.    I was fortunate to discover the moving story of Kurt Klein, a German Jew who immigrated to America in 1937 at age 17, and then spent several years struggling against a wall of Roosevelt administration obstacles that stood in the way of rescuing his parents from Nazi Germany. My film examined the profound social, political and economic factors that led the American government, along with much of American society, to turn its back on the plight of the Jews.   America and the Holocaust explored the decisions that President Roosevelt and his State Department made to block news about the growing genocide, as well as to keep Jewish immigration drastically below the legal limits that the existing quota system allowed.    That policy’s result: nearly 200,000 Jews, eligible for entry to America, such as Kurt Klein’s parents, were prevented from immigrating and were murdered in the Holocaust.        Not surprisingly, there were some viewers at the time whose fond memories of FDR—a fondness I have always shared—made it difficult for them to accept the president’s disturbing choices.   For filmmakers, one of the most important elements in the process of making a historical documentary is to have as our advisors historians who are experts in the subject material.   I was fortunate to have the late David S. Wyman as my main historical advisor. As the author of the definitive work in this field, The Abandonment of the Jews, Prof. Wyman was able to bring to our collaboration a comprehensive and nuanced appreciation of the historical issues and materials.   For The U.S. and the Holocaust, Ken has worked with writer Geoffrey Ward, his longtime collaborator. I hope they have examined the historical research published in the years since my film came out. And that they have made room in their expansive documentary for some of the uncomfortable truths about FDR, such as remarks about Jews behind closed doors. That information may help us better understand Roosevelt's decisions concerning Jewish refugees.   Inevitably, portions of The U.S. and the Holocaust will echo the social, political, economic story we told in 1994 about what America was like during the Roosevelt years and how that impacted the U.S. government’s response to events overseas. The racism, antisemitism, and isolationism of those years— found in both political camps—is by now a well-known story.   But what will merit special scrutiny in the new Ken Burns film is how he presents the key controversies:   Does he attempt to blame “American society” — as if the president was a helpless captive of public opinion?    Does he attempt to blame everything on the State Department -- as if that branch made its own foreign policy?    Does he make it seem as if the immigration quotas in themselves were the problem, instead of acknowledging how FDR’s policies kept the quotas vastly unfilled?   Does he convey the impression that bombing the railways leading to Auschwitz was too difficult to accomplish, when we know that U.S. planes bombed railroad lines throughout Europe—with multiple bombing raids on German oil factories in the vicinity of Auschwitz, some less than five miles from the gas chambers…?   Like many other Americans, I will be watching closely to see if The U.S. and the Holocaust honestly portrays these issues or fails to confront the difficult truths that need to be faced.


Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183959 https://historynewsnetwork.org/article/183959 0
Gil Coronado: The Padrino of National Hispanic Month

Gil Coronado, photographed during his service as the director of Selective Service during the Clinton administration. 



The role of staff, advisors, and bureaucrats in making laws in Washington, D.C. is often buried in the file cabinets of elected members of U.S. Congress and the Senate. These elected officials, uniquely privileged to introduce legislation, often benefit politically from their policy-making activities. Their political standing is linked to their ability tointroduce and pass legislation. When Colonel Gil Coronado was invited to attend a White House ceremony hosted by President Ronald Reagan in 1988 after the passage of Public Law 100-408, he entered a rare political space that few non-elected Latinos had ever occupied. This is an account of how a significant law came about through a persistent champion of Hispanic heritage.

Every year millions of Americans honor National Hispanic Month, September 15th–October 15th. Among other events, they celebrate the arrival of 86 Latino sailors in three small ships that landed more than five hundred years ago in the Americas. These sailors from Spain and Portugal braved unknown waters and maneuvered the ships guided by a Genoan navigator, Christopher Columbus, across the Atlantic and landed at one of the natural ports of the Caribbean Islands. Their landing marked a new era of exploration and colonization in the history of the world. For years Americans have celebrated Columbus Day in honor of the intrepid Italian navigator Columbus. Latinos who sailed with Columbus were also major contributors to the discovery of what has been described as a “New World.”

Latinos have long celebrated historical moments–such as the establishment of America’s earliest mainland community, Saint Augustine in Florida in 1565, more than fifty years before the landing of the Mayflower. Texas school children read about the amazing eight years of travels [1528-1536] of the Spaniard, Alvaro Nunez Cabeza de Vaca, who crisscrossed Texas with Estevan, a Spanish-African Moor. The history of Latinos in America is deep and of great significance.

National Hispanic Month is especially meaningful to showcase the rich diversity of more than 60 million Latinos, the largest ethnic minority in the United States. The celebrations reveal a range of different cultural traditions. There are distinctions among Latinos in their histories, immigration status, Spanish dialects, food, music, and religion.

A celebrant of National Hispanic Month in San Antonio. Photo by author.


The inspiration for the National Hispanic Month Celebration can be credited to a San Antonio Westside native, Colonel Gil Coronado [Ret.]. In 1968 President Lyndon B. Johnson first issued an annual proclamation designating the week including September 15 and 16 as National Heritage Week. Coronado thought differently– a week was insufficient to pay special tribute to the rich and enduring plethora of Hispanic traditions. Five American presidents from Johnson to Reagan participated in the early Hispanic Week celebrations before it became an extended month-long event. The story of Coronado’s “Padrino” or “Godfather” role in lobbying for the month-long celebration is noteworthy.

Coronado’s quest to add more weeks to National Hispanic Week began in 1985 when the U.S. Air Force assigned him to the Inter-American Defense Board [IADB] in Washington, D.C. The IADB provides the Organization of American States [OAS] with technical and educational advisory military services on issues related to military and defense matters in the Northern-Southern Hemispheres. In Latin America, the OAS is equivalent to NATO. While working with the OAS, Coronado participated in meetings with the Congressional Hispanic Caucus where he became friends with Los Angeles Congressman Estevan Torres, the Chair of the Caucus, and Elvira Castillo, Executive Director of the Caucus.

President Ronald Reagan credited his victory in 1980 to the increased voter participation of Latinos who joined the Republican Party in the late 1970s. Working with marketing guru Lionel Sosa, another San Antonio native, Republicans joined national vendors of food, beer, and household products to declare the 1980s “The Decade of the Hispanics.” The standard explanation describes this decade as a time when the growing Latino population came to national prominence.


The formation of Latina mariachi groups in the southwest is a recent update to this tradition. Photo by author. 


Toward the end of the 1980s, Coronado and members of the Hispanic Caucus debated whether the 80s signified a special Hispanic designation. The improvements over the decade in education, income, and home ownership, for example, were modest at best. Coronado offered that the seven-day celebration of National Hispanic Week was far from sufficient. Many Latinos thought that Hispanics’ annual celebration should model Black History Month, which was officially designated in 1970. Thus, the movement to designate National Hispanic Month began with Coronado and the Congressional Hispanic Caucus.

In 1988, Coronado, assisted by Elvira Castillo of the Hispanic Caucus, prepared the bill to amend National Hispanic Week to National Hispanic Month. California Congressman Estevan Torres asked Coronado to join him on the House floor to witness the co-sponsorship of 218 members of the U.S. Congress, a number sufficient to pass the bill unanimously. With the lead sponsorship of Utah Senator Orlin Hatch, the bill also passed the U.S. Senate with little fanfare. President Reagan signed the new law and invited Coronado to the Rose Garden for a special White House event. At the September 13, 1988 ceremony, Reagan recognized Colonel Coronado “as a stout defender of Hispanic Heritage and the United States of America.”

Coronado retired from the Selective Service in 2001 and returned to his hometown of San Antonio where he currently resides. He is active in the Rotary Club and proud of his alumni status of Lanier High School where he attended until dropping out in the 10th grade to join the Air Force. A traditional vocational high school, Lanier offered him a poor and limited choice between auto mechanics and body and fender repair classes. He wanted something better and decided to leave school during the first semester of the 10th grade.

After altering his birth certificate to make him older and eligible for military service, he showed up at the Air Force recruiting office–he was 15 years old. The recruiters asked him to return in six months when they tested him and gave him a physical. He passed both and they enrolled him in the United States Air Force. He had just turned 16. 

After Basic Training at Lackland Air Force Base, Coronado shipped off to a Louisiana base where he excelled in typing classes. The typing classes led to work in signal and code communication. He also earned a spot on the base basketball team. In the mid-1960s, the Air Force deployed him to Southeast Asia where the United States military forces were engaged in the Vietnam War. When he returned to the states, he held a command position at Lackland Air Force Base. During his thirty-seven years of service in the U.S. Air Force, he was awarded a Legion of Merit, a Bronze Star Medal, the Air Force Commendation Medal, a Meritorious Service Medal with three oak leaf clusters, the Joint Service Commendation Medal, and a Distinguished Presidential Unit Citation.

Coronado’s most memorable assignments included Base Commander of the U.S. Air Force Base in Torrejon, Spain as well as serving as the director of Selective Service under President William Clinton. Coronado is proud of his military service and his opportunity to serve his country. He remarked to me that in the military he was “judged and evaluated due to ability, effectiveness, and proven results and not by a Zip Code." Schools were notorious for tracking working-class Latinos into vocational courses versus academic courses. This October Coronado’s San Antonio School District will honor him for his contribution to Hispanic heritage, a well-deserved honor for a high school dropout.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183958 https://historynewsnetwork.org/article/183958 0
How the Constitution Can Accommodate Divergent Values



The Declaration of Independence charged that King George III and parliament had “…subject[ed] us to [British laws] foreign to our constitution…” Yet in 1776 (when the Declaration was written) there was no written American constitution. Why did Jefferson write “our constitution”?

Jefferson was asserting that across the 13 British-American colonies there was a shared belief about how America should be governed.

Eleven years after the Declaration, the Constitutional Convention devised a system of federal and national government, which it set out in the Constitution. That system built on the shared constitutional beliefs and experience of Americans but also allowed for the diversity of other American beliefs and values.

The moral issue which most divided revolutionary America was not abortion: it was the institution of slavery. By the time of the Constitutional Convention in 1787, Massachusetts had abolished slavery through its new Bill of Rights, and other New England States were moving towards abolition. Georgia and South Carolina were adamant that they would not join the Union if slavery or the slave trade were suppressed. Some Virginian slave owners at the Convention spoke against slavery, on moral grounds.

The original Constitution addressed slavery in three places: once when dealing with the allocation of votes and taxes; once when prohibiting a ban on the importation of slaves before 1808; and once when requiring all States to return slaves who had escaped from other States.

By 1776, the British-American political system included: a legislature in each colony, with a chamber elected by most white men who owned property; jury trials, and the application of the common law; low taxes; the limited exercise of British power in the colonies; and a belief that the colonies should be able to grow westward.

The Declaration was electrifying in its assertion of “certain unalienable rights; …among these are life, liberty, and the pursuit of happiness”. How did the Constitution give effect to these rights? 

Not by setting out personal rights. The original Constitution was focused on establishing the machinery of a new system of republican federal and national government. It was this system, rather than the express enumeration of individual rights, which would protect Americans’ liberty and protect them from tyranny.

Both at the Constitutional Convention and during the ratification debates many Americans wanted to follow the practice, which began with the English Bill of Rights of 1689 and was picked up in many of the State Constitutions, of setting out certain individual rights as paramount rights which American governments and law would have to respect. The American Bill of Rights was added to the Constitution almost immediately after ratification.

When founding the nation, American political leaders had to both leverage Americans’ shared political culture and beliefs, and allow for Americans’ differing moral beliefs and values.

In some ways, differing beliefs about the issue of abortion are even more politically difficult than about slavery. Defenders of slavery claimed that it was economically, not morally, necessary. Anti-slavery advocates believed that there was a moral imperative to abolish slavery.

With abortion, pro-lifers believe that abortion is immoral in many circumstances, while pro-choicers believe that it is immoral to prohibit abortion in many circumstances. The difference in perception of the moral issues intensifies the political division.

The possibility of the Supreme Court being activist in constitutional issues would have surprised the founding generation, who knew that differences in values across the 13 original States were a political reality, with slavery being the most obvious example.

Since the Second World War, internationally human rights treaties and constitutional protections have expanded immensely in scope. Some Americans see a role for the Supreme Court in responding to the changed human rights environment by finding rights which were not explicit in the Constitution or the Bill of Rights. For originalists, the express enumeration of rights is a constraint on the Court’s ability to do this.

The question now is the extent to which differences should be resolved through State democratic processes or constrained by the Supreme Court finding individual rights as paramount rights.

The challenge of enabling a single federal/national government in a nation with differing (as well as many shared) convictions about moral issues has been with the nation since its founding.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183962 https://historynewsnetwork.org/article/183962 0
The Roundup Top Ten for September 16, 2022

Black Historians Know There Has Never Been Objectivity in Writing the Past

by Keisha N. Blain

"Black historians have long recognized the role of the present in shaping our narratives of the past. We have never had the luxury of writing about the past as though it were divorced from present concerns."


Barbara Ehrenreich Challenged Readers to Examine Themselves

by Gabriel Winant

The journalist and social theorist wrote to force her readers to examine their own positions in society's hierarchies, not to encourage cynicism of futility, but to encourage them to see change as a long haul. 



Today's Book Bans Might be Most Dangerous Yet

by Jonna Perrillo

Today's book banners have broadened their attention from communist themes in textbooks and are attacking young adult literature titles that students are choosing to read, a much more significant intrusion on freedom of thought. 



Historicizing the Legitimacy of LGBTQ History

by Marc Stein

The AHA's newsletters reveal a protracted and frequently bitter debate about the boundaries of the discipline as scholars in the early 1970s worked to establish gay and lesbian people and communities as subjects of study. 



On Climate, the British Monarchy Mortgaged the Planet's Future

by Priya Satia

The monarchy, as a cultural core of the British empire, papered over the separation and alienation among humans resulting from the conversion of the Earth to a set of exploitable commodities. 



The Historical Roots of "Florida Man"

by Julio Capó, Jr. and Tyler Gillespie

The internet meme "Florida Man" signals a caricature of the presumed recklessness and ignorance of the state's population. But these stories have a long history of justifying colonialism and profiteering in the Sunshine State, and stand in the way of progress today. 



Economism as a Red Scare Legacy

by Landon Storrs

An economic historian traces the rise of neoliberal political economy to the post-WWII Red Scare, when Keynesians were driven out of government service under suspicion of disloyalty. 



Mourn the Queen, Not the Empire

by Maya Jasanoff

As the head of the postwar British Commonwealth, the Queen symbolized the effort to put the brakes on the global wave of decolonization, including deadly and secret campaigns of state violence in Northern Ireland, Kenya, and elsewhere.



Black Mississippians Have Been Fighting a Water Crisis for Decades

by Thomas J. Ward Jr.

Black residents of the Mississippi Delta began organizing in 1970 for access to water and sewage services; their struggle continues today. 



Queen Not Innocent of Empire's Sins

by Howard W. French

"I bear no ill will toward her following her death. Her empire—and empires more generally—though is another matter."


Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183955 https://historynewsnetwork.org/article/183955 0
Russians' Disapproval of Gorbachev Shouldn't Dominate How He is Remembered




Gorbachev’s recent death stirs thoughts of his legacy. Although he only ruled over the Soviet Union for six years (1985-1991), there is no doubt that his tenure as head of its Communist Party and president of the USSR had a tremendous impact, and not just in his own country but also in the world in general.


But whether that impact is seen as good or bad varies greatly depending on where you live. In Western countries, including the USA, you’re likely to view him in a more positive way--he played a key role (I would argue the key role) in ending the Cold War and making democracy possible for various countries in Eastern Europe. German ex-chancellor Angela Merkel, for example, “praised him for changing the world for the better and allowing a reunified Germany to join NATO.” If you live in Russia, however, you’re likely to see him in a more negative light. One Russian politician said in 2009, “Gorbachev is the most hated man in Russia.”

To understand why ask yourself how you would feel about a U.S. president who presided over the dismantling of a USA which lost one-quarter of its territory and half its population. And in the decade that followed you experienced a great deal of misery. That’s what happened to Russians, which helps to explain why Gorbachev and his Russian successor in the 1990s, Boris Yeltsin, have been much less popular than previous Russian dictators like Stalin and Brezhnev, or Russia’s only de facto twenty-first century leader, Vladimir Putin. In 2005 the latter said the collapse of the “Soviet Union was the greatest geopolitical catastrophe of the century.”


Illustrating the differing evaluations of Gorbachev depending upon where you live are the reflections of Mansur Mirovalev, who was born in the USSR in the 1970s, but in 2003 came to California to spend two years studying journalism. As a boy, he “firmly believed in the forthcoming Communist utopia,” and after Gorbachev “reshaped everything” and allowed revelations of past Soviet atrocities he “hated him.”

The new Soviet leader’s reforms could be summed up in three Russian words, glasnost (openness), perestroika (restructuring), and demokratizatsiia (democratization). The first meant more freedom of expression and less censorship and government secrecy. But Mirovalev thought it “defamed our heroic, Communist-era past.” He was also critical of the results of perestroika, which primarily meant economic restructuring. “Gorby’s reforms made capitalism possible—an ugly, mutant version of capitalism that relied on corruption. Very few got rich quickly, but inflation was galloping, and food disappeared…. My parents’ savings turned into nothing, and their salaries weren’t enough to buy food, let alone basic clothes.” Nor did Mirovalev think that demokratizatsiia was a positive development. He joined many other Soviet citizens in thinking that Gorbachev’s “transition . . . made lives and careers useless” and increased “joblessness, organized crime, loss of contact with relatives in other ex-Soviet republics, and the emergence of kleptocratic, corrupt governments.”

After leaving California and eventually ending up in Russia, Mirovalev interviewed Gorbachev on several occasions between 2007 and 2013. As a journalist, he realized he had to try to be more objective about the former leader whom he had once hated. He came to realize that Gorbachev “was the only true nemesis to Soviet dictator Joseph Stalin, and that his very attempt to reform the USSR was doomed from the start. But Gorbachev’s failures were more beneficial for the almost 300 million Soviet citizens than any great achievement of his predecessors, because they forced a complete break from one of history’s darkest eras.” Mirovalev ends his 2021 article writing “I love Mikhail Sergeevich Gorbachev.”

Since Gorbachev’s death there have been many evaluations of his legacy. One U. S. source (Professor Erik Loomis) opined, “Gorbachev is probably best described as the greatest failure of a leader in Russian history…. When we evaluate his legacy, we need to do so outside an American context, even if we’re American and we’re glad the Cold War ended.” And the author adds, “We have to see his legacy in the light of how he is perceived in Russia, not just that he helped the US win the Cold War. The bitterness toward him in Russia is very real and I’m not sure the world is better off for his failures.”


In The Atlantic staff writer Anne Applebaum, also author of several books dealing with the Soviet Union, was also critical of Gorbachev, writing about him, “Almost nobody in history has ever had such a profound impact on his era, while at the same time understanding so little about it…. He never understood the depth of the rot inside Soviet bureaucracies or the amorality of the bureaucrats. In the end, he wound up racing to catch up with history, rather than making it himself.”


Another author and staff writer at The Atlantic, Tom Nichols, writes a more mixed evaluation of the former Soviet leader. “As you listen to the tributes, remember always that Gorbachev was trying to rescue, rather than destroy, the U.S.S.R. and Soviet Communism….  He was too decent for a job that required a fundamental lack of decency. In the end, he showed the courage and humanity not to use force to try to turn back the clock—a lesson lost on his latest successor, Vladimir Putin.”


In Foreign Affairs still another author, but also a former State Department expert on Russia and Ukraine, Michael Kimmage, also comments on Gorbachev’s decency, saying it “was exceptional.” The Soviet leader had the “means of mass violence at his… disposal, but for the most part, he chose not to use them against dissident or breakaway movements in the Baltic republics, Czechoslovakia, East Germany, Hungary, and Poland. He did not want to let those Soviet republics and satellite states go,” but he did so rather than crush them. Kimmage also admires the manner in which Gorbachev gave up his own power rather than trying “to summon the military, call for violence in the streets, or seek out loyalists within the KGB to keep him in power by force.”

But, while admiring Gorbachev’s decency, Kimmage thinks “he was a catastrophically bad statesman.” And although he “cannot be blamed for Putin’s terrible wars in Ukraine,” Kimmage does think he bears some responsibility “for a set of circumstances that served as the precondition for Putin’s wars and for the disastrous state of Russia’s relations with the West.”  

Also seeing both the positive and negative in Gorbachev’s six years of rule, was Ronald Suny, one of the U. S.’s most distinguished historians of the Soviet period. He concludes, “Gorbachev tried to do too much, too fast without the resources to achieve his goals. By 1990 his own weakness and indecision had derailed the revolution from above. A great emancipator, Gorbachev left a mixed legacy. He expanded freedom for millions but at the same time unleashed roiling waves of nationalism and left the upturned soil for renewed authoritarianism.”

Perhaps most favorable toward Gorbachev is Katrina vanden Heuvel, editor and publisher of The Nation. She and her now deceased husband, Stephen Cohen, who was a leading U.S. historian of Russia, were personal friends of Gorbachev, and she writes: “Many Russians still revile him; they blame him (along with Yeltsin) for destroying the Soviet Union and for the economic and social misery that followed. Other Russians, however, view him, as I do, as a leader of vision and courage. If democracy eventually returns to Russia, Gorbachev will (and should) be remembered as the greatest reformer in that nation’s tormented history.”

She fondly remembers, partly from personal experience in Russia, how “step by step, from 1985 to 1991, the mechanisms and taboos of censorship were dismantled, Here, too, the result was astonishing: virtual freedom of the press, both print and broadcast, at least in the national media.” In those years I was also in the Soviet Union, though only for a few weeks almost every summer. It was exciting to see the greater freedoms evidenced there every year under Gorbachev’s leadership, more freedom for the press, for demonstrators, for academicians, and for nationalities like the Ukrainians, Armenians, and Estonians. At the same time, Gorbachev was helping end the Cold War.

My own evaluation of Gorbachev’s rule has not changed much since the first edition of my A History of Russia (2 vols.) was published twenty-five years ago. In it (and in the second edition), I end my chapter on Gorbachev’s domestic and foreign policies with a mixed review.

He was unable to hold his country together…. partly because of his own failings. After a year or two in power, he became increasingly impatient to institute changes in both domestic and foreign policy, and he sometimes failed to think through the consequences of policies before he enacted them.


… Despite his charismatic appeal in the West, he increasingly was unable to inspire the Soviet peoples with his vision of the future, partly because it was somewhat murky, always changing and evolving….


Yet, given the difficulties he faced (including reactionary resistance and ambitious politicians such as Yeltsin), decades of Communist rule and misrule, and the natural aspirations of many non-Russians for national independence, it is hardly surprising that he failed to hold the Soviet Union together. In many other ways, however, he did not fail. In addition to being mainly responsible for ending the Cold War, he set in motion changes that helped bring greater freedom to many peoples in the world, especially in east-central Europe, Russia, and many other parts of the former USSR.


Although all the death and destruction in Ukraine, a war fought between the two most significant parts of the old USSR, may affect our judgments about the breakup of the Soviet Union, the above cited analysis still seems valid.

By the way, in 2015 Gorbachev said "I am, after all, half Ukrainian. My mother was Ukrainian, and my [deceased] wife, Raisa, was too. I spoke my very first words in Ukrainian, and the first songs I heard were Ukrainian."


As a politician Gorbachev certainly had his limitations, and perhaps many in the West fail to stress these enough. In judging him we should certainly take the opinions of Russians into consideration, but they should not regulate our final evaluation of him or his years in power. In the mid-1930s many Germans approved of Hitler. Should the Russian’s judgment--ranking Gorbachev lower than Stalin, Brezhnev, and Putin--be ours? Rating him so low reflects too much emphasis on state and national power and too little appreciation for values such as freedom, democracy, and peace.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183901 https://historynewsnetwork.org/article/183901 0
Songs for Sale: Tin Pan Alley (Excerpt)

"Tin Pan Alley" Publishing Houses, W. 28th St., Manhattan, 1910



Ragtime’s rise to national, then international, prominence took a full decade. The first decade of the American century would see the creation and rapid rise of the music industry, which, once in place, both made ragtime a phenomenon and broke the spirits of its innovators. Recorded music needed to be marketed and sold, and songs needed to be written in order to be recorded. Lower Manhattan didn’t lack for a “can do” spirit.


In the 1900s dozens of would-be writers saw Charles K. Harris rolling around in “After the Ball” dollars and, hoping lightning would strike Lower Manhattan again, bought themselves a little office space on West 28th Street, between Broadway and Sixth Avenue. It became a warren of songwriters’ offices that was soon nicknamed “Tin Pan Alley” on account of the noise issuing from multiple bashed pianos, not to mention the wastepaper bins – filling up with abandoned songs – that were being kicked in frustration.[i]


Art and commerce were interchangeable on Tin Pan Alley. “Meet Me in St Louis” was written as an advert for the Louisiana Purchase Exposition, otherwise known as the St Louis World’s Fair, in 1904; it would become a hit all over again in 1944 as the theme to one of Judy Garland’s best-loved films. The very soul of cockney London, Florrie Forde’s “Down at the Old Bull and Bush” (1903) was actually American in origin: “Here’s the little German band, just let me hold your hand” was a lyrical clue. It had been written by Harry Von Tilzer, whose real name was Harry Gumm; his mother’s maiden name was Tilzer, and he’d added “Von” for a bit of Tin Pan Alley class. The song was an ad for Budweiser, brewed by Anheuser-Busch – you can imagine the original jingle. Von Tilzer also gave us the boisterous cheeriness of “Wait ’Til the Sun Shines, Nellie,” first recorded in 1905 by minstrel singer Byron G. Harlan, and fifty years later by Buddy Holly.


The era’s biggest American hits, emanating from the Alley – as yet untouched by Missouri’s ragtime – were largely lachrymose stuff. The sentimental “In the Shade of the Old Apple Tree,” written by one Egbert Van Alstyne, was recorded straight by the Peerless Quartet and Henry Burr in 1904, but was so sappy that it was almost immediately parodied, nearly as sappily, by Billy Murray (“I climbed up the old apple tree, ’cos a pie was the real thing to me”). More blubby yet was 1906’s “My Gal Sal,” a last flourish from Paul Dresser, the writer of “On the Banks of the Wabash,” who cried every time he sang one of his own songs.


The portly Dresser, in his cups, was legendarily generous – to his author brother Theodore Dreiser, to the homeless of New York – and gave all of his songwriting royalties away. He died in 1907, aged forty-eight, and so never lived to see himself played on screen by Victor Mature – in the 1942 biopic My Gal Sal – nor to delight in the fact that his on-screen persona would cavort with Rita Hayworth (the film remains one of Hollywood’s most complete rewrites of history – half of the songs in the film were written by Leo Robin rather than Dresser).


The biggest home-grown name, the most celebrated American composer of the decade, wasn’t really American at all. Victor Herbert had been born in Dublin in 1859 and moved to the States in the early 1890s; by 1898 he had his first operetta on Broadway, The Fortune Teller, featuring “Gypsy Jan,” “Romany Life” and “Slumber On, My Little Gypsy Sweetheart” – telltale titles that gave away his Viennese inspiration. New York remained largely immune to the charms of American music. What it needed was some pride, some self-mythologizing, and the person to do that was a smug-looking man in a straw boater called George M. Cohan.


Cohan became the first undisputed king of Broadway with a batch of songs he wrote in his mid-twenties, between 1904 and 1906, and a two-pronged attack that stood his work apart from Herbert’s light operas and home-grown ballads like the Haydn Quartet’s 1905 recording “In the Shade of the Old Apple Tree.” First, he was heavily patriotic – he was all about the New World. Secondly, he mythologized Broadway as a place of glamour (“Give My Regards to Broadway”). No hokum about apple trees; it was all city slicker sentiments and love for the new century. Cohan had been born on July 4, 1878, which entitled him to a certain amount of loud-mouth chauvinism; in 1904 he wrote the most patriotic pop song of the lot, “Yankee Doodle Dandy.” He reacted to critical reviews of his work with a sharp “So long as they mention my name.” Along what lines did he write his plays, one critic asked. “Mainly on the New York, New Haven and Hartford line.” He’s recognizably modern and even has a statue in Times Square, so why isn’t Cohan’s name better remembered? Well, he damaged his legacy by singing his own songs, which wasn’t a great idea, given his inability to stay in tune. Still, it makes for an entertaining listen today: 1911’s “Life’s a Very Funny Proposition” suggests the odd rising and falling cadence of Bob Dylan, only sung in a half-Scottish, half-French accent.




As American songwriters like George M. Cohan began to create American theatre music free of any debt to Vienna or Gilbert and Sullivan, and Will Marion Harris introduced ragtime rhythms to Broadway with his 1903 show In Dahomey, so the gramophone was reinvented for the burgeoning American age in the shape of the Victrola. Thomas Edison himself had thought that any use of the gramophone beyond dictation was in the realms of novelty, and he had a point: it recorded the human voice much better than it did the violin; for any other use it was a squeaky mechanical toy. Talking Machine World was under no illusions and wrote that “the high-brow element professed to find nothing of merit in the talking machine." The piano, on the other hand, continued to be a source of spiritual succour beyond the Victorian age. It took the business savvy of Eldridge Johnson of the Victor Talking Machine Company to make the gramophone an equally acceptable and desirable piece of household furniture in the Edwardian age.


Johnson invented several things which any record collector or twenty-first-century vinyl obsessive would be familiar with today: a straight tone arm, a recess in the middle of the disc on which you could place a paper label, and a box under the turntable in which all of the mechanical parts were neatly contained. This was his new record player, and in 1906 it went on the market as the Victrola. It came in a four-foot-tall mahogany cabinet; Edison’s machines looked like industrial lathes by comparison. Soon, President Taft had a Victrola in the White House, and Johnson milked this news for all it was worth, using photos of Taft in his sales literature.



[i] The first music publisher to move to the block was the successful M. Witmark and Sons – Isidore, Julius and Jay – who moved uptown from 14th Street to 49–51 West 28th Street in 1893. Others soon moved into close proximity: Paul Dresser and Harry von Tilzer from Indiana; and Charles K. Harris from Milwaukee, who had written the schmaltzy but wildly successful “After the Ball” in 1893. By 1900 West 28th Street had the largest concentration of popular- music publishers in the US. A chance hit and a couple of hundred dollars could secure you an office. Tin Pan Alley quickly became so effective at the publication and distribution of sheet music that publishers in other American cities were marginalised.


Excerpted from Chapter 3 of Let's Do It: The Birth of Pop Music: A History, with permission of Pegasus Books. 

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183903 https://historynewsnetwork.org/article/183903 0
The Authoritarian Personality and the Rising Far Right




If nothing else, the Republican Party has steadily removed any doubt that it has embraced an extremism that threatens the future of American democracy. We can clearly perceive the imminent danger that the Party poses in several crucial ways, perhaps the most salient being the refusal of candidates to concede when they lose an election (as we might have expected, this has metastasized beyond Trump: Kari Lake, the Republican nominee for governor of Arizona, boasted in a debate that she would not concede if she lost). This is part of a broader disregard for the rule of law itself, which also manifests in the readiness to use, or at a minimum threaten, political violence. We should underscore the disowning of incontrovertible facts as well, and a distinct susceptibility to propaganda and outlandish, bizarre, even mystical explanations congenial to their agenda.  


The January 6th House Select Committee laid out in stunning detail the culpability of Donald Trump and his henchmen in fostering the conditions that have made it possible for so many millions of Americans to openly avow a right-wing extremism that contradicts the principles of a democratic republic. What we are witnessing in this country is the rise of fascism, the potential for which most Americans, until now, would have denied. If American democracy is to survive, we cannot afford to deny it any longer.


Which means that it would behoove us to examine not only what occurred to get us to this point, and what can be done legislatively and otherwise to buttress our democracy and shield it from the assault of authoritarian extremism, but to also understand what the qualities and characteristics are of a potentially fascistic individual. In short, the question we consider here is, what defines the authoritarian character? It is a question of political psychology, and it is one that must be answered and probed if we want to identify who is potentially fascistic, what makes them so, how prevalent such individuals are within the American body politic, and how to respond effectively to extremist propaganda, which seeks to dismantle the psychological impediments to openly avowing a fascist ideology.   


In 1950, a team of sociologists, including the philosopher Theodor Adorno, conducted an empirical study, later published as The Authoritarian Personality, which addressed this set of problems: “If a potentially fascistic individual exists, what, precisely, is he like? What goes to make up antidemocratic thought? What are the organizing forces within the person?... What have been the determinants and what is the course of his development?” As Adorno stated, these were the questions upon which the study was “designed to throw some light” – and their findings could hardly be more timely or lay a greater claim on our attention.


There were two parts to the study: the first consisted of detailed questionnaires that were distributed to all the subjects to be completed anonymously; the second involved extensive interviews with the individual subjects. The questionnaires were designed to allow the researchers to determine the subjects’ political or ideological convictions and to rank them along various scales, including one referred to as the Fascism or F-scale, perhaps “the most innovative and significant feature” of the study. Subjects were given statements and asked to rank their answers on a scale ranging from strong agreement to strong disagreement.


One group of statements were designed to measure subjects’ authoritarian submissiveness and formulated in such a way that, while avoiding direct references to dictatorship, “agreement with them would indicate not merely a realistic, balanced respect for valid authority but an exaggerated, all out emotional need to submit.” For example: “What this country needs is fewer laws and agencies, and more courageous, tireless, devoted leaders whom the people can put their faith in.” Submissiveness was only one component of the authoritarian personality, however – the masochistic, as it were. No less important was authoritarian aggressiveness, i.e., the sadistic component. This aspect is especially important if we are to understand how January 6 was possible, including how it is that a mob could erect a gallows and storm the US Capitol with chants of “Hang Mike Pence” ringing in the air. Consider the observation, which the authors might have written in response to the attack:


Once the individual has convinced himself that there are people who ought to be punished, he is provided with a channel through which his deepest aggressive impulses may be expressed, even while he thinks of himself as thoroughly moral. If his external authorities, or the crowd, lend their approval to this form of aggression, then it may take the most violent forms, and it may persist after the conventional values, in the name of which it was undertaken, have been lost from sight. (Italics mine)


The F-scale also measured what the researchers refer to as superstition, or a belief in mystical forces, including “fantastical external determinants…” indicating “a tendency to shift responsibility from within the individual onto outside forces beyond one’s control… making the individual’s fate dependent on… fantastic factors.” In this regard, the questionnaire offered statements such as: “It is more than a remarkable coincidence that Japan had an earthquake on Pearl Harbor Day, December 7, 1944”; and “Every person should have a deep faith in some supernatural force higher than himself to which he gives total allegiance and whose decisions he does not question” – this latter question being cross-indicative of submissiveness as well. This feature of the authoritarian personality is one that has become all too evident today with the millions of QAnon supporters of Trump, who openly and consistently endorses their outlandish claims, apocalypticism, and conspiracy theories. The New York Times reports that Trump recently shared a post that “included ‘the storm,’ which followers of QAnon… use to describe the day when their enemies will be violently punished.” Take note of how their “Storm” prophecy invokes authoritarian submission, aggression, and superstition.


Yet another crucial variable was power and “toughness,” or a penchant to identify with strong figures of authority who dominate the weak, “a disposition to view all relations among people in terms of such categories as strong-weak, dominant-submissive, leader-follower…” The statements that subjects were asked to evaluate included: “Too many people today are living in an unnatural, soft way; we should return to the fundamentals, to a more red-blooded, active way of life”; as well as “There are some activities so flagrantly un-American that, when responsible officials won’t take the proper steps, the wide-awake citizen should take the law into his own hands.” This latter statement could have been repeated word for word by any number of the hundreds of Americans who joined the January 6 insurrection. The researchers pointedly observe: “The individual whom we expected to score high on this cluster readily identifies with the ‘little people,’ or ‘the average,’ but he does so, it seems, with little or no humility, and seems actually to think of himself as strong or to believe that he can somehow become so.”


It is unsurprising that we find this same “power complex” in Trump, who has crafted a public image of toughness and repeatedly confirmed his admiration for strongmen, including Hitler, Putin, and Kim Jung Un. According to a new book on the Trump administration – The Divider: Trump in the White House, 2017-2021, by Peter Baker and Susan Glasser – the former president asked his chief of staff, John Kelly: “Why can’t you be like the German generals?” When Kelly informed the president that Germany’s generals had “tried to kill Hitler three times and almost pulled it off,” Trump was immediately dismissive: “No, no, no, they were totally loyal to him.” Baker and Glasser observe that “In [Trump’s] version of history, the generals of the Third Reich had been completely subservient to Hitler; this was the model he wanted for his military.” Hence, with respect to power/toughness, the researchers point out: “We should expect that both leaders and followers will score high on this variable, for the reason that the actual role of the individual seems to be less important than his concern that leader-follower relations shall obtain.” 


The antidemocratic individual harbors intense underlying aggressive impulses: that is, “undifferentiated aggressiveness” that could “easily, by means of propaganda, be directed against minority groups, or against any group the persecution of which was politically profitable.” We could expect then high-scorers on the F-scale to agree with statements such as: “America is getting so far from the true American way of life that force may be necessary to restore it.” In fact, precisely the kinds of statements we commonly hear among MAGA Republicans. The individual generally needs to find an outlet for this objectless, non-moralized aggression, and typically does so “through displacement onto outgroups leading to moral indignation and authoritarian aggression.” It is worth reminding ourselves of the antisemitic chant of the fascists in during the “Unite the Right” rally in Charlottesville, Virginia, August 12, 2017: “The Jews will not replace us,” – and Trump’s readiness to throw his followers a little red meat with his statement three days later that there were “some very fine people on both sides,” implicitly condoning antisemitic aggression.


The F-scale also included statements that were concerned with the mechanism of projection, in keeping with the theory that the “suppressed impulses of the authoritarian character tend to be projected onto other people who are then blamed out of hand.” This projectivity is often expressed through a moralizing over-concern with others’ sexual activity. One statement that subjects were presented with read: “The sexual orgies of the old Greeks and Romans are nursery school stuff compared to some of the goings-on in this country today, even in circles where people might least expect it.” This inclination to punish violators of sexual norms still looms large among the ever-more mainstream QAnon, which “picked up where Pizzagate left off.”  Indeed, QAnon’s followers tenaciously cling to fantastical ideas about Satan-worshiping Democratic pedophiles that Trump alone will defeat. One can trace the origin of these ideas to a much older conspiracy theory – namely, that of the “blood libel,” a medieval, antisemitic canard that Jewish people stole and murdered Christian children, using their blood to make matzah. There is a close connection among these variables, including that of antisemitism and prejudice, which was one of the key areas of focus for Adorno and his colleagues.


The rise of right-wing extremism is sounding the death knell of what used to be mainstream American conservatism. That current Republican leaders see fit to assault the FBI, historically the country’s most conservative law enforcement agency, simply to protect Trump’s criminal behavior, is just one example of how they have abandoned conservatism in any meaningful sense of the word. Genuine conservatism can at least be credited with supporting the rule of law, and according to Adorno, the “unqualified rejection of antiminority prejudices.” The point here is that “all fascist movements” utilize what Adorno refers to as pseudoconservatism – that is, they “officially employ traditional ideas and values but actually give them an entirely different, anti-humanistic meaning.” Step one in guarding against the destruction of American democracy is knowing its enemy.   We must identify the pseudoconservative for what he or she is – one “who, in the name of upholding traditional American values and institutions and defending them against more or less fictious dangers, consciously or unconsciously aims at their abolition.”


Trump’s words to the rioters who stormed the Capitol were, “We love you. You’re very special” – simply one of countless examples demonstrating that he is precisely that kind of leader “under whose tinseled aegis license becomes law, secret and primitive desires become virtuous ambitions readily attained, and compulsive behavior formerly deemed punishable becomes the order of the day.” Liz Cheney’s observation that Trump “weaponized the patriotism” of his followers makes it sound as if they were simply deceived by an unscrupulous leader. Unfortunately, this masks a more unpleasant reality that must be squarely faced – namely, that Trumpism deliberately taps into the suppressed impulses of the authoritarian character, undermining the psychological barriers that impede the personal and social acceptance of authoritarian and even fascist dogma.


Sam Ben-Meir is an assistant adjunct professor of philosophy at City University of New York, College of Technology.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183904 https://historynewsnetwork.org/article/183904 0
Arena Rockin' The Vote?



Tune in to any Classic Rock radio station, or click on a Spotify playlist of rock 'n' roll anthems, and whose songs will you hear?  A lot of straight, white, and often angry males:  AC/DC.  Lynyrd Skynyrd.  Bruce Springsteen.  Creedence Clearwater Revival.  Black Sabbath.  ZZ Top.  Guns n' Roses.  Bob Seger.  Grand Funk Railroad.  George Thorogood.  Metallica.  Aerosmith.  Who are they mostly singing to, both on their original records and as latter-day legends?  A lot of straight, white, and often angry males, most of us once party-hearty teenagers who've since become middle-aged working stiffs.  Why should this particular set of artists and audiences matter today?  Because this is the demographic now held to be a key voting bloc in many places, credited with or blamed for Donald Trump, Brexit, trucker blockades, and anti-vaccine rallies.  From Sweet Home Alabama to the Highway to Hell, popular music and populism have significantly overlapped.

The connection seldom gets noticed.  While the country sound of Nashville has long been associated with red-state politics in the US, the more varied - and commercially far more lucrative - genre of rock has had its own influence on a working- or middle-class base in many countries.  It isn't so much that performers have actively backed populist causes, although Bernie Sanders used Springsteen's "We Take Care of Our Own" as a campaign theme and Skynyrd played private gigs for Republican conventioneers in 2016.  Rather, it's that several generations of listeners have conflated the loud 'n' proud rebellion of electric guitar riffs with their private ideals of integrity, resilience, and rugged individualism.  Call it arena rock, call it Rust Belt rock, call it blue-collar boogie, call it Golden Oldies for the Boomer and Gen X cohorts, but don't underestimate its impact.  While we usually think of rock 'n' roll as the soundtrack of juvenile delinquency in the 1950s, student protest in the 1960s, and punkish irreverence in later decades, some of the medium's most vital expressions may have really been the Silent Majority's way of making some noise.  

Consider acts like CCR, the meat-and-potatoes quartet whose visual and sonic style, and class-conscious hits like "Fortunate Son" and "Proud Mary," were defiant rejoinders to the self-indulgent utopianism of their psychedelic competitors;  Lynyrd Skynyrd, the unrepentant American Southerners who regularly played their immortal hymns to rural independence, "Freebird" and "Simple Man," before a huge Confederate flag; Australia's AC/DC, blaring brazenly toxic masculinity avant la lettre, in "Dirty Deeds Done Dirt Cheap," "Have a Drink On Me," "You Shook Me All Night Long," and a barroom jukebox of other outrages; heavy metal heroes Judas Priest and Iron Maiden, offering cathartic fantasies of power and freedom to underemployed youth around the world, with "Breaking the Law" and "Run to the Hills"; and superstars John Mellencamp, Bob Seger, and Bruce Springsteen, articulating the hopes and fears of a transforming global economy's newly vulnerable farmers and factory workers, via the multimillion-selling albums Scarecrow, Against the Wind, and Born In the USA.  It was all mass-marketed entertainment, of course, but the continued fame of the musicians, and the enduring appeal of the music, tells us something about what really mattered - and still matters - to their constituencies.  People who first encountered the songs in 1969 or 1975 or 1986 have grown up and grown old absorbing messages of group solidarity and personal honor, socially tolerant but sternly ethical, apolitical but anti-elitist, delivered by icons of nonconformity, charisma, and cool.  The philosophy of rock 'n' roll may have drifted rightward over its seventy years, but the right has just as surely been rocked.

As celebrities, rock stars also anticipated the distrust of authority and expertise which we hear in right-wing reaction internationally.  Often disparaged by music journalists in their day, some artists defended themselves and their followers in language that seems oddly prescient now.  "The rock press was always attracted to the Talking Heads, Television, the Ramones, the New York Dolls, the Sex Pistols – bands who couldn’t sell out a stadium or even an arena," griped Gene Simmons of the greasepainted foursome Kiss. "There is a side to that media completely devoid of connection to the people who make up most of the rock audience.”   Likewise, AC/DC guitarist Angus Young recalled the band's first tour of the US in 1977, "What was real strange was that although the media was pushing this really soft music, you'd get amazing numbers of people turning out to hear the harder stuff."  Judas Priest singer Rob Halford had his own complaint:  "You get narrow-minded critics reviewing the shows, and all they think about heavy metal is that it is just total ear-splitting, blood-curdling noise without any definition or point.  This is a very, very professional style of music.  It means a great deal to many millions of people.  We treat heavy metal with respect." 

In 1970 rock promoter Frank Barsalona noted that critical approval wasn't important to the crowds flocking to see Grand Funk Railroad.  "They have been put down repeatedly by the underground press but the younger kids are not letting the press tell them what to like," he said.  And Ted Nugent, the Motor City Madman - nowadays a conservative provocateur best known for his leering 1970s successes like "Cat Scratch Fever" and the libertarian "Stormtroopin'" - summed up his views on the Detroit-based rock magazine Creem, declaring,  "Most of its so-called writers provided me with constant fortification to my concrete understanding of how transparent, criminal, and chimplike the hippie lifestyle truly is."  This disconnect between sanctioned opinion and popular sentiment, which didn't seem to matter much when it was confined to the record shops, has lately migrated to the ballot box, where it's made an historic difference.

Granted, it's difficult to trace a connection from any subcategory of pop culture directly to political insurrection, from "We're An American Band" to "Make America Great Again."  But the vast populations who've matured from alienated adolescents to disaffected citizens, all the time raising fists and banging heads to heavily amplified songs about macho bluster, grassroots defiance, and stubborn allegiance to community and class, comprise much of today's populist movement, and they first gained their sense of themselves as freedom-loving challengers of an arrogant and out-of-touch establishment through a catalogue of hard-rockin' Greatest Hits. 

Newer formats of hip-hop or K-pop could themselves be heralds of nascent social uprisings somewhere else on the spectrum; the confirmed sales and committed partisans of denim-clad, blues-based, guitar-playing ensembles and soloists, however, make their import obvious.  Even at its economic peak, the music industry was never a monolith, yet in terms of tickets sold and units shipped, Back In Black, Born to Run, and the massed ranks of the Kiss Army enjoy a collective resonance that drowns out comparable claims for Lizzo, Harry Styles, and BTS.  Whatever you think of the music, that so many have identified so deeply with it for so long bears reflection.  If the Battle of Waterloo was won on the playing fields of Eton, the electoral surges of the last ten years - and perhaps of the next ten as well - may have been generated in the concert halls of Cleveland. 

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183906 https://historynewsnetwork.org/article/183906 0
Inflation Opened the Door to American Neoliberalism



In America, it was inflation that opened the door to Milton Friedman’s neoliberalism.

Inflation is usually caused by one of two things: international devaluation or internal dilution of a country’s currency, or widespread shortages of essential commodities that drive up prices enough to echo through the entire economy.

The early 1970s got both, one deliberately and the other as the result of war.

Between 1971 and 1973, President Nixon pulled the United States out of the Bretton Woods economic framework that had been put together after World War II to stabilize the world’s currencies and balance trade. The dollar had been pegged to gold at $35 an ounce, and the world’s other currencies were effectively pegged to the dollar.

But the United States couldn’t buy enough gold to support the number of dollars we needed as our economy grew, so on August 15, 1971, Nixon announced to the nation and the world that he was taking the dollar off the gold standard and putting a 10 percent tariff on most imports of finished goods into the US to deal with the changes in the dollar’s value relative to other currencies.

The immediate result was that the value of the dollar rose as the world breathed a sigh of relief that the “gold crisis” was coming to an end and the dollar would become more portable. But an increased value in the dollar relative to other currencies meant that products manufactured in the US became more expensive overseas, hurting our exports.

At that time, there were 60,000 more factories in the US than today, and Walmart was advertising that everything in their stores was “Made in the USA”: exports were an important part of our economy, and imports were mostly raw materials or “exotic” goods not produced here, like sandalwood from Thailand or French wines.

To deal with the “strong dollar” problem, Nixon announced in December 1971 that the US was devaluing our currency relative to the Japanese yen, German mark, and British pound (among others) by 11 percent. It was the first-ever negotiated realignment of the world’s major currencies, and Nixon crowed that it was “the greatest monetary agreement in the history of the world.”

But we were still importing more and more goods from overseas, particularly cars from Japan, increasing our trade deficit and hurting American jobs that manufactured goods like cars that competed with the Japanese and the Germans. So in the second week of February 1973, Nixon did it again, negotiating a further devaluation of the dollar by 10 percent.

While devaluing the dollar against other currencies didn’t have much immediate impact on products grown or made in the United States from US raw materials, it did mean that the prices of imports (including oil, which was the primary energy supply for pretty much everything in America) went up.

Over the next decade, the impact of that devaluation would work its way through the American economy in the form of a mild inflation, which Nixon thought could be easily controlled by Fed monetary policy.

What he hadn’t figured on, though, was the 1973 Arab-Israeli War. Because America took Israel’s side in the war, the Arab states cut off their supply of oil to the US in October 1973. As the State Department’s history of the time notes, “The price of oil per barrel first doubled, then quadrupled, imposing skyrocketing costs on consumers and structural challenges to the stability of whole national economies.”

Everything in America depended on oil, from manufacturing fertilizer to powering tractors, from lighting up cities to moving cars and trucks down the highway, from heating homes to powering factories. As a result, the price of everything went up: it was a classic supply-shock-driven inflation.

The war ended on January 19, 1974, and the Arab nations lifted their embargo on US oil in March of that year. Between two devaluations and the explosion in oil prices, inflation in the US was running red-hot by the mid-1970s, and it would take about a decade for it to be wrung out of our economy through Fed actions and normal readjustments in the international and domestic marketplace.

But Americans were furious. The price of pretty much everything was up by 10 percent or more, and wages weren’t keeping pace. Strikes started to roil the economy as Nixon was busted for accepting bribes and authorizing a break-in at the Democratic National Committee’s headquarters in the Watergate complex. Nixon left office and Gerald Ford became our president, launching his campaign to stabilize the dollar with a nationally televised speech on October 8, 1974.

Ford’s program included a temporary 5 percent increase in top-end income taxes, cuts to federal spending, and “the creation of a voluntary inflation-fighting organization, named ‘Whip Inflation Now’ (WIN).” The inflation rate in 1974 peaked at 12.3 percent, and home mortgage rates were going through the roof.

WIN became a joke, inflation persisted and got worse as we became locked into a wage-price spiral (particularly after Nixon’s wage-price controls ended), and President Ford was replaced by President Jimmy Carter in the election of 1976.

But inflation persisted as the realignment of the US dollar and the price of oil was forcing a market response to the value of the dollar. (An x percent annual inflation rate means, practically speaking, that the dollar has lost x percent of its value that year.)

The inflation rates for 1977, 1978, 1979, and 1980 were, respectively, 6.7 percent, 9.0 percent, 13.3 percent, and 12.5 percent.

In 1978, Margaret Thatcher came to power in the United Kingdom and, advised by neoliberals at the Institute of Economic Affairs (IEA), a UK-based private think tank, began a massive program of crushing that country’s labor unions while privatizing as much of the country’s infrastructure as she could, up to and including British Airways and British Rail.

She appointed Geoffrey Howe, a member of the Mont Pelerin Society and friend of Milton Friedman’s, as her chancellor of the exchequer (like the American secretary of the Treasury) to run the British economy. Friedman, crowing about his own influence on Howe and the IEA’s founder, Sir Antony Fisher, wrote, “The U-turn in British policy executed by Margaret Thatcher owes more to him (i.e., Fisher) than any other individual.”

The ideas of neoliberalism had, by this time, spread across the world, and Thatcher’s UK was getting international applause for being the world’s first major economy to put them into place. Pressure built on President Carter to do the same, and, hoping it might help whip inflation, he deregulated the US trucking and airline industries, among others, in the last two years of his presidency.

Ronald Reagan was elected in 1980, and when he came into office, he jumped into neoliberal policy with both feet, starting by crushing the air traffic controllers’ union, PATCO, in a single week. Union busting, welfare cutting, free trade, and deregulation were the themes of Reagan’s eight years, then carried on another four years by President George H. W. Bush, whose administration negotiated the North American Free Trade Agreement (NAFTA).

America was now officially on the neoliberal path, and Friedman and his Mont Pelerin buddies were cheering it on.

By 1982, inflation was down from 1981’s 8.9 percent to a respectable and tolerable 3.8 percent; it averaged around that for the rest of the decade. Instead of pointing out that it normally takes a supply-shock inflation and a currency-devaluation inflation a decade or two to work itself out, the American media gave Reagan and neoliberalism all the credit. Milton Friedman, after all, had made his reputation as the great scholar of inflation and was a relentless self-promoter, appearing in newspapers and newsmagazines almost every week in one way or another.

Claiming that neoliberal policies had crushed over a decade of inflation in a single year, and ignoring the fact that it was just the normal wringing-out of inflation from the economy, Reagan openly embraced neoliberalism with a passion at every level of his administration. He embarked on a series of massive tax cuts for the morbidly rich, dropping the top tax bracket from 74 percent when he came into office down to 25 percent. He borrowed the money to pay for it, tripling the national debt from roughly $800 billion in 1980 to $2.4 trillion when he left office, and the effect of that $2 trillion he put on the nation’s credit card was a sharp economic stimulus for which Reagan took the credit.

He deregulated financial markets and savings and loan (S&L) banks, letting Wall Street raiders walk away with billions while gutting S&Ls so badly that the federal government had to bail out the industry by replacing about $100 billion that the bankers had stolen.

“Greed is good!” was the new slogan, junk bonds became a thing, and mergers and acquisitions experts, or “M&A Artists” who called themselves “Masters of the Universe,” became the nation’s heroes, lionized in movies like the 1987 Wall Street, starring Michael Douglas.

Reagan signed Executive Order 12291, which required all federal agencies to use a cost-benefit estimate when putting together federal rules and regulations. Instead of considering costs of externalities (things like the damage that pollution does to people or how bank rip-offs hurt the middle class), however, the only costs his administration worried about were expenses to industry.

He cut the regulatory power of the Environmental Protection Agency (EPA), and his head of that organization, Anne Gorsuch (mother of Supreme Court Justice Neil Gorsuch), was, as Newsweek reported, “involved in a nasty scandal involving political manipulation, fund mismanagement, perjury and destruction of subpoenaed documents,” leaving office in disgrace.

Meanwhile, Reagan’s secretary of the interior, James Watt, went on a binge selling off federal lands to drilling and mining interests for pennies on the dollar. When asked if he was concerned about the environmental destruction of sensitive lands, he replied, “[M]y responsibility is to follow the Scriptures which call upon us to occupy the land until Jesus returns.” According to Watt’s fundamentalist dogma, any damage to the environment would be reversed when Jesus came back to Earth and would “[make] all things new.”

Reagan cut education funding, putting Bill Bennett in as secretary of education. Bennett was a big advocate of the so-called school choice movement that emerged in the wake of the 1954 Supreme Court Brown v. Board of Education decision, which ordered school desegregation. All-white private, religious, and charter schools started getting federal dollars; public schools had their funds cut, and Bennett later rationalized it all by saying, “If it were your sole purpose to reduce crime, you could abort every black baby in this country, and your crime rate would go down.”

The Labor Department had been created back in 1913 by President William H. Taft, a progressive Republican, and Reagan installed former construction executive Ray Donovan as its head, the first anti-labor partisan to ever run the department, a position he had to leave when he was indicted for fraud and grand larceny (the charges didn’t stick) related to Mafia associates he was in tight with. As the Washington Post observed when Donovan died, “Carrying out Reagan’s conservative agenda, Mr. Donovan eased regulations for business, including Occupational Safety and Health Administration rules disliked by industry. He withdrew a rule requiring the labeling of hazardous chemicals in the workplace and postponed federal employment and training programs, equal opportunity employment measures, and a minimum-wage increase for service workers. His tenure also saw drastic cuts in the department’s budget and staff.”

That sort of thing happened in every federal agency throughout the Reagan and Bush presidencies; much of their neoliberal damage has yet to be undone.

By 1992, Americans were starting to wise up to Reagan’s scam.

Thousands of factories had closed, their production shipped overseas; working-class wages had stagnated since his first year in office, while CEO salaries exploded from 29 times the average worker’s salary in 1978 to 129 times average worker wages in 1995 (they’re over 300 times average worker wages today); and union membership had dropped from a third of workers to around 15 percent (it’s around 6 percent of the private workforce today).

The Reagan and Bush administrations negotiated the neo-liberal centerpiece, the NAFTA treaty (although they called it a “trade agreement” rather than a treaty because it couldn’t get past the constitutional requirement for a two-thirds vote in the Senate to approve all treaties), and wanted it signed the following year, in 1993.


Reprinted from The Hidden History of Neoliberalism with the permission of Berrett-Koehler Publishers. Copyright © 2022 by Thom Hartmann.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183902 https://historynewsnetwork.org/article/183902 0
Pessimistic Economic Forecasts Ignore a History of Dynamism

Swift meatpacking plant and refrigerated rail cars, Sioux City, Iowa, 1917



“I refuse to recognize any impossibilities.  I cannot discover that anyone knows enough about anything on this earth definitely to say what is and what is not possible.” – Henry Ford, 1922



Pessimism is in the air.  A rising number of economists seem to have resigned themselves to slower American economic growth.  They cite both the slow recoveries from recent recessions and the failure of recent technologies to jumpstart productivity.  Like Robert Gordon, they look back wistfully to the century from 1870 to 1970, when the “second industrial revolution” of mechanical and chemical advances drove real per capita growth at an annual rate of 2%.  Even the more optimistic economists now expect growth at 1%.


This slowdown, combined with rising income inequality, has called capitalism into question.  Critics seek stepped-up government intervention in the economy, with aggressive antitrust, redistribution of wealth, and heavy regulation.  But this approach neglects a crucial factor in the glorious century of rapid growth.


Despite calls for nationalizing industries, weakening the currency, or radically expanding government power, 19th century Americans maintained a decentralized political economy balancing property rights with the right to compete.  The result was an entrepreneurial dynamism that drove the powerhouse innovations of those decades.  To recapture that growth, we need to trust in our entrepreneurial dynamism to address our main problems, with government as a supporter rather than leader. 


Bringing Forth the Glorious Century


Most histories of the 1870-1970 period tell a rather deflating story.  Yes, the American economy succeeded in connecting isolated homes to water, electricity, transportation, and telecommunications, while greatly reducing pollution and infant mortality.  People began to live vastly better lives.  But this was all low-hanging fruit, the inevitable consequence of easy innovation fueled by government spending, cheap immigrant labor, and strong tariff protection.


A close look at those decades, however, tells a different story.  Most of the great innovations were hardly obvious at the time, and required remarkable ingenuity and persistence.  Even the building of national markets required overcoming local monopolies and connecting disparate transportation networks – a political as well as economic challenge.

Actual innovators had to work even harder.  Take the advent of refrigerated train cars, which greatly reduced the cost of meat for urban households.  Until the 1880s, people who wanted meat had to rely on livestock transported on foot or rail, which raised costs and lowered quality.  Gustavus Swift, a Chicago meatpacker, introduced refrigeration, which made possible centralized and efficient slaughterhouses close to grazing areas.  But the existing railroads, which preferred to own all the freight equipment, refused to transport Swift’s specialized cars.


At that point Swift could have given up and let the big roads eventually get around to copying his invention.  But he was ambitious and relentless, so he cleverly partnered with Canadian and smaller railroads to get his cars to the northeast.  The invention proved so popular that the big roads eventually capitulated.  His entrepreneurial drive gave the country inexpensive meat years, if not decades, sooner than it would have otherwise.


An even bigger leap came from Henry Ford, who (as in the epigraph) refused to accept the conventional wisdom that automobiles were just horseless carriages for the rich.  He was crazy enough to believe he could make durable, reliable cars that his father and other farmers could afford.  Like entrepreneurs throughout this period, he learned the basics from an established business, but was too restless to make a comfortable living going with the flow.  He quit to set up his own company, one that revolutionized the industry and improved everyone’s standard of living in the process.


Keeping the Entrepreneurial Spark Alive


Why then did America’s glorious century end?  Part of the problem was the rise of big business that gradually tamed entrepreneurship.  The aggressive moves of Swift and Ford, like those of McCormick, Rockefeller, and Carnegie, succeeded so well as to yield large organizations that increasingly prized steadiness over innovation.  Meanwhile rising affluence led to calls for government intervention to stabilize and regulate the economy.  We still generated remarkable inventions, from antibiotics to semiconductors, but now we had fewer entrepreneurs to drive these through the economy.


Many economists and historians took the slowdown in stride, as evidenced by books such as America at Middle Age.  But fortunately, a wave of new entrepreneurs emerged after 1970 to take up the challenge.  Rather than complacently settling into comfortable corporate positions, they began pushing crazy ideas that benefitted us all.  Political leaders supported the dynamism with deregulation and stable fiscal and monetary policies.  Unlike in many other countries, entrepreneurs flourished here with our balance of property rights and the right to compete.  From Microsoft and Apple to Uber and Tesla, they built world-class companies that advanced our lives.


We’re continuing to see talented people quit their jobs to try out a venture.  Some, like Marc Lore, become serial disruptors: build a profitable upstart (Quidsi), sell out to a slower-moving incumbent, use the proceeds to build another challenger (Jet), and repeat (Wonder).  Money isn’t enough to satisfy them – studies suggest they would make more money per hour with a corporate job.  They prefer the adventure of starting up a business outside the corporate grind.


Those efforts gained momentum in the 1990s, with the internet and the great expansion in venture capital.  The great innovations of this period, which continue to emerge, have yet to jumpstart growth.  That’s partly because these fundamental changes, like electricity in the 1880s, can take decades to propagate.  We probably need another generation of persistent tinkerers before realizing the full productivity boost of computers and robots.  But there’s plenty of room for optimism.


In contrast to the gloom of most commentators, the world of entrepreneurship has never been so exciting.  We’re seeing capital flow into ventures that promise major improvements, from lower delivery costs to ameliorating climate change.  All we need to do is learn the lesson of the late 19th century and hold back from discouraging this dynamism.

Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183905 https://historynewsnetwork.org/article/183905 0
The Roundup Top Ten for September 9, 2022

History as Love and The Presentist Trap: Responses to James Sweet

by Malcolm Foley and Priya Satia

Two historians respond to the AHA president's essay by reflecting on the politics of historical research and of speaking publicly about the past. 


Why Medical Exceptions to Abortion Bans Won't Protect Women

by Evan Hart

Under new restrictive state laws, judges and lawyers, not doctors or patients, will decide who can get a medically necessary abortion. 



All History is Revisionist

by James M. Banner Jr.

"The collective noun for a group of historians is an “argumentation,” and for good reason. At the very dawn of historical inquiry in the West, historians were already wrestling over the past, attacking each other."



I was Fired for Asking My Students to Wear Masks

by Michael Phillips

Sometimes academic freedom is about the ability of professors to advocate on behalf of the campus community's health against administrators who prefer silence as a matter of political expediency. 



The Supreme Court Ignored the 19th Century "Voluntary Motherhood" Movement

by Lauren Thompson

Abortion is, indeed, a deeply rooted right in American history, demonstrated by the extent to which women have worked to control reproduction. 



New Colombian Leadership Means it's Time for the US to End the Disastrous Drug War

by Christy Thornton

The US has taken steps to pull back from the domestic war on drugs. But the violent, repressive and expensive campaign to fund militarized drug interdiction in Latin America has carried on uninterrupted, fueling political violence abroad and fentanyl overdoses at home. 



Why the Right Hates History Now

by Jonathan M. Katz

Conservative intellectuals like L. Brent Bozell used to claim the authority of history because the saw it as a set of texts that affirmed the right of men like them to rule. Now that the field has changed, pundits like Bret Stephens have little use for it. 



It's Not Trump's GOP, it's Pat Buchanan's

by Nicole Hemmer

Republicans have come around to Pat Buchanan's vision of a hard-right, pessimistic and grievance-driven party. The question now is whether they will soften that vision in pursuit of a majority of voters, or try to keep power as a minority party. 



Colleges Should Quit Trying to Appease the Right

by Silke-Maria Weineck

"When J.D. Vance says that 'professors are the enemy,' he is correct. He is our enemy, and we must be his. I welcome his hatred."



Ken Burns Got "Prohibition" Wrong

by Mark Lawrence Schrad

Burns largely accepts an individualistic and libertarian narrative of prohibition as a misbegotten campaign of moral scolds, missing the reformist, egalitarian, and humane demands of the movement and the exploitative nature of the "liquor traffic" it sought to disrupt. 


Fri, 30 Sep 2022 19:36:22 +0000 https://historynewsnetwork.org/article/183899 https://historynewsnetwork.org/article/183899 0