History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Thu, 21 Nov 2024 06:29:13 +0000 Thu, 21 Nov 2024 06:29:13 +0000 Laminas_Feed_Writer 2 (https://getlaminas.org) https://www.hnn.us/site/feed Martha Hodes Talks "My Hijacking" with HNN

Martha Hodes (l) and her older sister Catherine on the single passport they shared. Photo courtesy of Martha Hodes.

On September 6, 1970 Martha Hodes, then aged 12, and her older sister Catherine boarded a TWA around-the-world flight in Tel Aviv to return to New York after spending the summer in Israel. After a stop in Frankfurt, militants from the Popular Front for the Liberation of Palestine hijacked the flight and rerouted it to a makeshift airfield in the Jordan desert, part of a coordinated campaign of four hijackings. The Hodes sisters became part of a six-day drama that held the world’s attention before the ultimate release of all of the hostages.

Yet, for years her own memories of the event were vague and unclear, and she occasionally felt as if it weren’t certain that she had, in fact been inside that plane at all. Her new book My Hijacking: A Personal History of Forgetting and Remembering, published to acclaim by HarperCollins, describes her work to apply her craft as a historian to her own memory.

Professor Hodes agreed to a discussion over email of the book and her unique experiences as a historian examining her own hostage crisis.

HNN: Readers might initially be shocked to learn about your longstanding ways of relating to your experience, which included anxiety and avoidance around air travel, but also a sense of unreality and detachment from the events. How did you begin to approach your own experience as a historian?

Martha Hodes: One of the oddest parts of thinking back on the hijacking was the sense that it had never happened to me. As a historian, I wanted to dispel this illogical perception by researching, documenting, and reconstructing the event—from the moment of the hijacking up in the air, to our week inside the plane in the desert, to our release. Yet even when I came upon raw news footage in which my sister and I appeared on screen, I felt more like a historian coming upon evidence of a distant historical subject. 

Bringing a historian’s skills to memoir-writing, I studied the work of other scholars writing about their own lives: Hazel Carby’s Imperial Intimacies, Clifton Crais’s History Lessons, Annette Gordon-Reed’s On Juneteenth, Saidiya Hartman’s Lose Your Mother, Jonathan Scott Holloway’s Jim Crow Wisdom, Daniel Mendelsohn’s The Lost, Edward Said’s Out of Place, Richard White’s Remembering Ahanagran, to name a few. Working as a memoirist writing a personal story and working at the same time as a historian writing about the past, I found it valuable to think of my twelve-year-old self as an historical actor in the past. In the end, that helped me come around to the fact that I had really been there, in the desert.

Part of this journey was reading your own diary from this period. I imagine most people would find their half-century old diary a challenging reading assignment even with much lower stakes—What did this document tell you?

My 1970 diary turned out to be a key document, though not in the way I’d expected. I’d packed my diary in my carry-on bag and wrote every day during the hijacking, so I thought it would be the scaffolding on which to build the book—after all, historians place considerable trust in documents from the time and place about which we are writing. Soon, though, I discovered that I’d omitted a great deal from those pages, in particular descriptions of frightening circumstances and events, as well as my own feelings of fear. When my students work with primary sources, I always teach them to ask, “Why did this person tell this story this way?” In that light, I came to understand my own inability, as a twelve-year-old caught in a world historical event, to absorb everything that was happening around me. Or maybe it was that I didn’t want to record what I knew I wouldn’t want to remember.

At many other times, documents speak to you in ways that disrupt your understanding; can you describe some of these moments?

I’ll mention a crucial one. Along with my diary, the other key document I discovered was a tape recording of an interview that my sister and I had given less than a week after we returned home—again, valuable as a document created very close in time to the event under investigation. From that astounding document I learned a great deal about how my family handled the immediate aftermath of the hijacking and how I would approach the hijacking for decades afterward. For me, writing the book broke my own pattern of denial and dismissiveness.

Your writing is particularly effective, I think, in mirroring your own developing understanding of the unfolding of the hostage crisis breaking emotional barriers you had long maintained. For me, this was most dramatic when you begin to confront a question that preoccupied people from your 12-year-old self to Henry Kissinger for that week in September: would the PFLP follow through on its threat to kill hostages? What can you conclude about as a historian about the danger you and your fellow hostages faced?

The dynamics of hostage-taking required our captors to keep their hostages off-balance. Sometimes they conveyed that no one would be harmed and occasionally they threatened that if their demands were not soon met, we would all die. And of course they told the world that the planes, along with the hostages, would be destroyed if their demands were not honored. In the course of my research, though, I learned that the Popular Front’s internal policy was not to harm anyone (and all the hostages did in fact return unharmed). But I also learned, during my research, of other ways that harm could have come to us—say, in an attack from the outside or by the accidental firing of one of the many weapons all around us. As a historian of the American Civil War, I teach my students about contingency on the battlefield; looking back, there was also considerable contingency out there in the desert.

Your parents’ story also presents something remarkable from today’s perspective: two people of modest origins making long, gainfully employed careers in the arts (dance, in their case). Can you discuss a bit the story of your family, and how it placed you on that TWA airliner?

My parents were both modern dancers in the Martha Graham Dance Company. They were divorced, and my sister and I had spent the summer in Israel with our mother, who had moved there to help start the Batsheva Dance Company, Israel’s first modern dance troupe. I learned during my research—something I hadn’t quite understood at the time—that my sister and I were different from most of the other American Jews among the hostages, who were keen on visiting Israel after the 1967 war. Both my parents were raised as secular Jews, and my mother had moved to Israel as a dancer, with no interest in Zionism, or even any particular interest in Israel. My childhood attachment to Israel stemmed from the fact that Tel Aviv was the place I spent carefree summers.

Stepping back a bit, I want to talk about historiography. In your career, you’ve been no stranger to writing about other people’s grief and trauma as it’s preserved and remembered in the archives. In My Hijacking you explore the way that forgetting can be a necessary, even purposeful, part of people’s response to loss and harm. How can readers and writers understand how this work shapes the historical record?

I’m a historian of the nineteenth-century United States, and in different ways, each of my previous books has addressed the problem of forgetting in the historical record. In White Women, Black Men: Illicit Sex in the Nineteenth-Century South, I found that overpowering ideas of white anxiety about sex between white women and black men erased the historical record of white southerners’ toleration for such liaisons under the institution of slavery. In The Sea Captain’s Wife: A True Story of Love, Race, and War in the Nineteenth-Century, the act of forgetting was more personal: the protagonist, a white New England woman who married a Caribbean man of color, had been partially erased from family history. In Mourning Lincoln, I found that widespread gleeful responses to Lincoln’s assassination—in the North as well as the South—came to be forgotten, overtaken by narratives of universal grief.

Returning to the dilemma of my diary in My Hijacking, I saw quite starkly how first-person documents can be crafted in particular ways, and how erasure can then foster forgetting. As for My Hijacking shaping the historical record, it’s deeply researched, but it’s also my story alone. The experience of each hostage depended on factors ranging from where you were sitting on the plane to your convictions about the history of Israel/Palestine. As I write in the book, “I could strive only to tell my own story of the hijacking in the truest possible way.”

At HNN, we seek to connect the past and the present, and to draw insight onto current events from historical thought. It seems to me that in Americans’ collective responses to upheavals in recent history, from 9/11 to the COVID pandemic are deeply structured and enabled  by forgetting. Would you agree? And how can understanding the work of forgetting help us think about the recent past? 

Forgetting can be a way to survive, and during my research I found that my family was not alone in not talking much about the hijacking after my sister and I returned home. But it’s also the work of historians to remember, and while researching My Hijacking I learned about the process of forgetting in another way. Like many American Jews in 1970, I had no idea that Palestinians had once lived on the same land. In the book, I recount a visit my sister and I took with our mother to the village of Ein Hod. We didn’t know that in 1948 Palestinians had been exiled and the village resettledby Israeli artists. My sister wrote a letter home to our father, saying that the artists lived “in quaint little Arab style houses” surrounded by “beautiful mountain views and flowers,” thereby illuminating a kind of collective forgetting. At twelve years old, in the desert, I began to learn about the irreconcilable narratives told by different sides of the conflict. On the plane, my sister and I felt sorry for everyone—for our fellow captives, especially the Holocaust survivors among us, and for our captors and their families, some of whom had lost their homes in 1948 and 1967. We puzzled out the conflict, but at twelve and thirteen years old we couldn’t think of a solution.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185857 https://historynewsnetwork.org/article/185857 0
SCOTUS Declares Race-Aware Admissions at Harvard, UNC Unconstitutional

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185880 https://historynewsnetwork.org/article/185880 0
Blaine Harden on the Persistence of Marcus Whitman's Myth in the West

Blaine Harden (Photo by Jessica Kowal)

"The Whitman lie is a timeless reminder that in America a good story has an insidious way of trumping a true one, especially if that story confirms our virtue, congratulates our pluck, and enshrines our status as God’s chosen people."—Blaine Harden, Murder at the Mission

           

As the result of a good story, the Reverend Marcus Whitman and his wife Narcissa became perhaps the most revered pioneer couple in the history of America’s westward expansion.

Six decades ago, as a student in Spokane, Washington, I learned of the Whitmans in a course on our state history, a requirement in Washington schools.

 In our textbooks and lectures, the Whitman couple was virtually deified as benevolent Christian pioneers who offered Indians salvation as they brought civilization to their backward flock. At the same time, they encouraged others from the East to join them where land was plentiful and open for the taking. And Reverend Whitman was celebrated as an American patriot who saved for America the territory that became the states of Washington, Oregon, and Idaho from a plot hatched by the British with Catholic and Native co-conspirators.

We also learned of the Whitman massacre: the shocking and gruesome 1847 murders of the gracious Whitmans and eleven other white people by renegade Cayuse Indians in an unprovoked attack at their mission near present day Walla Walla. The massacre became a flashpoint in the history of the West.

It turns out that the Whitman story we were taught decades ago was rife with lies, as acclaimed journalist and author Blaine Harden reveals in his lively recent book, a masterwork of historical detection, Murder at the Mission: A Frontier Killing, Its Legacy of Lies, and the Taking of the West (Viking).

Mr. Harden learned the same version of the Whitman tale that I did in the early sixties as a fifth-grade student in Moses Lake, Washington. In recent years, he decided to meticulously investigate the Whitman legend and set forth an accurate historical account of the amazing story we learned in school.

Murder at the Mission presents a nuanced and complicated history of settler colonialism, racism, greed, righteousness, and mythmaking. Based on exhaustive research, including recent interviews with members of the Cayuse tribe, Mr. Harden traces the actual journey of the Whitmans and events leading to their deaths. He also reveals the origins of the Whitman hoax and how the lies about the Whitmans were spread after the massacre, notably by an embittered fellow missionary, the Reverend Henry Spalding.

With the help of religious, business and political leaders invested in a story to justify the evils of Manifest Destiny and westward expansion, Spalding’s exalted tale of Marcus and Narcissa Whitman was endorsed by major publications and the Congress and was shared in textbooks. The lie was so effectively spread that, by the end of the nineteenth century, Marcus Whitman was seen as one of the most significant men in our nation’s history.

Mr. Harden dissects and analyzes every aspect of the fabulous Whitman tale as he debunks the lie with primary sources and other evidence. The book also brings to life the predicament of the Cayuse people and other Native Americans where the Whitmans settled, as epidemics ravaged the tribes and a flood of white settlers pushed them from their traditional lands. And Mr. Harden places the Whitman lore in historical context by examining parallel accounts of the dispossession and extermination of other Native Americans in the conquest of the West.

Murder at the Mission exposes the lies at the center of a foundational American myth and examines how a false story enthralled a public weaned on Manifest Destiny and eager for validation of its rapacious conquest and intolerance while ignoring evidence-based history and countervailing arguments. As we continue to deal with the disinformation, potent political lies, genocidal conflicts, and systemic racism, Mr. Harden’s re-assessment of this history is powerfully resonant now.

Blaine Harden is an award-winning journalist who served as The Washington Post’s bureau chief in East Asia and Africa, as a local and national correspondent for The New York Times, and as a writer for the Times Magazine. He was also Post bureau chief in Warsaw, during the collapse of Communism and the breakup of Yugoslavia (1989-1993), and in Nairobi, where he covered sub-Saharan Africa (1985-1989). His other books include The Great Leader and the Fighter Pilot; King of Spies: The Dark Reign of America’s Spymaster in Korea; Africa: Dispatches from a Fragile Continent; A River Lost: The Life and Death of the Columbia; and Escape from Camp 14. Africa won a Pen American Center citation for first book of non-fiction. Escape From Camp 14 won the 2112 Grand Prix de la Biographie Politique, a French literary award and enjoyed several weeks on various New York Times bestseller lists and was an international bestseller published in 28 languages.

Mr. Harden’s journalism awards include the Ernie Pyle Award for coverage of the siege of Sarajevo during the Bosnian War; the American Society of Newspaper Editors Award for Nondeadline Writing (stories about Africa); and the Livingston Award for International Reporting (stories about Africa).  He has contributed to The Economist, PBS Frontline, Foreign Policy, and more. He lives in Seattle with his family.

Mr. Harden sat down at a Seattle café and generously discussed his work and his recent book Murder at the Mission.

Robin Lindley: Congratulations Mr. Harden on your revelatory work on the myth of Marcus Whitman in your new book, Murder at the Mission. Before we get to the book, could you talk about your background as a globetrotting foreign correspondent and author of several previous books?

Blaine Harden: I grew up in Moses Lake, a little town in eastern Washington, and in Aberdeen, which is on the coast. My father was a construction welder and we moved when he found jobs or when he lost jobs. And so, we moved from Moses Lake to Aberdeen and back to Moses Lake. My father’s work was dependent on the construction of dams on the Columbia River and on other federal spending in the Columbia Basin. His wages as a union welder catapulted him and his family from working class poor to middle class. So, when I graduated from high school, there was enough money for me to go to a private college. I went to Gonzaga University in Spokane and then to graduate school at Syracuse.

When I was at Syracuse, I had the great good fortune of having a visiting professor who was the managing editor of the Washington Post, Howard Simons. I made his course my full-time job and ignored every other class. He offered me a job. I worked for one year at the Trenton (N.J.) Times, a farm-club paper then owned by the Washington Post.

When I was 25, I was at the Washington Post. I worked there locally for about five years. Then I worked from abroad, which is something I always wanted to do when I was at college. I was a philosophy major, and I read Hume who wrote that you could only know what you see in front of you, what you experience with your senses. I had this narcissistic view that the world didn't exist unless I saw it.

I went to Africa for the Post and was there for five years. Then I was in Eastern Europe and the Balkans for three years during the collapse of Communism and the collapse of Yugoslavia. When that was over, I'd been abroad for eight years, and I was a little tired of it. I wanted to write another book. I'd already written one about Africa.

I came back and to the Northwest and wrote a book about the rivers and the dams and the natural resource wars that were going on in the Pacific Northwest over salmon and the proper use of the Columbia River.

I did that book, and then went back to Washington, D.C., and then to Japan in 2007. I came back to the Northwest in 2010 when I left the newspaper business and wrote three books about North Korea. I'd been in Eastern Europe and then the Far East. These books sort of fell together, one after another.

Robin Lindley: And then you focused on the Whitman story?

Blaine Harden: Yes. Then I was looking for another project. When I was in elementary school in Moses Lake, there was a school play about Marcus Whitman and I played a role. For some reason I remembered that story, and I just started investigating Marcus Whitman. Did he really save the Pacific Northwest from the British?

I went to the University of Washington and spent a couple weeks reading the incredibly voluminous literature about Whitman. It didn't take long to understand that everything I had been taught in school was nonsense.

The history we learned was a lie, and it was a deliberate lie, one that had been debunked in 1900 by scholars at Yale and in Chicago. But the people of the Pacific Northwest, despite all evidence to the contrary, had clung to a story that was baloney. That's what hooked me. I thought this could be a really good book about the power of a lie in America. It was the kind of lie that makes Americans feel good about themselves.

The Whitman lie was perfect for a nation that thinks of itself as extra special. It was action-packed, hero-driven, sanctified by God. It also made Americans feel good about taking land away from Indians.

Robin Lindley: With the Whitman story, many people might ask first wasn't there really a massacre in 1847? And didn't Indians attack the Whitmans and kill them and other white settlers?

Blaine Harden: Right. The story of the Whitman massacre is true.

Marcus and Narcissa Whitman came out to the Pacific Northwest in 1836. They were part of the first group of missionaries to settle in the Columbia River Basin. They were there for eleven years, during which time they failed as missionaries. Marcus and Narcissa converted only two people in eleven years. They also infuriated their Cayuse hosts, who frequently asked them to move on, and they refused. The Cayuse asked them to pay rent, and they refused. The Cayuse also noticed that when epidemics occurred, measles in particular, white people would get sick, but they wouldn't die. Members of the Cayuse and the Nez Perce tribes, however, would die in terrifying numbers. They had no immunity to diseases imported by whites.

Years of bitterness between the missionaries and the tribes culminated with an especially severe measles epidemic, which for many complicated reasons the Cayuse blamed on Marcus Whitman and his wife. So, they murdered the Whitmans along with eleven other white people. The murders were grisly. The Whitmans were cut up, decapitated, stomped into the ground, and buried in a shallow grave.

When word of this atrocity reached the much larger white settlement in the Willamette Valley in what is now western Oregon, people got very upset. They mobilized a militia to punish the Cayuse and sent a delegation to Washington, D.C. They hoped to persuade President Polk and Congress to abrogate a decades-old treaty under which the Oregon County was jointly owned by Britain and the United States. And they succeeded.

The Whitman massacre turned out to be the precipitating event for an official government declaration that Oregon was a territory of the United States. Within a few years, it became the states of Oregon, Washington, and then later Idaho.

Whitman is justifiably famous for getting himself killed in a macabre and sensational way. His murder was indeed the pivot point for the creation of a continental nation that included the Pacific Northwest. But that is where truth ends and lies begin. It would take another two decades after Whitman’s murder for a big whopper to emerge—the claim that Whitman saved Oregon from a British, Catholic and Indian scheme to steal the territory.

Robin Lindley: What was the myth that emerged about Whitman and what was the reality?

Blaine Harden: As I said, the reality was that Whitman and his wife were failed missionaries who antagonized the Cayuse and refused to move. Marcus Whitman was a medical doctor as well as a missionary, and the Cayuse had a long tradition of killing failed medicine men. Marcus Whitman was aware of that tradition and had written about it. He knew he was taking a great risk by working among the Cayuse and he was repeatedly warned that he should leave. He and his wife ignored all the warnings.

When a measles epidemic swept into Cayuse country in 1847, Whitman gave medical treatment to whites and Indians. Most whites survived; most Indians died. In the Cayuse tradition, this meant that Whitman needed killing.

Robin Lindley: And Narcissa Whitman was known for her racist remarks and dislike of the Indigenous people.

Blaine Harden: Narcissa wrote some rather affecting letters about being a missionary and traveling across the country. They were widely read in the East and much appreciated. She became something of a of a darling as a frontierswoman, a missionary icon.

But she never learned to speak the Nez Perce language, which was the lingua franca of the Cayuse. She described them in letters as dirty and flea infected. By the time of her death, she had almost nothing to do with the Cayuse, whom she had come to save. She was teaching white children who arrived on the Oregon Trail.

So that's the true Whitman story. As for the creation of the Whitman lie, there is another figure in my book who is very important, the Reverend Henry Spalding. He came west with the Whitmans and, strangely enough, he had proposed to Narcissa years before she married Marcus Whitman. He had been turned down and he never forgave her.

Spalding was constantly irritating and speaking ill of the Whitmans during their lifetime. But after their deaths, he decided to cast them as heroes. He claimed that in 1842 Whitman rode a horse by himself back to Washington, D.C., and burst into the White House and persuaded President Tyler to send settlers to the Pacific Northwest—and was thus successful in blocking a British and Indian plot to steal the Northwest away from America.

Robin Lindley: And didn’t Spalding also claim that Catholics were allies of the British in this so-called plot.

Blaine Harden: Yes. Spalding said Catholics were in the plot—and politicians believed him. Spalding was a big bearded, authoritative-looking figure when he traveled back to Washington in 1870. By then, he was pushing 70 himself and was probably the longest-tenured missionary in the Northwest, perhaps in the entire West.

Spalding went to the U.S. Senate with his manifesto, which was a grab bag of lies and insinuations. And the Senate and House bought it hook, line and sinker.

The manifesto was reprinted as an official U.S. government document. It became a primary source for almost every history book that was printed between 1878 and 1900. Every school kid, every college kid, every church kid in America learned this false story about Marcus Whitman, and it catapulted Whitman from a nobody into a hero of the status of Meriweather Lewis or Sam Houston. That’s according to a survey of eminent thinkers in 1900.

Spalding was spectacularly successful in marketing his lie. What's important for readers to think about is that this lie appealed to Americans in the same way that lies now appeal to Americans. As I said, it was simple. It was hero driven, action packed, ordained by God, and it sanctified whatever Americans had done. And when they came to the West, white Americans stole the land of the Indians. They knew what they were doing, but to have the taking of the West sanctified by this heroic story made it much more palatable. You could feel much better about yourself if you did it in response to the killing of a heroic man of God who saved the West from a nefarious plot.

Spalding was a smart demagogue who intuited what Americans wanted to hear, and he sold it to them. That may sound like some politicians we know now in our current political discourse, but Spalding got away with it. He died a happy man and was later lauded, even in the 1930s by Franklin Roosevelt as a very effective, heroic missionary. That's the lie.

Robin Lindley: And you discovered early evidence of those who were skeptics of Spalding’s tale and who debunked his version of the Whitman story.

Blaine Harden: Around 1898, there was a student at the University of Washington who read about Spalding’s Whitman story and didn't believe it. Then he went to Yale for a graduate degree. While there, he told his professor of history, Edward Gaylord Bourne, about his suspicions of the Whitman story. Bourne was an eminent scholar, a founder of the modern school of history based on primary sources. He didn't rely on what people were saying happened, but would look at original letters, documents, and other contemporaneous material.

Bourne started to investigate the Whitman story and he soon found irrefutable primary sources showing that Whitman did not save the Pacific Northwest from a plot. Instead, Whitman went to Washington, DC, briefly in 1842 and then went to Boston to save his mission because it was in danger of losing its funding. That was the actual story.

Professor Bourne debunked the Spalding story at a meeting of the American Historical Association in Ann Arbor, Michigan in 1900. Most of America’s major newspapers and its academic establishment accepted Bourne's evidence that Whitman was not a hero and Spalding was a world-class liar. But that didn't happen in the Pacific Northwest.

Much of the Pacific Northwest continued to believe in the lie. Politicians continued to promote Whitman as the most important individual who ever lived in Washington state. A huge bronze statue of Whitman was chosen to represent Washington State in the U.S. Capitol beginning in 1953, more than half a century after his legend had been debunked. The state legislature in Washington still chose him as the state’s most important historical personage. My book explains why.

Robin Lindley: That Whitman statue to represent Washington State in the Capitol was replaced in the last couple of years. Did your book have anything to do with that? 

Blaine Harden: The state legislature made a decision to replace the statue in the month that my book came out, but I can't claim credit for persuading them to do so. There was just a reassessment going on and my book was part of it [for more on Whitman and his commemoration, see this 2020 essay by Cassandra Tate—ed.].

It's important to understand why the lie had such legs in the Northwest after it had been debunked nationally. A primary reason was support for the lie from Whitman College. Whitman College was created by a missionary (the Reverend Cushing Eells) who was a peer of Marcus Whitman. It was founded in 1859 and struggled mightily for decades.

Robin Lindley: Didn’t Whitman College begin in Walla Walla as a seminary?

Blaine Harden: Yes. It was a seminary, but by 1882 had become a four-year college for men. By the 1890s, it was in dire financial straits. It couldn't pay its mortgage. It was losing students. It couldn't pay its faculty. Presidents of the college kept getting fired. Finally, they hired a young graduate of Williams College who had come west to work as a Congregational pastor in Dayton, Washington, a small town not far from Walla Walla. This young pastor, Stephen B. L. Penrose, joined the board of Whitman College, and very soon thereafter, the college president was fired. Penrose, who was not yet 30, then became youngest college president in America.

Penrose inherited a mess. The college was bleeding students and was on the verge of bankruptcy. Searching for a way to save it, he went into the library at Whitman College and discovered a book (Oregon: The Struggle for Possession, Houghton Mifflin, 1883). It told the amazing but of course false story of Marcus Whitman saving the Pacific Northwest and being killed by Indians for his trouble. Penrose was thunderstruck. He believed he’d found a public-relations bonanza for his college. He boiled the Whitman myth down to a seven-page pamphlet that equated Marcus Whitman with Jesus Christ. The pamphlet said that Whitman, not unlike Christ on the cross, had shed his blood for a noble cause—saving Oregon. The least that Americans could do, Penrose argued in his elegantly written pamphlet, was donate money to rescue the worthy but struggling western college named after the martyred missionary.

Penrose took this spiel on the road. It was a spectacular success in a Christian nation that believed in Manifest Destiny and admired missionaries. Penrose went to Chicago, sold the lie to a very rich man and a powerful newspaper editor. He then shared the story on the East Coast as he traveled between Philadelphia and Boston among some of the richest Protestants in the country.

Penrose raised the equivalent of millions of dollars. That money saved Whitman College. Penrose went on to serve as president of the college for 40 years and, from the mid-1890s until his death, he kept repeating the Whitman lie despite overwhelming evidence that it was not true.

Penrose, though, was much more than merely a factually challenged fundraiser. He was also a scholar obsessed with building a first-class, Williams-like liberal arts college in the Pacific Northwest. And, by the 1920s, he had succeeded. Whitman became one of the best private colleges in the Pacific Northwest, one of the best in the West. And it still is. It ranks among the top liberal arts colleges in America. Penrose used an absurd lie to create a fine school. It has educated many, many thousands of the people who now are leaders in Washington, Oregon, and Idaho.

Robin Lindley: Was Penrose aware that the Whitman story he shared through the years was a lie?

Blaine Harden: He never acknowledged that, but the people who knew him also knew how sophisticated and well-read he was. He taught Greek and Latin, and he was expert in many fields. He probably knew that the story he told was nonsense, but it was such a useful story for the college that he kept telling it until the day he died.

Robin Lindley: Your book deals with the legacy of Marcus Whitman at Whitman College in recent years and some of the controversy. You indicate that most of the Whitman students now don't know much about Marcus Whitman.

Blaine Harden: Penrose was around until the 1940s. After that, the school quietly walked back its commitment to the false story, without formally denouncing it. It didn't say we were wrong, and we built this institution on the lie. It has never said that to this day, but they backed away from it. Two historians who taught at Whitman said that the college could not have survived without the lie. In fact, they published papers and books to that effect and gave speeches at the school about that. No president of Whitman College has formally acknowledged this truth. Instead, the college quietly backed away from the lie. It stopped taking students from the campus out to the massacre site where the National Park Service has honored the Whitmans. Most students since the sixties and into the 21st century didn’t learn much about Marcus Whitman. The massacre and its relationship to the college and to the land that the school is built on was fuzzily understood. There was a deliberate plan by college administrators to move away from the myth, focusing instead on more global issues.

But in the past ten years students have become very much aware of the actual history. They've changed the name of the college newspaper from The Pioneer to the Wire. They've changed the name of the mascot from the Missionaries to the Blues. A portrait of Narcissa Whitman was defaced and a statue of her was quietly removed from campus. There's a statue of Marcus Whitman that the students want to remove; many professors hate it. In fact, a faculty committee located the statue on a far edge of the campus, near a railroad track. One professor said they put it there in the hope that a train might derail and destroy it. I asked the administration, as I finished the book, about any plans to change the name of the college or thoroughly investigate its historical dependence on a lie. The answer was no.

The college has, to its credit, become much more involved with the Cayuse, the Umatilla, and the Walla Walla tribes who live nearby on the Umatilla Reservation. They've offered five full scholarships to students from the reservation. They're also inviting elders from the tribes to talk to students. Students now are much more knowledgeable than they were in the past as the result of raising awareness.

Robin Lindley: You detail the lives of white missionaries in the Oregon region. Most people probably thought of missionaries as well intentioned, but you capture the bickering between the Protestant missionaries and their deep animosity toward Catholics. A reader might wonder about this feuding and the tensions when all the missionaries were Christians. Of course, as you stress, it was a time of strong anti-Catholic sentiment, xenophobia and nativism in America.

Blaine Harden: The Protestants and the Catholics were competing for Indian souls. In the early competition, the Catholics were lighter on their feet. They didn't require so much complicated theological understanding among the Native Americans before they would allow them to be baptized and take communion. Basically, if you expressed an interest, you were in. But the Protestants were Calvinists. They demanded that tribal members jump through an almost endless series of theological and behavioral hoops before they could be baptized. Very few Indians were willing to do what the Protestants required.

The Protestants saw the Catholics gaining ground, and they deeply resented it. In fact, they hated the Catholics, and they viewed the Catholic Church as controlled by a far-off figure in Rome. Catholics represented, in the minds of many of the Protestants, an invasion of immigrants. In the 1830s through the 1860s, the United States absorbed the biggest percentage of immigrants in its history, including Catholics from Italy and Ireland. Those immigrants not only came to East Coast cities, but they also came to Midwestern cities, and they also came west. If you were anti-Catholic, you were anti-immigrant. The anti-immigrant stance still has a resonance today.

Spalding was a Protestant who had gone to school at Case Western in Cleveland, where he was drilled in anti-Catholic madness. He created the Whitman lie to wrap all that prejudice, all that fear, into an appealing tale of Manifest Destiny with Catholics as villains.

Robin Lindley: And Spalding was in competition with Whitman although were both trying to convert people to the same Protestant sect.

Blaine Harden: To some extent, there was competition even among the Protestant missionaries, but the real competition was between the Protestants and the Catholics. The anti-Catholicism that was engendered by Spalding and by the Whitmans persisted in Oregon where anti-immigrant and anti-Black provisions were written into the State constitution. Blacks were banned from the state of Oregon until the 1920s under the state law. The Ku Klux Klan had a very receptive audience there and a fundamental hold on the politics of Oregon for many years. Oregon, in fact, banned parochial schools until that ban was overturned by the Supreme Court of the United States in a landmark case.

The spillover from this conflict between the Catholics and the Protestants in the 1830s and forties and fifties lasted well into the 20th century. Even now, there's strong strain of anti-immigrant and anti-Black and anti-minority sentiment in Oregon. It’s a minority view, but it hasn’t disappeared.

Robin Lindley: You set out the context of the Whitman story at a time when Americans embraced Manifest Destiny and the white settler conquest of the West. You vividly describe how the settlers and missionaries treated the Indians. You detail the cycles of dispossession as the Cayuse and other tribes were displaced and attacked violently as whites overran the region.

Blaine Harden: The Cayuse and the Walla Wallas and the Umatillas controlled as their traditional lands an area about the size of Massachusetts, north and south of the Columbia River in what became the Oregon territory.

In 1855, the federal government sent out a governor of Washington Territory, Isaac Stevens, who was to negotiate the taking of Indian land. There was a large meeting right where the campus of Whitman College is today. All the tribes attended. At first, the Cayuse were offered nothing. They were told to move to a reservation that would be established for the Yakima nation, and they refused. They made it clear to Stevens and the 25 to 30 white soldiers that were with him that, unless they got a better deal, they might kill all the whites at the meeting.

Robin Lindley: Weren’t the Cayuse more belligerent than other tribes in the region?

Blaine Harden:  They were known for being tough and willing to resort to violence to get what they wanted. So Stevens recalculated and decided to give them a reservation on traditional Cayuse land with the Umatillas and the Walla Wallas. It is near what’s now Pendleton, Oregon. The three tribes had treaty-guaranteed control of this land after 1855, but white people living around the reservation coveted its best farmland and kept pressing the state and federal government to allow them to take it. White people didn't take all the reservation, but white leases and white ownership created a checkerboard of non-Indian landholdings. By 1979, the reservation had shrunk by nearly two-thirds.

From 1850s all the way to 1980s, the tribes who lived on that reservation were marginalized. They were poor. They were pushed around. They didn't have self-government. There was a lot of hopelessness. There was also a lot of alcoholism and suicide, and a lot of people left the reservation. It was a situation not unlike reservations across the west.

Robin Lindley: You recount the investigation of the Whitman murders, if you could call it that. Five members of the Cayuse tribe were eventually arrested and tried for killing the Whitmans and eleven other white people in the 1847 massacre.

Blaine Harden: By the time of the Whitman killings, there were more white people in the Oregon Territory than Native Americans. Most of that was because of disease. About 90 percent of Native Americans in the Pacific Northwest died in the first 50 years of white contact.

In any case, the white majority wanted vengeance and they wanted justice for the massacre. They wanted to round up the perpetrators and hang them. And they did. They sent an army unit to round up some suspects. They captured five Cayuse men. Whether these were all involved in the killing, it's not clear. One of them almost certainly wasn't involved. But at least two were clearly involved. The role of the other two is not so clear.

The detained men became known as the Cayuse Five, and they were tried eventually in Oregon City, now a suburb of Portland. They were convicted and hanged before a huge crowd. Thousands of people watched the hanging and then, once they were dead, they were loaded on a wagon and the bodies were taken to the edge of town and buried in an unmarked grave that's been lost. The loss of the bodies has tormented the Cayuse. They want to find those remains and bring them back to their land to gain some closure on this business. And they still are looking and there's been some progress. They might be under a Clackamas County gravel yard used for a snow mitigation, but that hasn't yet been resolved.

Robin Lindley:  From your book, it's unclear if the Cayuse defendants even understood what was going on at trial, let alone had an opportunity to confront their accusers.

Blaine Harden: They had lawyers, and the lawyers presented a couple decent arguments. One argument was that the land where the killing of Marcus Whitman and his wife and eleven others occurred, was not US territory, and it also wasn't Indian territory under federal law. It was basically a no man's land governed by a treaty between the British and the Americans with no law enforcement infrastructure or legal authority. Because it occurred there and because the traditional laws of the Cayuse said that failed medicine men can be killed, the lawyers argued that no one could be prosecuted for the murders. The lawyers argued that American courts had no jurisdiction. The judge rejected the argument because he knew that if the Cayuse Five weren't legally convicted, they would be lynched. That was the sentiment of the time, so they were legally convicted and then quickly hanged.

Robin Lindley: And you provide the context of that verdict and sentence by recounting how often Native Americans were hanged throughout the country. And, in places like California, it seems white citizens were encouraged under law to exterminate the Indian population and were rewarded for it.

Blaine Harden: There was a pattern that repeated itself again and again and again in the South, in the Southwest, in California, in Minnesota and in Washington State. White people would come into an area where Native Americans were hunting and fishing and doing things that they'd been doing for hundreds and hundreds of years. And they would crowd out the Indians, push them away, and take the best land. Then sooner or later, a group of Natives Americans would strike back. Sometimes they'd kill some white men, sometimes they'd kill some children, sometimes they'd kill women and children. And then, after that provocation, after the killing of whites, there was a huge overreaction by whites and disproportionate justice was enforced. Native people were rounded up, murdered, and hanged, and survivors were moved off their land.

In Minnesota, more than 300 Native Americans were arrested and condemned to death following a fierce war. About forty of them eventually were hanged and the entire Indian population was removed from Minnesota. It happened again in Colorado. It happened in California. Indians were punished with extreme prejudice.

The goal of the violence against Native Americans was to clear land for white settlement. The provocations that produced an Indian backlash were the perfect way to advance Manifest Destiny: They killed our women and children so we must kill them all or remove them all. The Whitman case is one of the earlier examples. The trial that occurred in Oregon City in the wake of the Whitman killings was well documented and covered by the press. There's a precise, verbatim record of what happened there that I used in the book.

Robin Lindley: The Spalding tale and the treatment of Native Americans, as you note, were examples of virulent racism. The Indians were seen as backward and inferior in a white supremacist nation.

Blaine Harden: Yes. There is a link between the way whites in the Pacific Northwest treated Native American and the way whites in the South treated Blacks after the Civil War. There was a defense of state's rights against an overbearing federal government. The defenders of state's rights were Civil War generals and big statues of these Confederate generals went up throughout the early 20th century in every small, medium, and large city in the South.

In the same way, the statues of Marcus Whitman that went up in the Pacific Northwest and in Washington DC represented a whitewashed, politically accessible self-congratulatory story about land taking. That's why the Whitman story persisted so long.

I think there is a certain sanctimony about living in the liberal Pacific Northwest—that we understand the power and the poison of racism, particularly when you look at it in the context of American South. But if you look at it in the context of whites and Native Americans around here, the legacy of racism is obvious and enduring. I do think, however, that there's been a sea change in the past ten or fifteen years in education in schools at all levels, and books like Murder at the Mission have proliferated, so I think there is a much more sophisticated understanding of racism in the West.

Robin Lindley: You conclude your book with how the Cayuse nation has fared in the past few decades and you offer some evidence of positive recent developments after a history of racism, exploitation, and marginalization.

Blaine Harden: Yes. The book ends with an account of how the Umatilla Reservation has had a Phoenix-like rebirth. That’s a story I report I didn't know when I started the book. And it's hopeful and it speaks well for the character of the tribes and some of the people who engineered this transformation. It also speaks well for the rule of law in America because the treaty that gave them the land promised certain rights under the law. The treaty was ratified by the US Senate, but its language was largely ignored for nearly a century. However, starting in the 1960s and 1970s, federal judges started to read the language of that treaty, and they decided that the tribes have guaranteed rights under the law, and we are a country of laws, and we're going to respect those rights.

Slowly, self-government took hold on the reservation. Young people from the reservation were drafted to serve in World War II and then in Korea and Vietnam. There was also the Indian Relocation Act that moved a lot of young people to cities like Seattle, San Francisco, and Portland. While they were away, many got college educations. They learned about business management and land use planning. Some of the principal actors in the rebirth of the reservation became PhD students in land use planning. Others became lawyers. They understood their rights under state and federal law.

Starting in the seventies and eighties, these men and women went back to the reservation and started organizing self-governance. They started suing the federal government and winning some large settlements for dams that were built on the Columbia. With that money, they slowly started to assert their rights. They reclaimed the Umatilla River, which had been ruined for fish by irrigators who took all its water during the summer. Salmon couldn't swim up and down.

The big thing for the tribes happened in the early 1990s with the Indian Gaming Act, which allowed them to build a casino on the reservation. The reservation is on the only interstate that crosses northwest Oregon, east to west, and that highway runs right past the casino. The casino turned out to be spectacularly successful. By the time money started to flow, the tribe was well organized enough to use that money to do a whole range of things to improve life on the reservation ranging from better medical care to better educational facilities to broader opportunity for businesses.

Gambling money has been a remarkable boon to health. Since it started flowing into reservation programs, rates of suicide, alcoholism, smoking, obesity, and drug abuse have all declined. There are still serious problems, but no longer cataclysmic, existential threats to the survival of these reservations. Money with good planning and investment of the gaming money led to building   offices, projects for big farming, and more. They’ve been successful. And the tribe has acquired major commercial operations around Pendleton. They own one of the biggest outdoor stores and a few golf courses. They are political players in Eastern Oregon that politicians statewide respect. This is recompense for past wrongs.

Robin Lindley: Thanks for sharing this story of the Cayuse and their situation now. You interviewed many tribal members for the book. How did you introduce yourself to the tribe and eventually arrange to talk with tribal elders? There must have been trust issues.

Blaine Harden: There were difficulties and it's not surprising. I'm a white guy from Seattle and my father had built dams that flooded parts of the Columbia plateau. My requests to have in-depth interviews with the elders were not looked upon with a great deal of favor for sound historical reasons.

They were distrustful, but I managed to make an acquaintance with Chuck Sams, who was at that point in charge of press relations for the reservation. He agreed to meet with me and, like many of the other influential people on the reservations, he had gone to college. He also had served in the military in naval intelligence. I wrote a book about an Air Force intelligence officer who was very important in the history of the Korean War. Sams read the book, and I think he thought I was a serious person who was trying to tell the truth. He slowly began to introduce me to some of the elders.

Finally, I sat down with two of the elders who were key players in the transformation of the reservation. They talked to me for hours and then I followed up by phone.

Robin Lindley: You were persistent. Have you heard from any of the Cayuse people about your book?

Blaine Harden: Yes. The key men that I interviewed have been in contact. They like the book. It’s now sold at a museum on the reservation.

Another thing about this book is that I asked several Native Americans to read it before publication. They helped me figure out where I'd made mistakes because of my prejudices or blind spots. Bobbie Connor, an elder of the tribe on the reservation and head of the museum there, pointed out hundreds of things that she found questionable. She helped me correct many errors. The book was greatly improved by her attention to detail.

Robin Lindley: It’s a gift to have that kind of support.

Blaine Harden: Yes. I hadn't done that with my five previous books but in this case, it really helped.

Robin Lindley: Your exhaustive research and astute use of historical detection have won widespread praise. You did extensive archival research and then many interviews, including the crucial interviews with Cayuse elders. Is there anything you’d like to add about your research process?

Blaine Harden: There have been many hundreds of books written about the Whitman story in the past 150 years. A lot of them are nonsense, but there is a long historical trail, and it goes back to the letters that the missionaries wrote back to the missionary headquarters in Boston that sent them west.

Many of the missionaries wrote every week, and these were literate people who weren't lying to their bosses in most cases. They were telling what they thought. All those letters have been kept and entered a database so you can search them by words. The research on Spalding particularly was greatly simplified because I could read those databases, search them, and then create chronologies that were informed by what these people wrote in their letters. And that really helped.

Robin Lindley: Didn’t many doubts arise about Spalding in these letters?

Blaine Harden: Yes, from a lot of the correspondence I reviewed. The correspondence database simplified the research, and it also made it possible to speak with real authority because the letters reveal that Spalding says one thing in the year it happened, and then 20 years later, he's telling a completely different story. It's clear that he's lying. There's no doubt because the primary sources tell you that he's lying, and that's why the professor at Yale in 1900 could say that Spalding’s story was nonsense. And looking back at the documents, you can see just how ridiculous it was.

I also studied the records that are were kept by the people who raised money for Whitman College about how they did it, the stories that they were selling, and then how they panicked when it became clear that their story was being questioned.

Robin Lindley: What did you think of the replacement of the Whitman statue at the US Capitol in Washington DC with a statue of Billy Frank, a hero for Native American rights?

Blaine Harden: There's poetic justice to having Billy Frank’s statue replace the statue of Whitman. It makes a lot of sense because he empowered the tribes by using the rule of law. Billy Frank Jr. went out and fished in many places and got arrested, but he fished in places where he was allowed to under federal and state law. He kept asserting his rights under the law that the federal government and the states had on the books. That's in effect what happened with the Umatilla and Cayuse reservation. They asserted their rights under the law and their conditions improved.

 The one thing about Whitman is that he didn't make up the lie about himself. He was a man of his time who thought the way people of his time thought. And we can't blame him for that.

Robin Lindley: It seems that there's a lot of material in your book that hasn't been shared or widely known before.

Blaine Harden: Some of the actual story was widely known for a while, and then it just disappeared. It didn't become a part of what was taught in public schools. And into the 1980s, a phony version of history was taught officially in the state of Washington.

Robin Lindley: Thank you Mr. Harden for your patience and thoughtful remarks on your work and your revelatory book, Murder at the Mission. Your book provides an antidote to a foundational American myth and serves as a model of historical detection and investigation. Congratulations and best wishes on your next project.

Robin Lindley is a Seattle-based attorney, writer, illustrator, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr.  His writing often focuses on the history of human rights, social justice, conflict, medicine, visual culture, and art. He is currently preparing a book of selected past interviews. Robin’s email: robinlindley@gmail.com.  

Editor's note: for more on the memorialization of Marcus Whitman, see this essay from 2020 by Cassanda Tate

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/blog/154786 https://historynewsnetwork.org/blog/154786 0
New York's Education Wars a Century Ago Show how Content Restrictions Can Backfire

Matthew Hawn, a high school teacher for sixteen years in conservative Sullivan County, Tennessee, opened the 2020-21 year in his Contemporary Issues class with a discussion of police shootings.  White privilege is a fact, he told the students.  He had a history of challenging his classes, which led to lively discussions among those who agreed and disagreed with his views.  But this day’s discussion got back to a parent who objected.  Hawn apologized – but didn’t relent.  Months later, with more parents complaining, school officials reprimanded him for assigning “The First White President,” an essay by Ta-Nehisi Coates, which argues that white supremacy was the basis for Donald Trump’s presidency.  After another incident in April, school officials fired him for insubordination and unprofessional behavior.

Days later, Tennessee outlawed his teaching statewide, placing restrictions on what could be taught about race and sex.  Students should learn “the exceptionalism of our nation,” not “things that inherently divide or pit either Americans against Americans or people groups against people groups,” Governor Bill Lee announced.  The new laws also required advance notice to parents of instruction on sexual orientation, gender identity, and contraception, with an option to withdraw their children.

Over the past three years, at least 18 states have enacted laws governing what is and is not taught in schools. Restricted topics mirror Tennessee’s, focusing on race, gender identity, and sexual orientation.  In some cases, legislation bans the more general category of “divisive concepts,” a term coined in a 2020 executive order issued by the Trump administration and now promoted by conservative advocates.  In recent months, Florida has been at the forefront of extending such laws to cover political ideology, mandating lessons that communism could lead to the overthrow of the US government.  Even the teaching of mathematics has not escaped Florida politics, with 44 books banned for infractions like using race-based examples in word problems.

In a sense the country is stepping back a century to when a similar hysteria invaded New York’s schools during the “Red Scare” at the end of World War I, when fear of socialism and Bolshevism spread throughout the US.  New York City launched its reaction in 1918 when Mayor John Francis Hylan banned public display of the red flag.  He considered the Socialist Party’s banner “an insignia for law hating and anarchy . . .  repulsive to ideals of civilization and the principles upon which our Government is founded.”

In the schools, Benjamin Glassberg, a teacher at Commercial High School in Brooklyn, was cast in Matthew Hawn’s role.  On January 14, 1919, his history class discussed Bolshevism.  The next day, twelve students, about one-third, signed a statement that their teacher had portrayed Bolshevism as a form of political expression not nearly so black as people painted it.  The students cited specifics Glassberg gave them – that the State Department forbade publishing the truth about Bolshevism; that Red Cross staff with first-hand knowledge were prevented from talking about conditions in Russia; that Lenin and Trotsky had undermined rather than supported Germany and helped end the war.  The school’s principal forwarded the statement to Dr. John L. Tildsley, Associate Superintendent of Schools, who suspended Glassberg, pending a trial by the Board of Education.

Glassberg’s trial played out through May.  Several students repeated the charges in their statement, while others testified their teacher had said nothing disrespectful to the US government.  Over that period, the sentiments of school officials became clear.  Dr. Tildsley proclaimed that no person adhering to the Marxian program should become a teacher in the public schools, and if discovered should be forced to resign.  He would be sending to everyone in the school system a circular making clear that “Americanism is to be put above everything else in classroom study.”  He directed teachers to correct students’ opinions contrary to fundamental American ideas. The Board of Education empowered City Superintendent William Ettinger to undertake an “exhaustive examination into the life, affiliations, opinions, and loyalty of every member” of the teachers union.  Organizations like the National Security League and the American Defense Society pushed the fight against Bolshevism across the country.

After the Board declared Glassberg guilty, the pace picked up.  In June, the city’s high school students took a test entitled Examination For High Schools on the Great War.  The title was misleading.  The first question was designed to assess students’ knowledge of and attitude toward Bolshevism.  The instructions to principals said this question was of greatest interest and teachers should highlight any students who displayed an especially intimate knowledge of that subject.  The results pleased school officials when only 1 in 300 students showed any significant knowledge of or leaning toward Bolshevism.  The “self-confessed radicals” would be given a six-month course on the “economic and social system recognized in America.”  Only if they failed that course would their diplomas be denied.

In September, the state got involved.  New York Attorney General Charles D. Newton called for “Americanization,” describing it as “intensive instruction in our schools in the ideals and traditions of America.”  Also serving as counsel to the New York State Legislative Committee to Investigate Bolshevism, commonly known as the Lusk Committee after its chairman, Newton was in a position to make it happen.  In January 1920, Lusk began hearings on education.  Tildsly, Ettinger, and Board of Education President Anning S. Prawl all testified in favor of an Americanization plan.

In April, the New York Senate and Assembly passed three anti-Socialist “Lusk bills.”  The “Teachers’ Loyalty” bill required public school teachers to obtain from the Board of Regents a Certificate of Loyalty to the State and Federal Constitutions and the country’s laws and institutions.  “Sorely needed,” praised the New York Times, a long-time advocate for Americanization in the schools.  But any celebration was premature.  Governor Alfred E. Smith had his objections.  Stating that the Teacher Loyalty Bill “permits one man to place upon any teacher the stigma of disloyalty, and this even without hearing or trial,” he vetoed it along with the others.  Lusk and his backers would have to wait until the governor’s election in November when Nathan L. Miller beat Smith in a squeaker.  After Miller’s inauguration, the Legislature passed the bills again.  Miller signed them in May despite substantial opposition from prominent New Yorkers.

Over the next two years, the opposition grew.  Even the New York Times backed off its unrelenting anti-Socialist stance.  With the governor’s term lasting only two years, opponents got another chance in November, 1922, in a Smith-Miller rematch.  Making the Lusk laws a major issue, Smith won in a landslide.  He announced his intention to repeal the laws days after his inauguration.  Lusk and his backers fought viciously but the Legislature finally passed repeal in April.  Calling the teacher loyalty law (and a second Lusk law on private school licensing) “repugnant to the fundamentals of American democracy,” Smith signed their repeal.

More than any other factor, the experience of the teachers fueled the growing opposition to the Teachers’ Loyalty bill.  After its enactment, state authorities administered two oaths to teachers statewide.  That effort didn’t satisfy Dr. Frank P. Graves, State Commissioner of Education.  In April 1922, he established the Advisory Council on Qualifications of Teachers of the State of New York to hear cases of teachers charged with disloyalty.  He appointed Archibald Stevenson, counsel to the Lusk committee and arch-proponent of rooting out disloyalty in the schools, as one member.  By summer the Council had earned a reputation as a witch hunt.  Its activities drew headlines such as Teachers Secretly Quizzed on Loyalty and Teachers Defy Loyalty Court.  Teachers and principals called before it refused to attend.  Its reputation grew so bad that New York’s Board of Education asked for its abolishment and the President of the Board told teachers that they need not appear if summoned.

A lesson perhaps lies in that experience for proponents of restrictions on what can be taught today.  Already teachers, principals, and superintendents risk fines and termination from violating laws ambiguous on what is and is not allowed.  The result has been a chilling environment where educators simply avoid controversial issues altogether.  Punishing long-time and respected teachers – like Matthew Hawn, whom dozens of his former students defend – will put faces on the fallout from the laws being passed.  How long before a backlash rears up, as it did in New York over Teachers’ Loyalty?

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185878 https://historynewsnetwork.org/article/185878 0
The Army Warned Troops in 1945 of the Danger of Fascism. That Warning Rings True Today

On March 25, 1945, the United States Army issued “Fact Sheet #64: Fascism!” to promote discussions amongst American troops about fascism as the war in Europe wound down to a close. Discussion leaders were alerted “Fascism is not the easiest thing to identify and analyze; nor, once in power, is it easy to destroy. It is important for our future and that of the world that as many of us as possible understand the causes and practices of fascism, in order to combat it.”

It is worth revisiting the Army’s warnings as Donald Trump and MAGA Republicans denounce legal due process and threaten civil war.

Four key points were addressed in the Army fact sheet to be included in discussions.

(1) Fascism is more apt to come to power at a time of economic crisis;

(2) Fascism inevitably leads to war;

(3) It can come to any country;

(4) We can best combat it by making our democracy work.

The fact sheet described findings by war correspondent Cecil Brown who toured the United States after leaving Europe. Brown discovered that most Americans he talked with were “vague about just what fascism really means. He found few Americans who were confident they would recognize a fascist if they saw one.” The War Department was concerned that ignorance about fascism could make it possible for it to emerge in the United States and issued recommendations for how to prevent it.

As a simple definition, the War Department described fascism as the “opposite of democracy. The people run democratic governments, but fascist governments run the people. Fascism is government by the few and for the few.” Fascists remain in power through “skillful manipulation of fear and hate, and by false promise of security . . . At the very time that the fascists proclaimed that their party was the party of the ‘average citizen,’ they were in the pay of certain big industrialists and financiers . . . They played political, religious, social, and economic groups against each other and seized power while these groups struggled against each other.”

The War Department acknowledged that the United States had

native fascists who say that they are ‘100 percent American’ . . . [A]t various times and places in our history, we have had sorry instances of mob sadism, lynchings, vigilantism, terror, and suppression of civil liberties. We have had our hooded gangs, Black Legions, Silver Shirts, and racial and religious bigots. All of them, in the name of Americanism, have used undemocratic methods and doctrines which experience has shown can be properly identified as ‘fascist’.

The War Department warned,

An American fascist seeking power would not proclaim that he is a fascist. Fascism always camouflages its plans and purposes . . . Any fascist attempt to gain power in America would not use the exact Hitler pattern. It would work under the guise of ‘super-patriotism’ and ‘super-American- ism’.

The War Department identified three attitudes and practices that fascists share in common. Fascists pit “religious, racial, and economic groups against one another in order to break down national unity . . . In the United States, native fascists have often been anti-Catholic, anti-Jew, anti-Negro, anti-Labor, anti- foreign-born.” Fascists also “deny the need for international cooperation” and that “all people — regardless of color, race, creed, or nationality have rights.” They “substitute a perverted sort of ultra-nationalism which tells their people that they are the only people in the world who count.” Finally, for fascists, the “[i]ndiscriminate pinning of the label ‘Red’ on people and proposals which one opposes is a common political device.”

Learning to identify American fascists and detect their techniques was not going to be easy, but

it is vitally important to learn to spot them, even though they adopt names and slogans with popular appeal, drape themselves with the American flag, and attempt to carry out their program in the name of the democracy they are trying to destroy . . . In its bid for power, it is ready to drive wedges that will disunite the people and weaken the nation. It supplies the scapegoat — Catholics, Jews, Negroes, labor unions, big business — any group upon which the insecure and unemployed

are willing to blame.

They become frightened, angry, desperate, confused. Many, in their misery, seek to find somebody to blame . . . The resentment may be directed against minorities — especially if undemocratic organizations with power and money can direct our emotions and thinking along these lines.

The goal of the fascist doctrine is to prevent “men from seeking the real cause and a democratic solution to the problem.”

Fascists may talk about freedom, but

freedom . . . involves being alert and on guard against the infringement not only of our own freedom but the freedom of every American. If we permit discrimination, prejudice, or hate to rob anyone of his democratic rights, our own freedom and all democracy is threatened.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185879 https://historynewsnetwork.org/article/185879 0
Was a Utah District's Decision to Remove the Bible from Shelves a Win for the Anti-Anti-Woke? History Says Maybe Not

The latest twist in America’s culture wars saw crowds at the capitol in Salt Lake City this summer, protesting a book ban from the elementary and middle school libraries of Davis County, Utah. Such bans are increasingly prevalent in American public life, with issues of race and sexuality proving especially controversial. In this instance, though, contention arose because an unexpected book was deemed too “violent or vulgar” for children.

The Davis School District’s decision to ban the Bible has riled many, but Utah’s case is not unprecedented. Although the cultural context has changed, controversy over scripture in America’s public schools dates back to the “Bible Wars” of the 1840s, when use of the Protestant King James Bible came under fire. In cities throughout the United States, Protestants clashed with Catholics over the Bible’s place in the nation’s nominally secular but culturally evangelical public schools. In Philadelphia, rumors that Catholics sought to ban the King James Bible from city classrooms sparked deadly riots in 1844, with over twenty killed and dozens injured.

In Utah—at the time of writing—the controversy has not yet triggered physical violence, although today’s “Bible War” is entangled with broader conflict. The angry ambivalence of cancel culture is well illustrated in the placard of one protestor at the Utah Capitol, urging lawmakers to “Remove Porn Not the Holy Bible.”

The Davis School District Bible ban stems from H.B. 374—a “sensitive content” law enacted by Utah’s State Legislature last year. This legislation, backed by activist groups such as Utah Parents United, targets “pornographic or indecent material,” and provides a fast track for the removal of offending literature. Davis had already banned such books as Sherman Alexie’s The Absolutely True Diary of a Part-Time Indian and John Green’s Looking for Alaska when it received an anonymous complaint in December 2022. “Utah Parents United left off one of the most sex-ridden books around: The Bible,” asserted the plaintiff. “You’ll no doubt find that the Bible (under state law) has ‘no serious values for minors’ because it’s pornographic by our new definition.” Tellingly, the Davis school board upheld this objection and removed the Bible, although this decision is under appeal. A similar complaint has since been lodged within the district against the Book of Mormon.

Support for the Utah Bible ban comes from unexpected quarters. Republican state representative Ken Ivory, a co-sponsor of H. B. 374, initially criticized this removal, but since reversed his position. Ivory admitted that the Bible is a “challenging read” for children. More to the point, he questioned whether the school library was the best place for them to encounter scripture. “Traditionally, in America,” he added, “the Bible is best taught, and best understood, in the home, and around the hearth.” Doubling down on his broader skepticism of public education, Ivory demanded Utah school boards review “all instructional materials” for content, though failing to address how such sweeping assessment might work.

Ivory’s appeal to hearth and home hints at a deeper ideology; one that evokes the tradition of limited government and what Thomas Jefferson called the “wall of separation between church and State.” Such historical parallels, though beguiling, are misleading. Jefferson was neither the consistent partisan idealized by today’s libertarians, nor the die-hard secularist admired by critics of religion. On the contrary, his pragmatism was reflected in the Northwest Ordinance of 1787, which framed the territories between the Ohio River and Great Lakes as a political template for American expansion. This ordinance stated that “Religion, morality, and knowledge, being necessary to good government and the happiness of mankind, schools and the means of education shall forever be encouraged.” In so doing, it earmarked public lands for future schools and colleges, while accepting the generally porous boundaries then maintained between the pulpit and the classroom.

Even as the Northwest Ordinance established public education in the Midwest, immigrants from Catholic Europe challenged the region’s dominant Protestant culture by the 1830s. Tensions peaked in Cincinnati, the urban hub of the Ohio Valley and America’s sixth-largest city by 1840. Cincinnati largely escaped the ethnic violence experienced in Philadelphia, but nativist demagogues flooded in by the score. Among them was Lyman Beecher, the notorious New England evangelical who strove to redeem the frontier for Christ. In his 1835 anti-Catholic tract, A Plea for the West, Beecher declaimed: “We must educate! We must educate! Or we must perish by our own prosperity.” Beecher demanded militant Protestant nationalism to stanch the foreign influence of Catholicism. The growing competition between the secular public and Catholic parochial school systems, both of which developed in close competition through the early nineteenth century, only intensified such demands.

Rivalry between Cincinnati’s public and parochial schools culminated shortly after the Civil War. In 1869, hoping to appeal to Catholic and Jewish parents, the public school board voted to ban the King James Bible, which had been assigned “without note or comment.” Outraged citizens took to streets and to the courts in protest. In Minor v. Board of Education (1869), Cincinnati’s Superior Court upheld plaintiff John D. Minor’s assertion that the board acted illegally. Many children, Minor insisted, “receive no religious instruction or knowledge of the Holy Bible, except that communicated as aforesaid in said schools.” In a dissenting opinion, Judge Alphonso Taft defended the board’s position. “This great principle of equality in the enjoyment of religious liberty, and the faithful preservation of the rights of each individual conscience is important in itself ... But in a city and State whose people have been drawn from the four quarters of the world, with a great diversity of inherited religious opinions, it is indispensable.” Ohio’s Supreme Court later overruled Minor v. Board following appeal by the school district. The later ruling, Board of Education v. Minor (1873) “broke open the floodgates,” wrote historian Steven K. Green, “ushering in a national conversation about the meaning of separation of church and state.” Ohio became the first state to authorize (but not require) the Bible to be banned in public schools. The Buckeye State’s decision predated by nearly a century Abington School District v. Schempp (1963), whereby the U.S. Supreme Court banned Bible reading and the Lord’s Prayer in public schools, leading to complaints that “the Supreme Court has made God unconstitutional.”

Cincinnati’s Bible War exposed a nerve. In his Second Inaugural Address, a few years before, Abraham Lincoln reflected on the Civil War: “Both sides read the same Bible and pray to the same God; and each invokes His aid against the other.” Goaded and consoled by the same text, Americans slaughtered each another. As historian Mark A. Noll argued in America’s Book (2022), “the importance of the Bible for explaining the meaning of America,” and “the importance of America for explaining the history of the Bible” are tightly woven motifs. Following the Civil War, “the inability of Bible believers to find common ground in the Book they championed as the comprehensive guide to all truth” signaled the demise of a distinctly Protestant “Bible civilization,” among other consequences, heralding a more multicultural—apparently more secular—nation.

As Utah’s controversy suggests, the Bible may have fallen from grace, yet it remains a potent symbol. No longer assigned as a devotional text in America’s public schools, its mere presence on library shelves remains incendiary. The context surrounding its removal has shifted from nineteenth century sectarianism to twenty-first century culture wars, but continuities ignite destructive passions. Cynics might contend that Utah’s anti-woke warriors have been hoisted on their own censorious petard. However tempting this conclusion, we should also recognize the bitterness of old wine in a new wineskin, as the Bible once more becomes a focus of partisan discord.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185875 https://historynewsnetwork.org/article/185875 0
The Unlikely Success of James Garfield in an Age of Division

An 1880 Puck Cartoon depicts Ulysses Grant surrendering his sword to James Garfield after being defeated for the Republican nomination.

The candidate, at first glance, seemed to have no business being his party’s nominee for the White House. In an era seething with political strife, he had long been viewed by peers in Washington as a pleasant but out-of-touch figure. Partisan warfare was not his strong suit; he cultivated friendships with civil rights opponents and election deniers alike. He enjoyed scrappy political debate but refused to aim any blows below the belt (“I never feel that to slap a man in the face is any real gain to the truth.”) What’s more, American voters seemed to be in a decisively anti-establishment mood, and this nominee had been a presence in Washington for almost two decades – the epitome of a swamp creature.

Yet, somehow, it added up to a winning formula: James Garfield, the nicest man remaining in a polarized Washington, would be elected America’s next president in 1880. His rise to power would be framed as a rare triumph of decency in the increasingly bitter political environment of late 19th century America. It has resonance today as our country again navigates similar public conditions.

Garfield’s election was the very first of the Gilded Age. It was a time defined by tremendous disparity emerging in America. Men like Andrew Carnegie and Jay Gould were ascendant members of a new ruling class of industrialists, the so-called “robber barons.” But their factories were grinding down the working class; America’s first nationwide strike had broken out in 1877. Meanwhile, Reconstruction had failed in the South, leaving Black Americans in a perilous spot. They technically possessed rights, but, in practice, had lost most of them after former Confederates returned to power and reversed the policies of Reconstruction.

Yet the period’s discord was most obvious in its politics. The last presidential election had produced what half of Americans considered an illegitimate result: poor Rutherford Hayes had to put up with being called “Rutherfraud” for his term in the White House.  Meanwhile, the broader Republican Party had fractured into two vividly-named blocs (the “Stalwarts” and the “Half-Breeds”), each of which loathed Hayes almost as much as they did each another.  

In this setting, Minority Leader James Garfield was a uniquely conciliatory figure – the lone Republican who could get along with all the fractious, squabbling members of his party. Stalwarts described him as “a most attractive man to meet,” while the leader of the Half-Breeds was, perhaps, Garfield’s best friend in Congress. President Hayes also considered Garfield a trustworthy legislative lieutenant. The overall picture was a distinctly muddled approach to factional politics: Garfield did not fall into any of his party’s camps but was still treated as a valued partner by each.

Much of this came naturally (“I am a poor hater,” Garfield was fond of saying). But there was also, inevitably, political calculus informing it – the kind that comes from decades spent in Washington, trying in vain to solve the nation’s most pressing issues.

Exceptional as Garfield’s political style was, his life story was more so. He had been born in poverty on the Ohio frontier in 1831 and raised by a single mother. A dizzying ascent followed: by his late twenties, James Garfield was a college president, preacher, and state senator; only a few years later, he had become not just the youngest brigadier general fighting in the Union Army, but also the youngest Congressman in the country by 1864.

His talent seemed limitless; his politics, uncompromising. The young Garfield was an ardent Radical Republican – a member of the most progressive wing of his party on civil rights and the need for an aggressive Reconstruction policy in the postwar South. “So long as I have a voice in public affairs,” Garfield vowed during this time, “it shall not be silent until every leading traitor is completely shut out of all participation of in the management of the Republic.”

But he lived to see this pledge go unfulfilled. Garfield’s Congressional career was exceptionally long – stretching from the Civil War through Reconstruction and beyond – and his politics softened as events unfolded. Principle yielded to pragmatism during what felt like countless national crises. “I am trying to be a radical, and not a fool,” Garfield wrote during President Johnson’s impeachment trial. By the end of 1876, a young firebrand of American politics had evolved into a mature legislative chieftain – the Minority Leader of a fractious Republican Party. Younger ideologues of the Party had Garfield’s sympathy but not his support. “It is the business of statesmanship to wield the political forces so as not to destroy the end to be gained,” he would lecture them.

It is no wonder, then, that Garfield’s reputation as an agreeable Republican was not entirely a positive one. From Frederick Douglass to Ulysses Grant, friends tended to say the same thing: that Garfield’s flip-flopping and politeness indicated he “lacked moral backbone.” Garfield, in contrast, would argue that open-mindedness was a sign of inner strength rather than weakness.

This argument was put to the test in the election of 1880. Republicans entered their nominating convention with a handful of declared candidates who had no clear path to a majority of party support. They emerged behind a surprising choice – James Garfield, who had been picked (apparently against his will) off the convention floor as a compromise candidate. The rank-and-file rejoiced. “His nomination will produce perfect unison,” one celebrated, “because he has helped everybody when asked, and antagonized none.” Garfield was not so exuberant about the outcome. Over the course of his political career, he had learned to view the presidency with deep suspicion; none of the Administrations he had witnessed up-close ended well.

His reservations were well-placed. While trying to appease his party’s different blocs, President Garfield ultimately failed to keep the peace between them – kick-starting a chain of events that led to his murder. The result, ironically, was the nation’s politics suddenly shifted to resemble his own. Americans made Garfield into a martyr and blamed the hyperpartisan political climate of the country for his death. A great period of correction began, but, in all the drama around Garfield’s assassination, his remarkable life was overshadowed by its own untimely end.

On his deathbed, President Garfield seemed to sense this would be the case. Turning to a friend, he asked if his name would “have a place in human history.” The friend’s affirmative answer appeared to relax him. “My work is done.”

 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185876 https://historynewsnetwork.org/article/185876 0
Reading Peter Frankopan's Ambitious Planetary History

Desertification, village of Telly, Mali. Photo Ferdinand ReusCC BY-SA 2.0

 

The 24 main chapters of The Earth Transformed: An Untold History by British historian Peter Frankopan cover a longer period of history--from “the creation of our planet around 4.6 billion years ago” until late 2022--than any book I’ve read (it begins with a series of excellent maps, the first one displaying the Pangaea Supercontinent of 200 million years ago). Its introduction and conclusion focus on the problems of present-day climate change; Frankopan stresses that all his extensive historical research on human interaction with the environment has left him concerned about our climate future--and humanity’s fate within it.

How concerned? This sentence from the introduction sums it up nicely: “We live in a world teetering on the brink of disaster because of climate change.” And parts of that book’s section sum up as well as I’ve seen our present climate predicament:

Human impact on the natural environment has been devastating almost everywhere, in almost every way, from water contamination to erosion, from plastics entering the food chain to pressure on animal and plant life that has reached such a high level that the most recent United Nations report talks of declines in biodiversity at rates that are unprecedented in human history, and that threaten an erosion of “the very foundations of our economies, livelihoods, food security, health and quality of life worldwide.”

Or this 2019 quote from António Guterres, Secretary General of the United Nations: “Every week brings new climate-related devastation . . . . Floods. Drought. Heat waves. Wildfires. Superstorms.”

In his conclusion, Frankopan writes that the summer of 2022 was especially alarming—“Record heatwaves in Europe, the worst drought in many decades in Africa, nearly eight times the average rainfall in Pakistan . . . flash floods in Death Valley in the USA (caused by massive rainfall in three hours)…. the highest-ever recorded rate of rainfall in South Korea . . . the wettest year in Australia's modern history,” extremely high winter temperatures in Paraguay and in South Africa, “and a long and severe drought in China that followed the hottest summer on record, which was called the most severe heatwave ever recorded anywhere and was unparalleled in world climatic history.”

Yet, he marvels, many people continue to deny or minimize human-caused climate change. He does not deny, however, that there has been some progress in various countries, and he stresses that our past and present climate problems have been solvable—if only the collective will coalesces into action. He also mentions hopes that some people have in geoengineering including cloud seeding. But he cautions us that human modification of natural weather systems risks (as one scientific 2015 report indicated) “significant potential for unanticipated, unmanageable, and regrettable consequences.” In a recent Apple TV + eight-part fictional series called “Extrapolations,” a character played by Edward Norton in Episode 4 expresses similar sentiments: “We've treated this planet like an all-you-can-eat buffet for 250 years since we started burning fossil fuels. And changing the chemical composition of the atmosphere is not going to fix it.” Cloud seeding could lead to “changes in rainfall patterns that lead to crop failures and floods [and] . . . extreme weather events leading to mass migrations, social unrest, stress on infrastructure.”

Frankopan also indicates the possibility that future unknown events, like nuclear war, could greatly alter our climate, and concludes that the “biggest risk to global climate comes from volcanoes”—he often mentions the historical climate effects of volcanoes, especially their role in decreasing temperatures (see here for more on that effect). Regarding one volcano, occurring in what is now Indonesia in 1257, he writes that its effects “were global,” affecting such far-away areas as England and “the western flank of the Americas.”

Frankopan believes that generally “we ignore climate and long-run climate patterns or changes altogether when we look at history.” But his new book “attempts to integrate human and natural history,” including climate changes, believing “it is fundamentally important if we are to understand the world around us properly.” Using a wide variety of sources--212 pages of endnotes are available on the publisher’s website--he connects environmental changes to all sorts of historical developments including migrations, plagues, living arrangements, political structures, and religious beliefs. For example, he writes that “three of the most lethal pandemics” of the last 2,000 years followed “warmer springs and wetter summers [that] produced the bacterium that caused bubonic plague.” And, at the end of Chapter 12, “the fundamentals of ecological equilibrium and environmental sustainability underpinned the cultural, political, socio-economic, diplomatic and military histories of individual kingdoms, states or regions. Reliable food and water supplies were central at all times.”

At times, however, several pages may elapse without any mention of climate or the environment, as Frankopan details various political, social, or cultural developments in widely varied parts of the earth including Asia, Africa, and the Americas. As important as he thinks climate has been as a historical factor, he attempts not to overstate its significance. For example, in Chapter 9, he writes, “cities were far more lethal [because of unsanitary conditions] than changes to climate patterns.”

His first chapter is entitled “The World from the Dawn of Time (c.4.5bn–c.7m BC).” In this period before direct human ancestors (Homo-) existed, the author tells us that for about “half the earth’s existence, there was little or no oxygen in the atmosphere.” Still long before humans appeared, periods of extreme warming and cooling existed and one stage “brought about the extinction of 85 per cent of all species. . . . The single most famous moment of large-scale transformation in the past, however, was caused by an asteroid strike that impacted the earth 66 million years ago on the Yucatan peninsula” in Mexico.

In Frankopan’s second chapter, “On the Origins of Our Species (c.7m–c.12,000 BC),” he states that the timing of Homo sapiens’s origins is disputable: “Our own species may have started to emerge as distinct from Homo neanderthalensis [Neanderthals]. . . . though this is a matter of fierce debate.” Humans first appeared in Africa and then dispersed to other continents, for example, “into South-East Asia, China and beyond, reaching Australia by around 65,000 years ago.” “Most scholars date the arrival of the first modern humans in the Americas to around 22,000 years ago.”

The author intersperses these human movements with accounts of climate-change effects--besides volcanoes, he details all sorts of other causes of changes such as El Niño and La Niña. For example, “agriculture may not have been impossible before the Holocene,” a “long period of warmer, stable conditions” that began roughly 10,000 years ago, “but it suited conditions perfectly after its onset.”

Chapters 3 to 24 deal with a time span more familiar to historians--12,000 BC to AD 2022. But the book’s title, The Earth Transformed: An Untold Story correctly indicates that it is also unique. Not a history of some portion of our planet, but a global history (i.e. of the earth). And “untold” because no previous history has integrated the human and environmental journeys together over such a long time span.     

Although Frankopan pays sufficient attention to the Americas and his native England, the part of the world he is most familiar with is the Eurasian Steppe, which runs from the Balkans to the Pacific Ocean. Two of his previous books, The Silk Roads (2017) and The New Silk Roads (2020), deal with that area. Here, in Chapter 8, he writes. “Some 85 per cent of large empires over more than three thousands years developed in or close to the Eurasian steppe.” Among other observations he makes here is one that greatly affects demographics, a topic he often mentions: Tropical climates often provide “a crucible in which infectious diseases could flourish.”

Later on in Chapter 13, “Disease and the Formation of a New World (c.1250–c.1450),” he returns to the Eurasian steppe when he considers the Mongol conquest of many areas. And he writes that it may have “created a perfect environment” for the spread of plague. In the late 1340s, the Black Death spread across “Europe, the Middle East, North Africa,” and probably other parts of Africa, killing “an estimated 40-60 percent of the population.”

Frankopan is not only a global and environmental historian, but also one quite critical of European and Western imperialism and racism, from the time of Columbus to the present. Considering the world around 1500, he writes, “what drove the next cycle in global history was the pursuit of profit,” mainly by Europeans. He also mentions the “‘Great Dying’ of the indigenous populations of the Americas which was caused by violence, malnutrition and disease.” Later, dealing with the half century after 1870, he states that “the dovetailing of evangelical ideas about racial superiority, religious virtue and capitalism was a core element of the way that Europeans, and the British above all, saw both themselves--and the rest of the world.” And in that same period,

“the ecological implications of rapid transformation of landscapes that were motivated by the chase for a fast buck" were “severe and shocking.”

Like the earlier environmental critic E. F. Schumacher, he cites Gandhi on “the ravages of colonialism,” and suggests that modern economics should be based on a less materialistic approach to life. (Schumacher included his essay on “Buddhist Economics” in Small Is Beautiful.)

Regarding slavery, Frankopan estimates that in the 1780s “more than 900,000 souls were sent from the coast of Africa.” “The demand . . . was driven by the vast profits that could be generated from tobacco, cotton, indigo and sugar.” And even now the aftereffects of that racism that helped produce slavery still impact us. U.S. counties that possessed large numbers of slaves in the early 1860s are “more likely today not only to vote Republican, but to oppose affirmative action and express racial resentment and sentiments towards black people.”

The author’s last two chapters prior to his Conclusion cover the period from about 1960 to 2022. From the publication of Rachel Carson’s Silent Spring (1962) until the present, environmental anxieties have continued, at first regarding various forms of pollution and later stressing the dangers of climate change.

Frankopan also reveals that certain types of geoengineering, like cloud seeding, were already engaged in by the USA during the Vietnam War in the late 1960s, and for some time the U. S. Department of Defense has been “the largest institutional producer of greenhouse gases.” Military conflicts, as he points out, come with a “very high” environmental cost--note, for example, Russia’s invasion of Ukraine.

Although there is plenty of blame to go around for what Frankopan considers a woeful minimization of the importance of climate change, in the USA he identifies chiefly the Republicans. Despite more than a 99 percent agreement among “scientists working in this [climate-change] field,” more than half of the Republican members of the 117th Congress (ending in January 2023) “made statements either doubting or refusing to accept the scientific evidence for anthropogenic climate change.” In the last sentence of his book the author writes, “Perhaps we will find our way back there [to a sustainable planet] through peaceful means; a historian would not bet on it.” 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185877 https://historynewsnetwork.org/article/185877 0
The "Critical Race Theory" Controversy Continues

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/177258 https://historynewsnetwork.org/article/177258 0
The Right's Political Attack on LGBTQ Americans Escalates

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/182937 https://historynewsnetwork.org/article/182937 0
Mifepristone, the Courts, and the Comstock Act: Historians on the Politics of Abortion Rights

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/181169 https://historynewsnetwork.org/article/181169 0
The Roundup Top Ten for June 23, 2023

Daniel Ellsberg's Moral Courage Was Unsparing, Even of Himself

by Erik Baker

Turning against Vietnam wasn't Daniel Ellsberg's great moral achievement. That was his realization that he had to choose between moral conviction and maintaining his place in the ranks of elite decisionmakers. 

Juneteenth has Gone National—We Must Preserve its Local Meanings

by Tiya Miles

Juneteenth celebrations have long been couched in local Black communities' preserved rituals that express particular ideas about heritage and the meaning of freedom. While a national commemoration of emancipation is welcome, history will be lost if local observances are swamped by a national holiday.

The War on Black Studies Isn't "Culture War" – It's Part of a Long Political Fight

by Robin D.G. Kelley

"Who’s afraid of Black Studies? White supremacists, fascists, the ruling class, and even some liberals. As well they should be. Black Studies was born out of a struggle for freedom and a genuine quest to understand the world in order to change it."

Moms For Liberty Event at Museum of American Revolution is a Betrayal of Historians and Democracy

by Jen Manion

"The Museum of the American Revolution has a responsibility to defend the history and practice of American democracy, not harbor those who seek to destroy it."

Clarence Thomas Took a Swipe at My Dissertation Research in a Decision. Here's Why He's Wrong

by Gregory Ablavsky

Historians' work played a huge role in a recent decision affecting Native American children. But the dissent by Clarence Thomas showed an appalling willingness to cherry-pick from the past that undermines originalism's own claims to legitimacy. 

Cormac McCarthy's Brutal Allegories of the American Empire

by Greg Grandin

"McCarthy demonstrated how the frontier wasn’t an incubator of democratic equality but a place of unrelenting pain, cruelty, and suffering."

Access to Mifepristone Could Hinge on Whether Pregnancy is "Illness"—What History Says

by Kristi Upson-Saia

Although antiabortion judges have mocked the idea that pregnancy is an illness, medical thought back to the ancient Mediterranean world have recognized the sharp downturn in women's health when they become pregnant. 

The Dodgers Were Right to Honor the Sisters of Perpetual Indulgence

by Lily Lucas Hodges

Catholic groups charged that the LGBTQ group mocked their faith with their appropriation of nuns' dress. But the group more importantly defied Catholic teaching to promote life-saving health care and education about sexuality at the beginning of the AIDS epidemic. 

Texas's History is Under Ideological Attack—from the Right

by John R. Lundberg

A retired oil billionaire is trying to wrest control of the Texas State Historical Association from professional historians because they no longer support a vision of the state's history that gives white Anglo settlers pride of place in a diverse state. 

Texas Politicians Want to Erase What Happened Between Juneteenth and Jim Crow

by Jeffrey L. Littlejohn and Zachary Montz

Joshua Houston, long enslaved by Sam Houston, recognized that the collective work of securing freedom only began with the announcement of emancipation, and that teaching the state's history honestly was part of the struggle for an egalitarian society against people determined to stand against it. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185874 https://historynewsnetwork.org/article/185874 0
Can Canada Contain Conflagration?

Image NOAA

Eleven thousand years ago, all of Canada, save the Yukon Valley, was under ice.  The country was literally a blank slate.  Then warming set in, the land vegetated, people entered, and white Canada evolved into a green Canada.  Lightning and torch kindled fires.  Green Canada burned—and has burned ever since.  Climate is the grandest of the themes that frame Canadian history.

The primordial rivalry between ice and fire has deepened in recent years: the fires are more savage and frequent, the ice is melting more swiftly.  This time the tempo is set not by Milankovitch cycles that regulate the intensity of solar radiation but by anthropogenic fire. People have escalated from burning living landscapes to burning lithic ones – once living, now fossilized into coal, gas, and oil. Canada began burning its combustion candle at both its ends. 

The serial ice ages that erased and repopulated Canada like a palimpsest have yielded to a fire age that is rewriting history with flame.  An interglacial period has morphed into an epoch in its own right.  Call it the Pyrocene.

You might think that a place so vast, and so combustible, would be filled with fire traditions, a fire literature and fire art, fire institutions, a fire culture in the fullest sense.  But while colonized Canada has a first-world fire establishment, and displays developed-world fire pathologies, it shows a remarkable disconnect between fire’s presence on the land and its manifestation in the culture. This is particularly true among elites who live in cities, not the bush. Only over the past couple of decades has fire entered common currency. Canada may be a firepower, but it punches below its weight internationally.  Its muted presence makes a striking contrast to Australia.

One explanation points to the character of boreal Canada, and two others to how fire fits with Canadian ideas and institutions.

Begin with the boreal environment.  The boreal forest, which houses the major burns, is a landscape informed by extremes, not averages, now exacerbated by global warming.  Burning obeys rhythms of boom and bust.  This makes planning, budgeting, and general bureaucratic operations difficult.  Yet they are expected to stand between the extremes of boreal fire and ordinary life.  Taking the pounding can make them unstable.  Still, Alaska has managed, and Mediterranean climates present similar challenges. 

The other explanations derive from the character of Canada as a confederation.  The great question of culture and politics (and national identity) was how to reconcile colonies that did not want to unite, but were compelled by force, geopolitics, or economics to join together. The Anglophone-Francophone bond was both the largest and softest of these welds. There was little reason (or bandwidth left) to contemplate the boreal backcountry, other than as part of a northern economy of fish, fur, minerals, and timber.  Fire control existed to keep the timber flowing.

The second is that the colonies, now provinces, retained control over their landed estate and its natural resources.  Those provinces carved out of Hudson’s Bay Company lands originally held a hefty proportion of dominion lands, which were organized into national forests on the American model.  In 1930 these lands were ceded to the provinces.  The Canadian Forest Service imploded, surviving as a research program. Fire protection resided with the provinces. A national fire narrative disintegrated into provincial and territorial subnarratives.  The closest American analogy to the provinces may be Texas (imagine a U.S. consisting of 10 clones of Texas).

The provinces struggled to muster enough resources to handle the big outbreaks of fire.  Nearly all had a northern line of control beyond which they let fires burn.  Not until a round of conflagrations between 1979 and 1981 pulverized western Canada and Ontario did pressure build to create an institution that would allow provinces to share firefighting resources on a national scale.  A mutual-aid agreement with the U.S. stumbled because the U.S. wanted to sign a treaty between nations, not between a nation and separate provinces. The compromise was to craft a Canadian Interagency Forest Fire Center as a corporation (not a government entity).  If the U.S. too readily nationalizes fire policies better left to local authorities, Canada fails to nationalize what exceeds provincial interests.

The Canada-U.S. Reciprocal Forest Fire Fighting Accord granted Canada access to a continental-scale cache of firefighting resources (and the U.S. access to Canadian aid when needed).  But the fires have grown meaner, larger, and more frequent, well beyond the capacity of even wealthy nations to corral.  Fires have blasted through Fort McMurray, Slave Lake, and Lytton.  They burrow into organic soils, the major reservoir of carbon in the boreal biome. Now their tag-team smoke palls have flooded Toronto and New York, Ottawa and Washington, D.C. The burning of lodgepole pine and tar sands are measurably perturbing the global climate.  Canada’s fire practices have leaped well beyond its provinces.

Image Natural Resources Canada

The argument grows, moreover, that relying solely on fire suppression only aggravates the crisis, that excluding fire leaves more of the living landscape available to burn even as climate change powered by burning lithic landscapes bolsters its propensity to burn. Most analysts plead for better forest, land, and fire management programs that work with fire. Forty years ago Parks Canada began this transformation, and now fields a world-class fire program. The Canadian Forest Service continues to publish stellar science.  Most of the larger provinces have fire control organizations that rival anything of their peers; on fire technologies like pumps and aircraft, Canada excels. Ongoing reconciliations with First Nations promise a recovery of indigenous fire knowledge. Yet the whole seems less than its parts.

It’s not enough.  The Pyrocene is coming to Canada with the scale and power of the Laurentian ice sheet.  The country needs to find ways to leverage its many fire talents not just to advance a global good but because Canada may become ground zero for a fire age.  A shift will mean burning some woods rather than logging them, and not burning bitumen and oil that Canada has in the ground.  It means matching firefighting with fire tolerating or outright fire lighting. It means kindling in the minds of its literati and elites an appreciation that fire is an indelible – and fascinating - part of living in Canada, as recent books by John Vaillant, Alan MacEachern, and Edward Struzik demonstrate.  It means accepting that, in the global economy of carbon, Canada is a superpower that needs to craft a confederation of institutions, ideas, and tools that can grant it a cultural and geopolitical presence commensurate with its conflagrations. 

On the Canadian Red River Exploring Expedition of 1857, Henry Hind observed of a massive fire that “It is like a volcano in full activity; you cannot imitate it, because it is impossible to obtain those gigantic elements from which it derives its awful splendor.”  We still can’t control those elements, though we have managed to disrupt them and so boosted their power.  But we can control our response. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185853 https://historynewsnetwork.org/article/185853 0
A Historian of Photographic Defacement in the USSR Faces His Own Erasure

On the day when the Russian human rights organization Memorial became a co-winner of the Nobel Peace prize last year, its activists had very little opportunity to process the news. October 7, 2022 found them huddled in a windowless hall of one of Moscow’s district courts, where Memorial was fighting the capture of its assets by the Russian state. Nine months earlier, the Russian Supreme Court liquidated Memorial under the so-called “foreign agent law,” so the new judicial decision to take over their assets was hardly a surprise. On the day the organization was recognized in Stockholm for its human rights and educational work on behalf of victims of Soviet state terror, it became essentially homeless in Moscow.

While formally the cause of Memorial’s legal woes was its failure to consistently brand itself as a foreign agent in all of its public statements, the words of the prosecutor left little doubt as to the real reason behind the state’s hostility. Memorial’s crime was simply its commitment to commemorating the victims of Soviet-era state violence. As state prosecutor Aleksei Zhafiarov put it during the hearings that led to Memorial’s liquidation, “Why should we, the descendants of victorious people, feel shame and repent, rather than pride ourselves on our glorious past?” The organization’s keen gaze at state terror, he made clear, was perceived by the Russian state as actively harmful in the present.

The same commitment to making Soviet-era state violence visible that landed Memorial in the Russian courts is what animates the groundbreaking volume by Denis Skopin entitled Photography and Political Repressions in Stalin’s Russia: Defacing the Enemy. The book, published by Routledge in 2022 in its History of Photography series, is a conceptually and empirically rich study of a corpus of 57 photographs from Stalinist Russia (many of them from Memorial’s archives), in which faces of people declared “enemies of the state” have been removed: scratched out, painted over or otherwise excised. While the existence of such photographs is well known (one such image, for example, was reproduced on the hardcover of Orlando Figes’ Whisperers [2007]), Skopin is the first to probe this practice for what it reveals about both Soviet history and the ontology of photography — group photography in particular.

Photography and Political Repressions in Stalin’s Russia consists of five chapters. The first one lays out the historical context of Stalin’s terror before advancing to a close reading of the criminal cases centered on improper handling of representations of political leaders, the subject of Chapter Two. Chapters Two to Four are the theoretical heart of the book, in which Skopin lays out his theory of group portraiture and discusses specific cases of photographic defacement, before concluding this discussion with a fifth, empirical chapter on the defaced photographs in secret police archives. He argues persuasively that Soviet-era photographic destruction, while close in spirit to other acts of iconoclasm, such as damnatio memoriae, has its own distinctive logic. The removal of faces from photographs during the Great Terror, he suggests, was driven by a desire to purify the community depicted in these images by visually ejecting the newly discovered “enemy” from the group. Skopin’s theorization of group portraiture under Stalin takes inspiration from Gilbert Simondon’s concept of “transindividuality,” framing collective portraiture that was so prevalent throughout the Soviet period as a tool for cultivating the supra-individual nature of Soviet subjectivity. Such emphasis on the prevalence of the collective, and on one’s inherent entanglement with it, created problems for everyone who shared the photographic space with the people subsequently denounced as traitors. As the number of such bogus denunciations grew, more and more group photographs were swept into the vortex of a visual iconoclasm that targeted “less the portrait itself than its performative value, namely its power to create a relationship between the iconoclast and the enemy of the people” (141).

Given the immediate political danger posed by the visual evidence of association with an enemy of the state, why would so many such photographs be preserved? Instead of destroying the compromising evidence entirely, why would people keep the photos with the traces of violence done to them? Skopin’ hypothesis is that the images were simply too valuable to be disposed of, and this makes sense, especially given the outsized importance that belonging to a group carried for individual identities in the USSR. But it is also possible that these images were preserved precisely as records of the righteous indignation that drove their owners to iconoclasm. They were there to make that outrage visible, and in this way, to protect their owners by testifying to their loyalties.

The complex motivation behind photo tampering is especially evident in a particular subset of photographs – those of familial groups. Many of these more intimate portraits also carried the signs of violent modifications aimed at dissociating the family from a member exposed as “an enemy of the people.” The book considers them as congruent with the logic of group self-purification (the group in question this time being the family), but it also notes some details that seem to call for an investigation of the ways in which the practices around these images differed. While the visual editing of large-N collectives tended towards the dramatic, with faces violently blotched or torn out, images of familial groups more frequently strove to conceal the signs of their modification, for example, by painting a curtain over a figure, or even transforming the clothing of the sitter in order to hide a Tsarist-era uniform or military award in a preventative act of caution. The point with these images, it seems, was not so much to perform an act of outrage that ejected the traitor from the group, but rather to shield a loved one from the violence of history in an act of care.

While the release of Skopin’s study would be an important intellectual event under any circumstances, the grim irony of its publication in 2022 is that this research is unlikely to reach its readers in Russia for many of the same reasons that fueled the dynamic of mass terror in Stalin-era USSR. Indeed, in October 2022, just as the Memorial activists were busy fighting the capture of the group’s assets by the state, Skopin’s academic employer, St. Petersburg State University, terminated its contract with the author, citing “immoral behavior” as the reason for dismissal. The immoral act in question was Skopin’s participation in a public protest against the government’s mass mobilization of military-aged men to wage Russia’s criminal war on Ukraine. Skopin, who has had to leave Russia after spending ten days in detention, is but one among many Russian academics forced to choose between silence and abandoning their students in recent months; the history that his volume had been probing seems to have come a full circle. The very least the rest of us can do is make this violence visible.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185855 https://historynewsnetwork.org/article/185855 0
What to the Incarcerated Is Juneteenth?

Incarcerated laborers sew military uniforms under the UNICOR (Federal Prison Industries) program 

Juneteenth is a bittersweet day for Black people in prison holding onto the promise of freedom.

Let’s start with history. The Emancipation Proclamation -- issued by Abraham Lincoln on September 22, 1862, during the American Civil War -- declared that on January 1 all slaves in the Confederacy would be “forever free.” Unfortunately, that freedom didn’t extend to the four slaveholding states not in rebellion against the Union, and the proclamation was of course ignored by the Confederate states in rebellion. For the roughly 4 million people enslaved, Lincoln's declaration was symbolic; only after the Civil War ended was the proclamation enforced.

But the end of the fighting in April 1865 didn’t immediately end slavery everywhere. As the Union Army took control of more Confederate territory during the war, Texas became a safe haven for slaveholders. Finally, on June 19, Union General Gordon Granger rode into Galveston and issued General Order No. 3, announcing freedom for those enslaved. There were about 250,000 slaves in Texas when it became the last state to release African American bodies from the cruelest institution known to American history. By the end of that year, the 13th Amendment abolished slavery, mostly (more on that soon).

Darrell Jackson: My understanding of Juneteenth developed in prison

Juneteenth has long been a special day in Black communities, but I didn’t learn about it until I went to prison.

In the early 2000s, prisoners at Washington State Penitentiary in Walla Walla decided to hold a Juneteenth celebration. Because the Department of Corrections didn’t treat the day as special, Black prisoners used the category of "African American Cultural Event" (which had usually been used to celebrate Black History Month) as a platform to celebrate Juneteenth. The spirit of liberation moved through the incarcerated population, motivating other prison facilities across the state to follow suit. 

It seems hard to believe, but prior to my incarceration, I knew nothing about Juneteenth. I had taken classes that included American history, but the event apparently wasn’t part of my school's curriculum. I first heard about that history from other prisoners at Clallam Bay Correction Center, where I was incarcerated in 2009, and that prompted me to learn more about the ways that African Americans mark Juneteenth. By the time I had entered the prison system, Black prisoners had expanded the celebration to include family, friends, and community members, creating an opportunity for prisoners to connect with loved ones in ways that regular visiting did not permit. We were able to choose what foods we ate and provide our own entertainment, using creative ways to communicate inspiring messages to the African American prison population and their families.

One memorable moment for me came in June 2012. The Black Prisoners Caucus was hosting the event, which had not happened since 2007, and a friend and I were asked to perform a few songs. It was my first vocal performance and I was extremely nervous. When we finished, my friend's 7-year-old daughter shouted from the audience, "Daddy, they killed it!" Though I didn't have any family present at the event, that little girl's endorsement etched a long-lasting smile on my heart. Her words had become a soothing balm in the face of the stress, self-doubt, depression, and anger I had felt as a prisoner throughout that year.

It’s important for the world to know that Juneteenth holds great significance for the Black bodies who are locked away in prison. But we shouldn’t forget that the 13th Amendment abolished slavery and involuntary servitude, “except as a punishment for crime whereof the party shall have been duly convicted.” For the men and women who are Black and in prison, that exception connects us to our ancestors who were in chains long ago. 

Antoine Davis: The conflict of celebrating freedom while in chains

I can only imagine the effect that June 19, 1865, had on the souls of those trapped in the most barbaric institution in American history. The hope for freedom, passed down from one generation to another, had finally come to pass. Tears from Black faces must have run like rivers, not from the pains of a master's lashes but from the joy of knowing that one’s momma and daddy, sons and daughters, family and friends would no longer live in bondage.

Such images of joy have run through my mind as I have celebrated Juneteenth with Black families and White families, all occupying the same space in the prison's visiting room. While our loved ones laugh and dance, eat and rejoice, the truth about what we celebrate creates for me a tension between joy and grief. The joy comes from recognizing how far we've come as a people. The grief comes from the reminder that while chattel slavery was abolished, a new form continues in a prison system that incarcerates African American people at an alarming rate.

Prisoners aren’t the only people who understand the injustice. For several decades, activists and academics have developed an analysis called “Thirteentherism,” which argues that the 13th Amendment created constitutional protection for the brutal convict-leasing system that former Confederate states created after Reconstruction and which evolved into today’s system of racialized mass incarceration.

Here’s just one statistic of many: In Washington state, 33 percent of prisoners serving a sentence longer than 15 years for an offense committed before their 25th birthday are black. Blacks make up 4.3 percent of the state's population. 

The statistics that demonstrate the racialized disparities in prisons make me think of Devontae Crawford, who at the age of 20 was sentenced to 35 years in prison. Although he committed a crime with three white friends, Devontae ended up with more time than all his crime partners put together. Today, all three of them are free, and Devontae still has 30 years left to do in prison.

One of my closest friends, who asked to remain anonymous, also comes to mind. He was sentenced to 170 years in prison after his white friend shot a man during an altercation. Although his friend admitted to being the gunman, this prisoner remains incarcerated while his white friend was released after serving seven years.

Jackson/Davis: Still slaves to the system

As Black men in prison, we live the tension between celebrating the abolition of slavery and struggling inside the criminal justice system that replaced slavery. We prisoners who are left to deteriorate inside one of America's most inhumane systems are able to find joy in celebrating Juneteenth, but not without indignities.

For example, a number of us were told by the DOC that prisoners would have to pay an estimated $1,500 for this year's Juneteenth celebration -- $500 for food, not including the cost to our guests, and $1,000 to pay for additional guards. Juneteenth became a national holiday in 2021, and DOC officials decided that African American prisoners should cover the overtime and holiday pay for the extra guards deemed to be necessary for us to celebrate Juneteenth with our children. 

That’s a lot of money for any working folks, but consider what it means for people who make 42 cents an hour, maxing out at $55 a month. No matter what the job in prison, that’s our DOC wage. This means that if we African American prisoners want to include our children in celebrating the historical meaning behind June 19, we will be forced to give the prison facility 3,600 hours of labor. The irony is hardly subtle: Prisoners in Washington state who work at near-slave wages will have to pay to celebrate a day that represents freedom.

We live with this injustice, but we African American prisoners find a way to maintain joy in the face of adversity. It's never easy to cope with the social conditions that are designed to diminish prisoners’ sense of their own value, but we keep on keeping on. If our ancestors were able not only to survive but also sometimes thrive in the face of enslavers’ disregard for their humanity, so can we. Something as simple as a 7-year-old girl shouting encouragement after a Juneteenth performance can be enough to keep us going.

African American prisoners have learned to embrace all the positives, celebrating freedom when we can while living in modern-day chains.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185834 https://historynewsnetwork.org/article/185834 0
Maps are the Record of Humans' Imagination of the World

The Fra Mauro Map, 1460

One of the most significant word maps now hangs in the Correr Museum in Venice, Italy. This map is seven feet in diameter, inked on velum, and covered with over 3,000 inscriptions in old Veneziano, the language of Venice. It was created by an obscure Venetian monk named Fra Mauro who worked with a team of cartographers, artists, and calligraphers in the middle 1400s at the monastery island of San Michele just off the north shore of Venice. Finished in 1459, this map was a masterpiece of both cartography and artistry, and it is the oldest Medieval map that has survived.

This map was also an inflection point in human history. Fra Mauro’s map was the first to show definitively that a ship could circumnavigate Africa at the southern tip and sail into the Indian Ocean, thereby opening sea trade between the West and the East. And it described people and goods across many cultures pointing out to Westerners that there were many other lifeways around the world. But most of all, this map was the first time that a cartographer moved away from religious mythology and ideology and embraced the science of geography. As such, Fra Mauro’s map foreshadowed the slide in Western culture from the insular Middle Ages into the enlightenment of the Renaissance and the beginning of the Scientific Revolution.

The ubiquitous nature of mapping suggests that diagraming our landscape is an ancient feature of human cognition and behavior and that we owe much of our evolutionary success to that ability. Maps appeared in the human record before there was writing, and even before there were systems of numbers. Because these drawings were used to represent something else, they were a means of communication and memory and a way to bridge languages and cultures. Among the many maps created by people over time and across cultures, one mode stands out as the most imaginative and creative, and the least practical—the world map. These maps don’t show the way to get home, guide a traveler, or even inform accurately what belongs to whom.

World maps are purely artistic in that they have always been made for grand effect. Mappamundi are also products of their times because they chart the history of geography and other knowledge and so these sweeping, impractical showpieces also echo the society in which they were produced; they are talismans of culture, the storytellers of human experience. Their story is our story, and that’s why they matter.

The first world map is a tiny bit of smashed-up clay, called the Babylonian map of the world, about the size of a human hand when glued together, and it dates between 500-700 B.C.E. The reconstructed tablet is composed of 8-10 pieces with a hole in the center, which presumably marks the center of the Babylonian Empire. It is incised with rays and circles representing the Euphrates River and a horizontal rectangle that represents the city of Babel. The following centuries produced various world maps in Greece, the Roman Empire, the Arabic world, and some Asian world maps. These maps were made as ancient sailors and navigators began to travel long distances for exploration and trade, and they reflected how their cultures saw the world.

Eventually, the cartographers in Western culture used maps as supporting propaganda to reinforce Christian beliefs and to instill fear of the unknown by portraying mystical creatures, warning about barbarians, and highlighting uninhabitable and presumably dangerous places. And of course, all these early cartographers had no idea that there were two more continents on the other side of the globe, continents already inhabited by people who had walked, sailed, or rowed there long ago. These Medieval Western world maps were encyclopedias of knowledge, but that knowledge was biased and narrow.

Fra Mauro’s map was constructed during the Late Middle Ages, an exciting time for Western culture. The West was just on the cusp of breaking out of its known geography and sailing to far-flung places. But this Age of Discovery (or Age of Exploration) was not so much about exploring new and interesting places as a purposeful financial move. When Europeans moved out of their geographic comfort zone, they were incited by nascent capitalism, that is the desire to pick up goods and resources from foreign places and sell them back home or elsewhere at a profit.

That burgeoning of capitalism was underscored by a focus on technological improvements in trade ships and navigation. Because of Fra Mauro’s map, for example, one could now imagine circumnavigating Africa and entering the Indian Ocean, which had previously been imagined as a closed sea. As a result, trade with the East could be much more efficient and financially profitable by rounding the tip of Africa rather than sticking to land routes across Asia. And this map visually described other sea routes for trade and how they might connect to form one vast trading network.

Also, Fra Mauro's map was reflective of the various intellectual revolutions that had begun flowering. Like no world map before it, this one was brimming over with information from other places and cultures, suggesting there was a wide world out there waiting to be explored and understood. In that sense, Fra Mauro’s map was the first encyclopedia of the known world, and it pointed to a vast diversity of peoples and practices.

Fra Mauro’s map is not just a map of the known world in the mid-1400s. It is also a reflection of the tipping point that brought Western culture out of the Dark Ages into the light of modernity. His creation was a road map to expansion, discovery, trade, prosperity, and domination. And it gave birth to a long series of later world maps. In other words, this map was like a pebble thrown into a pond, creating various unpredictable but sizable waves of rings that spread out from the impact of that pebble; it changed world history, how world maps have since been used for various purposes, and established the scientific discipline of cartography.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185854 https://historynewsnetwork.org/article/185854 0
How Bob Dylan Ran Afoul of the FBI

James Baldwin and Bob Dylan at a dinner of the Emergency Civil Rights Committee, where Dylan would give a notorious speech in acceptance of the organization's Thomas Paine Award.

The Kennedy Assassination

On November 22, a little more than two weeks after the Newsweek article [a derogatory profile on Dylan], John F. Kennedy was assassinated in Dallas. On December 13, Bob Dylan received an award from the Emergency Civil Rights Committee. Things did not go well.

Problems arose when Dylan, who had been drinking throughout the ceremony, gave a rambling acceptance speech that reads more as an out-loud, unfiltered internal monologue, rather than a thought-through statement of views, let alone the expected thank you at an awards ceremony. In part, he said:

So, I accept this reward — not reward [laughter], award on behalf of Phillip Luce who led the group to Cuba which all people should go down to Cuba. I don’t see why anybody can’t go to Cuba. I don’t see what’s going to hurt by going any place. I don’t know what’s going to hurt anybody’s eyes to see anything. On the other hand, Phillip is a friend of mine who went to Cuba. I’ll stand up and to get uncompromisable about it, which I have to be to be honest, I just got to be, as I got to admit that the man who shot President Kennedy, Lee Oswald, I don’t know exactly where — what he thought he was doing, but I got to admit honestly that I too — I saw some of myself in him. I don’t think it would have gone — I don’t think it could go that far. But I got to stand up and say I saw things that he felt, in me — not to go that far and shoot. [Boos and hisses]

Before ending his remarks, he scolded the crowd for booing, “Bill of Rights is free speech,” and saying he accepted the award “on behalf of James Forman of the Student Non-Violent Coordinating Committee and on behalf of the people who went to Cuba.” That too was met with boos as well as applause.

Dylan’s internal thought process aside, in most situations in 1963, his comments on Cuba alone would have been enough to get him into trouble, but given the proximity to the Kennedy assassination, his remarks about Oswald were unequivocally verboten. As a result, he would be forced to issue a public apology. Though his apology, consistent with Dylan speaking for himself alone, reads as a further elaboration on his own internal thinking:

when I spoke of Lee Oswald, I was speakin of the times I was not speakin of his deed if it was his deed the deed speaks for itself.

Apology or not, the speech had repercussions. Among other things, the incident found its way into the FBI’s files — by way of his girlfriend Suze Rotolo. As a report in her file noted:

ROBERT DYLAN, self-employed as a folksinger appeared on December 13, 1963, at the 10th Annual Bill of Rights Dinner held by the ECLC at the Americana Hotel, New York City. At this dinner, DYLAN received the Tom Paine Award given each year by the ECLC to the “foremost fighter for civil liberties.” In his acceptance speech DYLAN said that he agreed in part with LEE HARVEY OSWALD and thought that he understood OSWALD but would not have gone as far as OSWALD did.

A more elaborate account of the incident showed up in the nationally syndicated column of Fulton Lewis, Jr., which ridiculed the entire event, but made clear to get across Dylan’s remarks. For example, Lewis characterized James Baldwin, also honored at the event, as a “liberal egghead whose books dot the best seller list,” and, Robert Thompson, another attendee as “the top-ranking Communist official once convicted of violating the Smith Act.” He then delivered his shot at Dylan:

The ECRC Tom Paine Award went to folksinger Bob Dylan, who wore dirty chinos and a worn-out shirt. He accepted the award “on behalf of all those who went to Cuba because they’re young and I’m young and I’m proud of it.” He went on to say that he saw part of Lee Harvey Oswald “in myself.”

What is striking about the column is that it reads as though Lewis were at the dinner, though he never says as much, nor does he cite any source for what is a very detailed description of the event. So either he failed to mention his attendance — his byline has him in Washington, the dinner was in New York — or he received a rather detailed report from an unnamed source.

All this might be explained by the fact that Lewis had a friendly relationship with the FBI. An FBI memo from October 1963, which listed anti-communist writers “who have proved themselves to us,” including journalists Paul Harvey of ABC News, Victor Riesel of the Hall Syndicate, and Fulton Lewis Jr. of King Features Syndicate.

That particular mystery might be answered by information in the FBI file on Bob Dylan, which recent governmental releases show was created. Specifically, there is an FBI report on the Emergency Civil Liberties Committee, which includes a table of contents listing for a report on the dinner. Unfortunately, the actual report is not included in that document, though there is a notation on the informant — coded as T-3390-S — who supplied information on Dylan. Beyond that, there is a report from January 1964, which references a file on Dylan himself, though there he is called “Bobby Dyllon.” Bob Dylan, in other words, was a subject of a more particular kind of FBI attention.

While most writing on Dylan in this period focuses on his personal decisions and behavior, what is clear in looking at the concentrated events in his most political period is that he confronted a considerable amount of scrutiny and hostility. He was ridiculed in the media, kept from performing certain material on television, and had his spontaneous remarks used to justify the opening of an FBI file. Dylan, in other words, was up against more than he realized. In this, he was not alone.

Excerpted with permission from 

Whole World in an Uproar: Music Rebellion & Repression 1955-1972

Aaron J Leonard

Repeater Books, 2023

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185833 https://historynewsnetwork.org/article/185833 0
Jared McBride Sheds Light on the Darker Parts of Ukraine's History

Ukrainian Auxiliary Police during Nazi occupation, c. 1943. Photo Bundesarchiv. 

Jared McBride, an Assistant Professor in UCLA’s History Department, sat down with HNN to discuss his research into 20th century violence in Ukraine. McBride specializes in the regions of Russia, Ukraine, and Eastern Europe and his research interests include nationalist movements, mass violence, the Holocaust, interethnic conflict, and war crimes prosecution. His research has been funded by Fulbright-Hays, the Social Science Research Council, the Mellon Foundation, and the Harry Frank Guggenheim Foundation and has been published in Holocaust and Genocide Studies, Journal of Genocide Research, The Carl Beck Papers, Ab Imperio, Kritika, and Slavic Review. At present, McBride is completing a book manuscript titled Pathways to Perpetration: Violence and the Undoing of Multi-Ethnic Western Ukraine, 1941-1944 that focuses on local perpetrators and interethnic violence in Nazi-occupied western Ukraine.

Q. In 2017 you wrote an article for Haaretz, a leading Israeli newspaper, which condemned the mythmaking” attempts in Ukraine (then led by President Poroshenko) to whitewash” the involvement of nationalist Ukrainians during WWII in terrorism against Jews and members of the Polish minority in Western Ukraine. Now, some six years later, the government of Ukraine is headed by Volodymyr Zelensky.  In recent years, have Ukrainian museums or local municipalities begun to acknowledge the role of local people in supporting the Nazi invaders in WWII?

Many scholars assumed the government of  President Zelensky,  which posited itself as centrist and outside the usual divides in the Ukrainian political landscape, would mark a break in the more cynical memory politics regarding 20th century history employed by the Poroshenko government. Until the start of the new war in 2022, this appeared to be true. Crucially, one of the most common barometers for policy shifts concerning the past is how the often-controversial Institute for National Memory is staffed and how they orient their projects. In this case, Zelensky clearly opted for a more moderate and respected leader and inclusive projects meant to bridge divides, rather than create them. How the Russian invasion will ultimately shape these politics moving forward remains to be seen. Concerning museums and municipalities, the assessment remains mixed. The aforementioned Decommunization Laws led to the removal of many Soviet-era markers, which is certainly understandable, but the replacement of them with monuments to individuals who served in Nazi-led battalions and police forces has been met with less sympathy.

Still, it is important to note the latter does not represent most of the new memorialization efforts, many of which include important and non-controversial Ukrainian figures from the last two centuries. In terms of other prominent and public spaces, we find similar tensions and growing pains. More controversial spaces like the Prison on Lonksoho in L’viv continue to operate, whereas Ukrainians have made progressive efforts to mark spaces in commemoration of where other ethno-national groups lived and died on Ukrainian soil. I’d therefore like to highlight the prolific work of Rivne-based NGO Mnemonics, which has completed projects like memorializing the site of the Jewish ghetto and even laying steppingstones (Stolperstein) throughout the city, among a great deal of other work. Finally, the fate of the endlessly byzantine process around the Babyn Yar commemoration project in Kiev remains to be seen, but it should say a lot of about the future treatment of these issues in a new Ukraine.

Q. How did you first get interested in this subject?

During my first year of college at Northeastern University, I took a course taught by Professor Jeffrey Burds that focused solely on the Second World War on the Eastern Front. This course highlighted various aspects of the war in the East including the intelligence front, partisan movements, local collaboration, the Holocaust, ethnic cleansing, and sexualized violence. In doing so, Dr. Burds exposed undergraduates to cutting-edge research on these topics through his own path-breaking work and that of others. When I took the course in the late 1990s, the field of study was rapidly developing so it was the perfect time to be exposed to these themes.

Shortly after the course ended, I began studying Russian and I followed up by learning German, and eventually studying Ukrainian in graduate school. I was able to put my Russian to use in two undergraduate research trips to Russian archives in Moscow where I began to work with primary source materials on the war. These experiences motivated me to seek a PhD in Russian/Eastern European history.

Q. How have students reacted when you lecture on this topic? Your scholarly articles discuss mass killings and torturing of women and children. Have any students complained about being exposed to potentially traumatic descriptions and images?

My experience teaching on these topics, first at Columbia on a teaching post-doc, and second, at UCLA since 2016, has been overwhelmingly positive. My courses on Eastern Europe in the 20th century and the Soviet experience are always full. I find students are curious and enthusiastic to learn about some of the many difficult moments of the 20th century. Most do not seem to come to the classroom with preconceived notions about the region, positive or negative, that I believe children of the Cold War, like me, had when we took these classes in the nineties or eighties. I also organize a team-taught course at UCLA each year on political violence and genocide for over two hundred first-year students called Political Violence in the Modern World. My experiences running this large course have been no different over the past four years – UCLA students can and do work through sensitive material in a respectful and engaged manner.

Q. In past years you were able to travel to Russia and Ukraine and Russia and get access to records. Has that availability changed because of the war in Ukraine?

I was able to complete most of my dissertation and now manuscript research in Ukraine and Russia before the events of the 2014 Maidan Revolution, so I did not have any access issues at the time. Access to Soviet-era archival materials in Ukraine only improved after the revolution and arrival of the Poroshenko government thanks, somewhat ironically, to a suite of controversial laws known as the Decommunization Laws. While controversial in terms of memory politics, the laws simplified access to the archives, including former KGB archives, and this was a boon for historians like myself. The war in Ukraine has not shuttered the archives – I know some colleagues continue to go and I have been able to support seasoned research assistants who have been able to access materials — but the war has unquestionably hampered the ability of young Ukrainian scholars to complete their work. Russian missiles have also damaged some holdings, which is terrible for scholars.

Russia has been the inverse of Ukraine in recent years where archives have been more restricted, especially for foreigners. Accessing Russian archives will likely prove increasingly difficult, and though there have been recent efforts to create crowd-sourced digital repositories for scholars, nothing truly replaces the experience of working on-site. The future is concerning for Soviet studies and archival research in Russia and Ukraine, but ultimately what matters most is that the war ends, and Ukrainians can rebuild their lives and livelihoods. Scholarship is second to survival.

Q. Please tell us a little bit about the book you are working on, Pathways to Perpetration: Violence and the Undoing of Multi-Ethnic Western Ukraine, 1941-1944.

My book expands upon my earlier work on local perpetrators in multiethnic settings. It is a micro-level social and political history of the Nazi occupation of western Ukraine. It examines the motivations of those who participated in various arenas of violence during the war including pogroms, the Holocaust, ethnic cleansing, and paramilitary violence. Throughout, I demonstrate how the social identities and group formations that are typically assumed to have caused the violence were instead caused by it, and that political choices were less often anchored in pre-existing ideologies and beliefs but rather more dynamic and situational than previously argued.

My conclusions therefore challenge overriding nationalist and primordialist interpretations of the war and people’s decisions in it. This integrative account of local perpetrators and decision-making is based on 10-plus years of research in Russia and Ukraine using sources in five languages from eighteen archives, including post-war Soviet investigations, newly declassified KGB trials, German documents, and personal accounts.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185856 https://historynewsnetwork.org/article/185856 0
The "Surreal" SCOTUS Case on Indian Adoptions A pivotal SCOTUS case, Haaland v. Brackeen, centers on the Indian Child Welfare Act (ICWA), landmark legislation passed in 1978. With a decision coming at any moment, I spoke with Matthew L.M. Fletcher, the Harry Burns Hutchins Collegiate Professor of Law at Michigan Law.

Professor Fletcher teaches and writes in the areas of federal Indian law, American Indian tribal law, Anishinaabe legal and political philosophy, constitutional law, and legal ethics. He sits as the Chief Justice of the Pokagon Band of Potawatomi Indians and the Poarch Band of Creek Indians.

Professor Fletcher and I discussed the history that led to ICWA’s passing: namely, the government’s removal of Native children from their homes dating back to the Civil War. A condensed transcript edited for clarity is below.

Ben: Professor Fletcher, thank you so much for being here.

MF: Thanks for asking me. 

Ben: Of course. Today I want to contextualize Haaland v. Brackeen, a case that could have far wider-reaching implications beyond Texas, where the dispute originates.

To kick us off, could you please give a quick summary of the case? 

MF: Sure. Haaland v. Brackeen is a challenge to the constitutionality of the Indian Child Welfare Act by the state of Texas and three couples who are attempting to adopt (and actually already have adopted) American Indian children.

Ben: To my understanding, there's a lot of history that precipitates this kind of action. Can you describe how the US government and US citizens began targeting Native children through schools in the 1800s, please? 

MF: Prior to the Civil War, there were a lot of boarding schools and day schools for Indian children funded by the United States government and operated mostly by religious institutions. Most of the schools were actually not that oppressive, but after the Civil War, President Ulysses S. Grant, who’d been the lead Union general, gave a lot of political appointments to his former military buddies. 

Suddenly, former military officers had enormous political power over Indian people. The boarding schools became like boot camps, and the US eventually made it mandatory for Indian children as young as four or five years old to be relocated, usually off their reservation, to these military-style schools.

Ben: Why did the US government force Native children into schools run by people whom it really doesn't sound like you'd want to go to school with?

MF: Throughout the 19th century, Indian tribes signed treaties with the US—not all 574 federally recognized tribes, but most tribes outside of Alaska and California did. And one of the things that tribal negotiators asked for was educational assistance.

So, peppered throughout the treaties were promises by the US to educate Indian kids in day schools: to teach them English as a second language, western math, and science—basically, the sort of knowledge that would help tribes integrate into the larger community while living in their homes.

But the US perverted this request for educational opportunities, transforming it into mandatory boarding schools. The reason, of course, was quite genocidal and ethnocentric. The schools were designed to destroy tribal communities by taking Indian kids away from their homes so that they’d forget about their language, culture, and religion; to dress them up as non-Indians and teach them menial labor so that there would be skilled, almost slave labor for many local farmers.

Ben: As egregious as it sounds, and worse. You've written interestingly about differing conceptions of children within American political philosophy and within Native communities. There’s a slight difference...

MF: A dramatic difference. Anishinaabe people, like a lot of Indigenous people, treat children almost like supernatural creatures when they're babies, infants, and toddlers wandering around. Many people ascribe to them sort of a mysticism. You don't mess with your children because they're still partly in the spirit world. And in essence, even if you don't believe in any of that stuff, it means that in Native cultures children are equal members of society. 

But when I went to law school, one of the first things we learned (to the extent that students learn this anymore) is that in US society, children are effectively the property of their parents. That’s one reason why boarding schools and the government viewed Native children as so easily removable.

Ben: How did placing Native kids in boarding schools lead to the adoption of Native kids?

MF: During the Great Depression, the government decided to get out of the business of educating Indian children. The boarding school practice was very expensive, so politicians turned most of the responsibility over to states and actually paid them to accept Indian children into their public schools.

This practice quickly turned into a project by states to remove Indian children from their reservation homes and adopt them out to non-Indians. States kept going after tribes for the same reason that the federal government did: to try and “kill the Indian and save the man,” so to speak.

Ben: How did the government’s policy of “termination” spur further adoptions after World War II? 

MF: During the war, the US zeroed out the Indian Affairs budget because all available resources went to the war effort. Interestingly enough, a lot of Indian people thrived during World War II because many Native men and women went to war and received regular paychecks and sent some money home.

But as World War II ended and the Cold War began, Congress began to rethink Indian affairs. We hear a lot about the Red Scare and McCarthyism, but the government went after Indian tribes, too, because many tribes owned their land communally. People were complaining to Congress that right in the heart of Indian Country was a bunch of communist nations. 

This unique and acute political pressure led the United States to adopt “termination.” Under this policy, the government “terminated” many tribes and sold off their property. Ironically, this kind of land expropriation was more of a communistic practice, but I guess there’s no irony in the world anymore.

Ben: I believe it was Yoda who said “the self-awareness has never been strong with Congress.”

By the 70s, the adoptions were endemic. Investigations by the Association on American Indian Affairs found that over the preceding decades, 25-35% of all Indian children in the United States had been removed and adopted out. About 80-90% of their adopted families were not Indians.

Ben: This leads us back to ICWA. How did the Association’s investigations spark the landmark legislation?

MF: Enormous kudos have to go to the Association. In the late 60s, they were really the first national organization to pay attention to the problem of Indian child welfare and removals. And you know, nobody wanted to talk about that sort of thing. Even in tribal communities, when you lose your children to the state, it's easy to just curl up and not think about it anymore—it's horrible. 

Also, Indian people didn't have lawyers. So the association started providing them with lawyers and publicizing what was going on. The Association had its own lawyers, too, who drafted legislation that became the Indian Child Welfare Act in 1978. The law was designed to correct the litany of due process violations that separated Indian families endured, as well as to correct some of the racism and ethnocentrism in child welfare. 

Ben: And now ICWA is under attack in Haaland v. Brackeen. Would you mind giving a few more details on the case? Who are the Brackeens? 

MF: The Brackeens are Christian evangelicals who wanted to adopt babies to save them. They fostered two babies before getting a third, who is Cherokee and Navajo. Since the child’s parents couldn’t rehabilitate themselves, his relatives in the Navajo Nation wanted to adopt him, but the Brackeens made an argument that they were the better parents. The state of Texas ruled in the Brackeens’ favor because, effectively, they have more money, and because there are many biases inherent to adoption cases that worked in their favor.

Now, you might be wondering how the case made its way to the Supreme Court. Well, right before the order that said the Brackeens were going to win, their attorneys, who are movement conservatives, encouraged them to act as shills in a separate case where the state of Texas sued the US and challenged ICWA

So the Brackeens have custody of the child. The adoption fight is over. But the case that Texas filed on their behalf continues. 

Ben: Why did Texas seize the opportunity to challenge ICWA? Is it correct to suggest then that there are... greasier motivations at play?

MF: There are a few possible explanations. One of the plaintiffs’ claims is that ICWA violates the equal protection principle inferred in the right to due process. They’re essentially saying the law is racially discriminatory towards non-Indians.

If SCOTUS agrees, it wouldn’t have much impact on ICWA, because the actual statute of ICWA that this claim challenges is rarely invoked. But it could weaken other laws protecting Indians, such as laws protecting tribal land. I think this legal rationale is kind of tenuous, but the fact that many oil and gas interests have filed briefs in the case does make you wonder why they care about child welfare.

There’s also a kind of stupid states’ rights issue at play. Texas is more or less saying Congress has forced them to abide by ICWA and treat Indian people like humans. In reality, the state has a terrible child welfare system, and their solution is to turn it over to private entities—to the same people who own private prisons. So that might be why you get groups like the Heritage Foundation and the Cato Institute involved in this case because they would like to privatize government, too.

The outcome really depends on what arguments attract the Court’s attention. I’ll say this: some of ICWA, if not all of it, will go away. This court is incredibly conservative, and they want those who supported their rise to power to know that they’ll help destroy civil rights statutes.

Ben: So to summarize, we're seeing a kind of classic mixture of conservative desires at play, whether it be privatization, possible resource exploitation, or just general striking down of civil rights and protections of Indian Country for the sake of striking those protections down.

MF: It's just so impossible to understand. There are centuries of precedents that don't seem to matter to the justices. It’s also surreal that we’re even talking about this. To reiterate, the Brackeens already won custody, and usually, SCOTUS kicks people out of the Court in that kind of scenario. You don't usually get to keep litigating to say you win again, but here we are.

Ben: Right, I have enough trouble winning anything once, let alone twice, so I’d probably stop there.

Professor Fletcher, this has been illuminating. I really appreciate your contextualizing 150 years of history and tribal law.

MF: Sure thing.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/blog/154766 https://historynewsnetwork.org/blog/154766 0
The Roundup Top Ten for June 16, 2023

Lady Vols Country

by Jessica Wilkerson

The author remembers Pat Summitt's championship women's basketball teams at the University of Tennessee as a demonstration of how sports "encompass a battleground for determining how gender manifests in the world, how women and girls can use their bodies, and who can access self-determination."

The Targeting of Bail Funds is an old Weapon in the Civil Rights Backlash

by Say Burgin and Jeanne Theoharis

Atlanta and Georgia law enforcement's arrest of the leaders of a fund dedicated to securing bail for protesters opposing "Cop City" shows that protest movements have long depended on bailing out activists, and the forces opposed to change have long known it. 

The Lesson of Germany's Most Famous Trans Woman? Freedom Requires Joy

by Samuel Huneke

As anti-trans and homophobic legislation and rhetoric pervade the political arena, this Pride month may feel less than celebratory. A historian of queer life under both Nazism and East German Communism says it's a mistake to embrace doom. 

Is Oklahoma's Religious Charter School Good News for Secularists?

by Jacques Berlinerblau

Oklahoma recently approved the first publicly-funded religious charter school in the United States. Is it possible that this ambitious move will backfire when schools representing all denominations and faiths demand equal treatment? 

Pat Robertson Helped Make Intolerance a Permanent Plank in the Republican Platform

by Anthea Butler

Pat Robertson lived at the intersection of public piety, apocalyptic rhetoric, and the pursuit of profit, and did as much as anyone to make the vilifiation of opponents as threats to the moral fiber of the nation a part of conservative politics. 

DEI Education in America Goes Back to the 18th Century

by Bradford Vivian

The pioneering Quaker educator Benezet implemented student-centered reforms that accounted for differences in background and experiences of injustice, reflecting the spirit, if not the language, of contemporary DEI principles in education and teacher training. 

Turning Universities Red

by Steve Fraser

American colleges were built to serve the children of elites and maintain the social order they dominated. Despite fears of liberal indoctrination on campus, growing labor movements including all workers are the only way that colleges will really make a more egalitarian society. 

Indoctrination in Schools? How About a Century of Capitalist Propaganda?

by Jennifer Berkshire

A century ago, the electric power industry faced an existential crisis as government mulled over public rural electrification programs. Their solution was to provide teachers and schools with propaganda for the magic of private enterprise, the first wave of business's efforts to control curriculum.

The Forgotten History of Japanese Internment in Hawaii

by Olivia Tasevski

Although Hawaii is associated with the United States being victimized by foreign attack, the history of internment of Japanese Americans on the islands should also remind us of the U.S. government's human rights abuses. 

The Daiquiri is the History of American Empire in a Cocktail

by Ian Seavey

"The daiquiri rose to prominence as a direct result of the American imperial project in the Caribbean during the burgeoning classic cocktail age from 1860 to 1920."

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185852 https://historynewsnetwork.org/article/185852 0
California's Collusion with a Texas Timber Company Let Ancient Redwoods be Clearcut

Old-growth redwoods, Headwaters Forest Reserve (CA)

In 1985, fresh out of college, I took a job as a reporter in my hometown of Guerneville, a small village set alongside the Russian River in western Sonoma County, California. Among my many beats was the North Coast timber industry. At first I focused on the logging of second-growth redwood forests by international timber giant Louisiana-Pacific. Yet just three months after I’d landed that reporter’s job, a Houston company called Maxxam completed one of the most consequential corporate buyouts of the era by leveraging $754 million in high-yield “junk bonds” to purchase the 117-year-old Pacific Lumber Company.

At the time, Pacific Lumber owned the very last private inventory of old-growth redwood forest still standing outside of parks. The company’s 200,000 acres of forestland included 8,000 acres of virgin redwood, and 56,000 acres of “residual” old-growth redwood forest. It was the single largest expanse of old-growth redwood remaining in the world, inside or outside of parks.

At first I investigated Maxxam’s redwood liquidation as a reporter. But after I visited the targeted forests I quit my job and moved to Humboldt County, a coastal outback two hundred miles north. Searching for a means of saving these ancient redwood groves, I co-founded a North Coast chapter of the radical environmental movement Earth First!. Our small group of determined activists camped high into the canopy of old-growth redwood trees, blockaded roads, and occupied the Golden Gate Bridge to draw national attention to what turned out to be the illegal destruction of the last ancient redwoods.

When I wasn’t dangling from trees, I was researching and publishing, which turned up several startling discoveries. It was shocking enough that a Houston oil and real estate company could employ the services of three soon-to-be-convicted financial felons—Ivan Boesky, Boyd Jeffries, and, at the notorious junk-bond firm of Drexel Burnham Lambert, Michael Milken—to secure a hostile takeover of, and immediately set to liquidating, the world’s last ancient redwoods. But what was especially jarring was the understanding that the California Department of Forestry (CDF), which was charged with enforcing statutes designed to protect soils, streams, and habitat, would approve every one of Maxxam’s destructive timber harvest plan (THPs) no matter that most of them violated several state and federal environmental laws.

Prior to the takeover of Pacific Lumber, Maxxam’s legal eagles no doubt understood that in 1970 California lawmakers had passed an unusually effective environmental law called the California Environmental Quality Act (CEQA). Three years later the state also passed the California Forest Practice Act (FPA). At the time the Forest Practice Act was touted as a means of curbing abusive logging practices, particularly in the redwoods. More accurately, the Forest Practice Act allowed timber companies to evade the far more restrictive requirements of CEQA—no matter that such evasion was illegal, as several court challenges would prove.

CEQA required that a company proposing a “project” in California must first provide an environmental analysis that takes into consideration the potentially “significant, cumulative, adverse environmental effects” of a of that project when considered alongside “past, present, or reasonably foreseeable future” development plans in the area. This was a very high bar, especially when applied to logging, among the most cumulatively destructive activities in California at the time. Properly enforced, CEQA would have prevented the scorched-earth logging that has always occurred in California, particularly in the state’s minuscule stands of remaining ancient forest. In contrast, the Forest Practice Act required that private timber companies submit just a short, check-box “timber harvest plan” whose level of environmental review was cursory at best. The Forest Practice Act reconstituted the California Department of Forestry as an enforcement agency, but really CDF served as a bulwark to assuage and fend off public challenges to destructive logging.

In 1985 the Humboldt County-based Environmental Protection Information Center (EPIC) penetrated the bulwark by securing a state appellate court decision against timber giant Georgia-Pacific Corporation and CDF for submitting and approving, respectively, a timber harvest plan designed to clear-cut the very last 75-acre grove of virgin redwood forest (called Sally Bell Grove) on California’s famous Lost Coast. The court ruled that timber companies and CDF must comply not just with the Forest Practice Act, but also the “cumulative effects” clause of CEQA.

Nonetheless, a confederacy of state officials—from the nine-member state Board of Forestry (made up primarily of loggers, ranchers, and academics loyal to the timber industry), to the Director of CDF in Sacramento, to CDF foresters on the ground—almost to a person refused to properly enforce CEQA, or even to consider it. Department of Forestry officials were largely culled from the same forestry programs—at Humboldt State University and UC Berkeley—as were corporate foresters. They all knew each other, and some of them were related. At the time of the Maxxam takeover of Pacific Lumber, the state foresters were also under pressure from Republican Governor George Deukmejian, a great friend of the timber industry, to maintain a heavy cut. State officials stuck with the weak provisions of Forest Practice Act and approved every Maxxam timber harvest plan with dizzying haste.

At the end of 1986 I compiled a list of the fifty-two timber harvest plans that CDF had approved that year for Maxxam, for a total of 10,855 acres—more than double PL’s acreage of the previous year. Of this expanse, 9,589 acres contained old-growth forest, virgin and residual. There would be no more selective logging. This was an appalling rate of cut. Every standing old-growth tree—hundreds of thousands total—would be leveled as the amount of lumber that Maxxam extracted from the redwoods actually tripled. The cumulative environmental effects of this liquidation were certainly significant; more so, they were extreme. Whole watersheds began unraveling as ten timber crews carved abusive roads and denuded miles of terrain in the most rapid acceleration of old-growth redwood logging in history. Rivers and creeks filled with up to fifteen feet of sediment, and salmon, along with all manner of terrestrial wildlife, disappeared.

By 1986, thanks to the EPIC lawsuits against Georgia Pacific, CDF foresters were now forced to provide at least a cursory CEQA review of logging plans. Cursory became Orwellian. In approving a Maxxam clear-cut of 125 acres of virgin redwood, a CDF forester addressed CEQA’s questions of “significant, cumulative, adverse environmental effects” in this way:

Tractor logging and new road construction will contribute to surface soil erosion, but it is unlikely to be significant at this time. Mass soil movement may happen but to say that it will be significant is to [sic] early. New road construction and tractor logging may somewhat decrease water quality but only for a short time period. It cannot be judged at this time if it will be significant. This stand of old-growth timber has direct access to the public, however it cannot be judged at this time if aesthetics would be significantly impacted. No endangered species were noted during the inspection. Old-growth timber has been noted to shelter all types of plants and animals. Unless one observes these species it cannot be judged if any significant impact would occur.

Department of Forestry officials specialized in throwing wildlife under the bulldozer. Incredibly, they often argued in Orwellian fashion that wildlife would “benefit” from the destruction of the extremely rare virgin redwood groves held by Pacific Lumber. “A general short-term improvement for wildlife habitat is seen from this plan,” CDF wrote of a 294-acre old-growth redwood logging plan on Chadd Creek, adjacent to Humboldt Redwoods State Park. Likewise, a “possible minor improvement to wildlife habitat” would result from an 88-acre clear-cut of old growth on Corner Creek, and “wildlife habitat may be improved” after 309 acres of old-growth clear-cutting on Larabee Creek. On Strongs Creek, where Pacific Lumber would clear 760 acres of old-growth redwood, “Some deer habitat will be improved.”

In May 1987 I attended a CDF “review team meeting,” during which state foresters examined two Maxxam timber harvest plans that proposed clear-cutting 274 acres out of the heart of 3,000-acre Headwaters Forest. I had discovered and named Headwaters Forest just two months earlier. Headwaters was the largest remaining island of virgin redwood forest still standing outside of parks; the value of its old-growth habitat was undeniable. Headwaters Forest stood as the only ancient forest remaining in the Humboldt Bay watershed, and it was one of the most important breeding areas for the endangered marbled murrelets.

During the review team meeting, I asked Stephen Davis, the Humboldt County-based CDF forester who reviewed Maxxam’s new logging plans, about the cumulative effects of logging Headwaters Forest. Davis was sanguine. He saw nothing amiss in the ecological dismantling of the grove. On paper, prior to the meeting, Davis had whipped through CDF’s “Cumulative Impacts Checklist” for the THP (positioned as an afterthought at the end of the review document after EPIC’s court victory) as if he were renewing a driver’s license. A checklist question asked whether the clear-cutting would cause “significant adverse cumulative environmental effects…to fish or wildlife or their habitat.” Davis had answered, “No. Minimal impacts will occur to these values; some wildlife may benefit.”

Davis was sitting directly across from me. I pulled out a cassette recorder, turned it on, and placed it on the table in front of Davis. Reading over Davis’s cumulative impacts checklist, I asked him, in the language of the California Environmental Quality Act, if clear-cutting nearly 10 percent of the world’s largest remaining unprotected grove of ancient redwoods wouldn’t significantly, cumulatively, and adversely impact the rare, threatened, and endangered species that depended on the grove for survival.

“I don’t think so,” said Davis.

“You don’t think so?” I asked.

“No.”

“What makes you not think so. Once the old-growth habitat is gone, how will the wildlife species that depend on that habitat survive?”

“What habitat are you speaking of?”

“The old-growth-forest habitat.”

“Who?”

“This”—I pounded my fingers into a THP map that was on the table—“this old-growth-forest habitat.”

Davis said, “I don’t think there’s a cumulative effect on those.”

“You don’t think so? You don’t think that by eliminating old growth in general, old-growth-dependent species will also be eliminated?”

“There’s plenty of habitat out there.”

The Department of Forestry quickly approved the THPs. Just as quickly, EPIC sued. The organization won this and nearly every lawsuit that it brought against Maxxam and CDF, exposing the willingness of both a voracious Houston corporation and California officials to violate state environmental laws (EPIC would also soon win a federal lawsuit against Maxxam that enforced the Endangered Species Act). But EPIC could only litigate against individual plans—the courts refused to consider a lawsuit targeting a company’s entire operation, no matter the obvious cumulative devastation.

In 1999, the state and federal governments paid Maxxam $480 million for 3,000-acre Headwaters Forest. It was an extraordinary sum in the face of lawsuits that had locked up the grove and rendered its actual value closer to approximately $50 million. After the deal, Maxxam continued cutting virtually every remaining tree of value on its 200,000 acres until, as if on cue, in January 2007, Pacific Lumber declared bankruptcy.

In two decades Maxxam had liquidated nearly all of Pacific Lumber’s assets, valued at between $3 billion and $4 billion. Twelve hundred employees lost their jobs. Pacific Lumber still owed bondholders $714 million, virtually the same debt incurred at the time of the junk-bond takeover in late 1985. Nonetheless, Maxxam, thanks to profits realized in the liquidation of Pacific Lumber, had made the Fortune 500 list eight times between 1989 and 1998. It was a sordid, though expected and predicted, ending to brief history of ancient redwood liquidation.

This excerpt of The Ghost Forest is published by permission of Public Affairs. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185783 https://historynewsnetwork.org/article/185783 0
Historians on Trump's Post-Presidential Legal Issues

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/183792 https://historynewsnetwork.org/article/183792 0
Recovering the Story of the Empress Messalina After a Roman Cancellation

From "Messaline Dans La Loge de Lisisca," Agostino Carraci, 16th c., depicting the rumored moonlighting of the first-century empress in a Roman brothel.

Towards the end of 48 CE a workman carried his tools down into a tomb on the outskirts of Rome. Among the rows of niches, he found the urn holding the ashes of Marcus Valerius Antiochus. He had been a hairdresser and the freedman of the empress Valeria Messalina – a fact he had been proud enough of to record on his tombstone. The workman took out his tools; his job that day was to chisel off the empress’s name.

Messalina had been killed that autumn, in the midst of a scandal that had rocked Rome. She’d been accused of bigamously marrying one of her lovers and plotting to oust her husband, the emperor Claudius, from the throne. The real reason for Messalina’s fall probably lay more in the power plays of court politics than in some grand, mad, bigamous passion, but it didn’t matter. A succession of her alleged lovers were executed, and then, fearing that Claudius might be swayed by love for his wife, an imperial advisor ordered that Messalina herself be killed before she had the chance to plead her case.

Tacitus, the great historian of Roman tyranny, recorded that Claudius hardly reacted when the news of his wife’s death was brought to him at dinner –– he simply asked for another glass of wine. Claudius seemed to want to forget completely, and the senate was willing to help him. They decreed that every trace of Messalina –– every image of her, and every mention of her name –– should be destroyed. It was only the second time in Roman history that an official order of this kind, now referred to as damnatio memoriae, had been passed. The decree applied to both the public and private sphere; statues of Messalina were dragged off plinths in town-squares and domestic atria before being smashed, or melted down, or recut. Mentions of her name were rubbed off official records, and chiselled equally off honorific monuments and hairdressers’ epitaphs.

Damnatio memoriae has sometimes been referred to as a form of ancient Roman “cancel culture,” but this was a process utterly unlike modern cancellation –– one that could not be replicated today. In the age of the internet someone might be unfollowed, their invitations to speak at official events rescinded, they might be attacked in op-eds. Their name might even become unmentionable in certain circles. But while the reach and influence of “the cancelled” might be reduced, the evidence of their existence and actions cannot be destroyed. Their government records and Wikipedia pages still record their birthdate; their tweets, however dodgy, are still cached in some corner of the internet. They can post videos of themselves crying and apologizing, tweet a glib brush-off, or publish ten-thousand-word tracts of self-justification. The cancelled might be dismissed, but they cannot be erased.

The situation was different in 48 CE. The sources of information about Roman political figures were less varied and more traceable than they are today –– and the mediums through which such information was disseminated, generally more smashable.

The public image of imperial women like Messalina was carefully controlled. Official portrait types were developed, copies of which were sent off to cities throughout the empire, where they were copied and recopied again for public buildings, shop-windows, private houses. These statues, along with coin types and honorific inscriptions, were designed to present Julio-Claudian women as icons of ideal Roman femininity and imperial stability. Messalina’s best-preserved portrait is almost Madonna like – she stands, veiled, balancing her baby son Britannicus, then heir to the empire, on her hip; coins minted in Alexandria depict the empress as a veiled fertility goddess, carrying sheaves of corn that promise the prosperity of imperially protected trade routes. Such a coherent image could be destroyed almost wholesale – especially when driven by an official, central edict rather than simply by a shift in popular consensus; there is only one surviving statue of Messalina that was not discovered pre-broken by the conscientious minor officials of the mid-1st century.

So where does this leave the historian? At first glance the situation is dire –– our information about imperial Roman women is always limited, and in this case much of that information has been purposefully and systematically destroyed. On reflection, however, it is more complex; the destruction of Messalina’s images and honours had created a vacuum and an opportunity.

The official narrative of the Julio-Claudian rulers, expressed in stone and bronze, was always supplemented by a secondary, ephemeral narrative of rumor. This was a period that saw politics move ever more away from the public arenas of the senate and the assembly into the private world of the imperial palace as power was ever-increasingly concentrated in the figure of the emperor. The women of the Julio-Claudian family were central to this new dynastic politics; they had access to the emperor that senators could only dream of, and all the while they were raising a new generation of potential heirs to the imperial throne. As the opacity of the new court politics encouraged ever more frenzied speculation about the private lives and intrigues of its players, much of that speculation came to center on the women.

Messalina’s dramatic and sudden fall from grace had raised questions and, in leaving her memory and reputation unprotected, the process of damnatio memoriae allowed people to propose answers. Rumours of the empress’ political and sexual conduct –– some of which may have been circulating during her life, some of which must have evolved after her death –– could now be openly discussed, elaborated upon and written about.

The result is an extraordinarily rich tangle of reality and myth. The sources are almost certainly right to accuse Messalina of orchestrating her enemies’ downfalls and deaths (no one could survive almost a decade at the top of the Julio-Claudian court without a little violence); their attribution of such plots to sexual jealousy and “feminine” passion rather than to political necessity is more suspect. Similarly, there is no reason to believe ancient writers totally unjustified in accusing Messalina of adultery; their claims that she slipped out of the palace nightly to work in a low-class brothel, or that she challenged the most notorious courtesan in Rome to a competition of who could sleep with more men in twenty-four hours (and won with a tally of twenty-five) are far more difficult to credit.

The unravelling of these stories is both the challenge and the joy of ancient history. The process is also revealing on two counts. The evaluation of these stories brings us closer to re-constructing the narrative of Messalina’s real life, her history, and her impact on her times. But even those tales that cannot be credited are of value. The stories and rumours that Rome constructed about its most powerful women when given totally free rein tell us a great deal about its contemporary culture and society –– its anxieties, its prejudices, its assumptions, and its desires. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185813 https://historynewsnetwork.org/article/185813 0
From "Shell Shock" to PTSD, Veterans Have a Long Walk to Health

"The 2000 Yard Stare", by Thomas Lea, 1944, WWII. The Army Art Collection, U.S. Army Center for Military History

Will Robinson, an American Iraq war veteran, languished for months with depression and post-traumatic stress disorder (PTSD) all alone at home in Louisiana. One day in March 2016, he watched the movie “Wild,” starring Reese Witherspoon as Cheryl Strayed. Strayed’s book of the same title told of her redemption from despair by hiking 2,650 miles of wilderness on the Pacific Coast Trail, from Mexico to Canada. Robinson decided to follow Strayed’s example, packing up a tent and supplies a month later to duplicate her journey and, he hoped, its hopeful outcome.

He had nothing to lose. Forced into the army at the age of eighteen by a judge who promised to erase his conviction for petty theft if he served, he was deployed to South Korea in 2001 and Iraq in 2003. Six months in Iraq left him with injuries to his wrist, his knee and, more significantly, his mind. The army gave him a medical discharge for PTSD, but it offered little in the way of medical treatment. He attempted suicide with drugs the Veterans Administration issued him, surviving only because the pills made him vomit. Other vets of the war on terror were not so lucky; every day, an average of twenty-two take their lives rather than endure another moment of living hell. Robinson promised his mother he would not try again. Then she died, and he retreated into loneliness and depression.

It was during that dark time that Robinson saw “Wild” and took his first, literal, step towards recovery. He may not have known that he was following the advice of a British psychiatrist, Dr. Arthur J. Brock, who had prescribed similar solutions to soldiers traumatized in the First World War. The battles between 1914 and 1918 subjected young men to the unprecedented terrors of high explosive artillery shells, poison gas, flamethrowers, rapid machine-gun fire and claustrophobia in rat-infested trenches. Growing numbers of casualties carried to field hospitals had no physical wounds. At least, not wounds the doctors could see.

The soldiers suffered nervous breakdowns. They called their malady “shell shock,” a term introduced to the medical lexicon by psychiatrist Dr. Charles Samuel Myers after he visited the front in 1915. A high proportion of the victims were junior officers, who shared the troops’ fears but also led them in futile offensives against relentless enemy fire and felt a burden of guilt for their deaths. The military needed these officers, but the war had transformed them into paralysed, trembling, stuttering, blind or deaf wrecks unable to fight or to lead.

The British government was forced to open hospitals to aid them and, more importantly, make them fit to return to battle. Dr. Brock took up his post at Scotland’s Craiglockhart War Hospital for Officers when it opened in October 1916. His belief, based on his pre-war practice with mental breakdowns, was that “the essential thing for the patient to do is to help himself,” and the doctor’s only role “is to help him to help himself.” Brock blamed modern society as much as industrial warfare for severing people from the natural world and from one another, resulting in an epidemic of mental illness. His treatment for the soldiers was the same as it had been for civilians who broke down amid the struggle for survival in harsh economic times: reconnect to the world, especially the natural world. He encouraged his patients, including the poet Wilfred Owen, to explore the wild Pentland Hills near Craiglockhart. Many joined Frock’s Field Club to study nature and restore their pre-war relationship to it.

Symbolizing his method was a drawing on his consulting room wall. It depicted the mythological wrestling match between the hero Hercules and the giant Antaeus of Libya. Antaeus, son of the earth goddess Gaia, drew his strength from his mother earth as Samson did from his hair. As long as he was touching the ground, his strength was prodigious. Realizing this, Hercules lifted Antaeus into the air and broke his back. “Antaeus is civilisation,” Brock wrote, “and Hercules is the Machine, which is on the point of crushing it.” The war machine had crushed his patients’ minds. Some of them in fact had been hurled skywards and rendered unconscious by exploding shells. Brock urged them to find peace through nature.

Will Robinson made his connection to mother earth by trekking and sleeping rough on the Pacific Coast Trail, and later on other famous routes –the Tahoe Rim, the Arizona, the Ozark Highlands, the Continental Divide, and the Appalachian. He clocked over 11,000 miles, the first African American man to do so. ESPN declared him “the trailblazing superstar of thru-hiking.” Not only did he come to understand and survive wild environments, he discovered something his life was lacking: community. “Thru-hiking has that community, and it’s why I love it so much,” Robinson told ESPN journalist Matt Gallagher. “People need to know they belong to something.”

Brock would have approved. Connecting to others, becoming part of a community, was as vital to mental health as relating to the earth. Robinson made friends on his treks and mentored younger hikers, including other veterans. He also met the woman who became his girlfriend, and they continue to wander America’s rural byways together. Robinson worked hard to traverse those miles and overcome his war demons. For other American vets, as for World War I’s shell-shocked warriors, there is no single cure for PTSD. What works for one will fail another. Robinson found his way, and it is up to the government that sent them to war to find ways for the rest to resume their lives.

Modern American veterans have one advantage over Brock’s charges. The men whom Brock aided to health had to return to the trenches. Many broke down again or were buried in what war poet Rupert Brook called “some corner of a foreign field/That is forever England.”

© Charles Glass 2023

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185703 https://historynewsnetwork.org/article/185703 0
Can the Left Take Back Identity Politics?

Members of the Combahee River Collective, 1974. Included are (back row, l-r) Margo Okazawa-Rey, Barbara Smith, Beverly Smith, Chirlane McCray, and Mercedes Tompkins;

(front row, l-r) Demita Frazier and Helen Stewart. 

The Combahee River Collective

“We were asserting that we exist, our concerns and our experiences matter,” said Black feminist activist Barbara Smith in an interview she gave almost four decades after the publication of the seminal Combahee River Collective Statement, credited as the first text where the term “identity politics” is used. “We named that ‘identity politics' because we said that it is legitimate to look at the elements of one’s own identity and to form a political analysis and practice out of it.”

Combahee River Collective was a Black feminist lesbian socialist organization active in Boston from 1974 to 1980. The Collective got its name from a military expedition at the Combahee River in South Carolina planned and carried out by the abolitionist Harriet Tubman on June 2, 1863. The raid, which freed 750 slaves at the time, was the first military campaign in American history led by a woman. When asked to describe her work with the Combahee Collective in Boston, Smith said, “I think it was really fated that I ended up there. In Boston there's something about the size and the scale of the city that made it more possible for those of us who were like-minded to find each other.”

But the Collective's impact extended much farther than the local activist scene, thanks to its widely circulated statement of principles. Written by Barbara Smith, her sister Beverly Smith and Demita Frazier in 1977, the statement was published in 1979 in Zillah Eisenstein's anthology Capitalist Patriarchy and the Case for Socialist Feminism, and has since become one of the foundational texts of Black feminist thought:

Our politics initially sprang from the shared belief that Black women are inherently valuable. This focusing upon our own oppression is embodied in the concept of identity politics. We believe that the most profound and potentially most radical politics come directly out of our own identity ... In the case of Black women this is a particularly repugnant, dangerous, threatening, and therefore revolutionary concept because it is obvious from looking at all the political movements that have preceded us that anyone is more worthy of liberation than ourselves. We reject pedestals, queenhood, and walking ten paces behind. To be recognized as human, levelly human, is enough.

This was indeed a very different understanding of identity politics than the hollowed-out versions that dominate public debate today. First, it refused the idea of comparing and ranking oppressions, focusing instead on the particularity of each lived experience. “We actually believed that the way you come together is to recognize everyone fully for who they are,” Smith said, “as we work toward common goals of justice and liberation and freedom.” This opened the door to cooperation and coalition-building, including with those who don't resemble, or necessarily agree with, us. Second, it rejected single-issue politics by pointing to the “interlocking” nature of major systems of oppression. This was in fact the reason the Combahee statement was written in the first place: to point to the failure of the Civil Rights movement, Black nationalism and White feminism to sufficiently address the realities of Black lesbian women.

But the statement didn't prioritize the liberation of one group of people over any other, and proposed what was effectively a new model of social justice activism — foregrounding what would later be called “intersectionality.” Oppressions were multilayered and experienced simultaneously, and that required multi-issue strategies that reject a rights-only agenda. And third, the Combahee vision was unabashedly internationalist and anti-capitalist. The members of the Collective were actively involved in the anti-war movement, for they considered themselves to be, in the words of Barbara Smith, “third world women”: “We saw ourselves in solidarity and in struggle with all third world people around the globe.” Growing out of the organized Left, they defined themselves as socialists, and believed, as their statement put it, “that work must be organized for the collective benefit of those who do the work and create the products, and not for the profit of the bosses.”

Till Identity Do Us Part

But times have changed, and not for the better. A new type of identity politics was forged on university campuses, one that didn't fully grasp the connection between theory and practice, or concerns about bread-and-butter issues that affect all women. This narrow version “was used by people as a way of isolating themselves, and not working in coalition, and not being concerned about overarching systems of institutionalized oppression,” Barbara Smith said, expressing her discontent with the ways in which identity politics was reconfigured by the campus Left. “Trigger warnings and safe spaces and microaggressions — those are all real, but the thing is, that’s not what we were focused upon.” Like other groups of Black women who were organizing around Black feminism, Combahee was “community-activist based. Focusing on looking at real issues affecting all Black women, which includes poor Black women.”

Demita Frazier, another co-author of the Combahee statement, concurred. Part of the problems is “the commodification of everything,” including identity politics, which was completely detached from its anti-capitalist origins. This was because of the way it was co-opted by academics, she added: “I wouldn’t say co-opted if it weren’t for the fact that there’s still this big divide between practice and theory, right? I mean, I’m glad that the children and the young’uns are getting educated, but it looks like a factory to me right now.”

This brief excursion into history, and the reflections of the veteran activists of the Combahee River Collective on the legacy of their statement, provide several insights into the problems that plague current understandings of identity politics. The radical identity politics of campus activists, Diversity, Equity and Inclusion trainers and anti-racism gurus is everything that the identity politics of the Combahee River Collective is not. The new upgrade is profoundly narcissistic, and focuses on perceived individual harm at the expense of structural injustices; it establishes hierarchies of oppression by resuscitating the theological concept of “eternal sin,” which is then imputed to certain groups of people who are expected to devote a certain percentage of their daily lives to confess and repent (after all, no salvation without self-flagellation!); it interjects the term “intersectionality” here and there as a catchphrase, but treats identities as if they are fixed, insulated categories with no internal hierarchies or divisions; it disparages the idea of universal values or human rights, treating them as tools for domination invented by the powerful to maintain the status quo; it sees no allies, and it seeks no allies; it is thus “separatist,” in the sense in which Barbara Smith used the term. “Instead of working to challenge”, Smith said, “many separatists wash their hands of it and the system continues on its merry way.”

“This Bridge Called My Back”

For the Combahee women, identity politics was about politics, and identity was one way of doing politics and challenging hierarchies. For the campus Left, identity politics is about identity, and identity is beyond politics. It's a sacred value that needs to be preserved intact, at all costs. The questions of who defines a particular identity, or what causes harm, are left unanswered. In that sense, early critics of radical identity politics, Marxists and liberals alike, were right, but only partially. It's true that for the campus Left, “symbolic verbal politics” was the only form of politics that was possible. But today, even verbal politics is out of bounds. Terms are not discussed but dictated; truth, in an ironic twist, is no longer relative but absolute. Paradoxical as it may sound, new identity politics is “anti-politics” — not only in the conventional sense of alienation from or distrust in mainstream politics but also in the broader sense of how we understand “the political,” as a space of contestation. The current obsession with privilege closes up that space, ruling out the possibility of dialogue and building alliances. In such a scheme, anyone who criticizes dominant progressive orthodoxies is branded as a “useful idiot,” advancing or unwittingly enabling a right-wing agenda. White progressives, Black conservatives, centrists or bona fide liberals are considered to be more harmful to the cause of social justice than explicitly racist modern day Ku Klux Klanners. It may well be so. But what does this mean, politically speaking? Are we not supposed to reach out to fellow progressives or, indeed, regular people, and explain to them that in a society built on White values, colorblindness may not be the best way to achieve racial equality? And if we cannot even speak to the progressives, how are we going to convince the conservatives, reactionaries, or overt racists who still constitute a substantial part of any given society?

The Combahee women who coined the term identity politics knew the answer to these questions because they were doing political work and consciousness-raising in the real world, with women of all colors and walks of life, not peddling virtue in sterilized boardrooms or slick vodcasts. They were guided by the motto “This Bridge Called my Back” (which was later to become the title of a ground-breaking feminist anthology edited by Cherrie Moraga and Gloria E. Anzaldúa), which they saw as the key to success. “The only way that we can win — and before winning, the only way we can survive,” said Barbara Smith, “is by working with each other, and not seeing each other as enemies.”

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185812 https://historynewsnetwork.org/article/185812 0
Ayn Rand's Defense of an Anti-Union Massacre

Photo from records of LaFollette Committee, National Archives and Records Administration

In July 1943, former Hollywood screenwriter Ayn Rand was still tracking responses, critical and commercial, to her first major novel, The Fountainhead.  It had been published two months earlier by Bobbs-Merrill after being rejected by a dozen other companies.   Rand had written two previous novels, along with two stage plays, none of which proved successful.  Now The Fountainhead was off to a slow start with audiences and reviewers.

While this was transpiring, Rand received in the mail a set of galleys for the memoir (eventually titled Boot Straps) by Tom M. Girdler, chairman of Republic Steel, which operated several massive plants in the Midwest and Pennsylvania. Many Americans had probably already forgotten the most tragic incident that Girdler was associated with, almost exactly six years earlier.  If Rand was among them, her memory (and high estimate of Girdler) was surely revived in reading those galleys.  Soon she would model a key character in her most famous novel, Atlas Shrugged, partly on Girdler.

Near the end of May 1937, workers who had been on strike for several days at Republic Steel in Southeast Chicago had called for a Memorial Day picnic on the wide open field several blocks from the plant entrance to build support.  Tom Girdler wouldn’t even recognize the union, famously vowing that he would retire and go back to growing apples before he’d do that.  At least 1500 workers and family members, including many women and children, turned out for the picnic.   After the festivities, organizers called on the crowd to march to the gates of the plant where they might establish a mass, legal, picket. 

Halfway there, the marchers, at least 500 strong, were halted by a large contingent of Chicago police and ordered to disperse.  A heated discussion ensued.  A few rocks were thrown in the direction of the police.  Suddenly, some of the police drew their pistols and opened fire on the protesters at point blank range, and then as the marchers fled.   They chased after the survivors, clubbing many of them. 

Forty in the crowd were shot, with ten dead within two weeks. Dozens of the survivors were arrested and lifted into paddy wagons without medical attention.  Only a handful of police required treatment for minor injuries.  

Despite these one-sided results, local and national newspapers, right up to The New York Times and Washington Post, almost uniformly portrayed the marchers as a “mob” intent on rioting—that is, as the perpetrators of this tragedy.   Some falsely suggested that the unionists fired first. 

The only footage of the incident is quite graphic, showing the police shooting and then clubbing marchers; it was suppressed by Paramount News, a leading newsreel company. 

Then the Progressive Party senator from Wisconsin, Robert LaFollette, Jr. convened a sensational three-day hearing into the tragedy. The Paramount footage was screened in its entirety—and then in slow motion (you can watch it here)--providing more proof of police malfeasance.  It emerged that Republic Steel had collaborated with police on this day, allowing them to set up headquarters inside their plant and supplying them with tear gas and axe handles to supplement their billy clubs.

When the LaFollette committee released its report (most of it, along with witness testimony, printed for the first time in my new book on the Massacre), it harshly criticized the police: “We conclude that the consequences of the Memorial Day encounter were clearly avoidable by the police. The action of the responsible authorities in setting the seal of their approval upon the  conduct of the police not only fails to place responsibility where responsibility properly belongs but will invite the repetition of similar incidents in the future.”

Ayn Rand clearly did not agree.  On July 12, 1943, she typed a five-page letter to Republic boss Girdler after reading his galleys.  “Allow me to express my deepest admiration for the way in which you have lived your life,” Rand wrote from New York City, “for your gallant fight of 1937, for the courage you displayed then and are displaying again now when you attempt a truly heroic deed—a defense of the industrialist….”  Then she offered to send him a copy of her novel.

“The basic falsehood which the world has accepted is the doctrine that altruism is the ultimate ideal,” she related.  “That is, service to others as a justification and the placing of others above self as a virtue.  Such an ideal is not merely impossible, it is immoral and vicious.  And there is no hope for the world until enough of us come to realize this.  Man’s first duty is not to others, but to himself…

“I have presented my whole thesis against altruism in The Fountainhead….Its hero is the kind of man you appear to be, if I can judge by your book, the kind of man who built America, the creator and uncompromising individualist.”

But Rand also admitted that “it shocked me to read you, a great industrialist, saying in self-justification that you are just as good as a social worker.  You are not.  You are much better.  But you will never prove it until we have a new code of values.  ​ 

“You had the courage to stand on your rights and your convictions in 1937, while others crawled, compromised, and submitted.  You were one of the few who made a stand.  You are doing it again now when you come out openly in defense of the industrialist.  So I think you are one of few men who will have the courage to understand and propagate the kind of moral code we need if the industrialists, and the rest of us, are to be saved.  A new and consistent code of individualism.” 

She concluded the letter “with deep appreciation for your achievement and that which you represent.”

Girdler replied on July 27, 1937, that he had just purchased The Fountainhead. A few months later, he met Rand in New York and told her that he had read and enjoyed novel, which pleased her immensely, and he suggested they meet for lunch.

This apparently did not take place, but she would, a short time later, create one of the key characters in Atlas Shrugged, troubled steel industrialist Hank Rearden, based partly on Girdler.

Greg Mitchell’s new film Memorial Day Massacre: Workers Die, Film Buried, premiered over PBS stations in May and can now be watched by everyone via PBS.org and PBS apps.  He has also written a companion book with the same title.  He is the author of a dozen previous books.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185782 https://historynewsnetwork.org/article/185782 0
The Power of Dependency in Women's Legal Petitions in Revolutionary America (Excerpt)

James Peale, "The Artist and His Family," 1795

Historians have spent decades investigating whether the American Revolution benefited women or provoked changes in women’s status. By and large, white women’s traditional political rights and legal status remained relatively stagnant in the wake of the American Revolution. In some ways, women’s legal status declined over the course of the long eighteenth century. Certain women’s private lives, however, did see some important shifts, especially in regards to family limitation and motherhood. Importantly, the Revolution politicized some women who participated in boycotts, contributed to and consumed Tory and Whig literature, and even acted as spies or soldiers themselves during the war. Women also carefully negotiated their political positions to manage the survival and safety of their families. In the postwar period, elite white women gained greater access to education, though ultimately in service of raising respectable republican sons and their worthy wives. In many ways, however, the lives of American women looked much the same in the postrevolutionary period as they had prior to the war. Despite Abigail Adams’s threat to “foment a rebellion” if women were not included formally in the new American body politic, there would be no great women’s revolution in the late eighteenth and early nineteenth centuries.

Asking whether the Revolution benefited women or brought meaningful changes in their social, legal, and economic statuses, however, cannot fully illuminate the war’s impact on women’s lives. In some ways, this framework is both anachronistic and problematic. Constructing our queries in this way asks too much from a historical period in which inequality and unfreedom were so deeply embedded in patriarchal law, culture, and society as to render such a sea change unlikely  at best. Likewise, this line of inquiry presumes that revolutionary-era women collectively desired what first- and second-wave feminists sought for themselves. It also judges the consequences of the Revolution for women from a set of expectations codified as masculine. Certainly, there were a few noteworthy women who sought rights and freedoms for which liberal feminists of the nineteenth and twentieth century fought, but the Abigail Adamses, Mercy Otis Warrens, and Judith Sargent Murrays of the American revolutionary era were few and far between.

This long scholarly conversation about whether the American Revolution was centrally a moment of change, stagnation, or decline in women’s lives has framed many historical investigations from the wrong perspective. Ironically, we have been studying patriarchal oppression, resistance to it, and attempts to overcome it from a patriarchal standard all along. We must seek to understand the impact of the American Revolution on women’s lives by framing our inquisition around women’s own worldview, their own needs, aspirations, and desires, even when doing so is uncomfortable to our modern sensibilities. What function did the Revolution serve in women’s lives? How did women interpret the rhetoric of the Revolution? How did they make the disruption and upheaval of this historical moment work to their advantage, with the tools already at their disposal? How did they use the apparatus of patriarchal oppression— namely, assumptions of their subordination and powerlessness—to their advantage? What did they want for themselves in this period, and were they able to achieve it? When the impact of the Revolution is investigated  with this shift in perspective, we are able to observe the ways in which women’s individual and collective consciousness changed, even if the Revolution was not radical enough to propel them from their unequal station in American society.  

In Dependence asks these questions from a regionally comparative and chronologically wide-ranging perspective, focusing on three vibrant urban areas—Boston, Massachusetts; Philadelphia, Pennsylvania; and Charleston, South Carolina—between 1750 and 1820, or what I refer to broadly as the “revolutionary era.” These three cities serve as ideal locations for a study of early American women’s experiences as their laws, social customs, and cultures varied significantly. Boston, Philadelphia, and Charleston were three of the most populous cities in the American colonies and, later, the early republic, which provided inhabitants with access to burgeoning communities as well as the growing marketplaces of goods, printed materials, and ideas. Massachusetts’s, Pennsylvania’s, and South Carolina’s laws regarding marriage, divorce, and property ownership (and thus their demarcation of women’s rights and legal status) all differed a great deal during this period. I chose to focus my study on urban as opposed to rural areas so as to include in this work impoverished communities, whose members often turned for assistance to city almshouses and other local organizations. Women in each of these three cities had the opportunity to petition their state legislatures for redress, yet because of their varying experiences and racial and class identities, they did so for different reasons, with different access to seats of patriarchal power, and certainly with different outcomes.

The revolutionary era was a period in which ideas about the meanings of independence, freedom, and individual rights were undergoing dynamic changes. Dependence was a fact of life in colonial British America, defining relationships ranging from colonial subjects’ connections to the king to wives’ unions with their husbands. Both parties in these relationships had power—even dependents—and these relationships required a set of mutual obligations. Thus, dependence was not an inherently impotent status. The meaning of dependence shifted, how ever, with the adoption of the Declaration of Independence. Dependence ceased to be a construct with positive connotations in the American imagination, and likewise became imbued with a sense of powerlessness. The newly independent United States required the allegiance of  its people, and adopted the concept of voluntary citizenship rather than  involuntary subjectship. Accordingly, the law recognized women’s personhood and, to a certain degree, their citizenship, but it also presumed their dependence, which codified them as legally vulnerable and passive. Dependence, then, became highly gendered, and feminized. Women’s  dependent status was likewise contingent on their socioeconomic status, their race, the legal jurisdiction in which they resided, and their relationship to men in power.

Importantly, dependence must not be observed as the ultimate foil to independence. These terms are not abjectly dichotomous to one another, but exist on a fluid spectrum. Situated on this continuum, women firmly asserted their dependence while expressing the “powers of the weak.” While a traditional understanding of “power” implies some form of domination of one party over another through possession, control, command, or authority, this conception obscures the meaning of the word itself while also negating the exercises and expressions of power that do not conform to these standards. If power is also understood as existing on a fluid spectrum, then, an analysis of women’s invocation of the language of dependence in their petitions to state legislatures, courts, local aid societies, and their communities becomes much different.

Notions of power and freedom in early America were contingent upon a person’s intersectional identities. Wealthy, white male enslavers, for example, had different understandings and experiences of freedom than did the Black women they enslaved, and because of the legal structure of the patriarchal state, these white male enslavers held a great deal of power over unfree, enslaved Black women. Like dependence and independence, freedom and unfreedom existed on different ends of the same spectrum. Race, gender, class, religion, region, status of apprenticeship,  servitude, or enslavement, and other elements of an early American’s  identity shaped their relationship to freedom and unfreedom. Notably, this continuum was deeply hierarchical. Even if enslaved women earned  or purchased their legal freedom from the institution of slavery, that free  status was still tenuous, as was the free status of any children they bore. Likewise, enslaved women would have viewed freedom differently than their white counterparts. Black women in particular often defined freedom as self-ownership, the ability to own property, to profess their  faith freely, and to ensure freedom for their families. Freedom for many  enslaved people was a matter of degrees, a game of inches, a process of  constant negotiation for small margins of autonomy and independence  in an otherwise deeply oppressive system. Even if they obtained documentation that declared them legally free from the institution of slavery,  that did not guarantee their perpetual freedom, and it certainly did not  grant them equality under the law; that freedom—even if it existed on  paper—was tenuous. Additionally, American freedom did not evolve  and expand in a teleological manner; in many cases, even in the revolutionary era, freedoms devolved and disappeared for certain marginalized groups of Americans.  We must always consider the ways in which  Americans’ experiences of their freedoms were not (and in many ways, still are not) equal.

Black women experienced multiple, layered dependencies that were compounded by their race and gender, and especially by the existence of the race-based system of chattel slavery that relied on Black women’s reproductive capacity to enhance the power of white patriarchs. Black women, therefore, were not endowed with the same legal protections, rights, and privileges as their white contemporaries were. Engaging with the sympathies of white patriarchs, for example, was not a functional or effective strategy for Black women, as it was for white women. In order  to fully understand how Black women exploited the terms of their intersectional dependencies, then, we must examine the unique experiences  of Black women from within these interlocking systems of oppression. The notion that women could—and can still—express power because of their subordinate status and the protection it offers indicates that women have never been completely powerless. Like other historically marginalized groups or individuals, women have been able to express  a degree of power, autonomy, and agency over their own lives while still being overtly suppressed by a controlling authority. Thus, dependents  expressed power in a variety of ways, including more subtle means such as claiming a public voice or becoming politically active via the submission of petitions. What is especially significant, however, is not that women found power through petitioning various authorities but that they found power in this way through public declarations of their dependent, unequal, and subordinate status.

This excerpt from In Dependence: Women and the Patriarchal State in Revolutionary America is published by permission of NYU Press. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185816 https://historynewsnetwork.org/article/185816 0
Comparing the Trump – DeSantis Race to the Republicans' 1912 Debacle is a Stretch... Right?

Leonard Raven-Hill, "For Auld Lang Syne" from Punch, 1912

And they’re off. With just a year and a half to go, Ron DeSantis has finally thrown his hat into the ring. Now the race for the GOP nomination truly begins. Nikki Haley, Asa Hutchinson, and a variety of other runners and riders are there but, for most people, this is a two-horse race between Donald Trump and DeSantis. This potential head-to-head has been a long time coming, and some think that DeSantis has left it too late. DeSantis was well ahead of Trump in GOP opinion polls shortly after the 2022 midterms, but now Trump has a commanding lead. However, we shouldn’t forget that a lot can change between now and November 2024.

Let’s go back a little, to see how polling looked for Trump in 2011 and 2015 (around a year and a half out from the presidential elections of 2012 and 2016).

In April 2011, Trump led a poll of GOP primary voters with 26 percent, more than ten points ahead of eventual nominee Mitt Romney. Much of this “Trump bump” was linked to his high-profile “birther” campaign demanding to see President Obama’s birth certificate. However, once Obama produced the document, Trump’s numbers swiftly dissipated, and he decided not to run. Conversely, In June 2015, Trump polled just 1 percent, leaving him in eleventh place out of 16 candidates in a survey of GOP primary voters. Of course, Trump looks to be in a much stronger position at the end of May 2023, with polls of primary voters putting him at over 50 percent and DeSantis trailing with around half of that. But remember, Trump won the nomination in 2016 despite polling at only 1 percent at the same stage of the nomination campaign, and his numbers collapsed when he had gathered a significant lead four years earlier.

So, let’s imagine, just for argument’s sake, that DeSantis stays the course. We get a broad slate of candidates (as we did in 2015-2016), and Trump isn’t the outright winner come the Republican National Convention. Let’s stretch our imaginations even further to see the GOP Convention tightly contested so that, in the end, DeSantis gets the nomination by the narrowest of margins. Trump, spurned, storms out and decides to run independently under the banner of the “Truth Party.” Come November, Trump picks up a number of states he won in 2020, and DeSantis takes Florida and a handful of flyover states for the GOP. Meanwhile, Biden wins by a mile, as the divided Republican vote lands him easy wins in Pennsylvania, Georgia and Ohio, and he even sneaks by in Texas. It’s an outlandish scenario, but for those of you with a long memory, it’s not quite the frantic fever-dream of a teacher overcome by too much grading that it might seem. I take you back to 1912….

In February 1912 – election year – Theodore Roosevelt (Republican president from 1901-1909) formally challenged the incumbent Republican President William Howard Taft for the GOP nomination. In Roosevelt’s mind, he had made Taft’s career; TR had appointed Taft to his cabinet while president and handpicked Taft as his successor. Roosevelt campaigned for Taft in 1908, had photographs taken with him, went the whole nine yards. In many ways TR felt he won the election for Taft. Yet, while in office, Taft disappointed Roosevelt. Always a larger-than-life personality, retirement was never really going to be TR’s favored path.

The 1912 Republican nomination campaign turned nasty. Roosevelt launched several stinging attacks on Taft. Taft, a TR true-believer and former friend, was wounded and was slow to resort to the same sort of name-calling as Roosevelt. The stage was set for a close race, and the result went down to the wire. That June, though, Taft wrested the nomination from Roosevelt at the convention. Roosevelt cried foul play, a corrupt stitch-up! He stormed out of the convention and weeks later ran a third-party “Bull Moose” campaign under the banner of the Progressive Party.

The 1912 election became a three-horse race – though special mention should go to five-time presidential candidate for the Socialist Party, Eugene Debs, who received over 900,000 votes (from a total vote of just over 15 million). Roosevelt won 88 Electoral College votes, including large states like Pennsylvania, while Taft got the measly 8 votes of Utah and Vermont. Between them, the erstwhile allies got over 50 percent of the popular vote, but the Electoral College saw Democrat Woodrow Wilson win in a landslide, with 435 votes out of 531.

As ever with these historical parallels, there are innumerable other variables that don’t mirror the past anywhere near as well. However, this comparison is not so much aiming to suggest that 2024 might be a full repeat of 1912, as to offer a glimpse into the danger that a full split could cause for the GOP if DeSantis and Trump really did divide the vote come November 2024.

Trump and DeSantis started as allies. Many, including Trump, felt that Trump’s backing won the Florida gubernatorial race for DeSantis in 2018. DeSantis appeared to be a Trump true-believer. Trump did not want to retire quietly, and he doesn’t seem to like how DeSantis has “betrayed” him. They are now running against each other for the nomination, and Trump has been criticising “Ron DeSanctimonious” for months, while DeSantis has remained largely passive in his response. There is an echo of the past here for sure, even if it’s a faint one thus far.

However, if things were to go the course, and the Convention looked like it might be decisive… if the remaining court cases against Trump were to go against him, and the Republican Party threw its weight behind DeSantis to narrowly deny Trump the nomination… then it does not seem quite so far-fetched that Trump could run as an independent. Maybe, just maybe, 2024 might see more echoes of 1912 when it arrives. If so, then President Biden will no doubt be happily ordering copies of James Chace’s 1912 or Lewis Gould’s Four Hats in the Ring, and merrily assessing his chances in a three-horse race come next November.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185814 https://historynewsnetwork.org/article/185814 0
A Trip Through the Mind of Vlad the Conqueror: A Satire Blending Imaginary Thoughts with Historical Facts

Striding masterfully through St. George’s Hall of the Grand Kremlin Palace, Vlad the Conqueror pondered his role as a Man of Destiny.

“It’s not easy to measure up to the past leaders of Russia,” he brooded.  “Ivan the Terrible and Peter the Great slaughtered enormous numbers of people at home and abroad in building the largest nation on earth.”  Stalin, too, he noted, “showed the world what could be accomplished by a strong man with an unrelenting will to power.”  After all, Stalin “succeeded in murdering millions of Ukrainians through starvation, gobbling up portions of Eastern Europe through an alliance with Nazi Germany, smashing Hitler’s legions after the führer broke with him, and pushing Russian domination right into Central Europe during the early Cold War.  Now those were real men!”

Frowning, he added: “Of course, the Russian empire went downhill after that.  Stalin’s namby-pamby successors fumbled along, trying to hold it together through invasions of Hungary, Czechoslovakia, and Afghanistan.  And then, Gorbachev”―Vlad spat out the name―“that traitor, he wanted Russia to behave like a normal nation.  But it’s not a normal nation,” Vlad told himself heatedly.  “It’s a great nation.  And great nations require great leaders!”  Pausing briefly, he stopped to regard himself, fondly, in a diamond-encrusted mirror.

“And look at what I’ve already accomplished in restoring our nation’s grandeur―not only rebuilding our military forces and arming them with new nuclear weapons, but using Russian military power to obliterate Chechnya, secure footholds in Georgia and Moldova, annihilate resistance to Assad’s dictatorship in Syria, and launch Russian mercenary forces throughout Africa.”

He stopped and smiled.  “But the highpoint so far is surely my invasion of Ukraine.  I’ve leveled its cities, massacred many thousands of Ukrainians, sent millions more fleeing as refugees, and annexed its land into Russia.  As my long-time friend, Donald Trump, remarked of the invasion, ‘this is genius’!”  Pausing before another mirror, he again admired his profile.

“Alas,” he conceded, “not everyone recognizes greatness when they see it.  In the wake of my glorious invasion of Ukraine, 141 nations at the UN General Assembly voted to condemn it, though four wise nations did give it their support:  North Korea, Syria, Belarus, and Eritrea.  At home, too, many thousands of Russian subversives―betrayers of their Motherland (and of me!)―demonstrated and signed petitions against the war.  Fortunately, we’ve already arrested about 20,000 of them.  Also, perhaps a million Russians, losers all, fled to other lands.”  He groaned wearily.  “Well, they won’t be missed!”

“Furthermore, abroad, where I’m gratified to learn that I have many fans among rightwing and leftwing zealots, public opinion has turned against me.”  Vlad scratched his head in dismay.  “Even those segments of the Western peace movement that back my policies don’t seem to ‘get it.’  One busy bee who writes and speaks incessantly about the war in Ukraine almost completely ignores my role in it.  Instead, she chalks up the conflict to the policies of the United States and NATO.  Don’t I get any credit for anything?!”  He shook his head sadly.

“And then, of course, there are the damned Ukrainians who, instead of welcoming our invasion, destruction, and occupation of their country, are resisting!  This is surely another sign that they are unfit to govern themselves.”  He concluded, morosely:  “What a mess!”

“Yes, life is unfair to me,” Vlad sighed, as warm tears suddenly appeared and rolled lazily down his cheeks.  “And it has been for some time.”

He ruminated: “Things are not so easy when you’re a little guy―only 5 feet, 6 inches tall―in a big man’s world.  Peter the Great, a hero of mine, measured 6 feet, 8 inches.  So he certainly had an advantage there!  Also, on top of that, my puberty came late. To keep from being bullied by the other boys, I took up judo and, at the age of 19, became a black belt.  Then,” he laughed, “I joined the KGB, and people soon learned not to mess with me or with my new circle of friends.” 

“Naturally, as I moved up the Russian government hierarchy, I became known for my tough, masculine style and approach―riding bare-chested, muscles rippling, on horseback, imprisoning uppity women, and making even the mention of homosexuality punishable by imprisonment.  And I saw to it that my political opponents were packed off to prison camps―at least when they didn’t develop the nasty habit of getting poisoned or falling out of windows.”  Pounding his fist on a table inlaid with gold and ivory, Vlad chortled at his wit.

“Some say that I’m a cold person.  Actually, though, I can be warm and accommodating when it’s useful in forging friendly relationships with other great leaders―men of power like Xi Jinping, Donald Trump, Kim Jong Un, and Saudi Prince Mohammed bin Salman.  In 2018, when bin Salman was being snubbed by other government leaders at the G20 summit for ordering the dismemberment of Jamal Khashoggi, a dangerous journalist, I went right over to the prince and we exchanged joyful high-fives.  We’ve been great pals ever since.”

Smiling, Vlad remarked: “None of them, of course, has my sophisticated grasp of international relations, and they will ultimately recognize my superior wisdom as my mastery of world affairs and my power grow ever greater.  Even now, though, they are turning to me for leadership.”  Spotting another mirror, he gazed lovingly at his splendor.

Standing tall and throwing back his shoulders, he proclaimed:  “Yes, I’m no longer Little Vlad.  I’m the supreme commander of the biggest country on earth.  And, under my rule, it’s growing even bigger.  Today I am Vlad the Conqueror!  Look on my works, ye mighty, and despair!”

Then, glancing about the vast, ornate hall, he muttered: “Now where the hell is my Viagra?  Where did I put it?”

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185815 https://historynewsnetwork.org/article/185815 0
Trump Poised to Join Short List of 3-Time Presidential Nominees

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency (Rowman Littlefield Publishers, 2015, Paperback Edition 2017).

As the presidential campaign of 2024 becomes the center of public attention, former president Donald Trump seems far ahead in the battle for the Republican presidential nomination; if he does win, Trump will join a select group of presidential nominees who have been on the ballot three or more times.

All by himself as the only four-time candidate is Franklin D. Roosevelt, who was the nominee of his Democratic Party in 1932, 1936, 1940, and 1944.  After World War II, the move for a constitutional amendment to limit presidential longevity in office to two elected terms (or a total of ten years if succeeding to the office) was accomplished with the 22nd Amendment, which took effect beginning with the presidency of Dwight D. Eisenhower. FDR also had the unique distinction of being on the presidential ballot as a vice presidential running mate in the failed presidential campaign of Democrat James Cox in 1920.

Two three-time presidential nominees failed to be elected despite multiple attempts.  Henry Clay was on the ballot in the presidential elections of 1824, 1832 and 1844, and was a contender in 1840 and 1848. The winning Whig candidates in those years (William Henry Harrison and Zachary Taylor) died early in their terms of office. William Jennings Bryan was the nominee of the Democratic Party in 1896, 1900, and 1908 and was bandied about as a possible nominee in 1912, before he threw his support to Woodrow Wilson, who went on to win two terms in the White House.

Thomas Jefferson and Andrew Jackson both lost their first bids for the presidency in 1796 and 1824. In times of constitutional crisis and division, they were defeated respectively by the father and son John Adams and John Quincy Adams. Jefferson and Jackson would defeat their Adams nemeses in the next elections, and each served two terms in the presidency. 

Martin Van Buren, Jackson’s second-term vice president would be the last vice president to succeed to the presidency by election (1836) until 1988, when George H. W. Bush succeeded Ronald Reagan. But Van Buren lost the election in 1840, and then ran as the candidate of the Free Soil Party in 1848, winning ten percent of the national popular vote.  If one counts his being on the ballot with Jackson in 1832, Van Buren was on the ballot more often than anyone except FDR.

Grover Cleveland was on the ballot three times, winning the popular vote all three times (1884, 1888, 1892), but losing the Electoral College in 1888 to Benjamin Harrison (whom he then defeated in 1892).  If Donald Trump ends up as the Republican nominee against Joe Biden, this would be the first such scenario of a rematch since 1892.

The final example of a three-time nominee was Richard Nixon, who lost to John F. Kennedy in 1960, but came back as the successful Republican nominee in 1968 and 1972.  He joined only Thomas Jefferson and Andrew Jackson as candidates who had lost, and then came back to win two terms as president.

So Donald Trump might join a short list of third time nominees, but he also has a unique situation as the only president to have lost the national popular vote twice (although he was elected president in 2016).  He would join only Thomas Jefferson (before the era of popular vote being a factor in elections), and Henry Clay and William Jennings Bryan (three time losers) in being on the ballot three times.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/blog/154756 https://historynewsnetwork.org/blog/154756 0
The Roundup Top Ten for June 2, 2023

Determined to Remember: Harriet Jacobs and Slavery's Descendants

by Koritha Mitchell

Public history sites have the potential to spark intellectual engagement because when they make embodied connections between people and the sites they visit—even when those connections evoke the cruelty of the past. 

Commemoration of the Tulsa Massacre Has Put Symbolism Over Justice for the Victims

by Victor Luckerson

"The neighborhood’s historical fame has become a kind of albatross slung over Black Tulsans’ necks, as efforts at building concrete pathways toward justice are buried under hollow symbolism."

Dodgers' Controversial Invite to "Drag Nuns" Group Highlights Catholics' Selective Sense of Faith

by Kaya Oakes

Catholic groups expressing outrage at the team's recognition of the Sisters of Perpetual Indulgence overlook the centrality of mercy in the Gospels. 

Will the Debt Ceiling Deal Derail Environmental Justice?

by Robert Bullard and Larry Shapiro

The idea of permitting reform—easing the environmental constraints on building new energy infrastructure—has been a bargaining chip in the debt ceiling negotiations. Reforms could help bring a green energy grid online, but it could also put more polluting industry in poor and minority communities. 

The Right to Dress However You Want

by Kate Redburn

New anti-transgender laws should prompt a legal response, but they also require a fundamental recognition: laws prescribing gendered dress codes infringe on everyone's freedom of expression. 

When Witches Took Flight

by Chris Halsted

The modern association of witches and flight in fact emerged from a relatively obscure corner of medieval church writings that gained prominence in the context of contemporary political anxieties about women's political influence. 

Why is the American Right so Thirsty for Generalissimo Franco?

by David Austin Walsh

Increasingly "respectable" conservative intellectuals are openly advocating for a dictator to enforce cultural traditionalism as part of a battle to control the politics of elite institutions.

Amid Anti-Woke Panic, Interdisciplinary Programs Inherently Vulnerable

by Timothy Messer-Kruse

Because standards of academic freedom like those of the AAUP tie that freedom to expertise within recognized professional communities of scholars, those doing interdisciplinary work and working in programs like ethnic studies have less institutional protection against political attacks. 

Can We Solve the Civics Education Crisis?

by Glenn C. Altschuler and David Wippman

Universal schooling created the potential for a unifying civic curriculum that, paradoxically, has been the subject of perpetual disagreement regarding its contents. A recent bipartisan roadmap for civics education that makes those disagreements central to the subject matter may be the only way to move forward. 

WGA Strike Latest Example of Cultural Workers Joining Together as Entertainment Technology Changes

by Vaughn Joy

The development of television and online content have historically forced multiple Hollywood unions to join forces to secure a share of the returns of new techology or risk being frozen out entirely. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185811 https://historynewsnetwork.org/article/185811 0
The Modern Relics in Crow's Cabinet of Curiosities

Senator Sheldon Whitehouse (D-RI) points to a painting commissioned by Harlan Crow depicting a meeting at Crow's vacation retreat including Federalist Society head Leonard Leo (2nd from left), Justice Clarence Thomas (2nd from right) and Crow (far right)

Who is Harlan Crow? As questions mount about Supreme Court Justice Clarence Thomas’s alleged failure to disclose significant gifts (and attendant concerns about his integrity multiply), his principal benefactor has achieved a certain, curious fame. Until recently Harlan Crow, despite his enormous wealth and influence, remained a relatively obscure Dallas billionaire. Now, many want to know why he has lavished so many gifts on Justice Thomas, including a Bible once owned by the great abolitionist Frederick Douglass, an invaluable piece of Americana and an American relic.

For me, and for many others, the most fascinating aspect of Crow’s new celebrity is his controversial penchant for collecting rare,— and sometimes disturbing—historical objects. These include things we might call “atrocious relics.” In my recent book, American Relics and the Politics of Public Memory, I wrestle with such matters. Why do we collect relic-like things? What do they mean? What do they “do” or “say”—to those who possess them and to those who view them? Relics can be whimsical, glorious, or sober, but they are also volatile and sometimes alarming and offensive.

 

What is a “relic”?

Relic is commonly defined as a material object held in reverence by believers because it is linked to a holy person. In medieval Christendom, relics—blood and bones of saints, pieces of the “true cross,” and other sacred traces—gave power to their possessors and access to the divine. Their presence elevated and sanctified churches and communities, helped mold worshippers’ identities, and fixed them in a larger Christian world.

In our more secular modern world, relics endure and perform some of the same functions. Prized vestiges of former times, souvenirs or mementos connect us directly to the past. They do not merely illustrate it; they physically embody it, its glory and triumph, sometimes its tragedy or even horror. Relics are the past, persisting in our present.

Important public relics seemingly possess an ability to speak firsthand, to communicate authentically, wordlessly, emotionally, compellingly. They are both the argument and the evidence, veritable “smoking guns.” Sometimes they look ordinary. Who cares about some old, unremarkable fountain pen, until we learn that Lincoln used it to inscribe the Emancipation Proclamation in 1863? What’s the big deal with some old, tattered book, until it’s revealed as the Bible once owned (before Crow and Thomas) by Frederick Douglass? Through such things, we are uncannily linked to “history.”

Crow’s Nest

Harlan Crow has accumulated lots of such stuff at his Highland Park estate—astonishing stuff—including (randomly) a letter written by Christopher Columbus, a silver tankard crafted by Paul Revere, the deed to George Washington’s Mount Vernon, Dwight D. Eisenhower’s helmet, adorned with five stars, a cannonball from the Battle of Gettysburg, and much, much more.

But mingled among these American treasures are linens, medallions, and other Nazi artifacts and memorabilia, as well as an autographed copy of Hitler’s hateful tome Mein Kampf and two of his paintings, landscapes distinctive because of their artist, not their artistry. The manor’s grounds include a sculpture park arrayed with statues of notorious Communist leaders, a so-called “garden of evil” populated by Marx, Lenin, Stalin, Tito, Castro, Ceausescu, and other villains perhaps more obscure but nonetheless malignant, such as Gavrilo Princip, the assassin of Archduke Franz Ferdinand who precipitated World War I.

Why would Harlan Crow harbor such things? Of course, they are rare and valuable commodities, which might command a considerable price if sold, and which conspicuously display the inestimable fortune of their possessor. They are the prizes of Crow’s wealth. But his collection is not merely an investment, uncurated, or randomly compiled. These things hold meaning beyond their financial valuation, and they help define the man who owns them. If Crow tells stories through them, they tell stories about him.

Maybe Crow’s despots in bronze and stone function like big game trophies, displaying dominance over one’s quarry or foes. Or maybe they are a snarky, conservative troll to antagonize liberal critics, representing Crow’s supremacy over his opponents. They allow him, literally, to crow. Defenders argue their benign didacticism, marking the triumph of good over evil and reminding us of what to hate. In fact, new sorts of institutions—memorial museums—emerged after the Second World War that were designed to confront evil, to teach, memorialize, and heal in the wake of cataclysms, the Holocaust most prominently. But these institutions commemorate victims, not perpetrators like those assembled by Crow. Despite the rationales, Crow’s garden of evil does not teach or heal. It pays implicit homage to the evildoers and their power, deadening viewers to the full measure of their horrific ideas and acts.

It’s not really possible to renovate disgraced public monuments, unlike structures or institutions saddled with an unfortunate name, which can be changed and repurposed. Fort Benning recently became Fort Moore; Fort Bragg, Fort Liberty; Fort Hood, Fort Cavazos. But a statue of Robert E. Lee or Josef Stalin is inescapably a statue of Lee or Stalin. Neither can be rehabilitated by unilaterally rechristening them Martin Luther King or Lech Walesa. Crow doesn’t try and likely doesn’t care.

Crow’s unnerving monuments and memorabilia connect us to a reprehensible past, revivifying that which is sinister and frightening and, even for Crow perhaps, sordid and shameful. As one visiting reporter noted, the Nazi artifacts are placed in cabinets, “out of the view of visitors,” controlling their ability to “say” indiscreet things. Such materials evoke the lynching postcards and other grisly souvenirs once prized by white supremacists, kept privately as racist talismans. Broader public scrutiny transformed them into appalling objects, atrocious relics. Recent revelations thus pose some uncomfortable questions. Has Crow collected Thomas? And what do his relics say about him, and about us?

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185764 https://historynewsnetwork.org/article/185764 0
What We Can Learn From—and Through—Historical Fiction

Novelist Anna Maria Porter, engraving The Ladies' Pocket Magazine (1824)

This image is available from the New York Public Library's Digital Library under the digital ID 1556053: digitalgallery.nypl.org → digitalcollections.nypl.org

I have been a local historian for many years, but turned to historical fiction to tell a specific story for which there were no sources. There was a sense of going to the “dark side” in doing so, yet at the same time I was able to illuminate things that do not appear in the historic record.  I suspect that there could be a lively debate online about what good historical fiction can accomplish—and also the misuse of history by those who write historical fiction.

As a local historian I tried to be true to the sources I found; to be trusted by readers. In the case of the dozen women who crossed the country in 1842, members of the first overland company to set out for the Pacific Northwest, I could find little. With no verifiable facts, but knowledge that women were present, I turned to fiction to put women in the picture and wrote Lamentations: A Novel of Women Walking West (Bison Books, an imprint of the University of Nebraska, 2021). To someone like Gore Vidal, that made perfect sense; he thought history should not be left to the historians, “most of whom are too narrow, unworldly, and unlettered to grasp the mind and motive,” of historical figures. E. L. Doctorow would agree, but more agreeably, writing that “the historian will tell you what happened,” while the novelist will explain what it felt like. The historian works with the verifiable facts—fiction is a step beyond.

Historical fiction is generally dated to Sir Walter Scott, beginning with Waverly in 1814. It turns out, however, that Scott was not the first historical novelist. Devoney Looser has just published Sister Novelists (Bloomsbury Press, 2022) about Maria (1778-1832) and Jane (1775-1850) Porter, driven by poverty, who wrote popular historical novels beginning in the 1790s. A Wall Street Journal reviewer in 2022 noted that “Maria was a workhorse, Jane a perfectionist. Between them they wrote 26 books and pioneered the historical novel.”

There have been only a few academic treatments of historical fiction. Ernest Leisy issued The American Historical Novel in 1950 and George Dekker wrote American Historical Romance in 1987, both interested in chronological periods, but neither man created, or exhibited, much enthusiasm for it. Yet, in 1911 James Harvey Robinson wrote in an essay titled “The New History,” published in the Proceedings of the American Philosophical Society, where he observed that historians need to be engaging, even while “it is hard to complete with fiction writers.” He stated

History is not infrequently still defined as a record of past events and the public still expects from the historian a story of the past. But the conscientious historian has come to realize that he cannot aspire to be a good story teller for the simple reason that if he tells no more than he has good reasons for believing to be true his story is usually very fragmentary and uncertain. Fiction and drama are perfectly free to conceive and adjust detail so as to meet the demands of art, but the historian should always be conscious of the rigid limitations placed upon him. If he confines himself to an honest and critical statement of a series of events as described in his sources it is usually too deficient in vivid authentic detail to make a presentable story.

The historian Daniel Aaron took the genre of historical fiction seriously in a 1992 American Heritage essay in which he castigates Gore Vidal. Aaron however conceded that “good writers, write the kind of history [that] good historians can’t or don’t write.”

Aaron quotes Henry James, who thought of historians as coal miners working in the dark, on hands and knees, wanting more and more documents, whereas a storyteller needed only to be quickened by a letter or event to see a way to share it with readers or use it to illuminate a point about the historical past. He recognized that genres of reading had changed. In the 19th century we read historical tomes, mostly about the classical world or of British and European war and political alignments, but in the last quarter of the 20th century “so-called scientific historians left a void that biographers and writers of fictional history quickly filled.” Aaron cites inventive novelists who have perverted history for a variety of reasons, using Gore Vidal as his prime example. Vidal thought of historians as squirrels, collecting facts to advance their careers. But Vidal does not get the last word.

Professor Aaron recognized that historical fiction had moved from a limited earlier model focused on well-known individuals to serious re-tellers of history who have “taken pains to check their facts and who possess a historical sensibility and the power to reconstruct and inhabit a space in time past.” What a lovely description of some of the best of our contemporary historical fiction.

But what of putting women into the past where they often do not appear? Addressing this issue, Dame Hilary Mantel noted in her 2013 London Review of Books essay “Royal Bodies” that

If you want to write about women in history, you have to distort history to do it, or substitute fantasy for facts; you have to pretend that individual women were more important than they were or that we know more about them than we do.

Despite my great admiration for Dame Hilary, I think we can deal with the issue of women in the past by honoring their lives in such a way that does not turn them into twenty-first century heroines but as women who found themselves in situations they might not have wished, and did what they needed to do, thought about their circumstances, and dealt with what they found they had landed in. They, as we, are each grounded in our own time, deserve credit for surviving, and should be appreciated for our observations of life around us.

We should respect the historians’ knowledge of time and place and the novelists’ intuition that is sometimes spot-on. An example: in trying to explore the moment when the buttoned-down eastern women in 1842 encountered a band of Lakota, then identified as Sioux, I wondered what the women might have thought of those bronzed warriors whose clothing left much of their chests and shoulders bare. What would the women walking west have thought about? When I read the paragraph I had written to an elderly friend, she went to her desk and pulled out a letter from an ancestor who had crossed Nebraska, walked over South Pass, and on into Oregon. And that ancestor, in the 1850s, had said exactly what I had imagined. Sometimes, the imagined past is as we conceive it to be because we have grasped the knowledge of time and place on which to activate believable players.

My desire in Lamentations was to hear what the women were thinking, and sometimes saying to each other, but within the context of that century when much that was unorthodox could not be said aloud. I wanted to show how a group of people traveling together would get to know each other, rather as students in a class know that one was from Ohio and another played hockey. We do not know others fully, but from the vantages we are given. I wanted to display how the women gained information, and then passed it along; how tragedies were dealt with; how personalities differed, and how, in the end, Jane matured. I wanted to bring women of different generations together, to show discord among sisters, to think about what was important when dismantling a home, how women fit into the daily account of miles and weather and sometimes events kept by the company clerk. I wanted to explore what it was like to answer a longing for new beginnings, for a journey when one is the first to make it. I am interested in names and what they mean, in the landscape what how one travels through. I wanted to hear the women speak when the records do not.

Historians need to be conscious of the audience we/they hope to have and perhaps can learn something about style and sense of place from the writers of historical fiction. Academic and local history can be told vividly; good history can also have good narrative but also, that some historical fiction tells a story that a historian cannot. I have written this to praise historical fiction when it respects the line between our times and the past, when it adheres to the known-truth and does not pervert it for excitement—or for book sales. I appreciate Daniel Aaron who thought historical fiction was worth taking seriously, and for all those writers who have brought the past alive in this form.

Fiction is not the only way to explore the past, but historical fiction can attract readers to wonder and speculate and then explore the past in other forms. A friend said that as a child, reading fiction of other times led her to read history and then become a historian. Aaron wrote that historical fiction gives “us something more than the historical argument.” It can bring alive an era, a person, a moment in time so that we meet the past as it was, not as we might want it to have been.

                                                                                                                                   

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185767 https://historynewsnetwork.org/article/185767 0
White House Speechwriter Cody Keenan on the Crucial 10 Days of the Obama Presidency

Cody Keenan (Photo by Melanie Dunea)

Other than being able to string a sentence together, empathy is the most important quality in a speechwriter. The ability or at least the attempt to understand your audience, to walk in their shoes for a little while, even if empathy will never be a perfect match for experience.—Cody Keenan, Grace

Ten days in June 2015 were some of the most intense during the presidency of Barack Obama. The president was awaiting US Supreme Court decisions on the fate of the Affordable Care Act and marriage equality. And, on June 17, a hate-fueled white supremacist shot to death nine African American worshippers at a historic church in Charleston, South Carolina.

Chief White House speechwriter Cody Keenan focuses on this extraordinary period in his revelatory and lively new book Grace: President Obama and Ten Days in the Battle for America (Mariner Books).

In response to this perfect storm of historic events, Mr. Keenan drafted memorable speeches and a heartfelt and now immortal eulogy for Reverend Clementa Pinckney and other victims of the Charleston violence. And that address moved beyond a eulogy with the president’s powerful plea for unity and reconciliation and his surprising segue as he led the congregation and the nation in singing “Amazing Grace.”

In Grace, Mr. Keenan recounts highlights of his career as a speechwriter as he describes the tumultuous ten days. The reader immediately senses the demands of working for a president who was himself the former editor of the Harvard Law Review and among the most celebrated writers and orators of the recent history. As Mr. Keenan puts it, “To be a speechwriter for Barack Obama is f---ing terrifying.” Mr. Keenan worked “to his limits” in his high-pressure position to provide President Obama with the best drafts possible. And it’s obvious from Grace that the two men were gifted collaborators who worked together with great mutual respect and admiration.

As he provides a behind-the-scenes perspective on White House operations, Mr. Keenan introduces key presidential aides such as Valerie Jarrett, Jen Psaki, Ben Rhodes, Jon Favreau and his speechwriting team. He also intersperses the book with the story of his romance with esteemed presidential fact-checker Kristen Bartoloni, who often challenged and corrected his writing. They married at the White House in 2016.

By 2015, President Obama had delivered more than a dozen eulogies for the victims of gun violence, including for those who died in the massacre where Representative Gabby Giffords was seriously wounded in Arizona and the horrific gunshot murders of 20 children and five adults in Sandy Hook, Connecticut. Mr. Keenan wrote those eulogies as well as the president’s now famous speech honoring the fiftieth anniversary of the 1965 March on Selma for voting rights and those peaceful protesters including civil rights icon, Representative John Lewis, who endured a bloody attack by police.

Mr. Keenan writes powerfully of the pain and sorrow that he and the president experienced in addressing yet another mass shooting in June 2015, that time with the added dimension of racist violence. The description in Grace of the creation of the president’s address for the funeral of beloved Reverend Clementa Pinckney is a case study in collaboration in the speech drafting process.

During the same sad week, Mr. Keenan wrote statements for the president to deliver if the Supreme Court gutted the Affordable Care Act and ended marriage equality. We now know that those speeches on the Court decisions weren’t necessary. And the eulogy for Reverend Pinckney will be remembered as one of the great presidential addresses. Mr. Keenan concedes that this eulogy was his most difficult assignment after working on more than three thousand speeches for President Obama.

Mr. Keenan’s heartfelt and moving memoir Grace shows how a gifted president and his devoted team worked together tirelessly for a more fair, more tolerant, and more just nation.

Mr. Keenan is best known as an acclaimed speechwriter. He studied political science at Northwestern University and, after graduation worked in the office of US Senator Ted Kennedy. After several years in that role, he earned a master's degree in public policy at the Harvard Kennedy School. He subsequently secured a full-time position with Barack Obama's presidential campaign in Chicago in 2008.

When President Obama took office in 2009, Mr. Keenan became deputy director of speechwriting in the White House. He was promoted to chief White House speechwriter during the president’s second term. He also collaborated with President Obama on writing projects from the end of his term in 2017 until 2020. He has said that he wrote his dream speech just four days before Obama left office—welcoming the World Champion Chicago Cubs to the White House.

Mr. Keenan is currently a partner at the speechwriting firm Fenway Strategies and, as a visiting professor at his alma mater Northwestern University, he teaches a popular course on political speechwriting. Today, he and Kristen live in New York City with their daughter, Grace.

Mr. Keenan graciously responded by email to a long series of questions on his new book and his work.

Robin Lindley: Congratulations Mr. Keenan on your engaging new book Grace, a revelatory exploration of your work as chief speechwriter for President Obama at an incredibly turbulent time. Before getting to that period, I wanted to ask about your background. You majored in political science at Northwestern University. What sparked your interest in politics?

Cody Keenan: Well, I enrolled at Northwestern as a pre-med student. I wanted to be an orthopedic surgeon after a football injury forced a knee reconstruction. Chemistry 101 weeded me right out, though. I just wanted to take biology.

But politics had always been an interest. My parents often argued about politics at the dinner table – my mom was a Kennedy Democrat from Indiana; my dad was a Reagan Republican from California – and whatever could make them so animated was something worth exploring. One value they both hammered into me, though, was the idea that I should do whatever I could to make sure more people had the same kind of opportunities I did growing up – and by the time I graduated from college, only one political party cared about that.

Robin Lindley: Did you have academic or other training in speechwriting?

Cody Keenan: No. Writing was something that always came naturally, and I think that came from being a voracious reader. I won every summer competition at the local public library. You can’t be a good writer without being a great reader.

Robin Lindley: You interned for legendary Senator Ted Kennedy after college. Did your duties in that role include speechwriting?

Cody Keenan: Not as part of the internship, or even the first position after that. Three months as an intern got me hired to answer his phones. I ended up working for him for almost four years in four different roles.

In 2004, when I was on his staff for the Committee on Health, Education, Labor, and Pensions, the Democratic National Convention was in Boston, his hometown. We all took a week off work to volunteer. I was on the arena floor the night that Barack Obama gave the speech that made him famous. He walked into the arena anonymous; he walked out 17 minutes later a global megastar. It shows you what a good speech can do.

Once we were back in Washington, I must have talked about that speech a lot, because that’s when my boss asked if I could write a speech. I don’t know if he meant did I have the time or did I know how, but it didn’t matter – I lied and said yes.

Robin Lindley: Senator Kennedy was known as a great legislator in the Senate who could work across the aisle. Did you work with him or his staff on any significant projects? What did you learn from that internship?

Cody Keenan: As an intern, one of my tasks was to read and route mail that came to the office. Perfect strangers were writing a senator – often one who wasn’t even their senator – to ask for help. There’s an act of hope involved in that. Even when it was a tough letter to read, even when you could see that the writer had wiped a tear from the page, they hoped that someone on the other end would care enough to help. I learned right away just how important this stuff is.

Later, as a staffer, I worked on all sorts of legislation. Kennedy was involved in everything. Health care, minimum wage, education, immigration, the Iraq War, the response to Hurricane Katrina, Supreme Court nominations – we were always busy. And with good mentors, I learned that just as important as the policy itself was often the way you communicated it.

Robin Lindley: What attracted you to working for President Obama during his first presidential campaign in 2007? Did you work as a speechwriter before his election?

Cody Keenan: Well, what struck me about that 2004 speech was that he described politics the way I wanted it to be – as this collective endeavor in which we could do extraordinary things that we couldn’t do alone. His only speechwriter at the time, Jon Favreau, called me early in the campaign and asked if I wanted to join the speechwriting team he was putting together. I said yes.

Robin Lindley:  What did you learn or do to prepare for work as a speechwriter for President Obama, one of our most celebrated American writers and thinkers even then? Did you go back and read works of some of the great White House writers such as Ted Sorensen, Bill Moyers, and Peggy Noonan? Did you read speeches by the likes of Lincoln, FDR, JFK, Churchill, and other memorable leaders?

Cody Keenan: I didn’t. I’d already read the canon of presidential hits, but to be a speechwriter for someone means writing for that specific person, helping him or her sound not like anybody else, but rather the best version of himself or herself.

Robin Lindley: I read that you didn’t personally meet President Obama until his first day at the White House in 2009. Yet, you had been working for him for a year and a half. What do you remember about your first meeting and your early days at the White House?

Cody Keenan: Yep – he visited Chicago headquarters maybe three times during the campaign. He was out campaigning! And when he did visit, it was for strategy sessions with his top aides and to address the entire staff at once, not to meet with his most junior speechwriter.

On our first day at the White House, he called me into the Oval Office because he’d seen my name at the top of speech drafts and he just wanted to put a face to the name. Those early days were drinking from a firehose: the economy was falling apart, millions of Americans had lost their jobs and their homes in just the four months before he took office, and millions more would in the first few months after. There was no honeymoon; we were busy trying to turn that firehose onto the fire.

Robin Lindley: Did you immediately start as a speechwriter once President Obama began work at the White House?

Cody Keenan: I did.

Robin Lindley: How does one prepare for a job that requires knowing the voice and propensities of the person they are writing for?

Cody Keenan: Well, I had a year and a half foundation from the campaign. I’d read his books to absorb his worldview, listened to the audio versions to absorb his cadence, and paid close attention to his edits. He was a writer. He was our chief speechwriter. And he was our top editor. I learned a lot just by poring over his edits to our drafts.

Robin Lindley: How did your relationship with President Obama evolve over his eight years in office? You wrote that working for this acclaimed writer could be terrifying. It seems he offered good advice to you such as having a drink and listening to Miles Davis or John Coltrane. Or reading James Baldwin. Did you see him as a kind of coach or mentor?

Cody Keenan: I was the junior writer on the team for the first two years, sitting across the driveway in the Eisenhower Executive Office Building. Then a series of high-profile speeches got me promoted to deputy director of speechwriting, and I moved into a West Wing office with Jon Favreau. Once he left after the second inaugural, I took over as chief speechwriter. So naturally, our relationship evolved – I went from seeing Obama every couple weeks to every week to every day.

I saw him as my boss. I guess as a writing coach of sorts. And sometimes even as an uncle or older brother who loved to dispense advice. He hosted my wife and our families and our best friends at the White House on our wedding day. It was his idea. He didn’t have to do that.

Robin Lindley: Are there other bits of President Obama’s advice that stick with you?

Cody Keenan: “Don’t impart motives to people.” That’s advice we could use more of.

Robin Lindley: Indeed. A big question, but can you give a sense of the speechwriting process? What sparks the process? Who is involved? What’s it like to collaborate with a team of writers and other staff?

Cody Keenan: He viewed speechwriting as a collaboration. He just wanted us to give him something he could work with. We wrote 3,477 speeches and statements in the White House, and believe it or not, he edited most of the speeches, even if lightly. But he couldn’t be deeply involved with all of them.

For any speech of consequence, though, we’d start by sitting down with him and asking “what’s the story we’re trying to tell?” Then the speechwriting team would talk over each speech, helping each other get started. Then we’d all go back to our own laptops and draft whatever speech we’d been assigned. The drafting was not a collaborative process. The revising was – with each other, but more importantly with him.

Robin Lindley: What’s the fact checking process for a speech draft before it goes to the president? It’s interesting that your future wife Kristen was one of the very diligent fact-checkers you relied on.

Cody Keenan: Yeah, she literally got paid to tell me I was wrong. Every day. For years. It was her team’s job to fireproof the president – to make sure he never said something he shouldn’t, with someone he shouldn’t be with, at a place he shouldn’t be visiting. They prevented countless alternate timelines where we’d have to do some cleanup in the press. They saved us from ourselves again and again.

Robin Lindley: Congratulations on your marriage to Kristen with the magnificent White House wedding. Your blossoming romance runs like a red thread through your book. You note that President Obama would stay up late at night to review and edit drafts of speeches he would give the next day. And you often received late night calls from him or met with him in the wee hours. How did those final hours work with a speech? It seems the president would often edit to the time of delivery.

Cody Keenan: He always edited in the wee hours of the morning. It’s when he preferred to work. It was rare that we were editing right up until delivery. If we were flying somewhere for a speech, he’d always go over it one or two final times on the plane. But he didn’t like chaos. In fact, the reason he edited so heavily, so often, was because he wanted the speech exactly the way he wanted it. Sometimes it was perfectionism. But it’s really just preparation.

Robin Lindley: What did you think when the president ad libbed or changed something from your draft as he spoke? I think you said something to the effect that he was a better speechwriter than all of his writing staff.

Cody Keenan: I loved it. I can’t think of a time I cringed at an adlib. He had a knack for it. It could be a little white-knuckled if he did it at the end of the speech when there’s no text for him to come back to. In that case, he’d have to build a new runway while he was speaking on which to land the plane.

Robin Lindley: When does humor come into the mix? Do you write for events such as the White House Correspondents Dinner? President Obama had some zingers for his eventual birther successor at these events.

Cody Keenan: Those were our most collaborative sets of remarks. The entire team would pitch jokes, and we’d reach out to professional comedy writers to solicit their help. We’d start out with about 200 jokes and whittle them down to the 20 funniest. Sometimes, none of your jokes would make the cut. You’ve got to have a thick skin.

Robin Lindley: And you and the other speechwriters did not use a template such as this speech is on the economy or this speech is political, so we’ll use the file template X or Y. You were responsible for more than three thousand speeches, yet it seems each speech was approached as a unique project.

Cody Keenan: Yes and no. We never used a template. But while each individual speech should tell a story, so should all speeches. What I mean by that is, we were mindful that every speech we wrote fit into a longer narrative arc – both of his presidency and his entire political career.

Robin Lindley: You worked for the president through his eight years in office. How did you come to focus on ten days in 2015 in Grace as the president dealt with the horrific 2015 mass murder of nine Black parishioners by an avowed white supremacist at Mother Emanuel Church in Charleston, South Carolina. The president then also was preparing to address two impending Supreme Court decisions that would determine the fate of the Affordable Care Act and marriage equality.  

Cody Keenan: Yeah. People will remember all of the stories and all of the events in this book. They won’t remember that they all happened in the same ten-day span. I mean, that in and of itself is a story that demands to be told. In addition to a massacre carried out by a self-radicalized white supremacist, there was a very real chance that the Supreme Court would say no, people who work two or three jobs don’t deserve help affording health insurance; no, gay Americans don’t get to get married like the rest of us; all of those people are now second-class citizens. And the first Black president has to serve as the public narrator and provide some moral clarity for all of this.

Someone once described it as ten days too implausible for an entire season of The West Wing. But it’s also what those events symbolized and how they fit in the broader, centuries-long story of America – whether or not we’re actually going to live up to the ideals we profess to believe in. Whether we’re going to stand up to white supremacy, and bigotry, and people who profit from inequality and violence. And that week, the answers were all “yes.”

Robin Lindley: With the Charleston massacre, the president had to address another mass shooting and he was tired of giving eulogies after the murders at Sandy Hook and all of the other heartbreaking mass shootings during his term in office. How was his speech at Mother Emmanuel Church different from previous addresses? What was your role in creating this memorable speech? How did the speech go beyond a eulogy to become a message of reconciliation?

Cody Keenan: We had done over a dozen eulogies after mass shootings at that point. And this goes back a few years, the shooting in Newtown, Connecticut, where 20 little kids were murdered in their classrooms, along with six of their educators, was right after he’d been reelected.

And he put aside his second term agenda right out of the gate to try to do something about guns, because what an abdication of leadership that would be if he didn’t. And he had a little boost by Joe Manchin and Pat Toomey, an arch conservative from Pennsylvania with an A-rating from the NRA. They both had one. They decided to work together on a background checks bill. And even though we knew the odds in the Senate would be long, that gives you something to try for. And so, we traveled the country for a few months. He made it a centerpiece of his State of the Union address. Big, emotional, powerful ending. And in the end, in April, Republicans blocked a vote on it with the parents of the Newtown kids watching from the gallery.

And that’s about as cynical as I’ve ever seen Barack Obama. Yet he went out and spoke in the Rose Garden with those families. I handed him a draft of the speech and he said, look out, I'm going to use this as a as a template, but I’m just going to wing it. And he came in after that speech into the outer Oval Office, which is this room just off the oval where his assistants sit, and he was almost yelling once the door closed, he said, “what am I going to do the next time this happens? What am I going to say? I don’t want to speak. If we’ve decided as a country that we’re not going to do anything about this, then I don’t want to be the one who closes the cycle every time with a eulogy that gives the country permission to move on.”

Ultimately, we did decide to do a eulogy after Charleston, and it was his idea to build the structure of the speech around the lyrics to “Amazing Grace.”

Robin Lindley: I think everyone was surprised and moved when President Obama sang “Amazing Grace” during the Charleston speech. Were you surprised or was that part of the plan for the speech?

Cody Keenan: That, too, was his idea. He told me on Marine One that morning that, if it felt right in the arena, he might sing it.

Robin Lindley: You now teach speechwriting at your alma mater Northwestern University. Do you have any other advice for prospective speech writers?

Cody Keenan: It’s fun, training a new generation of speechwriters and trying to convince them that public service is worth it. What I didn’t expect was that my students would end up teaching me quite a bit in return. There’s an impatience to their generation that mine didn’t have to have. Politics and the pace of change is now existential for them in a way it hasn’t been since schoolkids were doing duck and cover drills during the Cold War. They’re doing those duck and cover drills again because of guns. They can see an end to their future because of climate change.

And let me tell you, when they see a party more into policing books than policing assault weapons; when they see a party more exercised about drag queens than about climate change – they feel a real disdain there. I want them to harness it, though, in a productive way. And part of that means telling them the truth. To tell them that change has always taken time isn’t fun. To tell them that they’re not always going to win isn’t fun. To tell them that even when they vote in every election, they’ll never elect a leader who delivers everything they want. Well, that’s just not inspiring. But it’s also true.

Nobody ever promised us these things. That’s democracy. But here’s the thing about democracy: we get to refresh it whenever we want. Older generations aren’t entitled to their full tenure. So, while I counsel patience and realism, I also fan the flames of their impatience and idealism. I tell them to join a campaign now, to start an advocacy group now, to run for office now. Stay at it not just until the people in power are more representative of what America actually is, but until they’re the ones in power themselves. Then make the system your own. Faster, smarter, more responsive to the needs of a modern, pluralistic democracy. And one way to do that is through my cardinal rule of speechwriting: help more leaders talk like actual human beings.

Robin Lindley: You also continue to work as a speechwriter and you note that you worked with President Obama after his tenure in office. Did you consult with the president on writing projects such as his monumental memoir Promised Land?

Cody Keenan: I worked for him full-time for four years after we left the White House, ultimately leaving after the 2020 election so that I could devote my time to writing Grace.

Robin Lindley: What sorts of clients do your work with as a speechwriter now?

Cody Keenan: All kinds. Progressive candidates, nonprofit, academic, and corporate. Our rule is that each client has to be putting more into the world – hopefully much more – than it’s taking out. But the best part of it is to be surrounded by a team of idealistic young speechwriters again. I missed that over the four years after the White House.

Robin Lindley: Would you consider working with a president at the White House again?

Cody Keenan: Maybe. Depends on who it is. For a speechwriter, it really, really depends on who it is. Speeches require a deeper relationship than a lot of other staff positions. But I’m also older and have a young daughter. Both of those things make the grind of the White House much less attractive.

Robin Lindley: It seems we’re more divided now than during the Obama years. I never thought I’d see Nazi rallies in America in the 21st century. Where do you find hope for our democracy at this fraught time?

Cody Keenan: My students. While politics as it is may make them cynical, they’re not cynical about America and its possibilities. Somehow, they’re not as plagued by fear or suspicion as older generations; they’re more tolerant of differences between race and culture and gender and orientation, not only comfortable navigating all these different worlds but impatient to make them all fairer, more inclusive, and just plain better. They’re consumed with the idea that they can change things. They just want to do it faster.

Robin Lindley: Is there anything you’d like to add for readers about your book or your work?

Cody Keenan: You’re going to love Grace. I wrote it because it’s a hell of a story and it’s the most intimate look at Obama’s approach to speechwriting that exists.

But I also wrote it, as I told Stephen Colbert when he had me on, to blow up people’s cynicism about our politics. Because politics isn’t some rigid system we’re trapped under. It’s us. It’s only as good as we are. That’s why I was so happy when Obama called it “an antidote to cynicism that will make you believe again.”

But I was just as happy to read a review that described it this way: “Grace is a refreshing departure from the flood of scandalous ‘literary’ flotsam that typically washes up in the wake of the transfer of power. This book might not make breaking-news headlines, but it just might restore a little faith in the presidency and the backstage men and women who work around the clock to fulfill the chief executive’s promises to the American people.” The publicist at the publishing house didn’t love the part about “breaking-news headlines,” because that’s what sells books – but I was proud to write it the way I did. There’s no sleazy tell-all in this book, but there are a bunch of great never-before-told stories about what it’s like to sit alone with Obama and unlock the right words for a fraught moment.

Robin Lindley: Thank you Cody for your generosity and thoughtful comments. Your book captures the reality of work in the tense and often exhilarating environment of the White House with a president who was devoted to creating a more just and tolerant nation. Best wishes on your continuing work and congratulations on Grace.

Robin Lindley is a Seattle-based attorney, writer, illustrator, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, visual culture, and art. Robin’s email: robinlindley@gmail.com.  

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/blog/154750 https://historynewsnetwork.org/blog/154750 0
Dangerous Records: Why LGBTQ Americans Today Fear the Weaponization of Bureaucracy

Prisoners at Sachsenhausen concentration camp wear triangle badges indicating the nature of their offenses against Nazi social code (pink would indicate homosexuality). National Archives and Records Administration, 1938.

The recent rise of far right political movements in the United States and globally has prompted historical comparisons to the Nazis. The atrocities committed by the Nazis have been studied widely, particularly in reference to the Jewish victims of the Holocaust, but it is also important to understand lesser-known victims and the ways that prior discrimination affected their persecution. While focusing on the pre-war experience it is crucial to understand how the Nazis relied on bureaucratic information to know whom to target, especially when the classification was not an obvious ethnic or religious one (such as assimilated and secular Jews, or gay men, lesbians, and others persecuted for gender or sexual behavior). Today, there are important lessons to learn about the dangers that bureaucratic information gathering, combined with escalating prejudice and vilification, could present.

The rise of the Nazi party in Germany also brought about several laws restricting access to literature and laws regarding the treatment of what we today would refer to as LGBTQ+ people. Paragraph 175, a law criminalizing same sex male relationships, was established in 1871, but revised by the Nazi party to be more inclusive in regard to the actions that could be punished. Queer men were targeted early in the Nazi regime, which placed heavy blame on them for losing the First World War. Nazi ideology justified discrimination and repression by claiming that a lack of masculinity was a contributing cause of the country’s downfall and economic depression. Though only half of the 100,000 arrested for the alleged crime of homosexuality were persecuted, this figure is still large enough to raise an interesting question about how the Nazis knew whom to target and where the information was coming from. Political factors appear to be involved, because a majority were prosecuted within six weeks after Heinrich Himmler’s assumption of control of internal security in 1943. Each man was reported in a similar manner whether that was a private individual report, a police raid, or utilization of the “Pink List.”

The practice of information gathering towards members of minority groups by bureaucratic organizations has a startling history of being used for oppressive ends, particularly by the Nazis. A clear example of this includes the utilization by the Nazis of the “Pink List," a list compiled by organizations of support such as the Scientific Humanitarian Committee or reported by private individuals and then held by the police. The Scientific Humanitarian Committee aimed for “Justice Through Science” and espoused the biological theory of homosexuality, the idea that sexuality is an innate biological feature rather than a characteristic of weakness and psychological deviance. The SHC was targeted by the Nazi party early in the rise of Hitler due to their propensity to advocate for homosexuals. The SHC kept lists of homosexual Germans for support and scientific reasons but those lists were seized by the Nazis then utilized to target the homosexuals on the list.

A clear example of the danger that could befall a young gay man who interacted with police on any other matter is seen with the story of Pierre Seel. Seel arrived at his local police station to report a stolen watch and, when questioned about the specific circumstances, revealed that he had come from Steinbach Square, a well-known hangout for gay men seeking each other's company. After experiencing intense questioning, he was released and assured that nothing would come of the compromising information, but three years later he was arrested as a suspected homosexual due to the list he was placed on after he left the police station. This list was compiled by police and security forces over the years, and was augmented by confessions made by imprisoned gay men who were raped and tortured to compel them to add additional names to the list. The Pink List is a clear example of how dangerous information that categorizes someone into a minority group can be, particularly in the hands of those in power with ill intentions.

While the Holocaust is an unmatched and exceptional example of cruelty and systematic persecution of social outgroups, it is nevertheless important, even crucial, to recognize similarities between those events and the present, especially where prejudices join with bureaucratic state power. Today, transgender Americans are being framed as deviants, accused of undermining traditional gender roles, and described as “groomers'' and child sex abusers. Armed vigilantes have harassed people attending drag performances, and activists are seeking to remove books about gender and transgender experiences from schools and libraries. When the power of the state aligns with these expressions of prejudice and identification of outgroups as a threat to children, family and society, there is real cause for concern.

Anti-LBGTQ sentiment has been particularly vociferous in Texas. Texas Attorney General Ken Paxton’s recent request for a list of individuals who have changed their gender on state-issued driver’s licenses, as well as other departmental documents, has concerning similarities to the “Pink List” compiled by Nazi officials in 1930’s Germany. The request for the list itself made transgender Texans subjects of surveillance, implying the state views them as dangerous. According to an email sent on June 30, 2022 by Sheri Gipson, the chief of the DPS’s driver license division, the Attorney General’s office “wanted ‘numbers’ and later would want ‘a list’ of names, as well as ‘the number of people who had a legal sex change’.” This first request produced over sixteen thousand results. Unfortunately for the Attorney General, it was difficult for the state agencies to meet his request. One issue involved gender changes to correct filing mistakes (a cisgender person’s gender was accidentally recorded inaccurately, and the change affirmed their identity). A subsequent data request attempt led to narrowing the data to only court-ordered document changes, which would identify transgender people specifically. Although the agency could not accurately produce this data, this instance, alongside the various laws being introduced throughout the state such as the prohibition of gender affirming care and the limiting of LGBTQ+ lessons in school, brings up the startling question of the kind of damage that information gathering could do not only presently, but also in several years.

The weaponization of personal information available to state organizations should not be taken lightly. It has, and will continue to, present danger to those being targeted by the state as threats. Laws to target transgender children by restricting their access to gender-affirming care or affirming ideas in books have become commonplace in several Republican led states, but an explicit attack on legal adults adds an element that lends the question to where it will stop and who will stop it. These laws send a clear message that the right does not want transgender people to have a presence in society, both within everyday life and in the media surrounding them. The proposed laws restricting gender affirming care, along with classifying the parents of transgender children receiving gender affirming care as child abusers, LGBTQ+ lessons in school, and banning books and media that showcases queer people attempt to erase the queer experience both from modern life as well as in history.

All of these efforts depend on being able to identify those who are not living with the gender assigned to them at birth. Bureaucratic records may not be considered dangerous by the public, but the ability of government officials to access the records of those whose place in society they are seeking to erase can lead to dangerous consequences in the future. Other vulnerable groups will be targeted, and it is necessary to examine the historical implications and repercussions of the blatant targeting of these groups.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185765 https://historynewsnetwork.org/article/185765 0
150 Years of "Zero-Sum Thinking" on Immigration Last week Title 42, a Trump-era policy that has limited immigration for the last three years, expired. Still, the Biden administration warned people arriving at the border that “the border is not open” and anyone arriving there would be “presumed ineligible for asylum.” In conversation, Dr. Carly Goodman revealed the 150-year-old history behind the US government’s restrictionist stance.

Specifically, Dr. Goodman explored this history through the lens of the Diversity Lottery. Not by coincidence, Dr. Goodman is the author of Dreamland: America’s Immigration Lottery in an Age of Restriction. She’s also a Senior Editor of Made by History at the Washington Post, which provides fantastic daily commentary from the nation’s leading historians.

A condensed transcript edited for clarity is below.

Ben: Dr. Goodman, thank you so much for being here.

CG: Thank you, Ben. 

Ben: Today I'd like to explore the history of the Diversity Visa as part of a broader exploration of US immigration history writ large.

Before we go back in time, could you please just give a quick overview of what the lottery is? 

CG: Sure. The Diversity Visa Lottery has been part of our immigration policies and laws since the Immigration Act of 1990. It's an annual lottery, open to almost every country. People from eligible countries can put their names in to register for the lottery, and if they are selected, they can then apply to become lawful permanent residents of the US. 

The first lottery was held in June of 1994, and it remains one of the very few pathways to legal status in the US. It's restrictive in some sense—you still have to apply for the visa and fit qualifications like having a high school diploma or its equivalent—but also much more expansive than many parts of our immigration system.

Ben: I think that’s a good segue into exploring the system's restrictive nature, beginning in the 1870s. What were the first immigration restrictions, imposed at that time?

CG: I’ll mention that my colleague, historian Hidetaka Hirota, has written about state-level restrictions prior to the imposition of federal immigration controls.

However, when the US federal government started to think about imposing regulations on immigration, it began by excluding almost all Chinese immigrants in the 1880s, who were seen as competing for work and land in the American West. This set an enduring pattern wherein immigration would be racialized.

Ben: The next big evolution in immigration policy occurred in the 1920s. What happened then?

CG: This time period is really relevant to the rise of the Diversity Lotter later on.

In the early 20th century, eugenicists looked around at growing numbers of immigrants from Europe—Italians, Poles, Jews (including my ancestors), for example—and they really didn't like how the American nation (as they perceived it) was changing.

So, officials came up with national origins quotas that imposed severe numerical restrictions on the entry of people they deemed undesirable, especially southern and eastern Europeans (as well as Asians), who were seen as almost a contagion on the nation.

The national origins quotas were explicitly eugenic in nature, and they remained in place from 1924 until a major immigration reform in 1965 finally dismantled them. The Immigration Act of 1965, also known as the Hart-Celler Act, instead emphasized family ties as one of the main ways to legally migrate.

Ben: You write that the shift toward family ties wasn’t purely altruistic.

CG: No, in some ways it was a compromise meant to mollify bigots who hoped that prioritizing family ties would lead to primarily white family members joining their relatives in the States.

Ben: Related, you quote the famous (but problematic) historian Arthur Schlesinger Jr. who worried that the arrival of different immigrant groups in the US might “shift the balance from unum to pluribus.”

To continue speaking in Latin, did Schlesinger’s ad nauseating fear come to fruition?

CG: Well, in addition to creating the family ties route to becoming citizens, Hart Celler imposed the first numerical restrictions on immigration from the Western Hemisphere. There’d long been migration from Latin America, both because the US destabilized many countries there, leading people to leave, and because of the need for workers here.

After 1965, Latin Americans who’d been coming to the US were still coming, but now they ran up against numerical limits. As historian May Ngai discusses in her work, Hart Celler thus created the problem of undocumented immigration. Some would say that's one of the most important legacies of the act. 

Ben: Moving into the 80s, how did the Irish defy the growing conceptions of illegal immigration, and what reforms did they push for?

CG: There's a long, storied history of Irish immigration to the US. For example, I live in Philadelphia, and we have a vibrant Irish-American community here.

Ben: The Philly cheese steak is famously an Irish creation.

CG: Um, it's closer to Italian.

Ben: ...right.

CG: Anyway, that sense of heritage was foremost on Irish immigrants' minds in the 80s. They felt the injustice of having to leave Ireland amid an economic crisis, just as their grandparents had, but encountered the added injustice of restrictions on their access to the US. Many Irish people came as visitors and overstayed their visas to try and find work. They were undocumented and white, contrary to the more racially motivated stereotypes of people without legal status that burgeoned in the 70s.

Congress, meanwhile, had been working on passing immigration reform. In 1986, legislators passed bipartisan reform that combined new enforcement measures with a couple of legalization programs to help people gain status and a path to citizenship.

Most of the Irish hadn’t arrived in time to qualify for the legalization, so members of the Irish communities in major cities got together to try to get legislation passed that would help them out. Basically, they identified the Immigration Act of 1965 as their problem, which reduced the number of visas available to them under the laws from the 1920s.

But it wasn’t cool to say, let’s bring back the eugenicist quotas that privilege white people. Instead, congresspeople close with the Irish community—Brian Donnelly and John Kerry from Massachusetts, for example—began asking, what if we could create pathways for countries that get very few visas these days? Countries like, oh, I don't know... how about Ireland?

There were all kinds of proposals for how to do this, but they came up with a lottery because it was the cheapest way to administer it. They opened it up to all countries in the world that had sent fewer than 50,000 immigrants to the US in the previous five years.

That’s how the Diversity Lottery began.

Ben: And surprisingly, African countries, long ignored or excluded in US immigration policy, maybe benefitted the most from the Irish-led reform, is that right?

CG: Exactly. The lottery began in 1994. The following year, 6.5 million entries from around the world vied for just 55,000 visas. 

I first learned about the lottery by speaking with people in places like Ghana, Nigeria, and Cameroon. It seemed to foster a sense of admiration for the US and for its openness and diversity. In some ways, the lottery format, relying on chance, disrupted people's perception that they were being turned away from the US because of their African passports and a racist system.

Ben: At the same time, you point out that when a person from an African country was lucky enough to win the lottery, they then encountered racism in the US. It’s like: you pack your bags, ready to embark on new life, and then you have to face all of the US' own baggage.

CG: Yep, and the lottery aside, the 90s turned out to be a time of more immigration restriction, not less. Levels of nativism reached points not seen since the early 20th century, and politicians on state and federal levels began to see what anti-immigrant demagoguery could do for them. Even policymakers who were supposed pro-immigration, like Bill Clinton, were relying on and expanding a growing system of immigrant detention.

After 9/11, restrictions only intensified. Under George Bush, the government began to view immigration as a threat. More and more money was put into border militarization and enforcement. 

Ben: Bringing us into the present day, you talk about how Obama and then Biden effectively maintained continuity with the past in terms of restrictive immigration procedures. Biden of course struck down Trump's African and Muslim travel bans, but he's also kept in place lots of Trump’s policies at the border.

How do you view the lottery within this still-restrictive context?

CG: Well, there’ve been efforts to dismantle the lottery over the last 20 years, and a lot of critics’ arguments are really built around zero-sum thinking; around the idea that this was a weird policy created for the Irish, and we’re already pretty diverse, so can’t we use the visas for something better?

But, that’s zero-sum thinking. As it turns out, we could just create more visas for more people. This leads to one of the central points I’m trying to make: Since the 1870s, we’ve had a restrictionist, gatekeeping system, but it’s possible to widen access if we want to. 

The thing preventing us, as it’s always been, is racism. When Donald Trump called for the lottery to be replaced with a system that would be based on what he calls “merit,” he meant white people (which he clarified). Policymakers know that any reform to end the lottery would diminish the number of visas available to Africans and limit one of their few legal pathways to coming to the US.

So, I study the lottery because it’s a part of our immigration system that we really never hear about, and it just works. It's operated fine for thirty years. I don't want to say that the lottery is a good thing for the people who feel that they have no choice but to enter, but I know that more inclusion and more access serve our communities in so many ways, despite our government’s best attempts to limit migration for the last 150 years.

Ben: A good concluding note. I might suggest that your book be called Dreamland: A Little More Pluribus, A Little Less Unum.

CG: Ha!

Ben: Thank you so much for your time today, Dr. Goodman. This has been a pleasure.

CG: Thank you for having me.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/blog/154749 https://historynewsnetwork.org/blog/154749 0
The Mexican War Suggests Ukraine May End Up Conceding Crimea. World War I Suggests the Price May Be Tragic if it Doesn't

"American Army Entering the City of Mexico" by Filippo Constaggini, 1885. Architect of the Capitol. 

In April 1846, the United States invaded Mexico after a highly disputed incident at the border. Freshman Congressman Abraham Lincoln challenged President James Polk’s account of Mexican provocations as misleading and demanded to know the “spot” where they supposedly took place.

None of the major European powers got involved on either side. Great Britain remained officially neutral during the war, although it objected to an American blockade that interfered with its trade with Mexico. France was uninvolved but insisted that Mexico remain an independent nation.

By September 1847, American forces had captured the Mexican capital and forced surrender. An overwhelmed Mexico signed the 1848 Treaty of Guadalupe Hidalgo, ending the war and transferring to the United States over half of its territory, including modern day California, Nevada, Utah, and most of present day Colorado, New Mexico, and Arizona. Mexico was also forced to drop its claims to the former Mexican province of Texas and accept the Rio Grande as the new border between the countries. In return, the United States paid Mexico a consideration of fifteen million U.S. dollars, worth between 500 and 600 million dollars in today’s money.

Mexico is never going to receive its stolen territory back. The annual economy of California today alone is $3.5 trillion, approximately three times that of Mexico.

Fast forward to 1913, when Europe was divided into two military alliances. The Central Powers  (Germany, the Austro-Hungarian Empire, and Italy, later joined by the Ottoman Empire and Bulgaria), faced off against the Triple Entente (Great Britain, France and the Russian Empire, later to be joined by the United States and Italy when it changed sides). The alliances provided some stability in Europe, much like NATO and the Warsaw Pact alliances did during the Cold War, but also set conditions for the situation in Europe to rapidly spiral out of control.

On July 28, 1914, Austria-Hungary invaded Serbia after the assassination of the Austrian Archduke in Sarajevo, which had been annexed by Austria-Hungary in 1908. The assassins hoped to liberate Bosnia and Herzegovina from Austro-Hungarian rule. On August 8 Montenegro joined in the defense of Serbia, and on August 9 Russia, an ally of Serbia, attacked German and Austro-Hungarian positions. Meanwhile, Germany invaded Belgium, bringing France and Great Britain into the war. In the east, Russia collapsed, but in the west the two alliances stalemated. The war dragged on until the German collapse in the fall of 1918. Military and civilian casualties during World War I, deaths and injuries, were an estimated 40 million people. The punitive treaty that ended the war became an underlying cause of World War 2 and the deaths of another 80 million people.

Fast forward again, this time to more recent decades. With the collapse of the Soviet Union, Ukraine and Russia became independent countries, with the former Soviet Black Sea naval base now located in Ukraine after Crimea was administratively transferred from Russia to Ukraine in the 1950s. In 2014, a Ukrainian government allied with Russia was replaced by a westward leaning government, and Russia seized Crimea and its warm water naval base in violation of international agreements established after World War II protecting the territorial integrity of nations. In response, western nations placed economic sanctions on Russia, and NATO expanded eastward and considered admitting Ukraine into the alliance. Russia responded by invading Ukraine with the goals of putting a friendly government into power there and annexing territories on the Russian-Ukrainian border. The invasion stalled when NATO, including the United States, armed the Ukrainian military with modern weaponry more sophisticated than that used by Russian forces. It is now a war between NATO and Russia, although still a limited war, not just a war between Ukraine and Russia.

Ukrainian President Volodymyr Zelensky continually pressures NATO and the United States to provide Ukraine with more advanced weaponry. NATO has already agreed to deliver tanks, anti-missile systems, drones, and rockets, but Zelensky wants fighter jets that will allow Ukraine to shift from a defensive war and attack targets deep inside Russia.

The United States and NATO face a serious dilemma. They are committed to supporting Ukraine and preserving its national integrity, but Zelensky is demanding that Russia return all occupied territory, including Crimea, and pay reparations to rebuild areas of Ukraine that were destroyed by the Russian invasion, demands that Russia will never accept. Russia will not return Crimea to Ukraine, just as the United States will never return California to Mexico.

If NATO and the United States deliver jet fighters and Ukraine uses them to attack Russian targets, including cities, the world faces an escalating domino effect similar to that which started World War 1 and led to World War 2. That is why as a historian, I am really worried about events playing out in Ukraine. The only peaceful resolution that I see is Ukraine agreeing to accept Russia control over Crimea and some of the disputed border areas in exchange for the NATO alliance rebuilding areas destroyed by the war. NATO and Russia will then have to find a resolution to their differences, but I am not hopeful they will find an amicable solution.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185766 https://historynewsnetwork.org/article/185766 0
The Roundup Top Ten for May 25, 2023

Why Historians Love Comparing Themselves to the Great Detectives

by Carolyn Eastman

The best point of comparison between Holmes and historian isn't in solving the case but in the struggle to make sense of the facts. 

Hollywood Strikers Carry the Legacy of Ned Ludd

by Gavin Mueller

Our techo-utopian society holds the Luddites in low regard, but their actual history helps explain what's at stake in the screenwriters' strike and any labor conflict where new technology threatens workers' livelihoods. 

Republican Push for More Capital Punishment Echoes Crime Panic of the 1980s

by Duncan Hosie

The Supreme Court decision in 1976 that allowed the states to resume executions coincided with a rise in anxiety over crime and pushed politicians to pledge more executions. 

After Dobbs, Abortion Politics are Straining the Republican Coalition

by Daniel K. Williams

When the party could focus on appointing anti-Roe judges, the Republicans could make abortion a political issue without having to decide matters of policy that inevitably leave parts of their coalition angry and disappointed. Have they lost by winning? 

"Return to Rigor" Isn't the Answer to Restoring Student Engagement

by Kevin Gannon

A post-COVID reaction to the improvisations made on grades, schedules and deadlines supposes that students are suffering from too much flexibility, but a singular focus on rigor won't address the causes of disengagment. 

How to Fight Back Against the Right's "Parents' Rights" Moral Panic

by Jennifer Berkshire

Parents' fears about losing control over their children have been the raw material for potent politically-motivated moral panics for a century and more. But those panics aren't irresistible, because parents everywhere still value public schools as democratic community institutions.  

Trump and DeSantis Two Peas in a White Nationalist Pod

by Clarence Lusane

Any Republican candidate will need to lean in to the politics of white Christian nationalism ascendant on the right; Trump has needed the MAGA movement as much as it's needed him. 

"Salts" are Part of Labor's Fight to Organize. They were once Part of the Antiwar Movement

by Derek Seidman

Taking a job with the covert intention of organizing the workplace is a time-honored labor tactic that's back in the news. Some dedicated activists in the 1960s "salted" the U.S. military in the hopes of building an antiwar movement within the ranks. 

Coca Cola Can't Go Green While Selling Drinks Cold

by Bart Elmore

If the worldwide beverage giant wants to reduce its carbon footprint, it's time for it to reverse its historical commitment to make its drinks available cold—in electric coolers—across the globe.

The Writers' Strike Opens Old Wounds

by Kate Fortmueller

The plot of each sequel of negotiations between the producers and writers has followed a formula of compromise for mutual self-preservation. Technological advances have convinced studio heads that they no longer need the labor of writers enough to keep compromising. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185763 https://historynewsnetwork.org/article/185763 0
Texas Judge Revives Anthony Comstock's Crusade Against Reproductive Freedom

In April, a Texas judge ruled invalid the Food and Drug Administration’s approval of a pill used in over half the abortions in America.  Going further, he invoked the federal Comstock Act to declare it “nonmailable.” Twenty Republican Attorneys General promptly warned pharmacy chains to halt its sale.  Such sales would violate a law initiated 150 years ago by a Connecticut farm boy turned dry goods salesman beginning his battle against reproductive rights.

From an early age, Anthony Comstock showed his moralistic zeal.  At eighteen, he broke into a gin mill near his family’s farm and drained the liquor onto the floor. Enlisting after Gettysburg, he fought his fellow soldiers’ vices – liquor, lust, swearing, breaking the Sabbath – as vigorously as the Confederates.  Moving to New York, he futilely tried to jail a smut dealer loaning obscene books to schoolboys.

The “hydra-headed monster” of smut is where he made his first big kill.  On March 2, 1872, he and a police captain raided booksellers along Manhattan’s Nassau Street, the heart of America’s smut industry.  In one shop, he purchased The Confessions of a Voluptuous Young Lady of High Rank. In others, he bought Women’s Rights Convention and La Rose d’Amour.  Evidence in hand, the pair secured warrants from a judge who agreed the books were obscene.  Returning to Nassau, they arrested eight culprits and confiscated five bushels of obscene merchandise.

Later that month, Comstock targeted a crime catering more to women, and which he considered an immeasurably greater evil.  Smut merely inspired lust.  This crime enabled it.  His specific target was a man, Dr. Charles Manches.  But the services Manches offered helped women overcome the safeguards God had built to control their passions:  the fear that could make a woman on the brink stop and preserve her chastity.

Manches advertised his “French Imported Male Safes” as “a perfect shield against disease or conception.”  For ladies wishing to take matters into their own hands, he offered “Ladies Protectors,” commonly known as womb veils.  If those devices failed to prevent pregnancy, he promised “Ladies Cured at One Interview, with or without medicine, $5.”  He was one of over a hundred abortionists in the city, according to the New York Times.

With support from the YMCA, Comstock continued his raids.  By mid-year, he had eight smut cases pending in New York courts.  But prosecutors continually requested postponements.  When one case finally proceeded, the defense didn’t contest Comstock’s testimony.  It simply argued the material confiscated was no more obscene than passages in the bible.  The argument wasn’t convincing.  Ten jurors voted to convict.  But the two who didn’t meant the defendant walked.  That proved the best outcome of his pending cases.

Frustrated under state law, Comstock changed tactics.  Seven years earlier, Congress had banned obscenity from first class mail.  The law was weak, narrowly defining obscenity and prohibiting postmasters from unsealing mail even if they knew a piece contained it.  Prosecutions had barely hit half a dozen.

Comstock began ordering smut by mail.  After receiving obscene goods, he obtained warrants in US Circuit Court.  Four dealers were convicted and sentenced to one year in jail and $500 fines – too lenient for Comstock, but the maximum the law allowed.

Raiding one dealer’s medical associate, he discovered the doctor’s teenage patient awaiting his third attempt to abort her fetus.  But abortion was a state crime.  A district attorney killed that case.

Dissatisfied, Comstock outlined ideas for a tougher federal law to Morris Jessup, the YMCA’s President.  Jessup got US Supreme Court Justice William Strong to finalize a bill for Congress.  In February 1873, Comstock visited the US Capitol to exhibit obscenities – books, sex toys, rubber goods.  Attending senators declared they would accept any bill he wanted so long as it was constitutional.  They could pass it before the current session closed for President Grant’s second inauguration March 4.

New York Congressman Clinton Merriam introduced the bill in the House, expecting to pass it quickly under a suspension of the rules.  Connecticut Senator William Buckingham followed in the Senate.

An optimistic Comstock got a head start on enforcement.  On Treasury Department letterhead, he contacted nine suspicious doctors.  “I am an employee of the Treasury,” he wrote under the pseudonym Anna M. Ray, “I was seduced about four months ago, and I am now three months gone in the family way.”  “Anna” begged each doctor to send something to relieve her condition.  “For God’s sake do not disappoint a poor ruined and forsaken girl whose only relief will be suicide if you fail me.”

The optimism was premature.  With resisting legislators invoking rules and demanding changes, weeks passed.  On Saturday evening, March 1, the House met for its final session.  Comstock watched.  At midnight, unwilling to break the Sabbath, he gave up.  Leaving the Capitol, he spent a sleepless night too depressed even to pray.  Not until dawn could he accept the failure as God’s will. Only when he ran into the Senate’s chaplain did he learn the news.  “Your bill passed the House at two o’clock this morning,” the chaplain said.  It was immediately sent to the Senate and passed.  President Grant signed it the next day.

His bill launched Comstock’s four-decade career fighting smut dealers, abortionists, birth control advocates, artists, playwrights, and poets.  Its opening section foretold his war on reproductive rights, explicitly banning anything – device, medicine, tool, information, advertising – “for the prevention of conception” or “for causing unlawful abortion.”

Women bookended that career.  As he was pushing his bill in Congress, Comstock indicted “Free Lover” Victoria Woodhull and her sister Tennie Claflin for publishing an obscene article exposing the adultery of Reverend Henry Ward Beecher.  While the article might have been libelous were it not true, it wasn’t obscene.  But Comstock guessed the arrests would be a publicity coup that would help his bill pass.  After a year of harassment, the sisters were acquitted.

Under his bill, Comstock quickly attacked abortionists—twelve in Chicago, seventeen in New York.  But Chicago judges imposed trivial fines. In New York only three served serious time.  Through 1875, Comstock claimed 49 abortion arrests with 39 convictions, but even he acknowledged the difficulty of bringing the practitioners to justice.  In 1878, he achieved one notable feat.  He entrapped New York’s notorious abortionist Madame Restell, driving her to suicide.  “A Bloody ending to a bloody life,” he noted without remorse.

Months later, Comstock entrapped Dr. Sara Case.  She supplied syringes for post-coital douching with substances like vinegar and carbolic acid to prevent conception.  As their battle played out in the press, Case renamed her device the “Comstock Syringe.”  Sales soared.

The list went on until Comstock closed his career arresting birth control advocate Margaret Sanger.  She fled to Europe to escape his clutches.  Comstock resorted to convicting her estranged husband for handing out a birth control pamphlet.

Of course the women he attacked directly were not the only victims of Comstock’s fight against reproductive rights.  Others were the desperate women forced to bear children, no matter the risks to their health, their inability to support another baby, or simply satisfaction with the family they already had.

With the Texas judge’s decision stayed and appeals underway, the battle over reproductive rights continues in Anthony Comstock’s shadow.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185707 https://historynewsnetwork.org/article/185707 0
Forget "Finding Forrester"—Our Best Teaching Can Be Ordinary

Plato and Aristotle in detail from The School of Athens by Raphael (1509–1510), fresco at the Apostolic PalaceVatican City.

Every few years there is a movie about a gifted young person striving to reach their potential and being formatively shaped by a teacher or mentor. Finding Forrester is a classic in this genre. The main character, Jamal, is a gifted young writer who meets a famous but reclusive novelist, William Forrester, who helps Jamal improve by challenging him and not being overly easy with the praise. In Whiplash, Miles Teller plays a gifted young drummer named Andrew Neiman whose music teacher, Terence Fletcher, is determined to draw out his genius. Fletcher’s approach is abusive and even somewhat insane. But Andrew wants so badly to be a musical legend on the level of Charlie Parker that he practices until his hands bleed and he endures the abuse.

Though university level instruction should not involve the abusive behavior we see in Whiplash, and we probably have to be more orthodox in our teaching than an old novelist eating soup and pecking at a typewriter, we sometimes dream of working with the kind of student pictured in those films. This would be a young person who has a natural gift and an unnatural drive to succeed. They want to be challenged. When you push them, they keep getting better. They go on to achieve remarkable things. You get to help launch them into the stratosphere.

In reality, very few students are going to resemble the characters in these movies. Some of your students aren’t interested in your class. Some are trying to decide if they are interested. Some are interested, but have other priorities. Some want to get better at whatever your discipline is, but do not believe that your course is part of their hero’s journey. Not everyone is going to read your comments on their paper. Not all who do will take the comments to heart. A few of your students will cheat on class assignments. Some of your students will certainly go on to greatness and many have significant abilities, but most of your students will not practice until their hands bleed.

There aren’t a lot of movies about doing an excellent job with normal students and getting normal outcomes. However, if it’s true that the process is more important than the product, those movies are missing something anyway. There’s quite a bit of true excellence in teaching that never gets associated with students who go on to win Nobel prizes or become MacArthur Fellows. Exceptional outcomes are not the only measure of excellence in teaching. An excellent teacher can teach all kinds of students. You can do meaningful work and inspire people without becoming the backstory of the next Stand and Deliver.

In films with bright students, those students arrive with the passion. Jamal is already a writer before he finds Forrester. Andrew Nieman has aspirations in the opening sequence. In real life, some college students are still searching for their passion. Some of them need that flame to be nourished. Even those with significant gifts are not always a half step from legendary excellence. Sometimes the role of the excellent teacher is an introduction to a subject or guiding the first steps along the path of whatever it is that a student is pursuing. Sometimes what you impart is not even a passion for your own subject.

A lot of the wise mentors in movies are set in their ways and have a pretty fixed and cantankerous approach to instruction. That may not slow down a gifted student who cannot be deterred from learning, but, even then, it may not be the actual best approach. Teaching excellence does not always take the form of pushing students to the extreme limits of their abilities. All students need to be challenged, but not all in extreme ways. Some also need to be encouraged. Struggle can help with growth, but sometimes students are struggling with things that are more important than our classes and don’t need provocatively difficult assignments to learn to push themselves in life. That doesn’t mean that every semester, every course, has to be catered to each individual student, or that everything should be easy, but it does mean that good teaching is much more than setting the bar at the correct height and then noting who makes it over and who doesn’t. There is a real art to setting meaningful but realistic expectations for students and ourselves.

One very unhelpful thing about films with amazing students is that they give us a distorted sense of impact. A good teacher’s legacy is not built on the genius of a single student helped along the way. A good teacher’s legacy includes people who became slightly better writers, casual readers of history, more critical viewers of documentaries, more knowledgeable citizens, and even people who just got better at passing college classes. A good legacy may even include helping direct a student to a better major for them. A good legacy is built on hundreds, thousands of recommendation letters, for all kinds of positions with varying degrees of prestige.

The reclusive novelist in Finding Forrester is roughly modeled on J.D. Salinger. Interestingly, Salinger’s novel Franny & Zooey has a relevant passage. Franny is a college student experiencing a kind of breakdown, and is judging her peers and professors along the way. Though they are part of the Glass family, full of child geniuses, her brother Zooey suggests that she is not necessarily flexing her intellect as much as she is being snobbish. Both had been precocious kids on a radio quiz show and Zooey reminds his sister that their older brother Seymour always encouraged them to do their best for the “Fat Lady”—to do their best for some unknown woman in the audience that they imagined as really deserving and really listening. Zooey even shined his shoes, for the radio program, for the “Fat Lady.” He tells his sister:

“I don’t care where any actor acts. It can be in summer stock, it can be over a radio, it can be over television, it can be in a goddam Broadway theatre, complete with the most fashionable, most well-fed, most sunburned-looking audience you can imagine. But I’ll tell you a terrible secret—Are you listening to me? There isn’t anyone out there who isn’t Seymour’s Fat Lady. That includes your Professor Tupper, buddy. And all his goddam cousins by the dozens. There isn’t anyone anywhere that isn’t Seymour’s Fat Lady. Don’t you know that? Don’t you know that goddam secret yet? And don’t you know—listen to me, now—don’t you know who that Fat Lady really is?... Ah, buddy. It’s Christ Himself. Christ Himself, buddy.”

There are days it feels like we are doing the Broadway equivalent of teaching—students seem to be lighting up, they’re going on to bigger and better things, they’re asking for outside reading recommendations. It is easy to feel inspired. But there are days we are off-off- Broadway—monitoring low grades and repeating ourselves in class. It is our job to see all of our students as significant, whether or not they seem special to us when we first meet them. Even if they would rather be texting, it is our job to be teaching to the best of our abilities.

Excellence in teaching is in meeting the challenge of real-life classrooms, filled with students of all abilities, and resulting in all kinds of outcomes. Excellent teaching is not just about throwing down challenges to push great students on to more greatness. We don’t work on a film set, we work in a university classroom. We are great when we are consistently excellent, whether or not our students are famous or we are experiencing moments that have the feel of movie magic.   

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185704 https://historynewsnetwork.org/article/185704 0
Stronger Global Governance is the Only Way to a World Free of Nuclear Weapons

Some of the 800 members of Women Strike for Peace who marched at United Nations headquarters in Manhattan to demand UN mediation of the 1962 Cuban Missile Crisis

 

It should come as no surprise that the world is currently facing an existential nuclear danger.  In fact, it has been caught up in that danger since 1945, when atomic bombs were used to annihilate the populations of Hiroshima and Nagasaki.

Today, however, the danger of a nuclear holocaust is probably greater than in the past.  There are now nine nuclear powers―the United States, Russia, Britain, France, China, Israel, India, Pakistan, and North Korea―and they are currently engaged in a new nuclear arms race, building ever more efficient weapons of mass destruction.  The latest entry in their nuclear scramble, the hypersonic missile, travels at more than five times the speed of sound and is adept at evading missile defense systems. 

Furthermore, these nuclear-armed powers engage in military confrontations with one another―Russia with the United States, Britain, and France over the fate of Ukraine, India with Pakistan over territorial disputes, and China with the United States over control of Taiwan and the South China Sea―and on occasion issue public threats of nuclear war against other nuclear nations.  In recent years, Vladimir Putin, Donald Trump, and Kim Jong-Un have also publicly threatened non-nuclear nations with nuclear destruction.

Little wonder that in January 2023 the editors of the Bulletin of the Atomic Scientists set the hands of their famous “Doomsday Clock” at 90 seconds before midnight, the most dangerous setting since its creation in 1946.

Until fairly recently this march to Armageddon was disrupted, for people around the world found nuclear war a very unappealing prospect.  A massive nuclear disarmament campaign developed in many countries and, gradually, began to force governments to temper their nuclear ambitions.  The results were banning nuclear testing, curbing nuclear proliferation, limiting development of some kinds of nuclear weapons, and fostering substantial nuclear disarmament.  From the 1980s to today the number of nuclear weapons in the world sharply decreased, from 70,000 to roughly 13,000.  And with nuclear weapons stigmatized, nuclear war was averted.

But successes in rolling back the nuclear menace undermined the popular struggle against it, while proponents of nuclear weapons seized the opportunity to reassert their priorities.  Consequently, a new nuclear arms race gradually got underway.

Even so, a nuclear-free world remains possible.  Although an inflamed nationalism and the excessive power of military contractors are likely to continue bolstering the drive to acquire, brandish, and use nuclear weapons, there is a route out of the world’s nuclear nightmare.

We can begin uncovering this route to a safer, saner world when we recognize that a great many people and governments cling to nuclear weapons because of their desire for national security.  After all, it has been and remains a dangerous world, and for thousands of years nations (and before the existence of nations, rival territories) have protected themselves from aggression by wielding military might.

The United Nations, of course, was created in the aftermath of the vast devastation of World War II in the hope of providing international security.  But, as history has demonstrated, it is not strong enough to do the job―largely because the “great powers,” fearing that significant power in the hands of the international organization would diminish their own influence in world affairs, have deliberately kept the world organization weak.  Thus, for example, the UN Security Council, which is officially in charge of maintaining international security, is frequently blocked from taking action by a veto cast by one its five powerful, permanent members.

But what if global governance were strengthened to the extent that it could provide national security?  What if the United Nations were transformed from a loose confederation of nations into a genuine federation of nations, enabled thereby to create binding international law, prevent international aggression, and guarantee treaty commitments, including commitments for nuclear disarmament? 

Nuclear weapons, like other weapons of mass destruction, have emerged in the context of unrestrained international conflict.  But with national security guaranteed, many policymakers and most people around the world would conclude that nuclear weapons, which they already knew were immensely dangerous, had also become unnecessary.

Aside from undermining the national security rationale for building and maintaining nuclear weapons, a stronger United Nations would have the legitimacy and power to ensure their abolition.  No longer would nations be able to disregard international agreements they didn’t like.  Instead, nuclear disarmament legislation, once adopted by the federation’s legislature, would be enforced by the federation.  Under this legislation, the federation would presumably have the authority to inspect nuclear facilities, block the development of new nuclear weapons, and reduce and eliminate nuclear stockpiles.

The relative weakness of the current United Nations in enforcing nuclear disarmament is illustrated by the status of the UN Treaty on the Prohibition of Nuclear Weapons.  Voted for by 122 nations at a UN conference in 2017, the treaty bans producing, testing, acquiring, possessing, stockpiling, transferring, and using or threatening the use of nuclear weapons.  Although the treaty officially went into force in 2021, it is only binding on nations that have decided to become parties to it.  Thus far, that does not include any of the nuclear armed nations.  As a result, the treaty currently has more moral than practical effect in securing nuclear disarmament.

If comparable legislation were adopted by a world federation, however, participating in a disarmament process would no longer be voluntary, for the legislation would be binding on all nations.  Furthermore, the law’s universal applicability would not only lead to worldwide disarmament, but offset fears that nations complying with its provisions would one day be attacked by nations that refused to abide by it.

In this fashion, enhanced global governance could finally end the menace of worldwide nuclear annihilation that has haunted humanity since 1945.  What remains to be determined is if nations are ready to unite in the interest of human survival.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185705 https://historynewsnetwork.org/article/185705 0
AI the Latest Instance of our Capacity for Innovation Outstripping our Capacity for Ethics

The eagerness with which movie and television studios have proposed to use artificial intelligence to write content collides with the concern of Writers Guild members for their employment security and pay in the latest episode of technological innovation running ahead of ethical deliberation. 

Regarding modern technology, the psychologist Steven Pinker and the economist/environmentalist E. F. Schumacher have expressed opposite opinions. In his Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018), the former is full of optimism--e.g.,“technology is our best hope of cheating death”--but many decades earlier Schumacher stated that it was “the greatest destructive force in modern society.” And he warned, “Whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.”

Now, in 2023, looking over all the technological developments of the last century, I think Schumacher’s assessment was more accurate. I base this judgment on recent developments in spyware and Artificial Intelligence (AI). They have joined the ranks of nuclear weapons, our continuing climate crisis, and social media in inclining me to doubt humans’ ability to control the Frankensteinian  monsters they have created. The remainder of this essay will indicate why I have made this judgment.

Before taking up the specific modern technological developments mentioned above, our main failing can be stated: The structures that we have developed to manage technology are woefully inadequate. We have possessed neither the values nor wisdom necessary to do so. Several quotes reinforce this point.

One is General Omar Bradley’s: "Ours is a world of nuclear giants and ethical infants. If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner."

More recently, psychologist and futurist Tom Lombardo has observed that “the overriding goal” of technology has often been “to make money . . . without much consideration given to other possible values or consequences.”

Finally, the following words of Schumacher are still relevant:

“The exclusion of wisdom from economics, science, and technology was something which we could perhaps get away with for a little while, as long as we were relatively unsuccessful; but now that we have become very successful, the problem of spiritual and moral truth moves into the central position. . . . Ever-bigger machines, entailing ever-bigger concentrations of economic power and exerting ever-greater violence against the environment, do not represent progress: they are a denial of wisdom. Wisdom demands a new orientation of science and technology towards the organic, the gentle, the nonviolent, the elegant and beautiful.”

“Woefully inadequate” structures to oversee technological developments. How so? Some 200 governments are responsible for overseeing such changes in their countries. In capitalist countries, technological advances often come from individuals or corporations interested in earning profits--or sometimes from governments sponsoring research for military reasons. In countries where some form of capitalism is not dominant, what determines technological advancements? Military needs? The whims of authoritarian rulers or elites? Show me a significant country where the advancement of the common good is seriously considered when contemplating new technology.

Two main failings leap out at us. The first, Schumacher observed a half century ago--capitalism’s emphasis on profits rather than wisdom. Secondly--and it’s connected with a lack of wisdom--too many “bad guys,” leaders like Hitler, Stalin, Putin, and Trump, have had tremendous power yet poor values.

Now, however, on to the five specific technological developments mentioned above. First, nuclear weapons. From the bombings of Hiroshima and Nagasaki in 1945 until the Cuban Missile Crisis in 1962, concerns about the unleashing of a nuclear holocaust topped our list of possible technological catastrophes. In 1947, the Bulletin of the Atomic Scientists established its Doomsday Clock, “a design that warns the public about how close we are to destroying our world with dangerous technologies of our own making.” The scientists set the clock at seven minutes to midnight. “Since then the Bulletin has reset the minute hand on the Doomsday Clock 25 times,” most recently in January of this year when it was moved to 90 seconds to midnight--“the closest to global catastrophe it has ever been.” Why the move forward? “Largely (though not exclusively) because of the mounting dangers of the war in Ukraine.”

Second, our continuing climate crisis. It has been ongoing now for at least four decades. The first edition (1983) of The Twentieth Century: A Brief Global History  noted that “the increased burning of fossil fuels might cause an increase in global temperatures, thereby possibly melting the polar ice caps, and flooding low-lying parts of the world.” The third edition (1990) expanded the treatment by mentioning that by 1988 scientists “concluded that the problem was much worse than they had earlier thought. . . . They claimed that the increased burning of fossil fuels like coal and petroleum was likely to cause an increase in global temperatures, possibly melting the polar ice caps, changing crop yields, and flooding low-lying parts of the world.” Since then the situation has only grown worse.

Third, the effects of social media. Four years ago I quoted historian Jill Lepore’s highly-praised These Truths: A History of the United States (2018): “Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning.” By the 1990s, she observed that “targeted political messaging through emerging technologies” was contributing to “a more atomized and enraged electorate.” In addition, social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.”

Moreover, the Internet was “easily manipulated, not least by foreign agents. . . . Its unintended economic and political consequences were often dire.” The Internet also contributed to widening economic inequalities and a more “disconnected and distraught” world. Internet information was “uneven, unreliable,” and often unrestrained by any type of editing and fact-checking. The Internet left news-seekers “brutally constrained,” and “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. So, too did Internet-related companies that feed people only what they wanted to see and hear. Further, social media “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

Similar comments came from the brilliant and humane neurologist Oliver Sacks, who shortly before his death in 2015 stated that people were developing “no immunity to the seductions of digital life” and that “what we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.” 

Fourth, spyware. Fortunately, in the USA and many other countries independent media still exists. Various types of such media are not faultless, but they are invaluable in bringing us truths that would otherwise be concealed. PBS is one such example.

Two of the programs it produces, the PBS Newshour and Frontline have helped expose how insidious spyware has become. In different countries, its targets have included journalists, activists, and dissidents. According to an expert on The Newshour,

“The use of spyware has really exploded over the last decade. One minute, you have the most up-to-date iPhone, it's clean, sitting on your bedside table, and then, the next minute, it's vacuuming up information and sending it over to some security agency on the other side of the planet.”

The Israeli company NSO Group has produced one lucrative type of spyware called Pegasus. According to Frontline, it “was designed to infect phones like iPhones or Androids. And once in the phone, it can extract and access everything from the device: the phone books, geolocation, the messages, the photos, even the encrypted messages sent by Signal or WhatsApp. It can even access the microphone or the camera of your phone remotely.” Frontline quotes one journalist, Dana Priest of The Washington Post, as stating, “This technology, it's so far ahead of government regulation and even of public understanding of what's happening out there.”

The fifth and final technological development to consider is Artificial Intelligence (AI). During the past year, media has been agog with articles on it. Several months ago on this website I expressed doubts that any forces will be able to limit the development and sale of a product that makes money, even if it ultimately harms the common good. 

More recently (this month) the PBS Newshour again provided a public service when it conducted two interviews on AI. The first was with “Geoffrey Hinton, one of the leading voices in the field of AI,” who “announced he was quitting Google over his worries about what AI could eventually lead to if unchecked.”

Hinton told the interviewer (Geoff Bennett) that “we're entering a time of great uncertainty, where we're dealing with kinds of things we have never dealt with before.” He recognized various risks posed by AI such as misinformation, fraud, and discrimination, but there was one that he especially wanted to highlight: “the risk of super intelligent AI taking over control from people.” It was “advancing far more quickly than governments and societies can keep pace with.” While AI was leaping “forward every few months,” needed restraining legislation and international treaties could take years.

He also stated that because AI is “much smarter than us, and because it's trained from everything people ever do . . . it knows a lot about how to manipulate people, and “it might start manipulating us into giving it more power, and we might not have a clue what's going on.” In addition, “many of the organizations developing this technology are defense departments.” And such departments “don't necessarily want to build in, be nice to people, as the first rule. Some defense departments would like to build in, kill people of a particular kind.”

Yet, despite his fears, Hinton thinks it would be a “big mistake to stop developing” AI. For “it's going to be tremendously useful in medicine. . . . You can make better nanotechnology for solar panels. You can predict floods. You can predict earthquakes. You can do tremendous good with this.”

What he would like to see is equal resources put into both developing AI and  “figuring out how to keep it under control and how to minimize bad side effects of it.” He thinks “it's an area in which we can actually have international collaboration, because the machines taking over is a threat for everybody.”

The second PBS May interview on AI was with Gary Marcus, another leading voice in the field. He also perceived many possible dangers ahead and advocated  international controls.

Such efforts are admirable, but are the hopes for controls realistic? Looking back over the past century, I am more inclined to agree with General Omar Bradley--we have developed “our technology without wisdom or prudence,” and we are “ethical infants.”

In the USA, we are troubled by divisive political polarization; neither of the leading candidates for president in 2024 has majority support in the polls; and Congress and the Supreme Court are disdained by most people. Our educational systems are little concerned with stimulating thinking about wisdom or values. If not from the USA, from where else might global leadership come? From Russia? From China? From India? From Europe? From the UN? The past century offers little hope that it would spring from any of these sources.

But both Hinton and Marcus were hopeful in their PBS interviews, and just because past efforts to control technology for human betterment were generally unsuccessful  does not mean we should give up. Great leaders like Abraham Lincoln, Franklin Roosevelt, and Nelson Mandela did not despair even in their nations’ darkest hours. Like them, we too must hope for--and more importantly work toward--a better future.

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185706 https://historynewsnetwork.org/article/185706 0
John de Graaf on his Powerful Documentary on Stewart Udall, Conservation, and the True Ends of Politics

John de Graaf and Stewart Udall

We have, I fear, confused power with greatness.—Stewart Udall

Stewart Udall (1920-2010) may be the most effective environmentalist in our history considering his monumental accomplishments in protecting and preserving the environment and improving the quality of life for all citizens. Unfortunately, his tireless efforts for conservation and environmental protection and his gifts as a leader are not well known to the wider public today. His life offers inspiration and a model for, among others, public servants and citizen activists today.

As the Secretary of the Interior from 1961 to 1969 under Presidents John F. Kennedy and Lyndon Baines Johnson, Udall took the department in new directions as he crafted some of the most significant environmental policies and legislation in our history. With his talent for forging bipartisan alliances, he spearheaded the enactment of major environmental laws such as the Clear Air, Water Quality and Clean Water Restoration Acts, the Wilderness Act of 1964,

the Endangered Species Preservation Act of 1966, the Land and Water Conservation Fund Act of 1965, the National Trail System Act of 1968, and the Wild and Scenic Rivers Act of 1968.

Secretary Udall also led in expanding federal lands and he established four national parks, six national monuments, eight national seashores and lakeshores, nine national recreation areas, 20 national historic sites, and 56 national wildlife refuges including Canyonlands National Park in Utah, North Cascades National Park in Washington, Redwood National Park in California, and more. A lifelong advocate for civil rights, Udall also desegregated the National Park Service.

After his term as Secretary of the Interior, Udall continued to work for decades as an attorney advancing environmental protection, worker health and safety, human rights, tolerance, Indigenous rights, racial equality, and justice.

Despite his many achievements, Udall seems to have faded from memory and most people today know little of his monumental legacy. His name doesn’t usually leap to mind when considering the great leaders on the environment and human rights.

To remind us of Udall’s remarkable life and legacy, acclaimed filmmaker and activist John de Graaf created a new documentary, Stewart Udall, The Politics of Beauty (The film is available through Bullfrog Communities: www.bullfrogcommunities.com/stewartudall).

The film charts the trajectory of Udall’s life as it introduces viewers to a history of the origins of the modern environmental movement. There’s the journey from Udall’s childhood in Arizona, his schooling, and his World War II combat duty, to his commitment to public service, his terms in Congress, and his achievements as Secretary of the Interior. The film further recounts his later life as a zealous attorney, author, and voice for beauty, simplicity, and peace as he warned about climate change, health hazards, rampant consumerism, and the dangers of polarization and extreme partisanship. Especially engaging are interviews with Udall and his family supplemented with family films as well as scenes with JFK and Lady Bird Johnson.

The film is based on exhaustive archival research as well as interviews with historians, family members, friends and colleagues of Udall. Personal films, photographs and papers were shared with Mr. de Graaf and his team. As the life of Udall unfolds, the film provides historical context illustrated with vivid scenes from the turbulence, environmental devastation, and movements for justice and peace in the sixties and seventies. There are also stunning sequences of natural beauty from the forests, seas, deserts and other sites that Udall sought to protect.

The story of Udall’s life may provide a way forward for younger people today who are skeptical of politics and disillusioned by stasis and polarization that prevent meaningful change for a better quality of life and a more livable world. Udall’s visionary pursuit of environmental and social justice came out of his cooperative nature and his belief in democracy. May his inspiring example create hope and fire the minds of citizens today.  

Mr. de Graaf is a Seattle-based award-winning filmmaker, author, and activist. He has said that his mission is to “help create a happy, healthy and sustainable quality of life for America,” and his documentary on Stewart Udall is an aspect of that desire. He has been producing and directing documentaries for public television for more than forty years. His nearly 50 films, including 15 prime time PBS specials, have won more than 100 regional, national and international awards.

Mr. de Graaf also has written four books, including the bestselling Affluenza: The All-Consuming Epidemic. The John de Graaf Environmental Filmmaking Award, named for him, is presented annually at the Wild and Scenic Film Festival in California. He is also co-founder and president of Take Back Your Time, co-founder of the Happiness Alliance, former policy director of the Simplicity Forum, and founder of the emerging organization, And Beauty for All. 

Mr. de Graaf graciously responded to questions about his background and his Udall documentary by phone from his Seattle office.

Robin Lindley: Congratulations John on your heartfelt and vivid Stewart Udall film. I appreciate the work you do and your persistence. Every documentary film must be a long haul.

John de Graaf: Thank you. I had a team of great people to work with, so I can't take all the credit.

Robin Lindley:  Before we get to the Udall film, I wanted to give readers a sense about your background. What inspired you to work now as an activist, author and filmmaker?

John de Graaf:  I was an activist first, and that led me to do quite a bit of writing, to print reporting. And that eventually led me to do a public affairs radio show at the University of Minnesota in Duluth. Doing that, I met a character that I thought would make a great film. And then I connected with this videographer at the University of Minnesota Minneapolis, and we put a film together that was then aired on Minnesota Public Television in 1977, and the film won a major PBS award and that launched me.

Four years later I started doing freelance documentary production at Channel Nine, the PBS station in Seattle. I was there for 31 years basically, until they kicked me out in 2014, but I've continued. My film Affluenza was a big success on PBS, so I was asked to write a book by a by a New York agent. Then a California publisher put out the Affluenza book, and that took off like the film. It has sold nearly 200,000 copies in 10 or 11 languages internationally.

I also made a little film called What's the Economy for Anyway? and that led to another book. I also edited a book called Take Back Your Time that was connected with research and activism I was doing about overwork in America.

Robin Lindley: Congratulations on those projects aimed at exposing social justice and environmental issues and at encouraging work to improve the quality of our lives.

John de Graaf: Yes. The quantity of our stuff, or the gross national product, or world power, or any of those things should not be the goal. Instead, the aim should be about the best quality of life for people. I think all of these themes connect with that.

Robin Lindley: Thanks for your tireless efforts. You title of your new documentary is Stewart Udall, The Politics of Beauty. What do you mean by the politics of beauty? It seems that expression ties in with your interests in the environment and nature as well as your efforts to promote happiness and better quality of life.

John de Graaf: I think there is a lot of evidence that our common, even universal, love for beauty, especially nature’s beauty, can bring us together and reduce polarization.  It’s no accident that the most bipartisan bill passed during the Trump administration was the Great American Outdoors Act.  Beautiful cities can slow us down, reduce our levels of consumption, and use of the automobile.  Parks and access to nature are a more satisfying substitute for material stuff.  The response to my film confirms this for me.  Stewart was aware of all of this.

Robin Lindley: What inspired you to make a film now about Stewart Udall, who seems to be an overlooked champion for the environment? He's not remembered in the same way as naturalist John Muir maybe, or author Rachel Carson or Sierra Club’s David Brower.

John de Graaf: Of course, John Muir was a huge figure in his time. His writing was known by everybody and he stirred such a movement but he needed political figures like Teddy Roosevelt and later, Udall, to make his dream of the National Parks come true.

Rachel Carson’s book Silent Spring was very powerful, but that's what she did and she died soon afterwards. She wasn't able to accomplish a lot without people like Udall who actually created and passed legislation. I don't mean to in any way denigrate her. She was great and Udall loved and appreciated her. He was a pallbearer at her funeral. Her book stirred a lot of interest and attention, and people like Udall got behind it, and so it had a major effect.

In terms of environmental work, David Brower was exceedingly important because he was involved in so many things including the Sierra Club. Aldo Leopold was a key figure with his impact. And there have been many, many others since then. Now you'd have to probably add Bill McKibben, Gus Speth, and people like that.

Robin Lindley: It seems, however, that Udall has been overlooked or forgotten. Was that one of the reasons you wanted to do a film about him?

John de Graaf: I was impressed years ago when I interviewed him, but I'd forgotten about him until I saw a newspaper story in 2020 that said “a hundred years ago today Stewart Udall was born.” I was struck by my memory of him, and I knew he gave me a book so I went to my shelf and pulled down the book that he gave me and signed to me when I interviewed him.

And then I started doing a little more research, first online and then ordering biographies of him. And I thought, what a fascinating character. I knew that he had created several national parks and some things like that, and I knew that he had stopped the Grand Canyon dams because that was what I'd interviewed him about. But I had no idea about his civil rights activity, his work for world peace, his work for the arts, and his support for victims of atomic fallout and uranium miners, and so many other things that he ended up doing. That came as a complete surprise to me, and I think made the film richer.

Robin Lindley: Udall seems a renaissance man. I didn't know much about his life, and your film was certainly illuminating. What do you think inspired him to get involved in environmental protection and then in environmental and social justice issues?

John de Graaf: Number one, he did spend a lot of time outdoors when he was a kid on a farm in Arizona and hiking in the nearby White Mountains. And he got very interested in the natural world and the beauty of the natural world when he was out hiking.

And then, he grew up in a Mormon family, but it was unusual because it was a very liberal Mormon family. His father impressed on all the kids that Mormons had been discriminated against and that's why they were in these godforsaken places in the desert. They'd been pushed out of Illinois and Missouri and other places, so they had to stand up for other people who were discriminated against, and that included especially Native Americas because they lived in the area where he was, and Black Americans, and so forth.

And then, he fought in World War II. He flew on 52 very dangerous bombing missions. He was very lucky to come back alive and he said that he must have been allowed to live for some reason. He decided, “I really need to be involved in public service in the best way that I know how.”

When he came back, he played basketball at the University of Arizona, and he was very committed to civil rights. He and his brother Mo both joined the Tucson chapter of the NAACP right after the war. And they’d had Black friends in the military and Mo had been a lieutenant with a division of Black troops. And they both fought to integrate the University of Arizona.

And Stewart was especially interested in the environment and protecting the beauty of the West. Later, that went beyond conservation, beauty and preservation to a much wider view of ecology and the environment and pollution.  

Robin Lindley: Udall’s probably best known for his role as the Secretary of the Interior under JFK and LBJ. How did he come to be appointed the Secretary of Interior? What brought him to the attention of the Kennedy administration?

John de Graaf: He worked with Senator John Kennedy as a congressman. They worked on a number of bills together in the late fifties, and he was very impressed by Kennedy.

When Kennedy decided to run for president for 1960, Stewart got behind him. Stewart was a very influential and persuasive person in Arizona at that time, though nobody knew anything about him beyond Arizona.  But he was able to convince Arizona's delegation to unanimously support Kennedy for president over Lyndon Johnson at the Democratic Convention. And Kennedy appreciated that.

Kennedy was also looking for somebody who knew something about the outdoors and somebody who was a westerner because it was traditional that the Interior Secretary be a westerner. Stewart Udall was the obvious choice for Kennedy at that time.

Robin Lindley: Did Udall have a close relationship with Kennedy during his short presidential term?

John de Graaf: I think Kennedy was distant and Stewart wanted a much closer relationship than Kennedy would allow with him, or I think with anyone else. But they were friends, of course, and Kennedy supported what Stewart was doing and Stewart supported what Kennedy was doing. He felt that Kennedy had a prodigious intellect and capacity for getting things done, but he was not a person who was easy to make friends with. Stewart was actually much better friends with Jackie, Kennedy's wife. She thought Stewart was such a gentleman and a fascinating character. She liked his personality and very much liked his wife. They were friends with his family.

Stewart didn't know how Johnson would be, but it turned out that Johnson was a much more social person than Kennedy, and much easier to be with and have a friendship with, And Johnson really loved nature and was committed to environmental protection in a stronger way than Kennedy had been. And a lot of that came from Johnson’s wife so Stewart cultivated his friendship with Lady Bird Johnson who adored him, according to Johnson’s daughters.

Udall convinced Lady Bird Johnson that she should make a name for herself in conservation by first doing a beautification campaign and then through various other work. Lady Bird took up that Beautify America campaign and became a great advocate for the environment.

Robin Lindley: Didn’t Lady Bird and Udall share a concern about impoverished urban areas urban areas also?

John de Graaf: It didn't start with the impoverished areas. It started with the idea of beautifying America. But Lady Bird and Lyndon Johnson loved the cities that they visited in Europe, and they felt that Washington was a derelict place-- a mess in comparison to the other capitals of the world. It was embarrassing to bring people to the United States capital.

They felt that they had to start their campaign addressing cities in Washington DC, and that justice compelled them to start in the poorest communities, which were African American communities. They decided to put money first into beautifying those areas before focusing on the neighborhoods that were already gentrified.

Robin Lindley: And that approach also ties into Udall’s interest in civil rights, which you stress in your documentary.

John de Graaf: Yes. He was very interested in promoting civil rights. One of his first discoveries as Secretary of Interior was that the Washington Redskins (now Commanders) football team wouldn't hire Black players. So, he basically gave them this ultimatum that, if they wanted to play in the National Stadium, which the Department of Interior controlled, they needed to hire Black players or Udall would not lease the stadium to them. And so, they hired Black players, and that changed football immensely. In fact, the Washington Redskins became a much better team. The Washington Post even suggested that Stewart Udall should be named NFL Coach of the Year because of what he’d done to improve the team.

Udall also discovered that the National Park Service, which he was in charge of, was segregated. They had Black rangers only in the Virgin Islands, which is primarily Black. He was determined to change that. He sent recruiters to traditionally Black colleges in order to do it.

His kids told me that he would watch the civil rights protests on television. And he would say things like “Look at those brave young people. They have so much dignity.” And these young people were getting slammed, and weren't violent. They were quite the opposite, and Stewart said, “These kids are what America should be all about.” He added, “We need kids like this in the National Park Service, and the National Park Service needs to look like America.”

Bob Stanton from Texas was one of the first Black park rangers, and he went to Grand Teton. He later became the head of the National Park Service. He's a wonderful guy and I’ve gotten to know him well. Bob's 83 now, but he has the deepest memories of all that happened and Stewart Udall's role in it.

Stewart also had to decide whether the 1963 March on Washington could happen because it was planned for the National Park areas of the Lincoln Memorial and the Washington Monument. He had to grant a permit for the march to proceed, and there was enormous pressure for him not to approve the permit that came from the Jim Crow Democratic Senators in the South who were also putting huge pressure on President Kennedy. 

The march happened, and it was huge, and its impact was huge. Stewart watched it from the sidelines, but you could see in the photos of the march that National Park rangers were standing right near Martin Luther King when he spoke.

Robin Lindley:  Thanks for sharing those comments on Udall’s support of civil rights. Didn’t he leave the Mormon Church because of its racist policies?

John de Graaf: He wasn’t a Mormon anymore by then, but he always claimed that he remained a cultural Mormon--that he believed in the Mormon ideas of public service, of community and family, and all those things. And Mormons did have a real ethic of serving the community in those days. Those communities were tight, and people worked together. And Stewart believed in that.

World War II really cost him his faith because he just couldn't accept that, if there was a God, God would allow the things to happen that he saw in the war. He became basically an agnostic but he did not reject the church, and he did not openly criticize the church until the mid-1960s when he became concerned about the church's refusal to allow Blacks in its priesthood.

Udall thought that was astounding and terrible, so he finally wrote a letter to the church saying there was a Civil Rights Movement and the position of Mormon Church was unacceptable. The church basically wrote back and said that it might agree with Udall but it doesn’t make those decisions. God does. Until God gives a revelation to the head of the church, things must stay as they are.

Ten years later, God gave a revelation to the head of the church and they changed the policy. Stewart basically was out of the church and was not considered a Mormon, but he was never excommunicated and never really disowned in any sense by the church. In fact, some of the strongest supporters of this film today are Mormons even though it’s clear about Udall leaving the church. Some evangelicals believe that former members are betrayers, but the Mormons don't take that position at all. In fact, they very much honor Udall. I just spoke at Utah State University, and a young Mormon woman come up to me after the screening and said she wanted to show this film. She said she was a board member of the Mormon Environmental Stewardship Association, and she added that “We're proud of Steward Udall.” It was very positive to see that attitude.

Robin Lindley: Thanks for explaining that, John. I didn't quite understand Udall’s interactions with the Mormon Church.

John de Graaf: The church's view was that Stewart had honest reasons for rejecting policies and for leaving the church, and that was respected. And it did not make him a bad person. You had to decide that he was a good or bad person on the basis of the deeds that he did, which seems a good attitude

Robin Lindley:  Yes. And Stewart Udall had a special gift for working both sides of the aisle to pass legislation including many important environmental measures. Wasn’t the Congress of the 1960s far less polarized than what we see now?

John de Graaf: It was, and particularly after Kennedy's death, but there was a lot of fighting and it was hard for Stewart to move things through. He certainly had some very key Republican support, but he also had some major Democratic opposition, not only from the head of the Interior Committee, Wayne Aspinall, a Colorado Democrat, but he also had southern Democrats who hated him because of his civil rights positions.

But after Kennedy was killed, and Johnson was elected in a landslide, that brought the Congress together around the idea of LBJ’s Great Society programs and civil rights laws. And Johnson did a much better job of getting things through Congress than Kennedy. Then you saw the Land and Water Conservation Fund, and the Wilderness Act, and Endangered Species List--major bills that passed because Congress and Johnson supported them.

But some environmental laws didn’t get passed until Nixon came in because of the huge protests on the first Earth Day in 1970. These bills were already in Congress, and Congress moved them ahead. And when Nixon was president, he had a Democratic Congress. The bills moved ahead but there was never a veto proof majority except on a couple bills like the Wild Rivers Act. Nixon though, with the pressure of Earth Day and all the environmentalist sentiment at that time, signed the bills.

Nixon himself had an environmental sensibility. He was terrible on civil rights issues and the war but he was much more open about the environment. He realized the impact of pollution. He had seen the Santa Barbara oil spill, the polluted Cuyahoga River. Nixon felt comfortable in signing the act creating the Environmental Protection Agency.

Robin Lindley: Is it fair to say that Stewart Udall was the architect of the EPA’s creation?

John de Graaf: It's fair to say that he was certainly one of the main architects. He didn't do it alone. He had key people in Congress who were supporting him, but he certainly pushed hard for it. I don't know if the idea was originally his, but he was probably the first who talked about it, and he certainly played a major role in it.

Stewart was also the first political figure to speak about global warming. He heard about it from his scientific advisor, Roger Revelle.  Revelle was an oceanographer who worked with the Smithsonian and was one of the first scientists to look at how the oceans were heating up. He said we have a problem on our hands with global warming. Stewart was talking with him on a regular basis and then decided to go public with this threat.  Other politicians knew about it, but they wouldn't go public, but Stewart said this was a major problem and he predicted flooding of Miami and New York and melting of the polar ice cap. And he was talking about global warming in 1967.

Robin Lindley: That surprised me. He was so prescient.

John de Graaf: Yes. There were smart scientists, but most politicians wouldn't dare touch it, even though the signs of much of it were already there. Daniel Moynihan gave a big public speech in 1969 about global warming as a big issue. More attention was probably paid to that speech than to Stewart, because Stewart wrote about the climate in books and in articles rather than in speeches.

Robin Lindley: It was interesting that, in one of Johnson's major speeches on the Great Society, he spoke about civil rights and poverty, and he decided to added a section that Stewart had suggested on the quality of life despite objections from some politicians.

John de Graaf: The speech was written by Richard Goodwin, the president’s speechwriter. But certainly, Goodwin had to have been reading what Stewart had written for LBJ because the language was exactly the same as much of Stewart's language.

Stewart had actually written short speeches for LBJ that had that language about quality of life and beauty. He wrote that when we lose beauty, we lose much that is meaningful in our lives.

That Great Society speech was interesting because Johnson was clearly influenced by Stewart and he agreed with his views about quality of life and nature. And Johnson told Richard Goodwin to have three themes in that speech: poverty, civil rights, and the quality of life and beauty. But then he told Goodwin to share the speech with the leaders of the House and Senate and get their opinions on it because he wanted them to like it and to support it. When Goodwin did that, he found that the Democratic leaders wanted him to take out the part about beauty and quality of life and to focus on the war on poverty and civil rights because they felt that these other things would distract from the main message that the president wanted to share.

The story is that Goodwin took those sections out of the speech and passed the speech back to LBJ who read the speech before giving it. He looked at Goodwin and he said, “What the hell happened to the stuff about quality of life?” Goodwin said, “You told me to show it to the House and Senate leaders. They said I should take it out because it was a distraction from your message.” And Johnson slammed his hand on the desk and said, “They don't write my speeches. That's just as important as the other stuff. Put that back in.” So that language on quality of life ended up being part of his incredible Great Society speech.

Robin Lindley: And I was surprised that Udall was working on a nuclear test ban treaty and was very concerned about nuclear proliferation.

John de Graaf: Yes. That was under Kennedy before the Test Ban Treaty of 1963 was signed by Kennedy and Khrushchev.

In 1962, Stewart was very concerned about nuclear war. He also had been very concerned about the dropping of the bomb on Japan. He felt, even as a soldier, that it was going beyond what he believed in. He believed that it was all right to bomb military installations but he did not believe that we should bomb civilians deliberately. He accepted that civilians would inadvertently be killed, but we should never target civilians. That was simply awful and against all notions of how we fight and against the Geneva Convention.

Udall went to the Soviet Union to discuss energy issues and he took poet Robert Frost along to meet Soviet poets like Yevtushenko because he knew that the Russians loved poetry. And at that time, Americans didn't pay much attention to it. So, he took Robert Frost, and he was able to get a meeting with Khrushchev where they discussed nuclear weapons and banning atmospheric nuclear testing, which was going on in both countries at that time.

Nothing immediately came of the talks because it was actually right before the Cuban Missile Crisis. But it apparently had some influence, because once that crisis was resolved and nuclear weapons were not used, the Russians came back to the table with Kennedy and agreed to ban atmospheric testing. They were able to do that and I think Stewart had some influence, although it's impossible to say for certain.

Robin Lindley: Thanks for that insight. Udall must have been haunted by his World War II experiences. Many veterans were.

John de Graaf:  Yes. With Mormons who were in the war, the stresses of the war pushed quite a few into being smokers and drinkers, which the Mormon Church didn't allow. But many Mormons came back smoking and drinking to relieve stress, and Stewart was certainly one of them because the war was such a tragic experience.

Robin Lindley: Didn’t Udall differ with Johnson about the war in Vietnam.

John de Graaf: Big differences. Initially Stewart shared some of the worries about the spread of communism as many people did at that time. Stewart was never really a far leftist, but he was a strong liberal and he was afraid of communism or any totalitarianism, especially after fighting the Nazis.

Initially, Udall believed that maybe we should try to stop the spread of communism and help Vietnam be democratic. But that didn't last for long. Once Johnson sent the troops and Udall started seeing what was happening to the people of Vietnam, Udall changed his mind, probably as early as late 1965. He tried to get Johnson to pull back.

And Secretary of Defense Robert McNamara was a close friend of Udall. They hiked and backpacked together. Their kids knew each other. They always liked each other very much. But McNamara's son Craig told me that he didn’t know that Stewart was so against the war until he saw my film.  He said he always liked Stewart and thought Stewart was a wonderful guy. And his dad liked him, he said, but his dad never talked about what other people thought about the war.

McNamara completely separated his work and family life so he would not talk at home about anything going on with other cabinet members. So, McNamara's son had no idea that Stewart was so vociferously against the war along with Nicolas Katzenbach, Johnson’s Attorney General, and a couple of others who criticized the war at the cabinet meetings and to the president. Craig McNamara wrote to me saying that he wished his dad had listened to Stewart Udall.

Robin Lindley: And, after the Johnson administration, after Udall left his post as Secretary of Interior, he worked as a lawyer with environmental justice and human rights issues. How do you see his life after his term as Secretary?

John de Graaf: He didn't know exactly what to do in Washington. He wanted to work as a consultant to improve cities, to make cities more livable. He became very critical of the automobile and our use of energy. And plus, he saw racism tear our cities apart.

Stewart was looking for things to do, but it was not easy. What kept him in Washington was that he and his wife wanted to allow their kids to finish high school with their friends. After the kids were adults and off to college, the Udalls moved back to Arizona and to Phoenix. It took a while for Stewart to figure out what to do there after he’d been in a position of power and influence. He was 60 years old with so much behind him.

Robin Lindley: He practiced law after his years as Secretary of the Interior and focused on social justice and environmental issues. The film notes his work with “downwinders” who were ill from radiation as well as miners who faced work hazards. What do you see as some of his important accomplishments after he moved back to Arizona?

John de Graaf: Two things: certainly, his work for downwinders and uranium miners for more than ten years was the most significant.  Then in 1989, he moved to Santa Fe and did a lot of research and writing.  In all, he wrote nine books, the most significant being The Myths of August, an exploration of the terrible impacts of the nuclear arms race.  He loved history and several of his books are about the history of the American West.

Robin Lindley: You obviously did extensive research for the film. Can you talk about how the project evolved and some of the archival research and the interviews that surprised you? It seems that Udall’s family and colleagues were very enthusiastic and open to sharing their perceptions with you.

John de Graaf: The Udall family was wonderfully gracious and open to me.  Much of the real research had been done by Udall’s biographers so I just picked up on that.  As I talked to people, I discovered that no one would say anything negative about him; even those who disagreed with his politics had total respect for his humility and integrity.  That’s not common with political figures, especially in this polarized time.  I was especially impressed by current Interior Secretary Deb Haaland’s insistence that “the politics of beauty lives on.” And I was stunned by the paintings of Navajo artist Shonto Begay, a wonderful guy.  I use some of his paintings in the film.  I had great cooperation from the University of Arizona in finding still photos.

Robin Lindley: Congratulations John on the film and its recent warm reception at the Department of Interior with Secretary Deb Haaland, the first Native American in that role.

John de Graaf: Yes. That was a wonderful event. We had about 300 people there, and Secretary Haaland spoke and talked about Stewart.

And we are getting a very good response to the film at other screenings. My biggest concern is it's hard to get young people to come out to see it. But when they do, they like it, like the young Mormon woman who I mentioned at Utah State. And a Hispanic student at University of Arizona who is a leader of the students’ association there wants to present screenings to get students more active in politics. I think that's the way it's going to have to happen. The screenings already turn out faculty and the older community, but they don’t turn out students. But once they see it, they do respond to it. I've been very surprised at how many students come up to me afterwards and want to talk. They tell me that they never knew about any of this history. They didn't learn about it in school. We’ve also been treated very well by media.  We’ve done fairly well in festivals, though I’m disappointed that my own hometown Seattle International Film Festival didn’t take the film.

Robin Lindley: Thanks for your thoughtful comments, John, and again, congratulations on your intrepid work to create and now display this moving cinematic profile of Stewart Udall. I learned a lot, and the film brought back many memories of the sixties, those times of exuberance and turbulence. The film not only illuminates our history, but it's also inspiring. Udall’s example offers hope for our divided nation now.

Robin Lindley is a Seattle-based attorney, writer, illustrator, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, visual culture, and art. Robin’s email: robinlindley@gmail.com.  

 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/blog/154745 https://historynewsnetwork.org/blog/154745 0
The Roundup Top Ten for May 19, 2023

I'm Headed to Florida to Teach-In Against DeSantis's Education Policies

by Kellie Carter Jackson

This May 17 saw a 24-hour teach-in by historians in St. Petersburg, Florida, to protest the restrictions on curriculum, books and ideas pushed by Governor Ron DeSantis and his allies. As a historian of abolition, the author stresses that denying people the pen may influence them to pick up the sword. 

Bull Connor's Police Dogs Shocked the Nation in 1963, but they were an American Tradition

by Joshua Clark Davis

"In 1963 liberal critics condemned the Alabama city’s K-9 unit as a relic of the Old South. The harder truth to accept, however, was that it was actually a product of a new America."

MLK: Christian, Radical

by Jonathan Eig

Veneration has hollowed out Martin Luther King, Jr.'s legacy, and obscured the way that his political leadership always aimed at radical transformation of American society, argues the author of an acclaimed new biography. 

If it's Ineffective and Harmful, Why is Gay Conversion Therapy Still Around?

by Andrea Ens

Conversion therapies endure because their purpose is political, not therapeutic. They seek and symbolize the eradication of LGBTQ people from society and are promoted by groups who want that eradication to happen. 

Florida Just Banned Everything I Teach

by William Horne

Black historians during the Jim Crow era observed that the history taught in schools justified slavery, segregation, and lynching. A professor thinks that's where Ron DeSantis's vision of history is headed. Some politicians may think curriculum is a winning issue, but students and society will lose. 

Texas Shooting Highlights Long History of Anti-Black Violence in Latino Communities

by Cecilia Márquez

History shows that there have long been strains of anti-black racism in Latino communities, and that the categories "white" and "latino" are not mutually exclusive. Understanding today's far right requires attention to those details. 

The Relevance of Common Law to Abortion Debate: How Did the Law Work in Practice?

by Katherine Bergevin, Stephanie Insley Hershinow and Manushag N. Powell

Samuel Alito's ruling in Dobbs claimed to ground itself in the English common law's treatment of pregnancy. But he focused on a small number of published treatises while ignoring the record of how the law actually treated pregnant women and fetuses. 

There's Never Been a Right Way to Read

by Adrian Johns

The intellectual work and play of reading has always competed with other demands on attention; only recently have science and commerce converged to sell remedies for distraction and proprietary methods for reading. 

China is Cutting the US Out of the Middle East with an Axis of the Sanctioned

by Juan Cole

Recent American policies have squandered an opportunity to engage poductively with Iran and Saudi Arabia and instead pushed them toward stronger economic development relationships with China. 

Henry Kissinger: A War Criminal Still at Large at 100

by Greg Grandin

Henry Kissinger was instrumental in Nixon's decision to undertake the illegal bombing of Cambodia. His foreign policy machinations also led him to push Nixon to the actions that led to Watergate and the president's downfall, though Kissinger has remained unaccountable. 

]]>
Thu, 21 Nov 2024 06:29:13 +0000 https://historynewsnetwork.org/article/185702 https://historynewsnetwork.org/article/185702 0