History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Wed, 07 Jun 2023 19:44:10 +0000 Wed, 07 Jun 2023 19:44:10 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed Recovering the Story of the Empress Messalina After a Roman Cancellation

From "Messaline Dans La Loge de Lisisca," Agostino Carraci, 16th c., depicting the rumored moonlighting of the first-century empress in a Roman brothel.



Towards the end of 48 CE a workman carried his tools down into a tomb on the outskirts of Rome. Among the rows of niches, he found the urn holding the ashes of Marcus Valerius Antiochus. He had been a hairdresser and the freedman of the empress Valeria Messalina – a fact he had been proud enough of to record on his tombstone. The workman took out his tools; his job that day was to chisel off the empress’s name.


Messalina had been killed that autumn, in the midst of a scandal that had rocked Rome. She’d been accused of bigamously marrying one of her lovers and plotting to oust her husband, the emperor Claudius, from the throne. The real reason for Messalina’s fall probably lay more in the power plays of court politics than in some grand, mad, bigamous passion, but it didn’t matter. A succession of her alleged lovers were executed, and then, fearing that Claudius might be swayed by love for his wife, an imperial advisor ordered that Messalina herself be killed before she had the chance to plead her case.


Tacitus, the great historian of Roman tyranny, recorded that Claudius hardly reacted when the news of his wife’s death was brought to him at dinner –– he simply asked for another glass of wine. Claudius seemed to want to forget completely, and the senate was willing to help him. They decreed that every trace of Messalina –– every image of her, and every mention of her name –– should be destroyed. It was only the second time in Roman history that an official order of this kind, now referred to as damnatio memoriae, had been passed. The decree applied to both the public and private sphere; statues of Messalina were dragged off plinths in town-squares and domestic atria before being smashed, or melted down, or recut. Mentions of her name were rubbed off official records, and chiselled equally off honorific monuments and hairdressers’ epitaphs.


Damnatio memoriae has sometimes been referred to as a form of ancient Roman “cancel culture,” but this was a process utterly unlike modern cancellation –– one that could not be replicated today. In the age of the internet someone might be unfollowed, their invitations to speak at official events rescinded, they might be attacked in op-eds. Their name might even become unmentionable in certain circles. But while the reach and influence of “the cancelled” might be reduced, the evidence of their existence and actions cannot be destroyed. Their government records and Wikipedia pages still record their birthdate; their tweets, however dodgy, are still cached in some corner of the internet. They can post videos of themselves crying and apologizing, tweet a glib brush-off, or publish ten-thousand-word tracts of self-justification. The cancelled might be dismissed, but they cannot be erased.


The situation was different in 48 CE. The sources of information about Roman political figures were less varied and more traceable than they are today –– and the mediums through which such information was disseminated, generally more smashable.


The public image of imperial women like Messalina was carefully controlled. Official portrait types were developed, copies of which were sent off to cities throughout the empire, where they were copied and recopied again for public buildings, shop-windows, private houses. These statues, along with coin types and honorific inscriptions, were designed to present Julio-Claudian women as icons of ideal Roman femininity and imperial stability. Messalina’s best-preserved portrait is almost Madonna like – she stands, veiled, balancing her baby son Britannicus, then heir to the empire, on her hip; coins minted in Alexandria depict the empress as a veiled fertility goddess, carrying sheaves of corn that promise the prosperity of imperially protected trade routes. Such a coherent image could be destroyed almost wholesale – especially when driven by an official, central edict rather than simply by a shift in popular consensus; there is only one surviving statue of Messalina that was not discovered pre-broken by the conscientious minor officials of the mid-1st century.


So where does this leave the historian? At first glance the situation is dire –– our information about imperial Roman women is always limited, and in this case much of that information has been purposefully and systematically destroyed. On reflection, however, it is more complex; the destruction of Messalina’s images and honours had created a vacuum and an opportunity.


The official narrative of the Julio-Claudian rulers, expressed in stone and bronze, was always supplemented by a secondary, ephemeral narrative of rumor. This was a period that saw politics move ever more away from the public arenas of the senate and the assembly into the private world of the imperial palace as power was ever-increasingly concentrated in the figure of the emperor. The women of the Julio-Claudian family were central to this new dynastic politics; they had access to the emperor that senators could only dream of, and all the while they were raising a new generation of potential heirs to the imperial throne. As the opacity of the new court politics encouraged ever more frenzied speculation about the private lives and intrigues of its players, much of that speculation came to center on the women.


Messalina’s dramatic and sudden fall from grace had raised questions and, in leaving her memory and reputation unprotected, the process of damnatio memoriae allowed people to propose answers. Rumours of the empress’ political and sexual conduct –– some of which may have been circulating during her life, some of which must have evolved after her death –– could now be openly discussed, elaborated upon and written about.


The result is an extraordinarily rich tangle of reality and myth. The sources are almost certainly right to accuse Messalina of orchestrating her enemies’ downfalls and deaths (no one could survive almost a decade at the top of the Julio-Claudian court without a little violence); their attribution of such plots to sexual jealousy and “feminine” passion rather than to political necessity is more suspect. Similarly, there is no reason to believe ancient writers totally unjustified in accusing Messalina of adultery; their claims that she slipped out of the palace nightly to work in a low-class brothel, or that she challenged the most notorious courtesan in Rome to a competition of who could sleep with more men in twenty-four hours (and won with a tally of twenty-five) are far more difficult to credit.


The unravelling of these stories is both the challenge and the joy of ancient history. The process is also revealing on two counts. The evaluation of these stories brings us closer to re-constructing the narrative of Messalina’s real life, her history, and her impact on her times. But even those tales that cannot be credited are of value. The stories and rumours that Rome constructed about its most powerful women when given totally free rein tell us a great deal about its contemporary culture and society –– its anxieties, its prejudices, its assumptions, and its desires. 


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185813 https://historynewsnetwork.org/article/185813 0
From "Shell Shock" to PTSD, Veterans Have a Long Walk to Health

"The 2000 Yard Stare", by Thomas Lea, 1944, WWII. The Army Art Collection, U.S. Army Center for Military History



Will Robinson, an American Iraq war veteran, languished for months with depression and post-traumatic stress disorder (PTSD) all alone at home in Louisiana. One day in March 2016, he watched the movie “Wild,” starring Reese Witherspoon as Cheryl Strayed. Strayed’s book of the same title told of her redemption from despair by hiking 2,650 miles of wilderness on the Pacific Coast Trail, from Mexico to Canada. Robinson decided to follow Strayed’s example, packing up a tent and supplies a month later to duplicate her journey and, he hoped, its hopeful outcome.

He had nothing to lose. Forced into the army at the age of eighteen by a judge who promised to erase his conviction for petty theft if he served, he was deployed to South Korea in 2001 and Iraq in 2003. Six months in Iraq left him with injuries to his wrist, his knee and, more significantly, his mind. The army gave him a medical discharge for PTSD, but it offered little in the way of medical treatment. He attempted suicide with drugs the Veterans Administration issued him, surviving only because the pills made him vomit. Other vets of the war on terror were not so lucky; every day, an average of twenty-two take their lives rather than endure another moment of living hell. Robinson promised his mother he would not try again. Then she died, and he retreated into loneliness and depression.

It was during that dark time that Robinson saw “Wild” and took his first, literal, step towards recovery. He may not have known that he was following the advice of a British psychiatrist, Dr. Arthur J. Brock, who had prescribed similar solutions to soldiers traumatized in the First World War. The battles between 1914 and 1918 subjected young men to the unprecedented terrors of high explosive artillery shells, poison gas, flamethrowers, rapid machine-gun fire and claustrophobia in rat-infested trenches. Growing numbers of casualties carried to field hospitals had no physical wounds. At least, not wounds the doctors could see.

The soldiers suffered nervous breakdowns. They called their malady “shell shock,” a term introduced to the medical lexicon by psychiatrist Dr. Charles Samuel Myers after he visited the front in 1915. A high proportion of the victims were junior officers, who shared the troops’ fears but also led them in futile offensives against relentless enemy fire and felt a burden of guilt for their deaths. The military needed these officers, but the war had transformed them into paralysed, trembling, stuttering, blind or deaf wrecks unable to fight or to lead.

The British government was forced to open hospitals to aid them and, more importantly, make them fit to return to battle. Dr. Brock took up his post at Scotland’s Craiglockhart War Hospital for Officers when it opened in October 1916. His belief, based on his pre-war practice with mental breakdowns, was that “the essential thing for the patient to do is to help himself,” and the doctor’s only role “is to help him to help himself.” Brock blamed modern society as much as industrial warfare for severing people from the natural world and from one another, resulting in an epidemic of mental illness. His treatment for the soldiers was the same as it had been for civilians who broke down amid the struggle for survival in harsh economic times: reconnect to the world, especially the natural world. He encouraged his patients, including the poet Wilfred Owen, to explore the wild Pentland Hills near Craiglockhart. Many joined Frock’s Field Club to study nature and restore their pre-war relationship to it.

Symbolizing his method was a drawing on his consulting room wall. It depicted the mythological wrestling match between the hero Hercules and the giant Antaeus of Libya. Antaeus, son of the earth goddess Gaia, drew his strength from his mother earth as Samson did from his hair. As long as he was touching the ground, his strength was prodigious. Realizing this, Hercules lifted Antaeus into the air and broke his back. “Antaeus is civilisation,” Brock wrote, “and Hercules is the Machine, which is on the point of crushing it.” The war machine had crushed his patients’ minds. Some of them in fact had been hurled skywards and rendered unconscious by exploding shells. Brock urged them to find peace through nature.

Will Robinson made his connection to mother earth by trekking and sleeping rough on the Pacific Coast Trail, and later on other famous routes –the Tahoe Rim, the Arizona, the Ozark Highlands, the Continental Divide, and the Appalachian. He clocked over 11,000 miles, the first African American man to do so. ESPN declared him “the trailblazing superstar of thru-hiking.” Not only did he come to understand and survive wild environments, he discovered something his life was lacking: community. “Thru-hiking has that community, and it’s why I love it so much,” Robinson told ESPN journalist Matt Gallagher. “People need to know they belong to something.”

Brock would have approved. Connecting to others, becoming part of a community, was as vital to mental health as relating to the earth. Robinson made friends on his treks and mentored younger hikers, including other veterans. He also met the woman who became his girlfriend, and they continue to wander America’s rural byways together. Robinson worked hard to traverse those miles and overcome his war demons. For other American vets, as for World War I’s shell-shocked warriors, there is no single cure for PTSD. What works for one will fail another. Robinson found his way, and it is up to the government that sent them to war to find ways for the rest to resume their lives.

Modern American veterans have one advantage over Brock’s charges. The men whom Brock aided to health had to return to the trenches. Many broke down again or were buried in what war poet Rupert Brook called “some corner of a foreign field/That is forever England.”



© Charles Glass 2023

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185703 https://historynewsnetwork.org/article/185703 0
Can the Left Take Back Identity Politics?

Members of the Combahee River Collective, 1974. Included are (back row, l-r) Margo Okazawa-Rey, Barbara Smith, Beverly Smith, Chirlane McCray, and Mercedes Tompkins;

(front row, l-r) Demita Frazier and Helen Stewart. 



The Combahee River Collective


“We were asserting that we exist, our concerns and our experiences matter,” said Black feminist activist Barbara Smith in an interview she gave almost four decades after the publication of the seminal Combahee River Collective Statement, credited as the first text where the term “identity politics” is used. “We named that ‘identity politics' because we said that it is legitimate to look at the elements of one’s own identity and to form a political analysis and practice out of it.”


Combahee River Collective was a Black feminist lesbian socialist organization active in Boston from 1974 to 1980. The Collective got its name from a military expedition at the Combahee River in South Carolina planned and carried out by the abolitionist Harriet Tubman on June 2, 1863. The raid, which freed 750 slaves at the time, was the first military campaign in American history led by a woman. When asked to describe her work with the Combahee Collective in Boston, Smith said, “I think it was really fated that I ended up there. In Boston there's something about the size and the scale of the city that made it more possible for those of us who were like-minded to find each other.”


But the Collective's impact extended much farther than the local activist scene, thanks to its widely circulated statement of principles. Written by Barbara Smith, her sister Beverly Smith and Demita Frazier in 1977, the statement was published in 1979 in Zillah Eisenstein's anthology Capitalist Patriarchy and the Case for Socialist Feminism, and has since become one of the foundational texts of Black feminist thought:


Our politics initially sprang from the shared belief that Black women are inherently valuable. This focusing upon our own oppression is embodied in the concept of identity politics. We believe that the most profound and potentially most radical politics come directly out of our own identity ... In the case of Black women this is a particularly repugnant, dangerous, threatening, and therefore revolutionary concept because it is obvious from looking at all the political movements that have preceded us that anyone is more worthy of liberation than ourselves. We reject pedestals, queenhood, and walking ten paces behind. To be recognized as human, levelly human, is enough.


This was indeed a very different understanding of identity politics than the hollowed-out versions that dominate public debate today. First, it refused the idea of comparing and ranking oppressions, focusing instead on the particularity of each lived experience. “We actually believed that the way you come together is to recognize everyone fully for who they are,” Smith said, “as we work toward common goals of justice and liberation and freedom.” This opened the door to cooperation and coalition-building, including with those who don't resemble, or necessarily agree with, us. Second, it rejected single-issue politics by pointing to the “interlocking” nature of major systems of oppression. This was in fact the reason the Combahee statement was written in the first place: to point to the failure of the Civil Rights movement, Black nationalism and White feminism to sufficiently address the realities of Black lesbian women.


But the statement didn't prioritize the liberation of one group of people over any other, and proposed what was effectively a new model of social justice activism — foregrounding what would later be called “intersectionality.” Oppressions were multilayered and experienced simultaneously, and that required multi-issue strategies that reject a rights-only agenda. And third, the Combahee vision was unabashedly internationalist and anti-capitalist. The members of the Collective were actively involved in the anti-war movement, for they considered themselves to be, in the words of Barbara Smith, “third world women”: “We saw ourselves in solidarity and in struggle with all third world people around the globe.” Growing out of the organized Left, they defined themselves as socialists, and believed, as their statement put it, “that work must be organized for the collective benefit of those who do the work and create the products, and not for the profit of the bosses.”


Till Identity Do Us Part


But times have changed, and not for the better. A new type of identity politics was forged on university campuses, one that didn't fully grasp the connection between theory and practice, or concerns about bread-and-butter issues that affect all women. This narrow version “was used by people as a way of isolating themselves, and not working in coalition, and not being concerned about overarching systems of institutionalized oppression,” Barbara Smith said, expressing her discontent with the ways in which identity politics was reconfigured by the campus Left. “Trigger warnings and safe spaces and microaggressions — those are all real, but the thing is, that’s not what we were focused upon.” Like other groups of Black women who were organizing around Black feminism, Combahee was “community-activist based. Focusing on looking at real issues affecting all Black women, which includes poor Black women.”


Demita Frazier, another co-author of the Combahee statement, concurred. Part of the problems is “the commodification of everything,” including identity politics, which was completely detached from its anti-capitalist origins. This was because of the way it was co-opted by academics, she added: “I wouldn’t say co-opted if it weren’t for the fact that there’s still this big divide between practice and theory, right? I mean, I’m glad that the children and the young’uns are getting educated, but it looks like a factory to me right now.”


This brief excursion into history, and the reflections of the veteran activists of the Combahee River Collective on the legacy of their statement, provide several insights into the problems that plague current understandings of identity politics. The radical identity politics of campus activists, Diversity, Equity and Inclusion trainers and anti-racism gurus is everything that the identity politics of the Combahee River Collective is not. The new upgrade is profoundly narcissistic, and focuses on perceived individual harm at the expense of structural injustices; it establishes hierarchies of oppression by resuscitating the theological concept of “eternal sin,” which is then imputed to certain groups of people who are expected to devote a certain percentage of their daily lives to confess and repent (after all, no salvation without self-flagellation!); it interjects the term “intersectionality” here and there as a catchphrase, but treats identities as if they are fixed, insulated categories with no internal hierarchies or divisions; it disparages the idea of universal values or human rights, treating them as tools for domination invented by the powerful to maintain the status quo; it sees no allies, and it seeks no allies; it is thus “separatist,” in the sense in which Barbara Smith used the term. “Instead of working to challenge”, Smith said, “many separatists wash their hands of it and the system continues on its merry way.”


“This Bridge Called My Back”


For the Combahee women, identity politics was about politics, and identity was one way of doing politics and challenging hierarchies. For the campus Left, identity politics is about identity, and identity is beyond politics. It's a sacred value that needs to be preserved intact, at all costs. The questions of who defines a particular identity, or what causes harm, are left unanswered. In that sense, early critics of radical identity politics, Marxists and liberals alike, were right, but only partially. It's true that for the campus Left, “symbolic verbal politics” was the only form of politics that was possible. But today, even verbal politics is out of bounds. Terms are not discussed but dictated; truth, in an ironic twist, is no longer relative but absolute. Paradoxical as it may sound, new identity politics is “anti-politics” — not only in the conventional sense of alienation from or distrust in mainstream politics but also in the broader sense of how we understand “the political,” as a space of contestation. The current obsession with privilege closes up that space, ruling out the possibility of dialogue and building alliances. In such a scheme, anyone who criticizes dominant progressive orthodoxies is branded as a “useful idiot,” advancing or unwittingly enabling a right-wing agenda. White progressives, Black conservatives, centrists or bona fide liberals are considered to be more harmful to the cause of social justice than explicitly racist modern day Ku Klux Klanners. It may well be so. But what does this mean, politically speaking? Are we not supposed to reach out to fellow progressives or, indeed, regular people, and explain to them that in a society built on White values, colorblindness may not be the best way to achieve racial equality? And if we cannot even speak to the progressives, how are we going to convince the conservatives, reactionaries, or overt racists who still constitute a substantial part of any given society?


The Combahee women who coined the term identity politics knew the answer to these questions because they were doing political work and consciousness-raising in the real world, with women of all colors and walks of life, not peddling virtue in sterilized boardrooms or slick vodcasts. They were guided by the motto “This Bridge Called my Back” (which was later to become the title of a ground-breaking feminist anthology edited by Cherrie Moraga and Gloria E. Anzaldúa), which they saw as the key to success. “The only way that we can win — and before winning, the only way we can survive,” said Barbara Smith, “is by working with each other, and not seeing each other as enemies.”

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185812 https://historynewsnetwork.org/article/185812 0
Ayn Rand's Defense of an Anti-Union Massacre

Photo from records of LaFollette Committee, National Archives and Records Administration



In July 1943, former Hollywood screenwriter Ayn Rand was still tracking responses, critical and commercial, to her first major novel, The Fountainhead.  It had been published two months earlier by Bobbs-Merrill after being rejected by a dozen other companies.   Rand had written two previous novels, along with two stage plays, none of which proved successful.  Now The Fountainhead was off to a slow start with audiences and reviewers.


While this was transpiring, Rand received in the mail a set of galleys for the memoir (eventually titled Boot Straps) by Tom M. Girdler, chairman of Republic Steel, which operated several massive plants in the Midwest and Pennsylvania. Many Americans had probably already forgotten the most tragic incident that Girdler was associated with, almost exactly six years earlier.  If Rand was among them, her memory (and high estimate of Girdler) was surely revived in reading those galleys.  Soon she would model a key character in her most famous novel, Atlas Shrugged, partly on Girdler.


Near the end of May 1937, workers who had been on strike for several days at Republic Steel in Southeast Chicago had called for a Memorial Day picnic on the wide open field several blocks from the plant entrance to build support.  Tom Girdler wouldn’t even recognize the union, famously vowing that he would retire and go back to growing apples before he’d do that.  At least 1500 workers and family members, including many women and children, turned out for the picnic.   After the festivities, organizers called on the crowd to march to the gates of the plant where they might establish a mass, legal, picket. 


Halfway there, the marchers, at least 500 strong, were halted by a large contingent of Chicago police and ordered to disperse.  A heated discussion ensued.  A few rocks were thrown in the direction of the police.  Suddenly, some of the police drew their pistols and opened fire on the protesters at point blank range, and then as the marchers fled.   They chased after the survivors, clubbing many of them. 


Forty in the crowd were shot, with ten dead within two weeks. Dozens of the survivors were arrested and lifted into paddy wagons without medical attention.  Only a handful of police required treatment for minor injuries.  


Despite these one-sided results, local and national newspapers, right up to The New York Times and Washington Post, almost uniformly portrayed the marchers as a “mob” intent on rioting—that is, as the perpetrators of this tragedy.   Some falsely suggested that the unionists fired first. 


The only footage of the incident is quite graphic, showing the police shooting and then clubbing marchers; it was suppressed by Paramount News, a leading newsreel company. 


Then the Progressive Party senator from Wisconsin, Robert LaFollette, Jr. convened a sensational three-day hearing into the tragedy. The Paramount footage was screened in its entirety—and then in slow motion (you can watch it here)--providing more proof of police malfeasance.  It emerged that Republic Steel had collaborated with police on this day, allowing them to set up headquarters inside their plant and supplying them with tear gas and axe handles to supplement their billy clubs.


When the LaFollette committee released its report (most of it, along with witness testimony, printed for the first time in my new book on the Massacre), it harshly criticized the police: “We conclude that the consequences of the Memorial Day encounter were clearly avoidable by the police. The action of the responsible authorities in setting the seal of their approval upon the  conduct of the police not only fails to place responsibility where responsibility properly belongs but will invite the repetition of similar incidents in the future.”


Ayn Rand clearly did not agree.  On July 12, 1943, she typed a five-page letter to Republic boss Girdler after reading his galleys.  “Allow me to express my deepest admiration for the way in which you have lived your life,” Rand wrote from New York City, “for your gallant fight of 1937, for the courage you displayed then and are displaying again now when you attempt a truly heroic deed—a defense of the industrialist….”  Then she offered to send him a copy of her novel.


“The basic falsehood which the world has accepted is the doctrine that altruism is the ultimate ideal,” she related.  “That is, service to others as a justification and the placing of others above self as a virtue.  Such an ideal is not merely impossible, it is immoral and vicious.  And there is no hope for the world until enough of us come to realize this.  Man’s first duty is not to others, but to himself…


“I have presented my whole thesis against altruism in The Fountainhead….Its hero is the kind of man you appear to be, if I can judge by your book, the kind of man who built America, the creator and uncompromising individualist.”


But Rand also admitted that “it shocked me to read you, a great industrialist, saying in self-justification that you are just as good as a social worker.  You are not.  You are much better.  But you will never prove it until we have a new code of values.  ​ 


“You had the courage to stand on your rights and your convictions in 1937, while others crawled, compromised, and submitted.  You were one of the few who made a stand.  You are doing it again now when you come out openly in defense of the industrialist.  So I think you are one of few men who will have the courage to understand and propagate the kind of moral code we need if the industrialists, and the rest of us, are to be saved.  A new and consistent code of individualism.” 


She concluded the letter “with deep appreciation for your achievement and that which you represent.”


Girdler replied on July 27, 1937, that he had just purchased The Fountainhead. A few months later, he met Rand in New York and told her that he had read and enjoyed novel, which pleased her immensely, and he suggested they meet for lunch.


This apparently did not take place, but she would, a short time later, create one of the key characters in Atlas Shrugged, troubled steel industrialist Hank Rearden, based partly on Girdler.



Greg Mitchell’s new film Memorial Day Massacre: Workers Die, Film Buried, premiered over PBS stations in May and can now be watched by everyone via PBS.org and PBS apps.  He has also written a companion book with the same title.  He is the author of a dozen previous books.


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185782 https://historynewsnetwork.org/article/185782 0
The Power of Dependency in Women's Legal Petitions in Revolutionary America (Excerpt)

James Peale, "The Artist and His Family," 1795



Historians have spent decades investigating whether the American Revolution benefited women or provoked changes in women’s status. By and large, white women’s traditional political rights and legal status remained relatively stagnant in the wake of the American Revolution. In some ways, women’s legal status declined over the course of the long eighteenth century. Certain women’s private lives, however, did see some important shifts, especially in regards to family limitation and motherhood. Importantly, the Revolution politicized some women who participated in boycotts, contributed to and consumed Tory and Whig literature, and even acted as spies or soldiers themselves during the war. Women also carefully negotiated their political positions to manage the survival and safety of their families. In the postwar period, elite white women gained greater access to education, though ultimately in service of raising respectable republican sons and their worthy wives. In many ways, however, the lives of American women looked much the same in the postrevolutionary period as they had prior to the war. Despite Abigail Adams’s threat to “foment a rebellion” if women were not included formally in the new American body politic, there would be no great women’s revolution in the late eighteenth and early nineteenth centuries.


Asking whether the Revolution benefited women or brought meaningful changes in their social, legal, and economic statuses, however, cannot fully illuminate the war’s impact on women’s lives. In some ways, this framework is both anachronistic and problematic. Constructing our queries in this way asks too much from a historical period in which inequality and unfreedom were so deeply embedded in patriarchal law, culture, and society as to render such a sea change unlikely  at best. Likewise, this line of inquiry presumes that revolutionary-era women collectively desired what first- and second-wave feminists sought for themselves. It also judges the consequences of the Revolution for women from a set of expectations codified as masculine. Certainly, there were a few noteworthy women who sought rights and freedoms for which liberal feminists of the nineteenth and twentieth century fought, but the Abigail Adamses, Mercy Otis Warrens, and Judith Sargent Murrays of the American revolutionary era were few and far between.


This long scholarly conversation about whether the American Revolution was centrally a moment of change, stagnation, or decline in women’s lives has framed many historical investigations from the wrong perspective. Ironically, we have been studying patriarchal oppression, resistance to it, and attempts to overcome it from a patriarchal standard all along. We must seek to understand the impact of the American Revolution on women’s lives by framing our inquisition around women’s own worldview, their own needs, aspirations, and desires, even when doing so is uncomfortable to our modern sensibilities. What function did the Revolution serve in women’s lives? How did women interpret the rhetoric of the Revolution? How did they make the disruption and upheaval of this historical moment work to their advantage, with the tools already at their disposal? How did they use the apparatus of patriarchal oppression— namely, assumptions of their subordination and powerlessness—to their advantage? What did they want for themselves in this period, and were they able to achieve it? When the impact of the Revolution is investigated  with this shift in perspective, we are able to observe the ways in which women’s individual and collective consciousness changed, even if the Revolution was not radical enough to propel them from their unequal station in American society.  

In Dependence asks these questions from a regionally comparative and chronologically wide-ranging perspective, focusing on three vibrant urban areas—Boston, Massachusetts; Philadelphia, Pennsylvania; and Charleston, South Carolina—between 1750 and 1820, or what I refer to broadly as the “revolutionary era.” These three cities serve as ideal locations for a study of early American women’s experiences as their laws, social customs, and cultures varied significantly. Boston, Philadelphia, and Charleston were three of the most populous cities in the American colonies and, later, the early republic, which provided inhabitants with access to burgeoning communities as well as the growing marketplaces of goods, printed materials, and ideas. Massachusetts’s, Pennsylvania’s, and South Carolina’s laws regarding marriage, divorce, and property ownership (and thus their demarcation of women’s rights and legal status) all differed a great deal during this period. I chose to focus my study on urban as opposed to rural areas so as to include in this work impoverished communities, whose members often turned for assistance to city almshouses and other local organizations. Women in each of these three cities had the opportunity to petition their state legislatures for redress, yet because of their varying experiences and racial and class identities, they did so for different reasons, with different access to seats of patriarchal power, and certainly with different outcomes.


The revolutionary era was a period in which ideas about the meanings of independence, freedom, and individual rights were undergoing dynamic changes. Dependence was a fact of life in colonial British America, defining relationships ranging from colonial subjects’ connections to the king to wives’ unions with their husbands. Both parties in these relationships had power—even dependents—and these relationships required a set of mutual obligations. Thus, dependence was not an inherently impotent status. The meaning of dependence shifted, how ever, with the adoption of the Declaration of Independence. Dependence ceased to be a construct with positive connotations in the American imagination, and likewise became imbued with a sense of powerlessness. The newly independent United States required the allegiance of  its people, and adopted the concept of voluntary citizenship rather than  involuntary subjectship. Accordingly, the law recognized women’s personhood and, to a certain degree, their citizenship, but it also presumed their dependence, which codified them as legally vulnerable and passive. Dependence, then, became highly gendered, and feminized. Women’s  dependent status was likewise contingent on their socioeconomic status, their race, the legal jurisdiction in which they resided, and their relationship to men in power.


Importantly, dependence must not be observed as the ultimate foil to independence. These terms are not abjectly dichotomous to one another, but exist on a fluid spectrum. Situated on this continuum, women firmly asserted their dependence while expressing the “powers of the weak.” While a traditional understanding of “power” implies some form of domination of one party over another through possession, control, command, or authority, this conception obscures the meaning of the word itself while also negating the exercises and expressions of power that do not conform to these standards. If power is also understood as existing on a fluid spectrum, then, an analysis of women’s invocation of the language of dependence in their petitions to state legislatures, courts, local aid societies, and their communities becomes much different.


Notions of power and freedom in early America were contingent upon a person’s intersectional identities. Wealthy, white male enslavers, for example, had different understandings and experiences of freedom than did the Black women they enslaved, and because of the legal structure of the patriarchal state, these white male enslavers held a great deal of power over unfree, enslaved Black women. Like dependence and independence, freedom and unfreedom existed on different ends of the same spectrum. Race, gender, class, religion, region, status of apprenticeship,  servitude, or enslavement, and other elements of an early American’s  identity shaped their relationship to freedom and unfreedom. Notably, this continuum was deeply hierarchical. Even if enslaved women earned  or purchased their legal freedom from the institution of slavery, that free  status was still tenuous, as was the free status of any children they bore. Likewise, enslaved women would have viewed freedom differently than their white counterparts. Black women in particular often defined freedom as self-ownership, the ability to own property, to profess their  faith freely, and to ensure freedom for their families. Freedom for many  enslaved people was a matter of degrees, a game of inches, a process of  constant negotiation for small margins of autonomy and independence  in an otherwise deeply oppressive system. Even if they obtained documentation that declared them legally free from the institution of slavery,  that did not guarantee their perpetual freedom, and it certainly did not  grant them equality under the law; that freedom—even if it existed on  paper—was tenuous. Additionally, American freedom did not evolve  and expand in a teleological manner; in many cases, even in the revolutionary era, freedoms devolved and disappeared for certain marginalized groups of Americans.  We must always consider the ways in which  Americans’ experiences of their freedoms were not (and in many ways, still are not) equal.


Black women experienced multiple, layered dependencies that were compounded by their race and gender, and especially by the existence of the race-based system of chattel slavery that relied on Black women’s reproductive capacity to enhance the power of white patriarchs. Black women, therefore, were not endowed with the same legal protections, rights, and privileges as their white contemporaries were. Engaging with the sympathies of white patriarchs, for example, was not a functional or effective strategy for Black women, as it was for white women. In order  to fully understand how Black women exploited the terms of their intersectional dependencies, then, we must examine the unique experiences  of Black women from within these interlocking systems of oppression. The notion that women could—and can still—express power because of their subordinate status and the protection it offers indicates that women have never been completely powerless. Like other historically marginalized groups or individuals, women have been able to express  a degree of power, autonomy, and agency over their own lives while still being overtly suppressed by a controlling authority. Thus, dependents  expressed power in a variety of ways, including more subtle means such as claiming a public voice or becoming politically active via the submission of petitions. What is especially significant, however, is not that women found power through petitioning various authorities but that they found power in this way through public declarations of their dependent, unequal, and subordinate status.


This excerpt from In Dependence: Women and the Patriarchal State in Revolutionary America is published by permission of NYU Press. 

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185816 https://historynewsnetwork.org/article/185816 0
Comparing the Trump – DeSantis Race to the Republicans' 1912 Debacle is a Stretch... Right?

Leonard Raven-Hill, "For Auld Lang Syne" from Punch, 1912



And they’re off. With just a year and a half to go, Ron DeSantis has finally thrown his hat into the ring. Now the race for the GOP nomination truly begins. Nikki Haley, Asa Hutchinson, and a variety of other runners and riders are there but, for most people, this is a two-horse race between Donald Trump and DeSantis. This potential head-to-head has been a long time coming, and some think that DeSantis has left it too late. DeSantis was well ahead of Trump in GOP opinion polls shortly after the 2022 midterms, but now Trump has a commanding lead. However, we shouldn’t forget that a lot can change between now and November 2024.

Let’s go back a little, to see how polling looked for Trump in 2011 and 2015 (around a year and a half out from the presidential elections of 2012 and 2016).

In April 2011, Trump led a poll of GOP primary voters with 26 percent, more than ten points ahead of eventual nominee Mitt Romney. Much of this “Trump bump” was linked to his high-profile “birther” campaign demanding to see President Obama’s birth certificate. However, once Obama produced the document, Trump’s numbers swiftly dissipated, and he decided not to run. Conversely, In June 2015, Trump polled just 1 percent, leaving him in eleventh place out of 16 candidates in a survey of GOP primary voters. Of course, Trump looks to be in a much stronger position at the end of May 2023, with polls of primary voters putting him at over 50 percent and DeSantis trailing with around half of that. But remember, Trump won the nomination in 2016 despite polling at only 1 percent at the same stage of the nomination campaign, and his numbers collapsed when he had gathered a significant lead four years earlier.

So, let’s imagine, just for argument’s sake, that DeSantis stays the course. We get a broad slate of candidates (as we did in 2015-2016), and Trump isn’t the outright winner come the Republican National Convention. Let’s stretch our imaginations even further to see the GOP Convention tightly contested so that, in the end, DeSantis gets the nomination by the narrowest of margins. Trump, spurned, storms out and decides to run independently under the banner of the “Truth Party.” Come November, Trump picks up a number of states he won in 2020, and DeSantis takes Florida and a handful of flyover states for the GOP. Meanwhile, Biden wins by a mile, as the divided Republican vote lands him easy wins in Pennsylvania, Georgia and Ohio, and he even sneaks by in Texas. It’s an outlandish scenario, but for those of you with a long memory, it’s not quite the frantic fever-dream of a teacher overcome by too much grading that it might seem. I take you back to 1912….

In February 1912 – election year – Theodore Roosevelt (Republican president from 1901-1909) formally challenged the incumbent Republican President William Howard Taft for the GOP nomination. In Roosevelt’s mind, he had made Taft’s career; TR had appointed Taft to his cabinet while president and handpicked Taft as his successor. Roosevelt campaigned for Taft in 1908, had photographs taken with him, went the whole nine yards. In many ways TR felt he won the election for Taft. Yet, while in office, Taft disappointed Roosevelt. Always a larger-than-life personality, retirement was never really going to be TR’s favored path.

The 1912 Republican nomination campaign turned nasty. Roosevelt launched several stinging attacks on Taft. Taft, a TR true-believer and former friend, was wounded and was slow to resort to the same sort of name-calling as Roosevelt. The stage was set for a close race, and the result went down to the wire. That June, though, Taft wrested the nomination from Roosevelt at the convention. Roosevelt cried foul play, a corrupt stitch-up! He stormed out of the convention and weeks later ran a third-party “Bull Moose” campaign under the banner of the Progressive Party.

The 1912 election became a three-horse race – though special mention should go to five-time presidential candidate for the Socialist Party, Eugene Debs, who received over 900,000 votes (from a total vote of just over 15 million). Roosevelt won 88 Electoral College votes, including large states like Pennsylvania, while Taft got the measly 8 votes of Utah and Vermont. Between them, the erstwhile allies got over 50 percent of the popular vote, but the Electoral College saw Democrat Woodrow Wilson win in a landslide, with 435 votes out of 531.

As ever with these historical parallels, there are innumerable other variables that don’t mirror the past anywhere near as well. However, this comparison is not so much aiming to suggest that 2024 might be a full repeat of 1912, as to offer a glimpse into the danger that a full split could cause for the GOP if DeSantis and Trump really did divide the vote come November 2024.

Trump and DeSantis started as allies. Many, including Trump, felt that Trump’s backing won the Florida gubernatorial race for DeSantis in 2018. DeSantis appeared to be a Trump true-believer. Trump did not want to retire quietly, and he doesn’t seem to like how DeSantis has “betrayed” him. They are now running against each other for the nomination, and Trump has been criticising “Ron DeSanctimonious” for months, while DeSantis has remained largely passive in his response. There is an echo of the past here for sure, even if it’s a faint one thus far.

However, if things were to go the course, and the Convention looked like it might be decisive… if the remaining court cases against Trump were to go against him, and the Republican Party threw its weight behind DeSantis to narrowly deny Trump the nomination… then it does not seem quite so far-fetched that Trump could run as an independent. Maybe, just maybe, 2024 might see more echoes of 1912 when it arrives. If so, then President Biden will no doubt be happily ordering copies of James Chace’s 1912 or Lewis Gould’s Four Hats in the Ring, and merrily assessing his chances in a three-horse race come next November.

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185814 https://historynewsnetwork.org/article/185814 0
A Trip Through the Mind of Vlad the Conqueror: A Satire Blending Imaginary Thoughts with Historical Facts



Striding masterfully through St. George’s Hall of the Grand Kremlin Palace, Vlad the Conqueror pondered his role as a Man of Destiny.

“It’s not easy to measure up to the past leaders of Russia,” he brooded.  “Ivan the Terrible and Peter the Great slaughtered enormous numbers of people at home and abroad in building the largest nation on earth.”  Stalin, too, he noted, “showed the world what could be accomplished by a strong man with an unrelenting will to power.”  After all, Stalin “succeeded in murdering millions of Ukrainians through starvation, gobbling up portions of Eastern Europe through an alliance with Nazi Germany, smashing Hitler’s legions after the führer broke with him, and pushing Russian domination right into Central Europe during the early Cold War.  Now those were real men!”

Frowning, he added: “Of course, the Russian empire went downhill after that.  Stalin’s namby-pamby successors fumbled along, trying to hold it together through invasions of Hungary, Czechoslovakia, and Afghanistan.  And then, Gorbachev”―Vlad spat out the name―“that traitor, he wanted Russia to behave like a normal nation.  But it’s not a normal nation,” Vlad told himself heatedly.  “It’s a great nation.  And great nations require great leaders!”  Pausing briefly, he stopped to regard himself, fondly, in a diamond-encrusted mirror.

“And look at what I’ve already accomplished in restoring our nation’s grandeur―not only rebuilding our military forces and arming them with new nuclear weapons, but using Russian military power to obliterate Chechnya, secure footholds in Georgia and Moldova, annihilate resistance to Assad’s dictatorship in Syria, and launch Russian mercenary forces throughout Africa.”

He stopped and smiled.  “But the highpoint so far is surely my invasion of Ukraine.  I’ve leveled its cities, massacred many thousands of Ukrainians, sent millions more fleeing as refugees, and annexed its land into Russia.  As my long-time friend, Donald Trump, remarked of the invasion, ‘this is genius’!”  Pausing before another mirror, he again admired his profile.

“Alas,” he conceded, “not everyone recognizes greatness when they see it.  In the wake of my glorious invasion of Ukraine, 141 nations at the UN General Assembly voted to condemn it, though four wise nations did give it their support:  North Korea, Syria, Belarus, and Eritrea.  At home, too, many thousands of Russian subversives―betrayers of their Motherland (and of me!)―demonstrated and signed petitions against the war.  Fortunately, we’ve already arrested about 20,000 of them.  Also, perhaps a million Russians, losers all, fled to other lands.”  He groaned wearily.  “Well, they won’t be missed!”

“Furthermore, abroad, where I’m gratified to learn that I have many fans among rightwing and leftwing zealots, public opinion has turned against me.”  Vlad scratched his head in dismay.  “Even those segments of the Western peace movement that back my policies don’t seem to ‘get it.’  One busy bee who writes and speaks incessantly about the war in Ukraine almost completely ignores my role in it.  Instead, she chalks up the conflict to the policies of the United States and NATO.  Don’t I get any credit for anything?!”  He shook his head sadly.

“And then, of course, there are the damned Ukrainians who, instead of welcoming our invasion, destruction, and occupation of their country, are resisting!  This is surely another sign that they are unfit to govern themselves.”  He concluded, morosely:  “What a mess!”

“Yes, life is unfair to me,” Vlad sighed, as warm tears suddenly appeared and rolled lazily down his cheeks.  “And it has been for some time.”

He ruminated: “Things are not so easy when you’re a little guy―only 5 feet, 6 inches tall―in a big man’s world.  Peter the Great, a hero of mine, measured 6 feet, 8 inches.  So he certainly had an advantage there!  Also, on top of that, my puberty came late. To keep from being bullied by the other boys, I took up judo and, at the age of 19, became a black belt.  Then,” he laughed, “I joined the KGB, and people soon learned not to mess with me or with my new circle of friends.” 

“Naturally, as I moved up the Russian government hierarchy, I became known for my tough, masculine style and approach―riding bare-chested, muscles rippling, on horseback, imprisoning uppity women, and making even the mention of homosexuality punishable by imprisonment.  And I saw to it that my political opponents were packed off to prison camps―at least when they didn’t develop the nasty habit of getting poisoned or falling out of windows.”  Pounding his fist on a table inlaid with gold and ivory, Vlad chortled at his wit.

“Some say that I’m a cold person.  Actually, though, I can be warm and accommodating when it’s useful in forging friendly relationships with other great leaders―men of power like Xi Jinping, Donald Trump, Kim Jong Un, and Saudi Prince Mohammed bin Salman.  In 2018, when bin Salman was being snubbed by other government leaders at the G20 summit for ordering the dismemberment of Jamal Khashoggi, a dangerous journalist, I went right over to the prince and we exchanged joyful high-fives.  We’ve been great pals ever since.”

Smiling, Vlad remarked: “None of them, of course, has my sophisticated grasp of international relations, and they will ultimately recognize my superior wisdom as my mastery of world affairs and my power grow ever greater.  Even now, though, they are turning to me for leadership.”  Spotting another mirror, he gazed lovingly at his splendor.

Standing tall and throwing back his shoulders, he proclaimed:  “Yes, I’m no longer Little Vlad.  I’m the supreme commander of the biggest country on earth.  And, under my rule, it’s growing even bigger.  Today I am Vlad the Conqueror!  Look on my works, ye mighty, and despair!”

Then, glancing about the vast, ornate hall, he muttered: “Now where the hell is my Viagra?  Where did I put it?”

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185815 https://historynewsnetwork.org/article/185815 0
Trump Poised to Join Short List of 3-Time Presidential Nominees



Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency (Rowman Littlefield Publishers, 2015, Paperback Edition 2017).



As the presidential campaign of 2024 becomes the center of public attention, former president Donald Trump seems far ahead in the battle for the Republican presidential nomination; if he does win, Trump will join a select group of presidential nominees who have been on the ballot three or more times.

All by himself as the only four-time candidate is Franklin D. Roosevelt, who was the nominee of his Democratic Party in 1932, 1936, 1940, and 1944.  After World War II, the move for a constitutional amendment to limit presidential longevity in office to two elected terms (or a total of ten years if succeeding to the office) was accomplished with the 22nd Amendment, which took effect beginning with the presidency of Dwight D. Eisenhower. FDR also had the unique distinction of being on the presidential ballot as a vice presidential running mate in the failed presidential campaign of Democrat James Cox in 1920.

Two three-time presidential nominees failed to be elected despite multiple attempts.  Henry Clay was on the ballot in the presidential elections of 1824, 1832 and 1844, and was a contender in 1840 and 1848. The winning Whig candidates in those years (William Henry Harrison and Zachary Taylor) died early in their terms of office. William Jennings Bryan was the nominee of the Democratic Party in 1896, 1900, and 1908 and was bandied about as a possible nominee in 1912, before he threw his support to Woodrow Wilson, who went on to win two terms in the White House.

Thomas Jefferson and Andrew Jackson both lost their first bids for the presidency in 1796 and 1824. In times of constitutional crisis and division, they were defeated respectively by the father and son John Adams and John Quincy Adams. Jefferson and Jackson would defeat their Adams nemeses in the next elections, and each served two terms in the presidency. 

Martin Van Buren, Jackson’s second-term vice president would be the last vice president to succeed to the presidency by election (1836) until 1988, when George H. W. Bush succeeded Ronald Reagan. But Van Buren lost the election in 1840, and then ran as the candidate of the Free Soil Party in 1848, winning ten percent of the national popular vote.  If one counts his being on the ballot with Jackson in 1832, Van Buren was on the ballot more often than anyone except FDR.

Grover Cleveland was on the ballot three times, winning the popular vote all three times (1884, 1888, 1892), but losing the Electoral College in 1888 to Benjamin Harrison (whom he then defeated in 1892).  If Donald Trump ends up as the Republican nominee against Joe Biden, this would be the first such scenario of a rematch since 1892.

The final example of a three-time nominee was Richard Nixon, who lost to John F. Kennedy in 1960, but came back as the successful Republican nominee in 1968 and 1972.  He joined only Thomas Jefferson and Andrew Jackson as candidates who had lost, and then came back to win two terms as president.

So Donald Trump might join a short list of third time nominees, but he also has a unique situation as the only president to have lost the national popular vote twice (although he was elected president in 2016).  He would join only Thomas Jefferson (before the era of popular vote being a factor in elections), and Henry Clay and William Jennings Bryan (three time losers) in being on the ballot three times.

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/blog/154756 https://historynewsnetwork.org/blog/154756 0
The "Critical Race Theory" Controversy Continues

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/177258 https://historynewsnetwork.org/article/177258 0
The Right's Political Attack on LGBTQ Americans Escalates

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/182937 https://historynewsnetwork.org/article/182937 0
Mifepristone, the Courts, and the Comstock Act: Historians on the Politics of Abortion Rights

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/181169 https://historynewsnetwork.org/article/181169 0
The Roundup Top Ten for June 2, 2023

Determined to Remember: Harriet Jacobs and Slavery's Descendants

by Koritha Mitchell

Public history sites have the potential to spark intellectual engagement because when they make embodied connections between people and the sites they visit—even when those connections evoke the cruelty of the past. 


Commemoration of the Tulsa Massacre Has Put Symbolism Over Justice for the Victims

by Victor Luckerson

"The neighborhood’s historical fame has become a kind of albatross slung over Black Tulsans’ necks, as efforts at building concrete pathways toward justice are buried under hollow symbolism."



Dodgers' Controversial Invite to "Drag Nuns" Group Highlights Catholics' Selective Sense of Faith

by Kaya Oakes

Catholic groups expressing outrage at the team's recognition of the Sisters of Perpetual Indulgence overlook the centrality of mercy in the Gospels. 



Will the Debt Ceiling Deal Derail Environmental Justice?

by Robert Bullard and Larry Shapiro

The idea of permitting reform—easing the environmental constraints on building new energy infrastructure—has been a bargaining chip in the debt ceiling negotiations. Reforms could help bring a green energy grid online, but it could also put more polluting industry in poor and minority communities. 



The Right to Dress However You Want

by Kate Redburn

New anti-transgender laws should prompt a legal response, but they also require a fundamental recognition: laws prescribing gendered dress codes infringe on everyone's freedom of expression. 



When Witches Took Flight

by Chris Halsted

The modern association of witches and flight in fact emerged from a relatively obscure corner of medieval church writings that gained prominence in the context of contemporary political anxieties about women's political influence. 



Why is the American Right so Thirsty for Generalissimo Franco?

by David Austin Walsh

Increasingly "respectable" conservative intellectuals are openly advocating for a dictator to enforce cultural traditionalism as part of a battle to control the politics of elite institutions.



Amid Anti-Woke Panic, Interdisciplinary Programs Inherently Vulnerable

by Timothy Messer-Kruse

Because standards of academic freedom like those of the AAUP tie that freedom to expertise within recognized professional communities of scholars, those doing interdisciplinary work and working in programs like ethnic studies have less institutional protection against political attacks. 



Can We Solve the Civics Education Crisis?

by Glenn C. Altschuler and David Wippman

Universal schooling created the potential for a unifying civic curriculum that, paradoxically, has been the subject of perpetual disagreement regarding its contents. A recent bipartisan roadmap for civics education that makes those disagreements central to the subject matter may be the only way to move forward. 



WGA Strike Latest Example of Cultural Workers Joining Together as Entertainment Technology Changes

by Vaughn Joy

The development of television and online content have historically forced multiple Hollywood unions to join forces to secure a share of the returns of new techology or risk being frozen out entirely. 


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185811 https://historynewsnetwork.org/article/185811 0
The Modern Relics in Crow's Cabinet of Curiosities

Senator Sheldon Whitehouse (D-RI) points to a painting commissioned by Harlan Crow depicting a meeting at Crow's vacation retreat including Federalist Society head Leonard Leo (2nd from left), Justice Clarence Thomas (2nd from right) and Crow (far right)



Who is Harlan Crow? As questions mount about Supreme Court Justice Clarence Thomas’s alleged failure to disclose significant gifts (and attendant concerns about his integrity multiply), his principal benefactor has achieved a certain, curious fame. Until recently Harlan Crow, despite his enormous wealth and influence, remained a relatively obscure Dallas billionaire. Now, many want to know why he has lavished so many gifts on Justice Thomas, including a Bible once owned by the great abolitionist Frederick Douglass, an invaluable piece of Americana and an American relic.


For me, and for many others, the most fascinating aspect of Crow’s new celebrity is his controversial penchant for collecting rare,— and sometimes disturbing—historical objects. These include things we might call “atrocious relics.” In my recent book, American Relics and the Politics of Public Memory, I wrestle with such matters. Why do we collect relic-like things? What do they mean? What do they “do” or “say”—to those who possess them and to those who view them? Relics can be whimsical, glorious, or sober, but they are also volatile and sometimes alarming and offensive.


What is a “relic”?

Relic is commonly defined as a material object held in reverence by believers because it is linked to a holy person. In medieval Christendom, relics—blood and bones of saints, pieces of the “true cross,” and other sacred traces—gave power to their possessors and access to the divine. Their presence elevated and sanctified churches and communities, helped mold worshippers’ identities, and fixed them in a larger Christian world.


In our more secular modern world, relics endure and perform some of the same functions. Prized vestiges of former times, souvenirs or mementos connect us directly to the past. They do not merely illustrate it; they physically embody it, its glory and triumph, sometimes its tragedy or even horror. Relics are the past, persisting in our present.


Important public relics seemingly possess an ability to speak firsthand, to communicate authentically, wordlessly, emotionally, compellingly. They are both the argument and the evidence, veritable “smoking guns.” Sometimes they look ordinary. Who cares about some old, unremarkable fountain pen, until we learn that Lincoln used it to inscribe the Emancipation Proclamation in 1863? What’s the big deal with some old, tattered book, until it’s revealed as the Bible once owned (before Crow and Thomas) by Frederick Douglass? Through such things, we are uncannily linked to “history.”


Crow’s Nest

Harlan Crow has accumulated lots of such stuff at his Highland Park estate—astonishing stuff—including (randomly) a letter written by Christopher Columbus, a silver tankard crafted by Paul Revere, the deed to George Washington’s Mount Vernon, Dwight D. Eisenhower’s helmet, adorned with five stars, a cannonball from the Battle of Gettysburg, and much, much more.


But mingled among these American treasures are linens, medallions, and other Nazi artifacts and memorabilia, as well as an autographed copy of Hitler’s hateful tome Mein Kampf and two of his paintings, landscapes distinctive because of their artist, not their artistry. The manor’s grounds include a sculpture park arrayed with statues of notorious Communist leaders, a so-called “garden of evil” populated by Marx, Lenin, Stalin, Tito, Castro, Ceausescu, and other villains perhaps more obscure but nonetheless malignant, such as Gavrilo Princip, the assassin of Archduke Franz Ferdinand who precipitated World War I.


Why would Harlan Crow harbor such things? Of course, they are rare and valuable commodities, which might command a considerable price if sold, and which conspicuously display the inestimable fortune of their possessor. They are the prizes of Crow’s wealth. But his collection is not merely an investment, uncurated, or randomly compiled. These things hold meaning beyond their financial valuation, and they help define the man who owns them. If Crow tells stories through them, they tell stories about him.


Maybe Crow’s despots in bronze and stone function like big game trophies, displaying dominance over one’s quarry or foes. Or maybe they are a snarky, conservative troll to antagonize liberal critics, representing Crow’s supremacy over his opponents. They allow him, literally, to crow. Defenders argue their benign didacticism, marking the triumph of good over evil and reminding us of what to hate. In fact, new sorts of institutions—memorial museums—emerged after the Second World War that were designed to confront evil, to teach, memorialize, and heal in the wake of cataclysms, the Holocaust most prominently. But these institutions commemorate victims, not perpetrators like those assembled by Crow. Despite the rationales, Crow’s garden of evil does not teach or heal. It pays implicit homage to the evildoers and their power, deadening viewers to the full measure of their horrific ideas and acts.


It’s not really possible to renovate disgraced public monuments, unlike structures or institutions saddled with an unfortunate name, which can be changed and repurposed. Fort Benning recently became Fort Moore; Fort Bragg, Fort Liberty; Fort Hood, Fort Cavazos. But a statue of Robert E. Lee or Josef Stalin is inescapably a statue of Lee or Stalin. Neither can be rehabilitated by unilaterally rechristening them Martin Luther King or Lech Walesa. Crow doesn’t try and likely doesn’t care.


Crow’s unnerving monuments and memorabilia connect us to a reprehensible past, revivifying that which is sinister and frightening and, even for Crow perhaps, sordid and shameful. As one visiting reporter noted, the Nazi artifacts are placed in cabinets, “out of the view of visitors,” controlling their ability to “say” indiscreet things. Such materials evoke the lynching postcards and other grisly souvenirs once prized by white supremacists, kept privately as racist talismans. Broader public scrutiny transformed them into appalling objects, atrocious relics. Recent revelations thus pose some uncomfortable questions. Has Crow collected Thomas? And what do his relics say about him, and about us?


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185764 https://historynewsnetwork.org/article/185764 0
What We Can Learn From—and Through—Historical Fiction

Novelist Anna Maria Porter, engraving The Ladies' Pocket Magazine (1824)

This image is available from the New York Public Library's Digital Library under the digital ID 1556053: digitalgallery.nypl.org → digitalcollections.nypl.org



I have been a local historian for many years, but turned to historical fiction to tell a specific story for which there were no sources. There was a sense of going to the “dark side” in doing so, yet at the same time I was able to illuminate things that do not appear in the historic record.  I suspect that there could be a lively debate online about what good historical fiction can accomplish—and also the misuse of history by those who write historical fiction.


As a local historian I tried to be true to the sources I found; to be trusted by readers. In the case of the dozen women who crossed the country in 1842, members of the first overland company to set out for the Pacific Northwest, I could find little. With no verifiable facts, but knowledge that women were present, I turned to fiction to put women in the picture and wrote Lamentations: A Novel of Women Walking West (Bison Books, an imprint of the University of Nebraska, 2021). To someone like Gore Vidal, that made perfect sense; he thought history should not be left to the historians, “most of whom are too narrow, unworldly, and unlettered to grasp the mind and motive,” of historical figures. E. L. Doctorow would agree, but more agreeably, writing that “the historian will tell you what happened,” while the novelist will explain what it felt like. The historian works with the verifiable facts—fiction is a step beyond.


Historical fiction is generally dated to Sir Walter Scott, beginning with Waverly in 1814. It turns out, however, that Scott was not the first historical novelist. Devoney Looser has just published Sister Novelists (Bloomsbury Press, 2022) about Maria (1778-1832) and Jane (1775-1850) Porter, driven by poverty, who wrote popular historical novels beginning in the 1790s. A Wall Street Journal reviewer in 2022 noted that “Maria was a workhorse, Jane a perfectionist. Between them they wrote 26 books and pioneered the historical novel.”


There have been only a few academic treatments of historical fiction. Ernest Leisy issued The American Historical Novel in 1950 and George Dekker wrote American Historical Romance in 1987, both interested in chronological periods, but neither man created, or exhibited, much enthusiasm for it. Yet, in 1911 James Harvey Robinson wrote in an essay titled “The New History,” published in the Proceedings of the American Philosophical Society, where he observed that historians need to be engaging, even while “it is hard to complete with fiction writers.” He stated


History is not infrequently still defined as a record of past events and the public still expects from the historian a story of the past. But the conscientious historian has come to realize that he cannot aspire to be a good story teller for the simple reason that if he tells no more than he has good reasons for believing to be true his story is usually very fragmentary and uncertain. Fiction and drama are perfectly free to conceive and adjust detail so as to meet the demands of art, but the historian should always be conscious of the rigid limitations placed upon him. If he confines himself to an honest and critical statement of a series of events as described in his sources it is usually too deficient in vivid authentic detail to make a presentable story.


The historian Daniel Aaron took the genre of historical fiction seriously in a 1992 American Heritage essay in which he castigates Gore Vidal. Aaron however conceded that “good writers, write the kind of history [that] good historians can’t or don’t write.”


Aaron quotes Henry James, who thought of historians as coal miners working in the dark, on hands and knees, wanting more and more documents, whereas a storyteller needed only to be quickened by a letter or event to see a way to share it with readers or use it to illuminate a point about the historical past. He recognized that genres of reading had changed. In the 19th century we read historical tomes, mostly about the classical world or of British and European war and political alignments, but in the last quarter of the 20th century “so-called scientific historians left a void that biographers and writers of fictional history quickly filled.” Aaron cites inventive novelists who have perverted history for a variety of reasons, using Gore Vidal as his prime example. Vidal thought of historians as squirrels, collecting facts to advance their careers. But Vidal does not get the last word.


Professor Aaron recognized that historical fiction had moved from a limited earlier model focused on well-known individuals to serious re-tellers of history who have “taken pains to check their facts and who possess a historical sensibility and the power to reconstruct and inhabit a space in time past.” What a lovely description of some of the best of our contemporary historical fiction.


But what of putting women into the past where they often do not appear? Addressing this issue, Dame Hilary Mantel noted in her 2013 London Review of Books essay “Royal Bodies” that


If you want to write about women in history, you have to distort history to do it, or substitute fantasy for facts; you have to pretend that individual women were more important than they were or that we know more about them than we do.


Despite my great admiration for Dame Hilary, I think we can deal with the issue of women in the past by honoring their lives in such a way that does not turn them into twenty-first century heroines but as women who found themselves in situations they might not have wished, and did what they needed to do, thought about their circumstances, and dealt with what they found they had landed in. They, as we, are each grounded in our own time, deserve credit for surviving, and should be appreciated for our observations of life around us.


We should respect the historians’ knowledge of time and place and the novelists’ intuition that is sometimes spot-on. An example: in trying to explore the moment when the buttoned-down eastern women in 1842 encountered a band of Lakota, then identified as Sioux, I wondered what the women might have thought of those bronzed warriors whose clothing left much of their chests and shoulders bare. What would the women walking west have thought about? When I read the paragraph I had written to an elderly friend, she went to her desk and pulled out a letter from an ancestor who had crossed Nebraska, walked over South Pass, and on into Oregon. And that ancestor, in the 1850s, had said exactly what I had imagined. Sometimes, the imagined past is as we conceive it to be because we have grasped the knowledge of time and place on which to activate believable players.


My desire in Lamentations was to hear what the women were thinking, and sometimes saying to each other, but within the context of that century when much that was unorthodox could not be said aloud. I wanted to show how a group of people traveling together would get to know each other, rather as students in a class know that one was from Ohio and another played hockey. We do not know others fully, but from the vantages we are given. I wanted to display how the women gained information, and then passed it along; how tragedies were dealt with; how personalities differed, and how, in the end, Jane matured. I wanted to bring women of different generations together, to show discord among sisters, to think about what was important when dismantling a home, how women fit into the daily account of miles and weather and sometimes events kept by the company clerk. I wanted to explore what it was like to answer a longing for new beginnings, for a journey when one is the first to make it. I am interested in names and what they mean, in the landscape what how one travels through. I wanted to hear the women speak when the records do not.


Historians need to be conscious of the audience we/they hope to have and perhaps can learn something about style and sense of place from the writers of historical fiction. Academic and local history can be told vividly; good history can also have good narrative but also, that some historical fiction tells a story that a historian cannot. I have written this to praise historical fiction when it respects the line between our times and the past, when it adheres to the known-truth and does not pervert it for excitement—or for book sales. I appreciate Daniel Aaron who thought historical fiction was worth taking seriously, and for all those writers who have brought the past alive in this form.


Fiction is not the only way to explore the past, but historical fiction can attract readers to wonder and speculate and then explore the past in other forms. A friend said that as a child, reading fiction of other times led her to read history and then become a historian. Aaron wrote that historical fiction gives “us something more than the historical argument.” It can bring alive an era, a person, a moment in time so that we meet the past as it was, not as we might want it to have been.


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185767 https://historynewsnetwork.org/article/185767 0
White House Speechwriter Cody Keenan on the Crucial 10 Days of the Obama Presidency

Cody Keenan (Photo by Melanie Dunea)


Other than being able to string a sentence together, empathy is the most important quality in a speechwriter. The ability or at least the attempt to understand your audience, to walk in their shoes for a little while, even if empathy will never be a perfect match for experience.—Cody Keenan, Grace




Ten days in June 2015 were some of the most intense during the presidency of Barack Obama. The president was awaiting US Supreme Court decisions on the fate of the Affordable Care Act and marriage equality. And, on June 17, a hate-fueled white supremacist shot to death nine African American worshippers at a historic church in Charleston, South Carolina.

Chief White House speechwriter Cody Keenan focuses on this extraordinary period in his revelatory and lively new book Grace: President Obama and Ten Days in the Battle for America (Mariner Books).

In response to this perfect storm of historic events, Mr. Keenan drafted memorable speeches and a heartfelt and now immortal eulogy for Reverend Clementa Pinckney and other victims of the Charleston violence. And that address moved beyond a eulogy with the president’s powerful plea for unity and reconciliation and his surprising segue as he led the congregation and the nation in singing “Amazing Grace.”

In Grace, Mr. Keenan recounts highlights of his career as a speechwriter as he describes the tumultuous ten days. The reader immediately senses the demands of working for a president who was himself the former editor of the Harvard Law Review and among the most celebrated writers and orators of the recent history. As Mr. Keenan puts it, “To be a speechwriter for Barack Obama is f---ing terrifying.” Mr. Keenan worked “to his limits” in his high-pressure position to provide President Obama with the best drafts possible. And it’s obvious from Grace that the two men were gifted collaborators who worked together with great mutual respect and admiration.

As he provides a behind-the-scenes perspective on White House operations, Mr. Keenan introduces key presidential aides such as Valerie Jarrett, Jen Psaki, Ben Rhodes, Jon Favreau and his speechwriting team. He also intersperses the book with the story of his romance with esteemed presidential fact-checker Kristen Bartoloni, who often challenged and corrected his writing. They married at the White House in 2016.

By 2015, President Obama had delivered more than a dozen eulogies for the victims of gun violence, including for those who died in the massacre where Representative Gabby Giffords was seriously wounded in Arizona and the horrific gunshot murders of 20 children and five adults in Sandy Hook, Connecticut. Mr. Keenan wrote those eulogies as well as the president’s now famous speech honoring the fiftieth anniversary of the 1965 March on Selma for voting rights and those peaceful protesters including civil rights icon, Representative John Lewis, who endured a bloody attack by police.

Mr. Keenan writes powerfully of the pain and sorrow that he and the president experienced in addressing yet another mass shooting in June 2015, that time with the added dimension of racist violence. The description in Grace of the creation of the president’s address for the funeral of beloved Reverend Clementa Pinckney is a case study in collaboration in the speech drafting process.

During the same sad week, Mr. Keenan wrote statements for the president to deliver if the Supreme Court gutted the Affordable Care Act and ended marriage equality. We now know that those speeches on the Court decisions weren’t necessary. And the eulogy for Reverend Pinckney will be remembered as one of the great presidential addresses. Mr. Keenan concedes that this eulogy was his most difficult assignment after working on more than three thousand speeches for President Obama.

Mr. Keenan’s heartfelt and moving memoir Grace shows how a gifted president and his devoted team worked together tirelessly for a more fair, more tolerant, and more just nation.

Mr. Keenan is best known as an acclaimed speechwriter. He studied political science at Northwestern University and, after graduation worked in the office of US Senator Ted Kennedy. After several years in that role, he earned a master's degree in public policy at the Harvard Kennedy School. He subsequently secured a full-time position with Barack Obama's presidential campaign in Chicago in 2008.

When President Obama took office in 2009, Mr. Keenan became deputy director of speechwriting in the White House. He was promoted to chief White House speechwriter during the president’s second term. He also collaborated with President Obama on writing projects from the end of his term in 2017 until 2020. He has said that he wrote his dream speech just four days before Obama left office—welcoming the World Champion Chicago Cubs to the White House.

Mr. Keenan is currently a partner at the speechwriting firm Fenway Strategies and, as a visiting professor at his alma mater Northwestern University, he teaches a popular course on political speechwriting. Today, he and Kristen live in New York City with their daughter, Grace.

Mr. Keenan graciously responded by email to a long series of questions on his new book and his work.


Robin Lindley: Congratulations Mr. Keenan on your engaging new book Grace, a revelatory exploration of your work as chief speechwriter for President Obama at an incredibly turbulent time. Before getting to that period, I wanted to ask about your background. You majored in political science at Northwestern University. What sparked your interest in politics?

Cody Keenan: Well, I enrolled at Northwestern as a pre-med student. I wanted to be an orthopedic surgeon after a football injury forced a knee reconstruction. Chemistry 101 weeded me right out, though. I just wanted to take biology.

But politics had always been an interest. My parents often argued about politics at the dinner table – my mom was a Kennedy Democrat from Indiana; my dad was a Reagan Republican from California – and whatever could make them so animated was something worth exploring. One value they both hammered into me, though, was the idea that I should do whatever I could to make sure more people had the same kind of opportunities I did growing up – and by the time I graduated from college, only one political party cared about that.

Robin Lindley: Did you have academic or other training in speechwriting?

Cody Keenan: No. Writing was something that always came naturally, and I think that came from being a voracious reader. I won every summer competition at the local public library. You can’t be a good writer without being a great reader.

Robin Lindley: You interned for legendary Senator Ted Kennedy after college. Did your duties in that role include speechwriting?

Cody Keenan: Not as part of the internship, or even the first position after that. Three months as an intern got me hired to answer his phones. I ended up working for him for almost four years in four different roles.

In 2004, when I was on his staff for the Committee on Health, Education, Labor, and Pensions, the Democratic National Convention was in Boston, his hometown. We all took a week off work to volunteer. I was on the arena floor the night that Barack Obama gave the speech that made him famous. He walked into the arena anonymous; he walked out 17 minutes later a global megastar. It shows you what a good speech can do.

Once we were back in Washington, I must have talked about that speech a lot, because that’s when my boss asked if I could write a speech. I don’t know if he meant did I have the time or did I know how, but it didn’t matter – I lied and said yes.

Robin Lindley: Senator Kennedy was known as a great legislator in the Senate who could work across the aisle. Did you work with him or his staff on any significant projects? What did you learn from that internship?

Cody Keenan: As an intern, one of my tasks was to read and route mail that came to the office. Perfect strangers were writing a senator – often one who wasn’t even their senator – to ask for help. There’s an act of hope involved in that. Even when it was a tough letter to read, even when you could see that the writer had wiped a tear from the page, they hoped that someone on the other end would care enough to help. I learned right away just how important this stuff is.

Later, as a staffer, I worked on all sorts of legislation. Kennedy was involved in everything. Health care, minimum wage, education, immigration, the Iraq War, the response to Hurricane Katrina, Supreme Court nominations – we were always busy. And with good mentors, I learned that just as important as the policy itself was often the way you communicated it.

Robin Lindley: What attracted you to working for President Obama during his first presidential campaign in 2007? Did you work as a speechwriter before his election?

Cody Keenan: Well, what struck me about that 2004 speech was that he described politics the way I wanted it to be – as this collective endeavor in which we could do extraordinary things that we couldn’t do alone. His only speechwriter at the time, Jon Favreau, called me early in the campaign and asked if I wanted to join the speechwriting team he was putting together. I said yes.

Robin Lindley:  What did you learn or do to prepare for work as a speechwriter for President Obama, one of our most celebrated American writers and thinkers even then? Did you go back and read works of some of the great White House writers such as Ted Sorensen, Bill Moyers, and Peggy Noonan? Did you read speeches by the likes of Lincoln, FDR, JFK, Churchill, and other memorable leaders?

Cody Keenan: I didn’t. I’d already read the canon of presidential hits, but to be a speechwriter for someone means writing for that specific person, helping him or her sound not like anybody else, but rather the best version of himself or herself.

Robin Lindley: I read that you didn’t personally meet President Obama until his first day at the White House in 2009. Yet, you had been working for him for a year and a half. What do you remember about your first meeting and your early days at the White House?

Cody Keenan: Yep – he visited Chicago headquarters maybe three times during the campaign. He was out campaigning! And when he did visit, it was for strategy sessions with his top aides and to address the entire staff at once, not to meet with his most junior speechwriter.

On our first day at the White House, he called me into the Oval Office because he’d seen my name at the top of speech drafts and he just wanted to put a face to the name. Those early days were drinking from a firehose: the economy was falling apart, millions of Americans had lost their jobs and their homes in just the four months before he took office, and millions more would in the first few months after. There was no honeymoon; we were busy trying to turn that firehose onto the fire.

Robin Lindley: Did you immediately start as a speechwriter once President Obama began work at the White House?

Cody Keenan: I did.

Robin Lindley: How does one prepare for a job that requires knowing the voice and propensities of the person they are writing for?

Cody Keenan: Well, I had a year and a half foundation from the campaign. I’d read his books to absorb his worldview, listened to the audio versions to absorb his cadence, and paid close attention to his edits. He was a writer. He was our chief speechwriter. And he was our top editor. I learned a lot just by poring over his edits to our drafts.

Robin Lindley: How did your relationship with President Obama evolve over his eight years in office? You wrote that working for this acclaimed writer could be terrifying. It seems he offered good advice to you such as having a drink and listening to Miles Davis or John Coltrane. Or reading James Baldwin. Did you see him as a kind of coach or mentor?

Cody Keenan: I was the junior writer on the team for the first two years, sitting across the driveway in the Eisenhower Executive Office Building. Then a series of high-profile speeches got me promoted to deputy director of speechwriting, and I moved into a West Wing office with Jon Favreau. Once he left after the second inaugural, I took over as chief speechwriter. So naturally, our relationship evolved – I went from seeing Obama every couple weeks to every week to every day.

I saw him as my boss. I guess as a writing coach of sorts. And sometimes even as an uncle or older brother who loved to dispense advice. He hosted my wife and our families and our best friends at the White House on our wedding day. It was his idea. He didn’t have to do that.

Robin Lindley: Are there other bits of President Obama’s advice that stick with you?

Cody Keenan: “Don’t impart motives to people.” That’s advice we could use more of.

Robin Lindley: Indeed. A big question, but can you give a sense of the speechwriting process? What sparks the process? Who is involved? What’s it like to collaborate with a team of writers and other staff?

Cody Keenan: He viewed speechwriting as a collaboration. He just wanted us to give him something he could work with. We wrote 3,477 speeches and statements in the White House, and believe it or not, he edited most of the speeches, even if lightly. But he couldn’t be deeply involved with all of them.

For any speech of consequence, though, we’d start by sitting down with him and asking “what’s the story we’re trying to tell?” Then the speechwriting team would talk over each speech, helping each other get started. Then we’d all go back to our own laptops and draft whatever speech we’d been assigned. The drafting was not a collaborative process. The revising was – with each other, but more importantly with him.

Robin Lindley: What’s the fact checking process for a speech draft before it goes to the president? It’s interesting that your future wife Kristen was one of the very diligent fact-checkers you relied on.

Cody Keenan: Yeah, she literally got paid to tell me I was wrong. Every day. For years. It was her team’s job to fireproof the president – to make sure he never said something he shouldn’t, with someone he shouldn’t be with, at a place he shouldn’t be visiting. They prevented countless alternate timelines where we’d have to do some cleanup in the press. They saved us from ourselves again and again.

Robin Lindley: Congratulations on your marriage to Kristen with the magnificent White House wedding. Your blossoming romance runs like a red thread through your book. You note that President Obama would stay up late at night to review and edit drafts of speeches he would give the next day. And you often received late night calls from him or met with him in the wee hours. How did those final hours work with a speech? It seems the president would often edit to the time of delivery.

Cody Keenan: He always edited in the wee hours of the morning. It’s when he preferred to work. It was rare that we were editing right up until delivery. If we were flying somewhere for a speech, he’d always go over it one or two final times on the plane. But he didn’t like chaos. In fact, the reason he edited so heavily, so often, was because he wanted the speech exactly the way he wanted it. Sometimes it was perfectionism. But it’s really just preparation.

Robin Lindley: What did you think when the president ad libbed or changed something from your draft as he spoke? I think you said something to the effect that he was a better speechwriter than all of his writing staff.

Cody Keenan: I loved it. I can’t think of a time I cringed at an adlib. He had a knack for it. It could be a little white-knuckled if he did it at the end of the speech when there’s no text for him to come back to. In that case, he’d have to build a new runway while he was speaking on which to land the plane.

Robin Lindley: When does humor come into the mix? Do you write for events such as the White House Correspondents Dinner? President Obama had some zingers for his eventual birther successor at these events.

Cody Keenan: Those were our most collaborative sets of remarks. The entire team would pitch jokes, and we’d reach out to professional comedy writers to solicit their help. We’d start out with about 200 jokes and whittle them down to the 20 funniest. Sometimes, none of your jokes would make the cut. You’ve got to have a thick skin.

Robin Lindley: And you and the other speechwriters did not use a template such as this speech is on the economy or this speech is political, so we’ll use the file template X or Y. You were responsible for more than three thousand speeches, yet it seems each speech was approached as a unique project.

Cody Keenan: Yes and no. We never used a template. But while each individual speech should tell a story, so should all speeches. What I mean by that is, we were mindful that every speech we wrote fit into a longer narrative arc – both of his presidency and his entire political career.

Robin Lindley: You worked for the president through his eight years in office. How did you come to focus on ten days in 2015 in Grace as the president dealt with the horrific 2015 mass murder of nine Black parishioners by an avowed white supremacist at Mother Emanuel Church in Charleston, South Carolina. The president then also was preparing to address two impending Supreme Court decisions that would determine the fate of the Affordable Care Act and marriage equality.  

Cody Keenan: Yeah. People will remember all of the stories and all of the events in this book. They won’t remember that they all happened in the same ten-day span. I mean, that in and of itself is a story that demands to be told. In addition to a massacre carried out by a self-radicalized white supremacist, there was a very real chance that the Supreme Court would say no, people who work two or three jobs don’t deserve help affording health insurance; no, gay Americans don’t get to get married like the rest of us; all of those people are now second-class citizens. And the first Black president has to serve as the public narrator and provide some moral clarity for all of this.

Someone once described it as ten days too implausible for an entire season of The West Wing. But it’s also what those events symbolized and how they fit in the broader, centuries-long story of America – whether or not we’re actually going to live up to the ideals we profess to believe in. Whether we’re going to stand up to white supremacy, and bigotry, and people who profit from inequality and violence. And that week, the answers were all “yes.”

Robin Lindley: With the Charleston massacre, the president had to address another mass shooting and he was tired of giving eulogies after the murders at Sandy Hook and all of the other heartbreaking mass shootings during his term in office. How was his speech at Mother Emmanuel Church different from previous addresses? What was your role in creating this memorable speech? How did the speech go beyond a eulogy to become a message of reconciliation?

Cody Keenan: We had done over a dozen eulogies after mass shootings at that point. And this goes back a few years, the shooting in Newtown, Connecticut, where 20 little kids were murdered in their classrooms, along with six of their educators, was right after he’d been reelected.

And he put aside his second term agenda right out of the gate to try to do something about guns, because what an abdication of leadership that would be if he didn’t. And he had a little boost by Joe Manchin and Pat Toomey, an arch conservative from Pennsylvania with an A-rating from the NRA. They both had one. They decided to work together on a background checks bill. And even though we knew the odds in the Senate would be long, that gives you something to try for. And so, we traveled the country for a few months. He made it a centerpiece of his State of the Union address. Big, emotional, powerful ending. And in the end, in April, Republicans blocked a vote on it with the parents of the Newtown kids watching from the gallery.

And that’s about as cynical as I’ve ever seen Barack Obama. Yet he went out and spoke in the Rose Garden with those families. I handed him a draft of the speech and he said, look out, I'm going to use this as a as a template, but I’m just going to wing it. And he came in after that speech into the outer Oval Office, which is this room just off the oval where his assistants sit, and he was almost yelling once the door closed, he said, “what am I going to do the next time this happens? What am I going to say? I don’t want to speak. If we’ve decided as a country that we’re not going to do anything about this, then I don’t want to be the one who closes the cycle every time with a eulogy that gives the country permission to move on.”

Ultimately, we did decide to do a eulogy after Charleston, and it was his idea to build the structure of the speech around the lyrics to “Amazing Grace.”

Robin Lindley: I think everyone was surprised and moved when President Obama sang “Amazing Grace” during the Charleston speech. Were you surprised or was that part of the plan for the speech?

Cody Keenan: That, too, was his idea. He told me on Marine One that morning that, if it felt right in the arena, he might sing it.

Robin Lindley: You now teach speechwriting at your alma mater Northwestern University. Do you have any other advice for prospective speech writers?

Cody Keenan: It’s fun, training a new generation of speechwriters and trying to convince them that public service is worth it. What I didn’t expect was that my students would end up teaching me quite a bit in return. There’s an impatience to their generation that mine didn’t have to have. Politics and the pace of change is now existential for them in a way it hasn’t been since schoolkids were doing duck and cover drills during the Cold War. They’re doing those duck and cover drills again because of guns. They can see an end to their future because of climate change.

And let me tell you, when they see a party more into policing books than policing assault weapons; when they see a party more exercised about drag queens than about climate change – they feel a real disdain there. I want them to harness it, though, in a productive way. And part of that means telling them the truth. To tell them that change has always taken time isn’t fun. To tell them that they’re not always going to win isn’t fun. To tell them that even when they vote in every election, they’ll never elect a leader who delivers everything they want. Well, that’s just not inspiring. But it’s also true.

Nobody ever promised us these things. That’s democracy. But here’s the thing about democracy: we get to refresh it whenever we want. Older generations aren’t entitled to their full tenure. So, while I counsel patience and realism, I also fan the flames of their impatience and idealism. I tell them to join a campaign now, to start an advocacy group now, to run for office now. Stay at it not just until the people in power are more representative of what America actually is, but until they’re the ones in power themselves. Then make the system your own. Faster, smarter, more responsive to the needs of a modern, pluralistic democracy. And one way to do that is through my cardinal rule of speechwriting: help more leaders talk like actual human beings.

Robin Lindley: You also continue to work as a speechwriter and you note that you worked with President Obama after his tenure in office. Did you consult with the president on writing projects such as his monumental memoir Promised Land?

Cody Keenan: I worked for him full-time for four years after we left the White House, ultimately leaving after the 2020 election so that I could devote my time to writing Grace.

Robin Lindley: What sorts of clients do your work with as a speechwriter now?

Cody Keenan: All kinds. Progressive candidates, nonprofit, academic, and corporate. Our rule is that each client has to be putting more into the world – hopefully much more – than it’s taking out. But the best part of it is to be surrounded by a team of idealistic young speechwriters again. I missed that over the four years after the White House.

Robin Lindley: Would you consider working with a president at the White House again?

Cody Keenan: Maybe. Depends on who it is. For a speechwriter, it really, really depends on who it is. Speeches require a deeper relationship than a lot of other staff positions. But I’m also older and have a young daughter. Both of those things make the grind of the White House much less attractive.

Robin Lindley: It seems we’re more divided now than during the Obama years. I never thought I’d see Nazi rallies in America in the 21st century. Where do you find hope for our democracy at this fraught time?

Cody Keenan: My students. While politics as it is may make them cynical, they’re not cynical about America and its possibilities. Somehow, they’re not as plagued by fear or suspicion as older generations; they’re more tolerant of differences between race and culture and gender and orientation, not only comfortable navigating all these different worlds but impatient to make them all fairer, more inclusive, and just plain better. They’re consumed with the idea that they can change things. They just want to do it faster.

Robin Lindley: Is there anything you’d like to add for readers about your book or your work?

Cody Keenan: You’re going to love Grace. I wrote it because it’s a hell of a story and it’s the most intimate look at Obama’s approach to speechwriting that exists.

But I also wrote it, as I told Stephen Colbert when he had me on, to blow up people’s cynicism about our politics. Because politics isn’t some rigid system we’re trapped under. It’s us. It’s only as good as we are. That’s why I was so happy when Obama called it “an antidote to cynicism that will make you believe again.”

But I was just as happy to read a review that described it this way: “Grace is a refreshing departure from the flood of scandalous ‘literary’ flotsam that typically washes up in the wake of the transfer of power. This book might not make breaking-news headlines, but it just might restore a little faith in the presidency and the backstage men and women who work around the clock to fulfill the chief executive’s promises to the American people.” The publicist at the publishing house didn’t love the part about “breaking-news headlines,” because that’s what sells books – but I was proud to write it the way I did. There’s no sleazy tell-all in this book, but there are a bunch of great never-before-told stories about what it’s like to sit alone with Obama and unlock the right words for a fraught moment.

Robin Lindley: Thank you Cody for your generosity and thoughtful comments. Your book captures the reality of work in the tense and often exhilarating environment of the White House with a president who was devoted to creating a more just and tolerant nation. Best wishes on your continuing work and congratulations on Grace.


Robin Lindley is a Seattle-based attorney, writer, illustrator, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, visual culture, and art. Robin’s email: robinlindley@gmail.com.  

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/blog/154750 https://historynewsnetwork.org/blog/154750 0
Dangerous Records: Why LGBTQ Americans Today Fear the Weaponization of Bureaucracy

Prisoners at Sachsenhausen concentration camp wear triangle badges indicating the nature of their offenses against Nazi social code (pink would indicate homosexuality). National Archives and Records Administration, 1938.



The recent rise of far right political movements in the United States and globally has prompted historical comparisons to the Nazis. The atrocities committed by the Nazis have been studied widely, particularly in reference to the Jewish victims of the Holocaust, but it is also important to understand lesser-known victims and the ways that prior discrimination affected their persecution. While focusing on the pre-war experience it is crucial to understand how the Nazis relied on bureaucratic information to know whom to target, especially when the classification was not an obvious ethnic or religious one (such as assimilated and secular Jews, or gay men, lesbians, and others persecuted for gender or sexual behavior). Today, there are important lessons to learn about the dangers that bureaucratic information gathering, combined with escalating prejudice and vilification, could present.

The rise of the Nazi party in Germany also brought about several laws restricting access to literature and laws regarding the treatment of what we today would refer to as LGBTQ+ people. Paragraph 175, a law criminalizing same sex male relationships, was established in 1871, but revised by the Nazi party to be more inclusive in regard to the actions that could be punished. Queer men were targeted early in the Nazi regime, which placed heavy blame on them for losing the First World War. Nazi ideology justified discrimination and repression by claiming that a lack of masculinity was a contributing cause of the country’s downfall and economic depression. Though only half of the 100,000 arrested for the alleged crime of homosexuality were persecuted, this figure is still large enough to raise an interesting question about how the Nazis knew whom to target and where the information was coming from. Political factors appear to be involved, because a majority were prosecuted within six weeks after Heinrich Himmler’s assumption of control of internal security in 1943. Each man was reported in a similar manner whether that was a private individual report, a police raid, or utilization of the “Pink List.”

The practice of information gathering towards members of minority groups by bureaucratic organizations has a startling history of being used for oppressive ends, particularly by the Nazis. A clear example of this includes the utilization by the Nazis of the “Pink List," a list compiled by organizations of support such as the Scientific Humanitarian Committee or reported by private individuals and then held by the police. The Scientific Humanitarian Committee aimed for “Justice Through Science” and espoused the biological theory of homosexuality, the idea that sexuality is an innate biological feature rather than a characteristic of weakness and psychological deviance. The SHC was targeted by the Nazi party early in the rise of Hitler due to their propensity to advocate for homosexuals. The SHC kept lists of homosexual Germans for support and scientific reasons but those lists were seized by the Nazis then utilized to target the homosexuals on the list.

A clear example of the danger that could befall a young gay man who interacted with police on any other matter is seen with the story of Pierre Seel. Seel arrived at his local police station to report a stolen watch and, when questioned about the specific circumstances, revealed that he had come from Steinbach Square, a well-known hangout for gay men seeking each other's company. After experiencing intense questioning, he was released and assured that nothing would come of the compromising information, but three years later he was arrested as a suspected homosexual due to the list he was placed on after he left the police station. This list was compiled by police and security forces over the years, and was augmented by confessions made by imprisoned gay men who were raped and tortured to compel them to add additional names to the list. The Pink List is a clear example of how dangerous information that categorizes someone into a minority group can be, particularly in the hands of those in power with ill intentions.

While the Holocaust is an unmatched and exceptional example of cruelty and systematic persecution of social outgroups, it is nevertheless important, even crucial, to recognize similarities between those events and the present, especially where prejudices join with bureaucratic state power. Today, transgender Americans are being framed as deviants, accused of undermining traditional gender roles, and described as “groomers'' and child sex abusers. Armed vigilantes have harassed people attending drag performances, and activists are seeking to remove books about gender and transgender experiences from schools and libraries. When the power of the state aligns with these expressions of prejudice and identification of outgroups as a threat to children, family and society, there is real cause for concern.

Anti-LBGTQ sentiment has been particularly vociferous in Texas. Texas Attorney General Ken Paxton’s recent request for a list of individuals who have changed their gender on state-issued driver’s licenses, as well as other departmental documents, has concerning similarities to the “Pink List” compiled by Nazi officials in 1930’s Germany. The request for the list itself made transgender Texans subjects of surveillance, implying the state views them as dangerous. According to an email sent on June 30, 2022 by Sheri Gipson, the chief of the DPS’s driver license division, the Attorney General’s office “wanted ‘numbers’ and later would want ‘a list’ of names, as well as ‘the number of people who had a legal sex change’.” This first request produced over sixteen thousand results. Unfortunately for the Attorney General, it was difficult for the state agencies to meet his request. One issue involved gender changes to correct filing mistakes (a cisgender person’s gender was accidentally recorded inaccurately, and the change affirmed their identity). A subsequent data request attempt led to narrowing the data to only court-ordered document changes, which would identify transgender people specifically. Although the agency could not accurately produce this data, this instance, alongside the various laws being introduced throughout the state such as the prohibition of gender affirming care and the limiting of LGBTQ+ lessons in school, brings up the startling question of the kind of damage that information gathering could do not only presently, but also in several years.

The weaponization of personal information available to state organizations should not be taken lightly. It has, and will continue to, present danger to those being targeted by the state as threats. Laws to target transgender children by restricting their access to gender-affirming care or affirming ideas in books have become commonplace in several Republican led states, but an explicit attack on legal adults adds an element that lends the question to where it will stop and who will stop it. These laws send a clear message that the right does not want transgender people to have a presence in society, both within everyday life and in the media surrounding them. The proposed laws restricting gender affirming care, along with classifying the parents of transgender children receiving gender affirming care as child abusers, LGBTQ+ lessons in school, and banning books and media that showcases queer people attempt to erase the queer experience both from modern life as well as in history.

All of these efforts depend on being able to identify those who are not living with the gender assigned to them at birth. Bureaucratic records may not be considered dangerous by the public, but the ability of government officials to access the records of those whose place in society they are seeking to erase can lead to dangerous consequences in the future. Other vulnerable groups will be targeted, and it is necessary to examine the historical implications and repercussions of the blatant targeting of these groups.


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185765 https://historynewsnetwork.org/article/185765 0
150 Years of "Zero-Sum Thinking" on Immigration Last week Title 42, a Trump-era policy that has limited immigration for the last three years, expired. Still, the Biden administration warned people arriving at the border that “the border is not open” and anyone arriving there would be “presumed ineligible for asylum.” In conversation, Dr. Carly Goodman revealed the 150-year-old history behind the US government’s restrictionist stance.

Specifically, Dr. Goodman explored this history through the lens of the Diversity Lottery. Not by coincidence, Dr. Goodman is the author of Dreamland: America’s Immigration Lottery in an Age of Restriction. She’s also a Senior Editor of Made by History at the Washington Post, which provides fantastic daily commentary from the nation’s leading historians.

A condensed transcript edited for clarity is below.

Ben: Dr. Goodman, thank you so much for being here.

CG: Thank you, Ben. 

Ben: Today I'd like to explore the history of the Diversity Visa as part of a broader exploration of US immigration history writ large.

Before we go back in time, could you please just give a quick overview of what the lottery is? 

CG: Sure. The Diversity Visa Lottery has been part of our immigration policies and laws since the Immigration Act of 1990. It's an annual lottery, open to almost every country. People from eligible countries can put their names in to register for the lottery, and if they are selected, they can then apply to become lawful permanent residents of the US. 

The first lottery was held in June of 1994, and it remains one of the very few pathways to legal status in the US. It's restrictive in some sense—you still have to apply for the visa and fit qualifications like having a high school diploma or its equivalent—but also much more expansive than many parts of our immigration system.

Ben: I think that’s a good segue into exploring the system's restrictive nature, beginning in the 1870s. What were the first immigration restrictions, imposed at that time?

CG: I’ll mention that my colleague, historian Hidetaka Hirota, has written about state-level restrictions prior to the imposition of federal immigration controls.

However, when the US federal government started to think about imposing regulations on immigration, it began by excluding almost all Chinese immigrants in the 1880s, who were seen as competing for work and land in the American West. This set an enduring pattern wherein immigration would be racialized.

Ben: The next big evolution in immigration policy occurred in the 1920s. What happened then?

CG: This time period is really relevant to the rise of the Diversity Lotter later on.

In the early 20th century, eugenicists looked around at growing numbers of immigrants from Europe—Italians, Poles, Jews (including my ancestors), for example—and they really didn't like how the American nation (as they perceived it) was changing.

So, officials came up with national origins quotas that imposed severe numerical restrictions on the entry of people they deemed undesirable, especially southern and eastern Europeans (as well as Asians), who were seen as almost a contagion on the nation.

The national origins quotas were explicitly eugenic in nature, and they remained in place from 1924 until a major immigration reform in 1965 finally dismantled them. The Immigration Act of 1965, also known as the Hart-Celler Act, instead emphasized family ties as one of the main ways to legally migrate.

Ben: You write that the shift toward family ties wasn’t purely altruistic.

CG: No, in some ways it was a compromise meant to mollify bigots who hoped that prioritizing family ties would lead to primarily white family members joining their relatives in the States.

Ben: Related, you quote the famous (but problematic) historian Arthur Schlesinger Jr. who worried that the arrival of different immigrant groups in the US might “shift the balance from unum to pluribus.”

To continue speaking in Latin, did Schlesinger’s ad nauseating fear come to fruition?

CG: Well, in addition to creating the family ties route to becoming citizens, Hart Celler imposed the first numerical restrictions on immigration from the Western Hemisphere. There’d long been migration from Latin America, both because the US destabilized many countries there, leading people to leave, and because of the need for workers here.

After 1965, Latin Americans who’d been coming to the US were still coming, but now they ran up against numerical limits. As historian May Ngai discusses in her work, Hart Celler thus created the problem of undocumented immigration. Some would say that's one of the most important legacies of the act. 

Ben: Moving into the 80s, how did the Irish defy the growing conceptions of illegal immigration, and what reforms did they push for?

CG: There's a long, storied history of Irish immigration to the US. For example, I live in Philadelphia, and we have a vibrant Irish-American community here.

Ben: The Philly cheese steak is famously an Irish creation.

CG: Um, it's closer to Italian.

Ben: ...right.

CG: Anyway, that sense of heritage was foremost on Irish immigrants' minds in the 80s. They felt the injustice of having to leave Ireland amid an economic crisis, just as their grandparents had, but encountered the added injustice of restrictions on their access to the US. Many Irish people came as visitors and overstayed their visas to try and find work. They were undocumented and white, contrary to the more racially motivated stereotypes of people without legal status that burgeoned in the 70s.

Congress, meanwhile, had been working on passing immigration reform. In 1986, legislators passed bipartisan reform that combined new enforcement measures with a couple of legalization programs to help people gain status and a path to citizenship.

Most of the Irish hadn’t arrived in time to qualify for the legalization, so members of the Irish communities in major cities got together to try to get legislation passed that would help them out. Basically, they identified the Immigration Act of 1965 as their problem, which reduced the number of visas available to them under the laws from the 1920s.

But it wasn’t cool to say, let’s bring back the eugenicist quotas that privilege white people. Instead, congresspeople close with the Irish community—Brian Donnelly and John Kerry from Massachusetts, for example—began asking, what if we could create pathways for countries that get very few visas these days? Countries like, oh, I don't know... how about Ireland?

There were all kinds of proposals for how to do this, but they came up with a lottery because it was the cheapest way to administer it. They opened it up to all countries in the world that had sent fewer than 50,000 immigrants to the US in the previous five years.

That’s how the Diversity Lottery began.

Ben: And surprisingly, African countries, long ignored or excluded in US immigration policy, maybe benefitted the most from the Irish-led reform, is that right?

CG: Exactly. The lottery began in 1994. The following year, 6.5 million entries from around the world vied for just 55,000 visas. 

I first learned about the lottery by speaking with people in places like Ghana, Nigeria, and Cameroon. It seemed to foster a sense of admiration for the US and for its openness and diversity. In some ways, the lottery format, relying on chance, disrupted people's perception that they were being turned away from the US because of their African passports and a racist system.

Ben: At the same time, you point out that when a person from an African country was lucky enough to win the lottery, they then encountered racism in the US. It’s like: you pack your bags, ready to embark on new life, and then you have to face all of the US' own baggage.

CG: Yep, and the lottery aside, the 90s turned out to be a time of more immigration restriction, not less. Levels of nativism reached points not seen since the early 20th century, and politicians on state and federal levels began to see what anti-immigrant demagoguery could do for them. Even policymakers who were supposed pro-immigration, like Bill Clinton, were relying on and expanding a growing system of immigrant detention.

After 9/11, restrictions only intensified. Under George Bush, the government began to view immigration as a threat. More and more money was put into border militarization and enforcement. 

Ben: Bringing us into the present day, you talk about how Obama and then Biden effectively maintained continuity with the past in terms of restrictive immigration procedures. Biden of course struck down Trump's African and Muslim travel bans, but he's also kept in place lots of Trump’s policies at the border.

How do you view the lottery within this still-restrictive context?

CG: Well, there’ve been efforts to dismantle the lottery over the last 20 years, and a lot of critics’ arguments are really built around zero-sum thinking; around the idea that this was a weird policy created for the Irish, and we’re already pretty diverse, so can’t we use the visas for something better?

But, that’s zero-sum thinking. As it turns out, we could just create more visas for more people. This leads to one of the central points I’m trying to make: Since the 1870s, we’ve had a restrictionist, gatekeeping system, but it’s possible to widen access if we want to. 

The thing preventing us, as it’s always been, is racism. When Donald Trump called for the lottery to be replaced with a system that would be based on what he calls “merit,” he meant white people (which he clarified). Policymakers know that any reform to end the lottery would diminish the number of visas available to Africans and limit one of their few legal pathways to coming to the US.

So, I study the lottery because it’s a part of our immigration system that we really never hear about, and it just works. It's operated fine for thirty years. I don't want to say that the lottery is a good thing for the people who feel that they have no choice but to enter, but I know that more inclusion and more access serve our communities in so many ways, despite our government’s best attempts to limit migration for the last 150 years.

Ben: A good concluding note. I might suggest that your book be called Dreamland: A Little More Pluribus, A Little Less Unum.

CG: Ha!

Ben: Thank you so much for your time today, Dr. Goodman. This has been a pleasure.

CG: Thank you for having me.

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/blog/154749 https://historynewsnetwork.org/blog/154749 0
The Mexican War Suggests Ukraine May End Up Conceding Crimea. World War I Suggests the Price May Be Tragic if it Doesn't

"American Army Entering the City of Mexico" by Filippo Constaggini, 1885. Architect of the Capitol. 


In April 1846, the United States invaded Mexico after a highly disputed incident at the border. Freshman Congressman Abraham Lincoln challenged President James Polk’s account of Mexican provocations as misleading and demanded to know the “spot” where they supposedly took place.

None of the major European powers got involved on either side. Great Britain remained officially neutral during the war, although it objected to an American blockade that interfered with its trade with Mexico. France was uninvolved but insisted that Mexico remain an independent nation.

By September 1847, American forces had captured the Mexican capital and forced surrender. An overwhelmed Mexico signed the 1848 Treaty of Guadalupe Hidalgo, ending the war and transferring to the United States over half of its territory, including modern day California, Nevada, Utah, and most of present day Colorado, New Mexico, and Arizona. Mexico was also forced to drop its claims to the former Mexican province of Texas and accept the Rio Grande as the new border between the countries. In return, the United States paid Mexico a consideration of fifteen million U.S. dollars, worth between 500 and 600 million dollars in today’s money.

Mexico is never going to receive its stolen territory back. The annual economy of California today alone is $3.5 trillion, approximately three times that of Mexico.

Fast forward to 1913, when Europe was divided into two military alliances. The Central Powers  (Germany, the Austro-Hungarian Empire, and Italy, later joined by the Ottoman Empire and Bulgaria), faced off against the Triple Entente (Great Britain, France and the Russian Empire, later to be joined by the United States and Italy when it changed sides). The alliances provided some stability in Europe, much like NATO and the Warsaw Pact alliances did during the Cold War, but also set conditions for the situation in Europe to rapidly spiral out of control.

On July 28, 1914, Austria-Hungary invaded Serbia after the assassination of the Austrian Archduke in Sarajevo, which had been annexed by Austria-Hungary in 1908. The assassins hoped to liberate Bosnia and Herzegovina from Austro-Hungarian rule. On August 8 Montenegro joined in the defense of Serbia, and on August 9 Russia, an ally of Serbia, attacked German and Austro-Hungarian positions. Meanwhile, Germany invaded Belgium, bringing France and Great Britain into the war. In the east, Russia collapsed, but in the west the two alliances stalemated. The war dragged on until the German collapse in the fall of 1918. Military and civilian casualties during World War I, deaths and injuries, were an estimated 40 million people. The punitive treaty that ended the war became an underlying cause of World War 2 and the deaths of another 80 million people.

Fast forward again, this time to more recent decades. With the collapse of the Soviet Union, Ukraine and Russia became independent countries, with the former Soviet Black Sea naval base now located in Ukraine after Crimea was administratively transferred from Russia to Ukraine in the 1950s. In 2014, a Ukrainian government allied with Russia was replaced by a westward leaning government, and Russia seized Crimea and its warm water naval base in violation of international agreements established after World War II protecting the territorial integrity of nations. In response, western nations placed economic sanctions on Russia, and NATO expanded eastward and considered admitting Ukraine into the alliance. Russia responded by invading Ukraine with the goals of putting a friendly government into power there and annexing territories on the Russian-Ukrainian border. The invasion stalled when NATO, including the United States, armed the Ukrainian military with modern weaponry more sophisticated than that used by Russian forces. It is now a war between NATO and Russia, although still a limited war, not just a war between Ukraine and Russia.

Ukrainian President Volodymyr Zelensky continually pressures NATO and the United States to provide Ukraine with more advanced weaponry. NATO has already agreed to deliver tanks, anti-missile systems, drones, and rockets, but Zelensky wants fighter jets that will allow Ukraine to shift from a defensive war and attack targets deep inside Russia.

The United States and NATO face a serious dilemma. They are committed to supporting Ukraine and preserving its national integrity, but Zelensky is demanding that Russia return all occupied territory, including Crimea, and pay reparations to rebuild areas of Ukraine that were destroyed by the Russian invasion, demands that Russia will never accept. Russia will not return Crimea to Ukraine, just as the United States will never return California to Mexico.

If NATO and the United States deliver jet fighters and Ukraine uses them to attack Russian targets, including cities, the world faces an escalating domino effect similar to that which started World War 1 and led to World War 2. That is why as a historian, I am really worried about events playing out in Ukraine. The only peaceful resolution that I see is Ukraine agreeing to accept Russia control over Crimea and some of the disputed border areas in exchange for the NATO alliance rebuilding areas destroyed by the war. NATO and Russia will then have to find a resolution to their differences, but I am not hopeful they will find an amicable solution.

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185766 https://historynewsnetwork.org/article/185766 0
The Roundup Top Ten for May 25, 2023

Why Historians Love Comparing Themselves to the Great Detectives

by Carolyn Eastman

The best point of comparison between Holmes and historian isn't in solving the case but in the struggle to make sense of the facts. 


Hollywood Strikers Carry the Legacy of Ned Ludd

by Gavin Mueller

Our techo-utopian society holds the Luddites in low regard, but their actual history helps explain what's at stake in the screenwriters' strike and any labor conflict where new technology threatens workers' livelihoods. 



Republican Push for More Capital Punishment Echoes Crime Panic of the 1980s

by Duncan Hosie

The Supreme Court decision in 1976 that allowed the states to resume executions coincided with a rise in anxiety over crime and pushed politicians to pledge more executions. 



After Dobbs, Abortion Politics are Straining the Republican Coalition

by Daniel K. Williams

When the party could focus on appointing anti-Roe judges, the Republicans could make abortion a political issue without having to decide matters of policy that inevitably leave parts of their coalition angry and disappointed. Have they lost by winning? 



"Return to Rigor" Isn't the Answer to Restoring Student Engagement

by Kevin Gannon

A post-COVID reaction to the improvisations made on grades, schedules and deadlines supposes that students are suffering from too much flexibility, but a singular focus on rigor won't address the causes of disengagment. 



How to Fight Back Against the Right's "Parents' Rights" Moral Panic

by Jennifer Berkshire

Parents' fears about losing control over their children have been the raw material for potent politically-motivated moral panics for a century and more. But those panics aren't irresistible, because parents everywhere still value public schools as democratic community institutions.  



Trump and DeSantis Two Peas in a White Nationalist Pod

by Clarence Lusane

Any Republican candidate will need to lean in to the politics of white Christian nationalism ascendant on the right; Trump has needed the MAGA movement as much as it's needed him. 



"Salts" are Part of Labor's Fight to Organize. They were once Part of the Antiwar Movement

by Derek Seidman

Taking a job with the covert intention of organizing the workplace is a time-honored labor tactic that's back in the news. Some dedicated activists in the 1960s "salted" the U.S. military in the hopes of building an antiwar movement within the ranks. 



Coca Cola Can't Go Green While Selling Drinks Cold

by Bart Elmore

If the worldwide beverage giant wants to reduce its carbon footprint, it's time for it to reverse its historical commitment to make its drinks available cold—in electric coolers—across the globe.



The Writers' Strike Opens Old Wounds

by Kate Fortmueller

The plot of each sequel of negotiations between the producers and writers has followed a formula of compromise for mutual self-preservation. Technological advances have convinced studio heads that they no longer need the labor of writers enough to keep compromising. 


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185763 https://historynewsnetwork.org/article/185763 0
Texas Judge Revives Anthony Comstock's Crusade Against Reproductive Freedom




In April, a Texas judge ruled invalid the Food and Drug Administration’s approval of a pill used in over half the abortions in America.  Going further, he invoked the federal Comstock Act to declare it “nonmailable.” Twenty Republican Attorneys General promptly warned pharmacy chains to halt its sale.  Such sales would violate a law initiated 150 years ago by a Connecticut farm boy turned dry goods salesman beginning his battle against reproductive rights.


From an early age, Anthony Comstock showed his moralistic zeal.  At eighteen, he broke into a gin mill near his family’s farm and drained the liquor onto the floor. Enlisting after Gettysburg, he fought his fellow soldiers’ vices – liquor, lust, swearing, breaking the Sabbath – as vigorously as the Confederates.  Moving to New York, he futilely tried to jail a smut dealer loaning obscene books to schoolboys.


The “hydra-headed monster” of smut is where he made his first big kill.  On March 2, 1872, he and a police captain raided booksellers along Manhattan’s Nassau Street, the heart of America’s smut industry.  In one shop, he purchased The Confessions of a Voluptuous Young Lady of High Rank. In others, he bought Women’s Rights Convention and La Rose d’Amour.  Evidence in hand, the pair secured warrants from a judge who agreed the books were obscene.  Returning to Nassau, they arrested eight culprits and confiscated five bushels of obscene merchandise.

Later that month, Comstock targeted a crime catering more to women, and which he considered an immeasurably greater evil.  Smut merely inspired lust.  This crime enabled it.  His specific target was a man, Dr. Charles Manches.  But the services Manches offered helped women overcome the safeguards God had built to control their passions:  the fear that could make a woman on the brink stop and preserve her chastity.


Manches advertised his “French Imported Male Safes” as “a perfect shield against disease or conception.”  For ladies wishing to take matters into their own hands, he offered “Ladies Protectors,” commonly known as womb veils.  If those devices failed to prevent pregnancy, he promised “Ladies Cured at One Interview, with or without medicine, $5.”  He was one of over a hundred abortionists in the city, according to the New York Times.


With support from the YMCA, Comstock continued his raids.  By mid-year, he had eight smut cases pending in New York courts.  But prosecutors continually requested postponements.  When one case finally proceeded, the defense didn’t contest Comstock’s testimony.  It simply argued the material confiscated was no more obscene than passages in the bible.  The argument wasn’t convincing.  Ten jurors voted to convict.  But the two who didn’t meant the defendant walked.  That proved the best outcome of his pending cases.


Frustrated under state law, Comstock changed tactics.  Seven years earlier, Congress had banned obscenity from first class mail.  The law was weak, narrowly defining obscenity and prohibiting postmasters from unsealing mail even if they knew a piece contained it.  Prosecutions had barely hit half a dozen.


Comstock began ordering smut by mail.  After receiving obscene goods, he obtained warrants in US Circuit Court.  Four dealers were convicted and sentenced to one year in jail and $500 fines – too lenient for Comstock, but the maximum the law allowed.


Raiding one dealer’s medical associate, he discovered the doctor’s teenage patient awaiting his third attempt to abort her fetus.  But abortion was a state crime.  A district attorney killed that case.


Dissatisfied, Comstock outlined ideas for a tougher federal law to Morris Jessup, the YMCA’s President.  Jessup got US Supreme Court Justice William Strong to finalize a bill for Congress.  In February 1873, Comstock visited the US Capitol to exhibit obscenities – books, sex toys, rubber goods.  Attending senators declared they would accept any bill he wanted so long as it was constitutional.  They could pass it before the current session closed for President Grant’s second inauguration March 4.


New York Congressman Clinton Merriam introduced the bill in the House, expecting to pass it quickly under a suspension of the rules.  Connecticut Senator William Buckingham followed in the Senate.


An optimistic Comstock got a head start on enforcement.  On Treasury Department letterhead, he contacted nine suspicious doctors.  “I am an employee of the Treasury,” he wrote under the pseudonym Anna M. Ray, “I was seduced about four months ago, and I am now three months gone in the family way.”  “Anna” begged each doctor to send something to relieve her condition.  “For God’s sake do not disappoint a poor ruined and forsaken girl whose only relief will be suicide if you fail me.”


The optimism was premature.  With resisting legislators invoking rules and demanding changes, weeks passed.  On Saturday evening, March 1, the House met for its final session.  Comstock watched.  At midnight, unwilling to break the Sabbath, he gave up.  Leaving the Capitol, he spent a sleepless night too depressed even to pray.  Not until dawn could he accept the failure as God’s will. Only when he ran into the Senate’s chaplain did he learn the news.  “Your bill passed the House at two o’clock this morning,” the chaplain said.  It was immediately sent to the Senate and passed.  President Grant signed it the next day.


His bill launched Comstock’s four-decade career fighting smut dealers, abortionists, birth control advocates, artists, playwrights, and poets.  Its opening section foretold his war on reproductive rights, explicitly banning anything – device, medicine, tool, information, advertising – “for the prevention of conception” or “for causing unlawful abortion.”


Women bookended that career.  As he was pushing his bill in Congress, Comstock indicted “Free Lover” Victoria Woodhull and her sister Tennie Claflin for publishing an obscene article exposing the adultery of Reverend Henry Ward Beecher.  While the article might have been libelous were it not true, it wasn’t obscene.  But Comstock guessed the arrests would be a publicity coup that would help his bill pass.  After a year of harassment, the sisters were acquitted.


Under his bill, Comstock quickly attacked abortionists—twelve in Chicago, seventeen in New York.  But Chicago judges imposed trivial fines. In New York only three served serious time.  Through 1875, Comstock claimed 49 abortion arrests with 39 convictions, but even he acknowledged the difficulty of bringing the practitioners to justice.  In 1878, he achieved one notable feat.  He entrapped New York’s notorious abortionist Madame Restell, driving her to suicide.  “A Bloody ending to a bloody life,” he noted without remorse.


Months later, Comstock entrapped Dr. Sara Case.  She supplied syringes for post-coital douching with substances like vinegar and carbolic acid to prevent conception.  As their battle played out in the press, Case renamed her device the “Comstock Syringe.”  Sales soared.


The list went on until Comstock closed his career arresting birth control advocate Margaret Sanger.  She fled to Europe to escape his clutches.  Comstock resorted to convicting her estranged husband for handing out a birth control pamphlet.


Of course the women he attacked directly were not the only victims of Comstock’s fight against reproductive rights.  Others were the desperate women forced to bear children, no matter the risks to their health, their inability to support another baby, or simply satisfaction with the family they already had.


With the Texas judge’s decision stayed and appeals underway, the battle over reproductive rights continues in Anthony Comstock’s shadow.




Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185707 https://historynewsnetwork.org/article/185707 0
Forget "Finding Forrester"—Our Best Teaching Can Be Ordinary

Plato and Aristotle in detail from The School of Athens by Raphael (1509–1510), fresco at the Apostolic PalaceVatican City.



Every few years there is a movie about a gifted young person striving to reach their potential and being formatively shaped by a teacher or mentor. Finding Forrester is a classic in this genre. The main character, Jamal, is a gifted young writer who meets a famous but reclusive novelist, William Forrester, who helps Jamal improve by challenging him and not being overly easy with the praise. In Whiplash, Miles Teller plays a gifted young drummer named Andrew Neiman whose music teacher, Terence Fletcher, is determined to draw out his genius. Fletcher’s approach is abusive and even somewhat insane. But Andrew wants so badly to be a musical legend on the level of Charlie Parker that he practices until his hands bleed and he endures the abuse.


Though university level instruction should not involve the abusive behavior we see in Whiplash, and we probably have to be more orthodox in our teaching than an old novelist eating soup and pecking at a typewriter, we sometimes dream of working with the kind of student pictured in those films. This would be a young person who has a natural gift and an unnatural drive to succeed. They want to be challenged. When you push them, they keep getting better. They go on to achieve remarkable things. You get to help launch them into the stratosphere.


In reality, very few students are going to resemble the characters in these movies. Some of your students aren’t interested in your class. Some are trying to decide if they are interested. Some are interested, but have other priorities. Some want to get better at whatever your discipline is, but do not believe that your course is part of their hero’s journey. Not everyone is going to read your comments on their paper. Not all who do will take the comments to heart. A few of your students will cheat on class assignments. Some of your students will certainly go on to greatness and many have significant abilities, but most of your students will not practice until their hands bleed.


There aren’t a lot of movies about doing an excellent job with normal students and getting normal outcomes. However, if it’s true that the process is more important than the product, those movies are missing something anyway. There’s quite a bit of true excellence in teaching that never gets associated with students who go on to win Nobel prizes or become MacArthur Fellows. Exceptional outcomes are not the only measure of excellence in teaching. An excellent teacher can teach all kinds of students. You can do meaningful work and inspire people without becoming the backstory of the next Stand and Deliver.


In films with bright students, those students arrive with the passion. Jamal is already a writer before he finds Forrester. Andrew Nieman has aspirations in the opening sequence. In real life, some college students are still searching for their passion. Some of them need that flame to be nourished. Even those with significant gifts are not always a half step from legendary excellence. Sometimes the role of the excellent teacher is an introduction to a subject or guiding the first steps along the path of whatever it is that a student is pursuing. Sometimes what you impart is not even a passion for your own subject.


A lot of the wise mentors in movies are set in their ways and have a pretty fixed and cantankerous approach to instruction. That may not slow down a gifted student who cannot be deterred from learning, but, even then, it may not be the actual best approach. Teaching excellence does not always take the form of pushing students to the extreme limits of their abilities. All students need to be challenged, but not all in extreme ways. Some also need to be encouraged. Struggle can help with growth, but sometimes students are struggling with things that are more important than our classes and don’t need provocatively difficult assignments to learn to push themselves in life. That doesn’t mean that every semester, every course, has to be catered to each individual student, or that everything should be easy, but it does mean that good teaching is much more than setting the bar at the correct height and then noting who makes it over and who doesn’t. There is a real art to setting meaningful but realistic expectations for students and ourselves.


One very unhelpful thing about films with amazing students is that they give us a distorted sense of impact. A good teacher’s legacy is not built on the genius of a single student helped along the way. A good teacher’s legacy includes people who became slightly better writers, casual readers of history, more critical viewers of documentaries, more knowledgeable citizens, and even people who just got better at passing college classes. A good legacy may even include helping direct a student to a better major for them. A good legacy is built on hundreds, thousands of recommendation letters, for all kinds of positions with varying degrees of prestige.


The reclusive novelist in Finding Forrester is roughly modeled on J.D. Salinger. Interestingly, Salinger’s novel Franny & Zooey has a relevant passage. Franny is a college student experiencing a kind of breakdown, and is judging her peers and professors along the way. Though they are part of the Glass family, full of child geniuses, her brother Zooey suggests that she is not necessarily flexing her intellect as much as she is being snobbish. Both had been precocious kids on a radio quiz show and Zooey reminds his sister that their older brother Seymour always encouraged them to do their best for the “Fat Lady”—to do their best for some unknown woman in the audience that they imagined as really deserving and really listening. Zooey even shined his shoes, for the radio program, for the “Fat Lady.” He tells his sister:


“I don’t care where any actor acts. It can be in summer stock, it can be over a radio, it can be over television, it can be in a goddam Broadway theatre, complete with the most fashionable, most well-fed, most sunburned-looking audience you can imagine. But I’ll tell you a terrible secret—Are you listening to me? There isn’t anyone out there who isn’t Seymour’s Fat Lady. That includes your Professor Tupper, buddy. And all his goddam cousins by the dozens. There isn’t anyone anywhere that isn’t Seymour’s Fat Lady. Don’t you know that? Don’t you know that goddam secret yet? And don’t you know—listen to me, now—don’t you know who that Fat Lady really is?... Ah, buddy. It’s Christ Himself. Christ Himself, buddy.”


There are days it feels like we are doing the Broadway equivalent of teaching—students seem to be lighting up, they’re going on to bigger and better things, they’re asking for outside reading recommendations. It is easy to feel inspired. But there are days we are off-off- Broadway—monitoring low grades and repeating ourselves in class. It is our job to see all of our students as significant, whether or not they seem special to us when we first meet them. Even if they would rather be texting, it is our job to be teaching to the best of our abilities.


Excellence in teaching is in meeting the challenge of real-life classrooms, filled with students of all abilities, and resulting in all kinds of outcomes. Excellent teaching is not just about throwing down challenges to push great students on to more greatness. We don’t work on a film set, we work in a university classroom. We are great when we are consistently excellent, whether or not our students are famous or we are experiencing moments that have the feel of movie magic.   

Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185704 https://historynewsnetwork.org/article/185704 0
Stronger Global Governance is the Only Way to a World Free of Nuclear Weapons

Some of the 800 members of Women Strike for Peace who marched at United Nations headquarters in Manhattan to demand UN mediation of the 1962 Cuban Missile Crisis


It should come as no surprise that the world is currently facing an existential nuclear danger.  In fact, it has been caught up in that danger since 1945, when atomic bombs were used to annihilate the populations of Hiroshima and Nagasaki.

Today, however, the danger of a nuclear holocaust is probably greater than in the past.  There are now nine nuclear powers―the United States, Russia, Britain, France, China, Israel, India, Pakistan, and North Korea―and they are currently engaged in a new nuclear arms race, building ever more efficient weapons of mass destruction.  The latest entry in their nuclear scramble, the hypersonic missile, travels at more than five times the speed of sound and is adept at evading missile defense systems. 

Furthermore, these nuclear-armed powers engage in military confrontations with one another―Russia with the United States, Britain, and France over the fate of Ukraine, India with Pakistan over territorial disputes, and China with the United States over control of Taiwan and the South China Sea―and on occasion issue public threats of nuclear war against other nuclear nations.  In recent years, Vladimir Putin, Donald Trump, and Kim Jong-Un have also publicly threatened non-nuclear nations with nuclear destruction.

Little wonder that in January 2023 the editors of the Bulletin of the Atomic Scientists set the hands of their famous “Doomsday Clock” at 90 seconds before midnight, the most dangerous setting since its creation in 1946.

Until fairly recently this march to Armageddon was disrupted, for people around the world found nuclear war a very unappealing prospect.  A massive nuclear disarmament campaign developed in many countries and, gradually, began to force governments to temper their nuclear ambitions.  The results were banning nuclear testing, curbing nuclear proliferation, limiting development of some kinds of nuclear weapons, and fostering substantial nuclear disarmament.  From the 1980s to today the number of nuclear weapons in the world sharply decreased, from 70,000 to roughly 13,000.  And with nuclear weapons stigmatized, nuclear war was averted.

But successes in rolling back the nuclear menace undermined the popular struggle against it, while proponents of nuclear weapons seized the opportunity to reassert their priorities.  Consequently, a new nuclear arms race gradually got underway.

Even so, a nuclear-free world remains possible.  Although an inflamed nationalism and the excessive power of military contractors are likely to continue bolstering the drive to acquire, brandish, and use nuclear weapons, there is a route out of the world’s nuclear nightmare.

We can begin uncovering this route to a safer, saner world when we recognize that a great many people and governments cling to nuclear weapons because of their desire for national security.  After all, it has been and remains a dangerous world, and for thousands of years nations (and before the existence of nations, rival territories) have protected themselves from aggression by wielding military might.

The United Nations, of course, was created in the aftermath of the vast devastation of World War II in the hope of providing international security.  But, as history has demonstrated, it is not strong enough to do the job―largely because the “great powers,” fearing that significant power in the hands of the international organization would diminish their own influence in world affairs, have deliberately kept the world organization weak.  Thus, for example, the UN Security Council, which is officially in charge of maintaining international security, is frequently blocked from taking action by a veto cast by one its five powerful, permanent members.

But what if global governance were strengthened to the extent that it could provide national security?  What if the United Nations were transformed from a loose confederation of nations into a genuine federation of nations, enabled thereby to create binding international law, prevent international aggression, and guarantee treaty commitments, including commitments for nuclear disarmament? 

Nuclear weapons, like other weapons of mass destruction, have emerged in the context of unrestrained international conflict.  But with national security guaranteed, many policymakers and most people around the world would conclude that nuclear weapons, which they already knew were immensely dangerous, had also become unnecessary.

Aside from undermining the national security rationale for building and maintaining nuclear weapons, a stronger United Nations would have the legitimacy and power to ensure their abolition.  No longer would nations be able to disregard international agreements they didn’t like.  Instead, nuclear disarmament legislation, once adopted by the federation’s legislature, would be enforced by the federation.  Under this legislation, the federation would presumably have the authority to inspect nuclear facilities, block the development of new nuclear weapons, and reduce and eliminate nuclear stockpiles.

The relative weakness of the current United Nations in enforcing nuclear disarmament is illustrated by the status of the UN Treaty on the Prohibition of Nuclear Weapons.  Voted for by 122 nations at a UN conference in 2017, the treaty bans producing, testing, acquiring, possessing, stockpiling, transferring, and using or threatening the use of nuclear weapons.  Although the treaty officially went into force in 2021, it is only binding on nations that have decided to become parties to it.  Thus far, that does not include any of the nuclear armed nations.  As a result, the treaty currently has more moral than practical effect in securing nuclear disarmament.

If comparable legislation were adopted by a world federation, however, participating in a disarmament process would no longer be voluntary, for the legislation would be binding on all nations.  Furthermore, the law’s universal applicability would not only lead to worldwide disarmament, but offset fears that nations complying with its provisions would one day be attacked by nations that refused to abide by it.

In this fashion, enhanced global governance could finally end the menace of worldwide nuclear annihilation that has haunted humanity since 1945.  What remains to be determined is if nations are ready to unite in the interest of human survival.





Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185705 https://historynewsnetwork.org/article/185705 0
AI the Latest Instance of our Capacity for Innovation Outstripping our Capacity for Ethics

The eagerness with which movie and television studios have proposed to use artificial intelligence to write content collides with the concern of Writers Guild members for their employment security and pay in the latest episode of technological innovation running ahead of ethical deliberation. 




Regarding modern technology, the psychologist Steven Pinker and the economist/environmentalist E. F. Schumacher have expressed opposite opinions. In his Enlightenment Now: The Case for Reason, Science, Humanism, and Progress (2018), the former is full of optimism--e.g.,“technology is our best hope of cheating death”--but many decades earlier Schumacher stated that it was “the greatest destructive force in modern society.” And he warned, “Whatever becomes technologically possible . . . must be done. Society must adapt itself to it. The question whether or not it does any good is ruled out.”


Now, in 2023, looking over all the technological developments of the last century, I think Schumacher’s assessment was more accurate. I base this judgment on recent developments in spyware and Artificial Intelligence (AI). They have joined the ranks of nuclear weapons, our continuing climate crisis, and social media in inclining me to doubt humans’ ability to control the Frankensteinian  monsters they have created. The remainder of this essay will indicate why I have made this judgment.


Before taking up the specific modern technological developments mentioned above, our main failing can be stated: The structures that we have developed to manage technology are woefully inadequate. We have possessed neither the values nor wisdom necessary to do so. Several quotes reinforce this point.


One is General Omar Bradley’s: "Ours is a world of nuclear giants and ethical infants. If we continue to develop our technology without wisdom or prudence, our servant may prove to be our executioner."


More recently, psychologist and futurist Tom Lombardo has observed that “the overriding goal” of technology has often been “to make money . . . without much consideration given to other possible values or consequences.”


Finally, the following words of Schumacher are still relevant:

“The exclusion of wisdom from economics, science, and technology was something which we could perhaps get away with for a little while, as long as we were relatively unsuccessful; but now that we have become very successful, the problem of spiritual and moral truth moves into the central position. . . . Ever-bigger machines, entailing ever-bigger concentrations of economic power and exerting ever-greater violence against the environment, do not represent progress: they are a denial of wisdom. Wisdom demands a new orientation of science and technology towards the organic, the gentle, the nonviolent, the elegant and beautiful.”


“Woefully inadequate” structures to oversee technological developments. How so? Some 200 governments are responsible for overseeing such changes in their countries. In capitalist countries, technological advances often come from individuals or corporations interested in earning profits--or sometimes from governments sponsoring research for military reasons. In countries where some form of capitalism is not dominant, what determines technological advancements? Military needs? The whims of authoritarian rulers or elites? Show me a significant country where the advancement of the common good is seriously considered when contemplating new technology.


Two main failings leap out at us. The first, Schumacher observed a half century ago--capitalism’s emphasis on profits rather than wisdom. Secondly--and it’s connected with a lack of wisdom--too many “bad guys,” leaders like Hitler, Stalin, Putin, and Trump, have had tremendous power yet poor values.


Now, however, on to the five specific technological developments mentioned above. First, nuclear weapons. From the bombings of Hiroshima and Nagasaki in 1945 until the Cuban Missile Crisis in 1962, concerns about the unleashing of a nuclear holocaust topped our list of possible technological catastrophes. In 1947, the Bulletin of the Atomic Scientists established its Doomsday Clock, “a design that warns the public about how close we are to destroying our world with dangerous technologies of our own making.” The scientists set the clock at seven minutes to midnight. “Since then the Bulletin has reset the minute hand on the Doomsday Clock 25 times,” most recently in January of this year when it was moved to 90 seconds to midnight--“the closest to global catastrophe it has ever been.” Why the move forward? “Largely (though not exclusively) because of the mounting dangers of the war in Ukraine.”


Second, our continuing climate crisis. It has been ongoing now for at least four decades. The first edition (1983) of The Twentieth Century: A Brief Global History  noted that “the increased burning of fossil fuels might cause an increase in global temperatures, thereby possibly melting the polar ice caps, and flooding low-lying parts of the world.” The third edition (1990) expanded the treatment by mentioning that by 1988 scientists “concluded that the problem was much worse than they had earlier thought. . . . They claimed that the increased burning of fossil fuels like coal and petroleum was likely to cause an increase in global temperatures, possibly melting the polar ice caps, changing crop yields, and flooding low-lying parts of the world.” Since then the situation has only grown worse.


Third, the effects of social media. Four years ago I quoted historian Jill Lepore’s highly-praised These Truths: A History of the United States (2018): “Hiroshima marked the beginning of a new and differently unstable political era, in which technological change wildly outpaced the human capacity for moral reckoning.” By the 1990s, she observed that “targeted political messaging through emerging technologies” was contributing to “a more atomized and enraged electorate.” In addition, social media, expanded by smartphones, “provided a breeding ground for fanaticism, authoritarianism, and nihilism.”


Moreover, the Internet was “easily manipulated, not least by foreign agents. . . . Its unintended economic and political consequences were often dire.” The Internet also contributed to widening economic inequalities and a more “disconnected and distraught” world. Internet information was “uneven, unreliable,” and often unrestrained by any type of editing and fact-checking. The Internet left news-seekers “brutally constrained,” and “blogging, posting, and tweeting, artifacts of a new culture of narcissism,” became commonplace. So, too did Internet-related companies that feed people only what they wanted to see and hear. Further, social media “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right. . . . The ties to timeless truths that held the nation together, faded to ethereal invisibility.”

Similar comments came from the brilliant and humane neurologist Oliver Sacks, who shortly before his death in 2015 stated that people were developing “no immunity to the seductions of digital life” and that “what we are seeing—and bringing on ourselves—resembles a neurological catastrophe on a gigantic scale.” 

Fourth, spyware. Fortunately, in the USA and many other countries independent media still exists. Various types of such media are not faultless, but they are invaluable in bringing us truths that would otherwise be concealed. PBS is one such example.

Two of the programs it produces, the PBS Newshour and Frontline have helped expose how insidious spyware has become. In different countries, its targets have included journalists, activists, and dissidents. According to an expert on The Newshour,

“The use of spyware has really exploded over the last decade. One minute, you have the most up-to-date iPhone, it's clean, sitting on your bedside table, and then, the next minute, it's vacuuming up information and sending it over to some security agency on the other side of the planet.”

The Israeli company NSO Group has produced one lucrative type of spyware called Pegasus. According to Frontline, it “was designed to infect phones like iPhones or Androids. And once in the phone, it can extract and access everything from the device: the phone books, geolocation, the messages, the photos, even the encrypted messages sent by Signal or WhatsApp. It can even access the microphone or the camera of your phone remotely.” Frontline quotes one journalist, Dana Priest of The Washington Post, as stating, “This technology, it's so far ahead of government regulation and even of public understanding of what's happening out there.”

The fifth and final technological development to consider is Artificial Intelligence (AI). During the past year, media has been agog with articles on it. Several months ago on this website I expressed doubts that any forces will be able to limit the development and sale of a product that makes money, even if it ultimately harms the common good. 

More recently (this month) the PBS Newshour again provided a public service when it conducted two interviews on AI. The first was with “Geoffrey Hinton, one of the leading voices in the field of AI,” who “announced he was quitting Google over his worries about what AI could eventually lead to if unchecked.”

Hinton told the interviewer (Geoff Bennett) that “we're entering a time of great uncertainty, where we're dealing with kinds of things we have never dealt with before.” He recognized various risks posed by AI such as misinformation, fraud, and discrimination, but there was one that he especially wanted to highlight: “the risk of super intelligent AI taking over control from people.” It was “advancing far more quickly than governments and societies can keep pace with.” While AI was leaping “forward every few months,” needed restraining legislation and international treaties could take years.

He also stated that because AI is “much smarter than us, and because it's trained from everything people ever do . . . it knows a lot about how to manipulate people, and “it might start manipulating us into giving it more power, and we might not have a clue what's going on.” In addition, “many of the organizations developing this technology are defense departments.” And such departments “don't necessarily want to build in, be nice to people, as the first rule. Some defense departments would like to build in, kill people of a particular kind.”

Yet, despite his fears, Hinton thinks it would be a “big mistake to stop developing” AI. For “it's going to be tremendously useful in medicine. . . . You can make better nanotechnology for solar panels. You can predict floods. You can predict earthquakes. You can do tremendous good with this.”

What he would like to see is equal resources put into both developing AI and  “figuring out how to keep it under control and how to minimize bad side effects of it.” He thinks “it's an area in which we can actually have international collaboration, because the machines taking over is a threat for everybody.”

The second PBS May interview on AI was with Gary Marcus, another leading voice in the field. He also perceived many possible dangers ahead and advocated  international controls.

Such efforts are admirable, but are the hopes for controls realistic? Looking back over the past century, I am more inclined to agree with General Omar Bradley--we have developed “our technology without wisdom or prudence,” and we are “ethical infants.”

In the USA, we are troubled by divisive political polarization; neither of the leading candidates for president in 2024 has majority support in the polls; and Congress and the Supreme Court are disdained by most people. Our educational systems are little concerned with stimulating thinking about wisdom or values. If not from the USA, from where else might global leadership come? From Russia? From China? From India? From Europe? From the UN? The past century offers little hope that it would spring from any of these sources.

But both Hinton and Marcus were hopeful in their PBS interviews, and just because past efforts to control technology for human betterment were generally unsuccessful  does not mean we should give up. Great leaders like Abraham Lincoln, Franklin Roosevelt, and Nelson Mandela did not despair even in their nations’ darkest hours. Like them, we too must hope for--and more importantly work toward--a better future.


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185706 https://historynewsnetwork.org/article/185706 0
John de Graaf on his Powerful Documentary on Stewart Udall, Conservation, and the True Ends of Politics

John de Graaf and Stewart Udall


We have, I fear, confused power with greatness.—Stewart Udall


Stewart Udall (1920-2010) may be the most effective environmentalist in our history considering his monumental accomplishments in protecting and preserving the environment and improving the quality of life for all citizens. Unfortunately, his tireless efforts for conservation and environmental protection and his gifts as a leader are not well known to the wider public today. His life offers inspiration and a model for, among others, public servants and citizen activists today.

As the Secretary of the Interior from 1961 to 1969 under Presidents John F. Kennedy and Lyndon Baines Johnson, Udall took the department in new directions as he crafted some of the most significant environmental policies and legislation in our history. With his talent for forging bipartisan alliances, he spearheaded the enactment of major environmental laws such as the Clear Air, Water Quality and Clean Water Restoration Acts, the Wilderness Act of 1964,

the Endangered Species Preservation Act of 1966, the Land and Water Conservation Fund Act of 1965, the National Trail System Act of 1968, and the Wild and Scenic Rivers Act of 1968.

Secretary Udall also led in expanding federal lands and he established four national parks, six national monuments, eight national seashores and lakeshores, nine national recreation areas, 20 national historic sites, and 56 national wildlife refuges including Canyonlands National Park in Utah, North Cascades National Park in Washington, Redwood National Park in California, and more. A lifelong advocate for civil rights, Udall also desegregated the National Park Service.

After his term as Secretary of the Interior, Udall continued to work for decades as an attorney advancing environmental protection, worker health and safety, human rights, tolerance, Indigenous rights, racial equality, and justice.

Despite his many achievements, Udall seems to have faded from memory and most people today know little of his monumental legacy. His name doesn’t usually leap to mind when considering the great leaders on the environment and human rights.

To remind us of Udall’s remarkable life and legacy, acclaimed filmmaker and activist John de Graaf created a new documentary, Stewart Udall, The Politics of Beauty (The film is available through Bullfrog Communities: www.bullfrogcommunities.com/stewartudall).

The film charts the trajectory of Udall’s life as it introduces viewers to a history of the origins of the modern environmental movement. There’s the journey from Udall’s childhood in Arizona, his schooling, and his World War II combat duty, to his commitment to public service, his terms in Congress, and his achievements as Secretary of the Interior. The film further recounts his later life as a zealous attorney, author, and voice for beauty, simplicity, and peace as he warned about climate change, health hazards, rampant consumerism, and the dangers of polarization and extreme partisanship. Especially engaging are interviews with Udall and his family supplemented with family films as well as scenes with JFK and Lady Bird Johnson.

The film is based on exhaustive archival research as well as interviews with historians, family members, friends and colleagues of Udall. Personal films, photographs and papers were shared with Mr. de Graaf and his team. As the life of Udall unfolds, the film provides historical context illustrated with vivid scenes from the turbulence, environmental devastation, and movements for justice and peace in the sixties and seventies. There are also stunning sequences of natural beauty from the forests, seas, deserts and other sites that Udall sought to protect.

The story of Udall’s life may provide a way forward for younger people today who are skeptical of politics and disillusioned by stasis and polarization that prevent meaningful change for a better quality of life and a more livable world. Udall’s visionary pursuit of environmental and social justice came out of his cooperative nature and his belief in democracy. May his inspiring example create hope and fire the minds of citizens today.  

Mr. de Graaf is a Seattle-based award-winning filmmaker, author, and activist. He has said that his mission is to “help create a happy, healthy and sustainable quality of life for America,” and his documentary on Stewart Udall is an aspect of that desire. He has been producing and directing documentaries for public television for more than forty years. His nearly 50 films, including 15 prime time PBS specials, have won more than 100 regional, national and international awards.

Mr. de Graaf also has written four books, including the bestselling Affluenza: The All-Consuming Epidemic. The John de Graaf Environmental Filmmaking Award, named for him, is presented annually at the Wild and Scenic Film Festival in California. He is also co-founder and president of Take Back Your Time, co-founder of the Happiness Alliance, former policy director of the Simplicity Forum, and founder of the emerging organization, And Beauty for All. 

Mr. de Graaf graciously responded to questions about his background and his Udall documentary by phone from his Seattle office.


Robin Lindley: Congratulations John on your heartfelt and vivid Stewart Udall film. I appreciate the work you do and your persistence. Every documentary film must be a long haul.

John de Graaf: Thank you. I had a team of great people to work with, so I can't take all the credit.

Robin Lindley:  Before we get to the Udall film, I wanted to give readers a sense about your background. What inspired you to work now as an activist, author and filmmaker?

John de Graaf:  I was an activist first, and that led me to do quite a bit of writing, to print reporting. And that eventually led me to do a public affairs radio show at the University of Minnesota in Duluth. Doing that, I met a character that I thought would make a great film. And then I connected with this videographer at the University of Minnesota Minneapolis, and we put a film together that was then aired on Minnesota Public Television in 1977, and the film won a major PBS award and that launched me.

Four years later I started doing freelance documentary production at Channel Nine, the PBS station in Seattle. I was there for 31 years basically, until they kicked me out in 2014, but I've continued. My film Affluenza was a big success on PBS, so I was asked to write a book by a by a New York agent. Then a California publisher put out the Affluenza book, and that took off like the film. It has sold nearly 200,000 copies in 10 or 11 languages internationally.

I also made a little film called What's the Economy for Anyway? and that led to another book. I also edited a book called Take Back Your Time that was connected with research and activism I was doing about overwork in America.

Robin Lindley: Congratulations on those projects aimed at exposing social justice and environmental issues and at encouraging work to improve the quality of our lives.

John de Graaf: Yes. The quantity of our stuff, or the gross national product, or world power, or any of those things should not be the goal. Instead, the aim should be about the best quality of life for people. I think all of these themes connect with that.

Robin Lindley: Thanks for your tireless efforts. You title of your new documentary is Stewart Udall, The Politics of Beauty. What do you mean by the politics of beauty? It seems that expression ties in with your interests in the environment and nature as well as your efforts to promote happiness and better quality of life.

John de Graaf: I think there is a lot of evidence that our common, even universal, love for beauty, especially nature’s beauty, can bring us together and reduce polarization.  It’s no accident that the most bipartisan bill passed during the Trump administration was the Great American Outdoors Act.  Beautiful cities can slow us down, reduce our levels of consumption, and use of the automobile.  Parks and access to nature are a more satisfying substitute for material stuff.  The response to my film confirms this for me.  Stewart was aware of all of this.

Robin Lindley: What inspired you to make a film now about Stewart Udall, who seems to be an overlooked champion for the environment? He's not remembered in the same way as naturalist John Muir maybe, or author Rachel Carson or Sierra Club’s David Brower.

John de Graaf: Of course, John Muir was a huge figure in his time. His writing was known by everybody and he stirred such a movement but he needed political figures like Teddy Roosevelt and later, Udall, to make his dream of the National Parks come true.

Rachel Carson’s book Silent Spring was very powerful, but that's what she did and she died soon afterwards. She wasn't able to accomplish a lot without people like Udall who actually created and passed legislation. I don't mean to in any way denigrate her. She was great and Udall loved and appreciated her. He was a pallbearer at her funeral. Her book stirred a lot of interest and attention, and people like Udall got behind it, and so it had a major effect.

In terms of environmental work, David Brower was exceedingly important because he was involved in so many things including the Sierra Club. Aldo Leopold was a key figure with his impact. And there have been many, many others since then. Now you'd have to probably add Bill McKibben, Gus Speth, and people like that.

Robin Lindley: It seems, however, that Udall has been overlooked or forgotten. Was that one of the reasons you wanted to do a film about him?

John de Graaf: I was impressed years ago when I interviewed him, but I'd forgotten about him until I saw a newspaper story in 2020 that said “a hundred years ago today Stewart Udall was born.” I was struck by my memory of him, and I knew he gave me a book so I went to my shelf and pulled down the book that he gave me and signed to me when I interviewed him.

And then I started doing a little more research, first online and then ordering biographies of him. And I thought, what a fascinating character. I knew that he had created several national parks and some things like that, and I knew that he had stopped the Grand Canyon dams because that was what I'd interviewed him about. But I had no idea about his civil rights activity, his work for world peace, his work for the arts, and his support for victims of atomic fallout and uranium miners, and so many other things that he ended up doing. That came as a complete surprise to me, and I think made the film richer.

Robin Lindley: Udall seems a renaissance man. I didn't know much about his life, and your film was certainly illuminating. What do you think inspired him to get involved in environmental protection and then in environmental and social justice issues?

John de Graaf: Number one, he did spend a lot of time outdoors when he was a kid on a farm in Arizona and hiking in the nearby White Mountains. And he got very interested in the natural world and the beauty of the natural world when he was out hiking.

And then, he grew up in a Mormon family, but it was unusual because it was a very liberal Mormon family. His father impressed on all the kids that Mormons had been discriminated against and that's why they were in these godforsaken places in the desert. They'd been pushed out of Illinois and Missouri and other places, so they had to stand up for other people who were discriminated against, and that included especially Native Americas because they lived in the area where he was, and Black Americans, and so forth.

And then, he fought in World War II. He flew on 52 very dangerous bombing missions. He was very lucky to come back alive and he said that he must have been allowed to live for some reason. He decided, “I really need to be involved in public service in the best way that I know how.”

When he came back, he played basketball at the University of Arizona, and he was very committed to civil rights. He and his brother Mo both joined the Tucson chapter of the NAACP right after the war. And they’d had Black friends in the military and Mo had been a lieutenant with a division of Black troops. And they both fought to integrate the University of Arizona.

And Stewart was especially interested in the environment and protecting the beauty of the West. Later, that went beyond conservation, beauty and preservation to a much wider view of ecology and the environment and pollution.  

Robin Lindley: Udall’s probably best known for his role as the Secretary of the Interior under JFK and LBJ. How did he come to be appointed the Secretary of Interior? What brought him to the attention of the Kennedy administration?

John de Graaf: He worked with Senator John Kennedy as a congressman. They worked on a number of bills together in the late fifties, and he was very impressed by Kennedy.

When Kennedy decided to run for president for 1960, Stewart got behind him. Stewart was a very influential and persuasive person in Arizona at that time, though nobody knew anything about him beyond Arizona.  But he was able to convince Arizona's delegation to unanimously support Kennedy for president over Lyndon Johnson at the Democratic Convention. And Kennedy appreciated that.

Kennedy was also looking for somebody who knew something about the outdoors and somebody who was a westerner because it was traditional that the Interior Secretary be a westerner. Stewart Udall was the obvious choice for Kennedy at that time.

Robin Lindley: Did Udall have a close relationship with Kennedy during his short presidential term?

John de Graaf: I think Kennedy was distant and Stewart wanted a much closer relationship than Kennedy would allow with him, or I think with anyone else. But they were friends, of course, and Kennedy supported what Stewart was doing and Stewart supported what Kennedy was doing. He felt that Kennedy had a prodigious intellect and capacity for getting things done, but he was not a person who was easy to make friends with. Stewart was actually much better friends with Jackie, Kennedy's wife. She thought Stewart was such a gentleman and a fascinating character. She liked his personality and very much liked his wife. They were friends with his family.

Stewart didn't know how Johnson would be, but it turned out that Johnson was a much more social person than Kennedy, and much easier to be with and have a friendship with, And Johnson really loved nature and was committed to environmental protection in a stronger way than Kennedy had been. And a lot of that came from Johnson’s wife so Stewart cultivated his friendship with Lady Bird Johnson who adored him, according to Johnson’s daughters.

Udall convinced Lady Bird Johnson that she should make a name for herself in conservation by first doing a beautification campaign and then through various other work. Lady Bird took up that Beautify America campaign and became a great advocate for the environment.

Robin Lindley: Didn’t Lady Bird and Udall share a concern about impoverished urban areas urban areas also?

John de Graaf: It didn't start with the impoverished areas. It started with the idea of beautifying America. But Lady Bird and Lyndon Johnson loved the cities that they visited in Europe, and they felt that Washington was a derelict place-- a mess in comparison to the other capitals of the world. It was embarrassing to bring people to the United States capital.

They felt that they had to start their campaign addressing cities in Washington DC, and that justice compelled them to start in the poorest communities, which were African American communities. They decided to put money first into beautifying those areas before focusing on the neighborhoods that were already gentrified.

Robin Lindley: And that approach also ties into Udall’s interest in civil rights, which you stress in your documentary.

John de Graaf: Yes. He was very interested in promoting civil rights. One of his first discoveries as Secretary of Interior was that the Washington Redskins (now Commanders) football team wouldn't hire Black players. So, he basically gave them this ultimatum that, if they wanted to play in the National Stadium, which the Department of Interior controlled, they needed to hire Black players or Udall would not lease the stadium to them. And so, they hired Black players, and that changed football immensely. In fact, the Washington Redskins became a much better team. The Washington Post even suggested that Stewart Udall should be named NFL Coach of the Year because of what he’d done to improve the team.

Udall also discovered that the National Park Service, which he was in charge of, was segregated. They had Black rangers only in the Virgin Islands, which is primarily Black. He was determined to change that. He sent recruiters to traditionally Black colleges in order to do it.

His kids told me that he would watch the civil rights protests on television. And he would say things like “Look at those brave young people. They have so much dignity.” And these young people were getting slammed, and weren't violent. They were quite the opposite, and Stewart said, “These kids are what America should be all about.” He added, “We need kids like this in the National Park Service, and the National Park Service needs to look like America.”

Bob Stanton from Texas was one of the first Black park rangers, and he went to Grand Teton. He later became the head of the National Park Service. He's a wonderful guy and I’ve gotten to know him well. Bob's 83 now, but he has the deepest memories of all that happened and Stewart Udall's role in it.

Stewart also had to decide whether the 1963 March on Washington could happen because it was planned for the National Park areas of the Lincoln Memorial and the Washington Monument. He had to grant a permit for the march to proceed, and there was enormous pressure for him not to approve the permit that came from the Jim Crow Democratic Senators in the South who were also putting huge pressure on President Kennedy. 

The march happened, and it was huge, and its impact was huge. Stewart watched it from the sidelines, but you could see in the photos of the march that National Park rangers were standing right near Martin Luther King when he spoke.

Robin Lindley:  Thanks for sharing those comments on Udall’s support of civil rights. Didn’t he leave the Mormon Church because of its racist policies?

John de Graaf: He wasn’t a Mormon anymore by then, but he always claimed that he remained a cultural Mormon--that he believed in the Mormon ideas of public service, of community and family, and all those things. And Mormons did have a real ethic of serving the community in those days. Those communities were tight, and people worked together. And Stewart believed in that.

World War II really cost him his faith because he just couldn't accept that, if there was a God, God would allow the things to happen that he saw in the war. He became basically an agnostic but he did not reject the church, and he did not openly criticize the church until the mid-1960s when he became concerned about the church's refusal to allow Blacks in its priesthood.

Udall thought that was astounding and terrible, so he finally wrote a letter to the church saying there was a Civil Rights Movement and the position of Mormon Church was unacceptable. The church basically wrote back and said that it might agree with Udall but it doesn’t make those decisions. God does. Until God gives a revelation to the head of the church, things must stay as they are.

Ten years later, God gave a revelation to the head of the church and they changed the policy. Stewart basically was out of the church and was not considered a Mormon, but he was never excommunicated and never really disowned in any sense by the church. In fact, some of the strongest supporters of this film today are Mormons even though it’s clear about Udall leaving the church. Some evangelicals believe that former members are betrayers, but the Mormons don't take that position at all. In fact, they very much honor Udall. I just spoke at Utah State University, and a young Mormon woman come up to me after the screening and said she wanted to show this film. She said she was a board member of the Mormon Environmental Stewardship Association, and she added that “We're proud of Steward Udall.” It was very positive to see that attitude.

Robin Lindley: Thanks for explaining that, John. I didn't quite understand Udall’s interactions with the Mormon Church.

John de Graaf: The church's view was that Stewart had honest reasons for rejecting policies and for leaving the church, and that was respected. And it did not make him a bad person. You had to decide that he was a good or bad person on the basis of the deeds that he did, which seems a good attitude

Robin Lindley:  Yes. And Stewart Udall had a special gift for working both sides of the aisle to pass legislation including many important environmental measures. Wasn’t the Congress of the 1960s far less polarized than what we see now?

John de Graaf: It was, and particularly after Kennedy's death, but there was a lot of fighting and it was hard for Stewart to move things through. He certainly had some very key Republican support, but he also had some major Democratic opposition, not only from the head of the Interior Committee, Wayne Aspinall, a Colorado Democrat, but he also had southern Democrats who hated him because of his civil rights positions.

But after Kennedy was killed, and Johnson was elected in a landslide, that brought the Congress together around the idea of LBJ’s Great Society programs and civil rights laws. And Johnson did a much better job of getting things through Congress than Kennedy. Then you saw the Land and Water Conservation Fund, and the Wilderness Act, and Endangered Species List--major bills that passed because Congress and Johnson supported them.

But some environmental laws didn’t get passed until Nixon came in because of the huge protests on the first Earth Day in 1970. These bills were already in Congress, and Congress moved them ahead. And when Nixon was president, he had a Democratic Congress. The bills moved ahead but there was never a veto proof majority except on a couple bills like the Wild Rivers Act. Nixon though, with the pressure of Earth Day and all the environmentalist sentiment at that time, signed the bills.

Nixon himself had an environmental sensibility. He was terrible on civil rights issues and the war but he was much more open about the environment. He realized the impact of pollution. He had seen the Santa Barbara oil spill, the polluted Cuyahoga River. Nixon felt comfortable in signing the act creating the Environmental Protection Agency.

Robin Lindley: Is it fair to say that Stewart Udall was the architect of the EPA’s creation?

John de Graaf: It's fair to say that he was certainly one of the main architects. He didn't do it alone. He had key people in Congress who were supporting him, but he certainly pushed hard for it. I don't know if the idea was originally his, but he was probably the first who talked about it, and he certainly played a major role in it.

Stewart was also the first political figure to speak about global warming. He heard about it from his scientific advisor, Roger Revelle.  Revelle was an oceanographer who worked with the Smithsonian and was one of the first scientists to look at how the oceans were heating up. He said we have a problem on our hands with global warming. Stewart was talking with him on a regular basis and then decided to go public with this threat.  Other politicians knew about it, but they wouldn't go public, but Stewart said this was a major problem and he predicted flooding of Miami and New York and melting of the polar ice cap. And he was talking about global warming in 1967.

Robin Lindley: That surprised me. He was so prescient.

John de Graaf: Yes. There were smart scientists, but most politicians wouldn't dare touch it, even though the signs of much of it were already there. Daniel Moynihan gave a big public speech in 1969 about global warming as a big issue. More attention was probably paid to that speech than to Stewart, because Stewart wrote about the climate in books and in articles rather than in speeches.

Robin Lindley: It was interesting that, in one of Johnson's major speeches on the Great Society, he spoke about civil rights and poverty, and he decided to added a section that Stewart had suggested on the quality of life despite objections from some politicians.

John de Graaf: The speech was written by Richard Goodwin, the president’s speechwriter. But certainly, Goodwin had to have been reading what Stewart had written for LBJ because the language was exactly the same as much of Stewart's language.

Stewart had actually written short speeches for LBJ that had that language about quality of life and beauty. He wrote that when we lose beauty, we lose much that is meaningful in our lives.

That Great Society speech was interesting because Johnson was clearly influenced by Stewart and he agreed with his views about quality of life and nature. And Johnson told Richard Goodwin to have three themes in that speech: poverty, civil rights, and the quality of life and beauty. But then he told Goodwin to share the speech with the leaders of the House and Senate and get their opinions on it because he wanted them to like it and to support it. When Goodwin did that, he found that the Democratic leaders wanted him to take out the part about beauty and quality of life and to focus on the war on poverty and civil rights because they felt that these other things would distract from the main message that the president wanted to share.

The story is that Goodwin took those sections out of the speech and passed the speech back to LBJ who read the speech before giving it. He looked at Goodwin and he said, “What the hell happened to the stuff about quality of life?” Goodwin said, “You told me to show it to the House and Senate leaders. They said I should take it out because it was a distraction from your message.” And Johnson slammed his hand on the desk and said, “They don't write my speeches. That's just as important as the other stuff. Put that back in.” So that language on quality of life ended up being part of his incredible Great Society speech.

Robin Lindley: And I was surprised that Udall was working on a nuclear test ban treaty and was very concerned about nuclear proliferation.

John de Graaf: Yes. That was under Kennedy before the Test Ban Treaty of 1963 was signed by Kennedy and Khrushchev.

In 1962, Stewart was very concerned about nuclear war. He also had been very concerned about the dropping of the bomb on Japan. He felt, even as a soldier, that it was going beyond what he believed in. He believed that it was all right to bomb military installations but he did not believe that we should bomb civilians deliberately. He accepted that civilians would inadvertently be killed, but we should never target civilians. That was simply awful and against all notions of how we fight and against the Geneva Convention.

Udall went to the Soviet Union to discuss energy issues and he took poet Robert Frost along to meet Soviet poets like Yevtushenko because he knew that the Russians loved poetry. And at that time, Americans didn't pay much attention to it. So, he took Robert Frost, and he was able to get a meeting with Khrushchev where they discussed nuclear weapons and banning atmospheric nuclear testing, which was going on in both countries at that time.

Nothing immediately came of the talks because it was actually right before the Cuban Missile Crisis. But it apparently had some influence, because once that crisis was resolved and nuclear weapons were not used, the Russians came back to the table with Kennedy and agreed to ban atmospheric testing. They were able to do that and I think Stewart had some influence, although it's impossible to say for certain.

Robin Lindley: Thanks for that insight. Udall must have been haunted by his World War II experiences. Many veterans were.

John de Graaf:  Yes. With Mormons who were in the war, the stresses of the war pushed quite a few into being smokers and drinkers, which the Mormon Church didn't allow. But many Mormons came back smoking and drinking to relieve stress, and Stewart was certainly one of them because the war was such a tragic experience.

Robin Lindley: Didn’t Udall differ with Johnson about the war in Vietnam.

John de Graaf: Big differences. Initially Stewart shared some of the worries about the spread of communism as many people did at that time. Stewart was never really a far leftist, but he was a strong liberal and he was afraid of communism or any totalitarianism, especially after fighting the Nazis.

Initially, Udall believed that maybe we should try to stop the spread of communism and help Vietnam be democratic. But that didn't last for long. Once Johnson sent the troops and Udall started seeing what was happening to the people of Vietnam, Udall changed his mind, probably as early as late 1965. He tried to get Johnson to pull back.

And Secretary of Defense Robert McNamara was a close friend of Udall. They hiked and backpacked together. Their kids knew each other. They always liked each other very much. But McNamara's son Craig told me that he didn’t know that Stewart was so against the war until he saw my film.  He said he always liked Stewart and thought Stewart was a wonderful guy. And his dad liked him, he said, but his dad never talked about what other people thought about the war.

McNamara completely separated his work and family life so he would not talk at home about anything going on with other cabinet members. So, McNamara's son had no idea that Stewart was so vociferously against the war along with Nicolas Katzenbach, Johnson’s Attorney General, and a couple of others who criticized the war at the cabinet meetings and to the president. Craig McNamara wrote to me saying that he wished his dad had listened to Stewart Udall.

Robin Lindley: And, after the Johnson administration, after Udall left his post as Secretary of Interior, he worked as a lawyer with environmental justice and human rights issues. How do you see his life after his term as Secretary?

John de Graaf: He didn't know exactly what to do in Washington. He wanted to work as a consultant to improve cities, to make cities more livable. He became very critical of the automobile and our use of energy. And plus, he saw racism tear our cities apart.

Stewart was looking for things to do, but it was not easy. What kept him in Washington was that he and his wife wanted to allow their kids to finish high school with their friends. After the kids were adults and off to college, the Udalls moved back to Arizona and to Phoenix. It took a while for Stewart to figure out what to do there after he’d been in a position of power and influence. He was 60 years old with so much behind him.

Robin Lindley: He practiced law after his years as Secretary of the Interior and focused on social justice and environmental issues. The film notes his work with “downwinders” who were ill from radiation as well as miners who faced work hazards. What do you see as some of his important accomplishments after he moved back to Arizona?

John de Graaf: Two things: certainly, his work for downwinders and uranium miners for more than ten years was the most significant.  Then in 1989, he moved to Santa Fe and did a lot of research and writing.  In all, he wrote nine books, the most significant being The Myths of August, an exploration of the terrible impacts of the nuclear arms race.  He loved history and several of his books are about the history of the American West.

Robin Lindley: You obviously did extensive research for the film. Can you talk about how the project evolved and some of the archival research and the interviews that surprised you? It seems that Udall’s family and colleagues were very enthusiastic and open to sharing their perceptions with you.

John de Graaf: The Udall family was wonderfully gracious and open to me.  Much of the real research had been done by Udall’s biographers so I just picked up on that.  As I talked to people, I discovered that no one would say anything negative about him; even those who disagreed with his politics had total respect for his humility and integrity.  That’s not common with political figures, especially in this polarized time.  I was especially impressed by current Interior Secretary Deb Haaland’s insistence that “the politics of beauty lives on.” And I was stunned by the paintings of Navajo artist Shonto Begay, a wonderful guy.  I use some of his paintings in the film.  I had great cooperation from the University of Arizona in finding still photos.

Robin Lindley: Congratulations John on the film and its recent warm reception at the Department of Interior with Secretary Deb Haaland, the first Native American in that role.

John de Graaf: Yes. That was a wonderful event. We had about 300 people there, and Secretary Haaland spoke and talked about Stewart.

And we are getting a very good response to the film at other screenings. My biggest concern is it's hard to get young people to come out to see it. But when they do, they like it, like the young Mormon woman who I mentioned at Utah State. And a Hispanic student at University of Arizona who is a leader of the students’ association there wants to present screenings to get students more active in politics. I think that's the way it's going to have to happen. The screenings already turn out faculty and the older community, but they don’t turn out students. But once they see it, they do respond to it. I've been very surprised at how many students come up to me afterwards and want to talk. They tell me that they never knew about any of this history. They didn't learn about it in school. We’ve also been treated very well by media.  We’ve done fairly well in festivals, though I’m disappointed that my own hometown Seattle International Film Festival didn’t take the film.

Robin Lindley: Thanks for your thoughtful comments, John, and again, congratulations on your intrepid work to create and now display this moving cinematic profile of Stewart Udall. I learned a lot, and the film brought back many memories of the sixties, those times of exuberance and turbulence. The film not only illuminates our history, but it's also inspiring. Udall’s example offers hope for our divided nation now.


Robin Lindley is a Seattle-based attorney, writer, illustrator, and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, visual culture, and art. Robin’s email: robinlindley@gmail.com.  



Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/blog/154745 https://historynewsnetwork.org/blog/154745 0
The Roundup Top Ten for May 19, 2023

I'm Headed to Florida to Teach-In Against DeSantis's Education Policies

by Kellie Carter Jackson

This May 17 saw a 24-hour teach-in by historians in St. Petersburg, Florida, to protest the restrictions on curriculum, books and ideas pushed by Governor Ron DeSantis and his allies. As a historian of abolition, the author stresses that denying people the pen may influence them to pick up the sword. 


Bull Connor's Police Dogs Shocked the Nation in 1963, but they were an American Tradition

by Joshua Clark Davis

"In 1963 liberal critics condemned the Alabama city’s K-9 unit as a relic of the Old South. The harder truth to accept, however, was that it was actually a product of a new America."



MLK: Christian, Radical

by Jonathan Eig

Veneration has hollowed out Martin Luther King, Jr.'s legacy, and obscured the way that his political leadership always aimed at radical transformation of American society, argues the author of an acclaimed new biography. 



If it's Ineffective and Harmful, Why is Gay Conversion Therapy Still Around?

by Andrea Ens

Conversion therapies endure because their purpose is political, not therapeutic. They seek and symbolize the eradication of LGBTQ people from society and are promoted by groups who want that eradication to happen. 



Florida Just Banned Everything I Teach

by William Horne

Black historians during the Jim Crow era observed that the history taught in schools justified slavery, segregation, and lynching. A professor thinks that's where Ron DeSantis's vision of history is headed. Some politicians may think curriculum is a winning issue, but students and society will lose. 



Texas Shooting Highlights Long History of Anti-Black Violence in Latino Communities

by Cecilia Márquez

History shows that there have long been strains of anti-black racism in Latino communities, and that the categories "white" and "latino" are not mutually exclusive. Understanding today's far right requires attention to those details. 



The Relevance of Common Law to Abortion Debate: How Did the Law Work in Practice?

by Katherine Bergevin, Stephanie Insley Hershinow and Manushag N. Powell

Samuel Alito's ruling in Dobbs claimed to ground itself in the English common law's treatment of pregnancy. But he focused on a small number of published treatises while ignoring the record of how the law actually treated pregnant women and fetuses. 



There's Never Been a Right Way to Read

by Adrian Johns

The intellectual work and play of reading has always competed with other demands on attention; only recently have science and commerce converged to sell remedies for distraction and proprietary methods for reading. 



China is Cutting the US Out of the Middle East with an Axis of the Sanctioned

by Juan Cole

Recent American policies have squandered an opportunity to engage poductively with Iran and Saudi Arabia and instead pushed them toward stronger economic development relationships with China. 



Henry Kissinger: A War Criminal Still at Large at 100

by Greg Grandin

Henry Kissinger was instrumental in Nixon's decision to undertake the illegal bombing of Cambodia. His foreign policy machinations also led him to push Nixon to the actions that led to Watergate and the president's downfall, though Kissinger has remained unaccountable. 


Wed, 07 Jun 2023 19:44:10 +0000 https://historynewsnetwork.org/article/185702 https://historynewsnetwork.org/article/185702 0