History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Thu, 01 Dec 2022 04:15:49 +0000 Thu, 01 Dec 2022 04:15:49 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed Transgender Youth Have Doubters. They Also Have History

CC BY-SA 4.0

 

 

Politicians and pundits decrying the “trend” of transgender children currently dominate American airwaves. From Matt Walsh’s documentary What is a Woman? to proponents of state restrictions on gender-affirming healthcare, political and cultural debates in 2022 consistently entertain the argument that trans children are coming out of the blue, with no historical precedent. These talking points are not new; historically speaking, neither are trans youth.

 

On April 12, 2022, the Los Angeles Times published an article under its “World and Nation” section - not the Opinion section - interviewing the ex-president of the World Professional Association for Transgender Health (WPATH), Dr. Erica Anderson. In her remarks, Anderson argued that increasing rates of transition-related medical care—hormone replacement therapy, gender-affirming surgery, and psychological care—are a product of the “popularity of alternative identities,” and “peer pressure.” As Anderson, herself a California-located psychologist and trans woman, states, “I think it’s gone too far [...] A fair number of kids are getting into it because it’s trendy.”

 

Anderson is not the first (or only) medical practitioner, let alone trans one, to critique this phenomenon, referred to in viral Twitter threads and TikTok videos as “transtrendiness.” But arguments like these aren’t just fueling medical gatekeeping; they’re historically missing the point. Anderson’s claim that trans numbers today are an “increase” demonstrates the holes in her trans history. To argue that trans youth rates are “on the rise” would require prior rates to have not been artificially depressed to an unknowable degree. “On the rise” is a comparison to history—a comparison that ignores the trans children whose existences can be charted both within and outside of medical records.

 

As trans historian Jules Gill-Peterson details in a 2018 monograph Histories of the Transgender Child, trans youth, and trans populations in general, did not suddenly start existing en masse in the 1980s. Sixty years before Anderson’s interview, a young trans woman named Agnes stole her mother’s estrogen tablets to prove to the University of California-Los Angeles’s Gender and Sexuality Clinic that she would do anything for treatment—a story detailed in this year’s Sundance documentary Framing Agnes.  Later that decade, the families of transgender children petitioned Johns Hopkins Medical School to create its own clinic. Even earlier, trans youth were given access to the restrooms that matched their gender presentation—and were accepted by their community in the process—decades before Republican lawmakers would stake their political hopes on the “transgender threat.”

 

Not all trans childhoods can be found in medical archives. As Gill-Peterson’s research explores, some trans individuals avoided interaction with the medical industry entirely; underground queer and trans communities played chemist with DIY hormone treatment. Others more surreptitiously navigated medical intervention and didn’t become national headlines. Many, many more were denied treatment by prominent, seemingly-sympathetic gender clinic staff at USC, Johns Hopkins University, or other facilities. Many trans childhoods in the 20th century ended in suicide notes.

 

In all likelihood, Dr. Anderson has not read every book, seen every movie, or learned every historical nuance of US trans history—that field of study is dense, and, all too often, incredibly bleak. She likely has her own timeline of trans history—the national coverage of Christine Jorgenson’s landmark sexual reassignment surgeries in the early 1950s, the HIV/AIDS crisis, and the comparatively greater access to trans resources in the last decade that allowed Anderson herself to come out and transition four years ago, at the age of 59. There’s no reason Anderson would be an expert on the history of trans youth, having approached transition from the material realities of a wealthy doctor and not as a child with fewer medical rights under law.

 

Regardless, Dr. Anderson’s remarks carry consequences. The World Professional Association for Transgender Health (WPATH), which prescribes diagnostic frameworks and best practices for individual physicians, proposed new care standards in 2021 setting more detailed age restrictions for hormonal treatment and surgeries amidst conservative scrutiny. Anderson is that organization’s former chair.

 

Anderson’s remarks to the Times also provide fuel for future state policy. In February 2022, Texas Governor Greg Abbott and Attorney General Ken Paxton attempted to create legal frameworks to reinterpret parental assistance of youth receiving transition-related care as “child abuse.” Though a Biden Administration executive order stymied these initial plans, amidst the tumult of the midterm election cycle, Florida’s state medical board voted to ban gender-affirming healthcare for trans minors. One day later, Tennessee Republicans introduced a bill doing the same, invoking a legislative obligation to “protect the health and welfare of minors.” Its title: the Protecting Children from Gender Mutilation Act.

 

Texas, Florida, and Californian psychologist Erica Anderson overlap not in their position on the moral permissibility of trans existence, but on their rhetoric that trans youth today are being influenced by social media, and are not properly thinking through the ramifications of “life-altering procedures.” When Anderson, conservatives, and those whipped into moral frenzies call trans youth a trend, they issue warnings that, unintentionally but importantly, overlap.

 

But modern conservatives did not invent this strategy. Detailing the case of an anonymous child denied hormone replacement therapy multiple times (after multiple gender dysphoria diagnoses) on the basis that the child might regret the choice later, trans historian Jules Gill-Peterson describes the strategy behind delaying medical intervention in order to gatekeep and ultimately deny it: “Keep deferring until there is nothing left to defer.”  Without historical context, trans children will continue to be treated as modern-day anomalies, and the deferral will continue.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184438 https://historynewsnetwork.org/article/184438 0
Does Novelist Robert Keable Deserve a Reappraisal?

 

 

One hundred years ago the publication of Simon Called Peter, the first novel of Robert Keable, an English priest and former missionary, caused a sensation. It was a war novel with no fighting, telling the story of a priest going to Le Havre as a chaplain and discovering the realities of life behind the front lines. Slammed by critics in England, it received a more mixed reaction in America when published in 1922. The New York Herald commended it “for presenting a graphic and undoubtedly veracious picture of actual conditions,” but F. Scott Fitzgerald called it “utterly immoral” and later mocked it in The Great Gatsby. The novel became an international bestseller – reaching the top five in America in 1922. There were multiple efforts to ban it and, after it featured in the Hall-Mills murders, a New York magistrate agreed “it is a nasty book and particularly objectionable because written by a clergyman.” [in 1922, Episcopal priest Edward Wheeler Hall and Eleanor Mills, with whom he was having an affair, were murdered in New Jersey. Hall’s wife and her brothers were acquitted of the killings in 1926—ed.]

Robert Keable moved to Tahiti in 1922 but regularly visited America, where he was feted as a celebrity. A play of Simon Called Peter was performed on Broadway, the sequel made into a Hollywood movie. Other novels sold well. However, Robert died young of kidney disease in 1927, and is virtually unknown today. I believe he was a significant figure whose life and works need to be reassessed.

It has been suggested that Simon Called Peter was simply autobiographical, and that time should not be given to a priest who had an affair during the war and then wrote about it with no remorse. Robert did have an affair during the war, but the circumstances were very different from those described in his novel – perhaps worse. He was a married priest and his lover a 19-year-old lorry driver working for the Canadian Forestry Corps.  More importantly though, his war experience was far more significant than that.  

Robert was ordained in 1911 and spent eighteen months in Zanzibar, working as a missionary for the University Mission to Central Africa. Unable to become a military chaplain in 1914, he was given a parish in Basutoland (now Lesotho). When the South African Native Labour Corps (SANLC) was established in 1916 to provide black laborers to help with the war effort, Robert helped recruit. He was devasted when two of his parishioners were amongst the 600 SANLC laborers who died when the troopship SS Mendi was sunk by accident in the English Channel. By then he had already enlisted as one of only twelve white chaplains who accompanied the men – and was soon based in Le Havre, where the men were employed unloading the ships.

Robert witnessed the appalling treatment of the men he had encouraged to volunteer. They were forced to live in closed camps, in appalling conditions, working twelve-hour shifts, and facing racist abuse from their officers. His experiences in Africa, working in multicultural Zanzibar and among a large black congregation in Basutoland, had not prepared him for the overt racism. He heard a SANLC officer in “an applauding [officer’s] mess,” say “the Basuto are growing too well educated and too numerous and ought to be ‘thinned out’.” Robert was powerless to help although he did write a book, The First Black Ten Thousand, which SPCK agreed to publish in 1918. He hoped it would shed light on their shocking treatment, but it was blocked by the British censor at the last minute and never published.

Robert is known today because of the success of Simon Called Peter but he wrote many other books which deserve re-examination, and his life story is remarkable – giving us a real insight into the first three decades of the twentieth century.

While still at school Robert was already teaching at his father’s Sunday school and open-air preaching in the slums of South London. He won a scholarship to Cambridge University where he knew Rupert Brooke – their poetry appeared together in an anthology. In Zanzibar he set up the first multi-faith scout troop in Africa, mixing Christians, Muslims, and others. His book City of Dawn, describing his time there, was very well reviewed.

In Basutoland Robert re-energized his parish, concentrating on the care of the local population and dramatically increasing the number who attended his church. The local population denigrated him, especially after he had returned from the war, because he was so critical of the racism he witnessed. He campaigned against forced labor practices introduced by the British after the war.

In Tahiti, Robert was the leading English celebrity on the island. He renovated, and lived in, French Impressionist Paul Gauguin’s old house, before building his own on the other side of the island. His book Tahiti, Isles of Dreams presented an honest and a very unromanticized portrait of Tahiti in the 1920s. He supported James Norman Hall, who co-authored The Mutiny on the Bounty, during his writer’s block, employing him as his secretary. He mixed with the American expats who had made their life on the island and invited visitors to the island, such as Zane Grey, to stay. And he married a Tahitian princess who bore him a second son.

One last reason why Robert Keable’s life should be reassessed is because his views, which were much mocked at the time, are taken more seriously today. Robert wrote and campaigned on marriage reform. He left the priesthood, partly because of his adulterous affair, but always maintained that it was the church at fault and not him. He argued the church’s insistence on the sanctity of marriage did not allow for mistakes, that many decided to marry for the wrong reasons, and that the sin was to continue living with someone you did not love.  In his last book, The Great Galilean, he championed the loving philosophy of what he called the “historic Christ,” but questioned the views placed on the traditional Christ.

Robert Keable was famous in the 1920s. One hundred years on is the right time to look again at his contribution and the significance of his writing.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184439 https://historynewsnetwork.org/article/184439 0
An Insider's Look at Congress With Former Rep. Jim McDermott

 

 

For more than 40 years, Jim McDermott has worked tirelessly on behalf of the people of Washington State. As a state legislator, he helped pass laws that offered healthcare to unemployed and low-income Washingtonians, the first such program in the nation. In the United States Congress, he continued to be a much-needed voice for his most vulnerable constituents. Across America, you'll find families that are better off because Jim McDermott was fighting for them.

 – President Obama in 2016 on Congressman Jim McDermott’s impending retirement.

 

For many of us, politicians are a confounding and often frustrating mystery. What makes these people tick? What do they want? Why can’t they get along? Is politics merely a game for money or love or power? Can an idealist with a vision for the common good survive in politics?

Thanks to former Seattle Congressman Jim McDermott’s engaging, candid and often humorous new book Money, Love and Power: A Guide to Understanding Congress, readers are treated to the perspective of a veteran legislator with the instincts of a psychiatrist on the motivations of politicians and the complexities of their profession.

In his book, Congressman McDermott confronts the contentious issues of corruption, greed, and hypocrisy in politics, but he also shares accounts of his mentors and friends, his accomplishments to help millions of citizens, his hopes for our fragile democracy, and his clear and humane vision for rising from our current tribalism and polarization. He also describes the personal challenges in balancing a life as a career legislator with demands of family, colleagues, and constituents.

For nearly a half century, Democratic Congressman Jim McDermott was a powerful force in Washington State and national politics. He served 14 terms (1989-2017) in Congress representing Seattle and environs in Washington’s 7th Congressional District.

He began his legislative career in 1970 when he was elected to the Washington State House of Representatives. Four years later, he was elected state senator and served in that role until 1987 when he left the legislature for a position as a medical officer in Zaire for the US Foreign Service. He returned to politics in 1988 when he answered the call to run for Congress.

Renowned as the “Liberal Lion” of the House of Representatives, Congressman McDermott has devoted his career to working tirelessly for a healthier, safer, more peaceful and more just nation and world. As a medical doctor with a specialty in psychiatry, he was especially focused on health care reform for accessible and affordable medical care for all citizens regardless of financial situation. He led a successful effort to provide health care for the most vulnerable citizens of Washington. In Congress, he was a principal architect of the Affordable Care Act and he also addressed other complex issues such as expanding Medicare coverage, HIV/AIDS funding, elder care, and more.

Congressman McDermott also mastered foreign policy in his work to improve and deepen US ties with our allies and the developing world. As he sponsored legislation to improve the safety net for citizens at home, he also advanced laws to relieve poverty and suffering in the Third World.

His staunch opposition of American “Forever Wars” in the Middle East is legendary. He brought a unique perspective to Congress as a veteran Navy psychiatrist who served during the Vietnam War and saw firsthand the brutal toll of the war on soldiers, Marines, and sailors. He remembered the losses and veterans of that disastrous conflict as he questioned the costs and wastefulness of twenty-first century military campaigns Iraq and Afghanistan. At the same time, Congressman McDermott honored the service of those who fought in the recent Mideast wars, and repeatedly reminded Washington State residents of their fellow citizens who had fallen. As detractors insulted and mocked Congressman McDermott as “Baghdad Jim,” most of his constituents and many other citizens admired his courage and hard-fought efforts to protect our troops and our nation with alternatives to a militant foreign policy.

As only the second psychiatrist who ever has served in Congress, he presents specialized understanding and thoughtful consideration of that august body and its members with his new guide for citizens, Money, Love and Power. (Congressman McDermott acknowledges pioneering physician Dr. Benjamin Rush as the first psychiatrist in Congress. Dr. Rush was a signatory of the Declaration of Independence and member of the Continental Congress who became the face of the logo of the American Psychiatric Association because he finished his career taking care of patients in the mental hospitals and prisons of 18th century Philadelphia. He is seen as a founder of American psychiatry long before that medical discipline was widely recognized.)

Congressman McDermott generously responded by email to questions about his book and more from his home in France. His persistent efforts to connect from his remote village are greatly appreciated. There were some hiccups, but he generously observed how “hilarious” it was for two old guys to do an interview by email. Indeed. LOL, as the tech savvy young people might add.

Robin Lindley: Thank you Congressman McDermott for discussing your new book Money, Love, and Power. Congratulations on this engaging, thought-provoking and often humorous work. You served in Congress from 1989 until 2017. What inspired your book now on understanding Congress?

Congressman Jim McDermott: When I entered the state legislature in 1971, I was a psychiatrist who had spent many hours of training and experience in working in groups and watching how information was shared. I also had the experience of being totally unaware of the situation that I was coming into and, realizing my ignorance, I had to listen to all kinds of people to put together an understanding of what was going on. As I did that, I began to realize I was gathering the material for a storybook about the education of a member of a legislative body. That process started in January of 1971 and I irregularly dictated or jotted down or wrote in long-hand the stories I heard over the course of my 46 years in the legislative process from 1971 to 2017. I had a hard time knowing when I had written enough.

As I began to write a book, I tried to organize it but I could never get it done. By the time I got my manuscript into the hands of an editor I had written more than 150 chapters.

One other aspect of my book may be of some interest. The Irish, because of the oppression by the British, were forbidden to speak or write their own language—Gaelic or Irish—and therefore all their history was oral history. The most respected person in any village was the oral historian. The term in Gaelic for that person is seanchai. My father was a seanchai and my brothers as well. I saw storytelling as a natural way of expressing wisdom and history at the same time

Robin Lindley: You warn that your book is not a memoir, but I wondered about your background before your political career. It seems you’ve had a long-term interest in international relations. You include in the book your photo in Ghana in 1961 with a group called Operation Crossroads-Africa. You also stress your long-term desire to serve “the underprivileged of the world.” What sparked your early interest in world affairs and in serving the less fortunate?

Congressman Jim McDermott: When one reviews one's life it becomes clear that the roots of what you are as an adult started when you were a kid.

I was raised in a fundamentalist Christian evangelical home where the values were around following in Jesus’ footsteps. Missionaries often stayed at our home. I decided as an eight-year-old child to become a medical missionary to the Chinese. That decision was made in 1944, after watching slide presentations by returning missionaries from China about the atrocities that were committed against the Chinese by the Japanese in their attempt to conquer China before the Second World War began.

The world order changed dramatically during the 1940s and 1950s. China was no longer an option for missionary work, but the impulse to take care of those who were least able to take care of themselves was deeply embedded in my upbringing.

I went to Wheaton College, which was a center of Christian evangelical fundamentalism. The Christians I met had been sent there to prepare themselves for service. The things that I was raised to believe were simply cemented into my life by that experience.

My first trip to the Third World came in my sophomore year in medical school when I went with Operation Crossroads-Africa and lived in a village [in Ghana] at the level of the people of the country. That meant boiling water to drink and using a latrine and eating native food and sleeping under a net to avoid being infected with malaria by the ever-present mosquitoes. That was perhaps the most formative experience in my adult life for it exposed to me the real problems that most of the world faces.

Robin Lindley: What a powerful experience. What inspired you to become a physician and then to choose to specialize in psychiatry?

Congressman Jim McDermott: My decision to become a physician, as I look back on it, was driven from my own experience of having asthma as a child. I was taken to the doctor's office when I was unable to breathe and received a shot of adrenaline, which relieved the spasm in my lungs. My mother also had asthma and I saw her in extremis on several occasions. These two experiences I think drove my desire to be a physician so that I could prevent my mother from dying and also help people like myself who had had this horrible experience of fearing never to have another breath.

My role model was a military general practitioner who had been in the Second World War. His name was Bernard Rodkinson and I'll never forget it. He gave me a shoulder patch from the First Cavalry Division. Dr Rodkinson was a gruff but warm and caring MD who knew how to talk to scared little nine- year-old kids. That shoulder patch emblazoned with a black horse on yellow background was our bonding.

And, like most students in the 1950s and 1960s I was driven educationally toward what I thought would be a useful way to spend my life. I became more and more interested in the human mind and I decided to go into psychiatry when I was 15 years old. I had read widely, and when asked to write what my career would be when I was older, I said that I was going to be a psychiatrist.  I'm sure I didn't really understand everything about psychiatry or what it meant but I was driven by the desire to understand the human mind.

Robin Lindley: You were precocious. I decided to be a brain surgeon in sixth grade and interviewed the two neurosurgeons then working in Spokane, but my deficiencies in chemistry and mathematics became apparent in high school and ended that dream. Patients should be thankful.

Your book is candid and often self-deprecating and wryly humorous. You’ve taken on the task of explaining how politicians think and behave. What is most important for readers to understand about the people who seek public office?

Congressman Jim McDermott: My book is not a tell all book designed to destroy this or that personality or tell stories that would be demeaning or hurtful but rather to help a student or someone who's interested in the subject understand what goes on in the mind of the congressman or congresswoman.

I don't think I've ever met anybody who didn't have some idea about what he or she wanted to accomplish as a congressperson. The election process is one in which people choose whether they want to send a person who has this goal or that goal to represent them in the national legislative assembly. The book is about the things that happen to people as they participate in that experience, and much of it is a surprise to the person who enters into the process.

Robin Lindley: What motivated you to seek public office, first at the state level and later with Congress?

Congressman Jim McDermott: I came out of the United States Navy absolutely enraged by what I have seen through the eyes of the soldiers and sailors and marines that I treated in my clinic. I saw the insanity of the war in Vietnam and wanted to do everything I could to stop it.

I actually thought before I went into the military of leaving the country and was offered a job in a psychiatric hospital in Tromso in northern Norway, above the Arctic Circle. I thought long and hard about leaving my country and finally decided that I wanted to stay and try and change it. That's really the driving force that took me from the Long Beach Naval Station on June 30th to a state legislative race in the 43rd district of Washington on the July 1, 1970.

Robin Lindley: Fred Rogers—Mister Rogers—advised troubled kids to look for “the helpers.” In looking at your career of service, it seems you fit that helper mold. What do you think?

Congressman Jim McDermott: The old joke about when you find yourself in a deep hole the first thing to do is stop digging. That principle applies to anybody who gets into something they don't understand.

I looked for helpers to help me avoid the problems that inevitably come in the legislative process. I, in turn, did all I could to help people, both Republicans and Democrats, to effectuate the system in the way that made the most sense to them. I often told members when they asked me, “you should vote yes on this issue but I think you're wrong.” By saying that, I let them know I wouldn’t mislead and that I actually knew what was the best vote for them, even if it didn’t fit for me. I learned that one should never mislead another legislator for a momentary advantage. It'll come back to bite you.

 The other part of being a helper in going to the legislature was that I saw that, as a member of Congress or of the state legislature, I could help all the people in the society if I could figure out what was an adequate and effective way to make their lives better. Being a doctor, I could help people one at a time, but being a legislator gave me more power.

Having been trained as a pediatric psychiatrist, I saw the world very much from the eyes of children and wondered how problems were going to affect kids and their families and their development for the future. Much of my effort was spent on trying to make a better place for them to live in the future.

Robin Lindley: What lessons did you bring from your medical training and career as a psychiatrist that were helpful in politics?

Congressman Jim McDermott: I only ran once as Doctor Jim McDermott and most of my career I was just plain Jim.

The fact that I was a psychiatrist made many people uneasy and was one of the reasons I lost the governor's race in 1980. The campaign against me advertised “Do you want a liberal Seattle psychiatrist to be your next governor?” Sounded like a good idea to me but the people of Washington weren’t ready for me. It's when I learned that the election was not about me but about the people making the choice of the person they felt most likely would represent them.

Coming to understand that elections are not about you is probably the most important thing you can learn from elections. A friend of mine who was Prime Minister of Ireland said the most important thing to have in politics is humility.

And, after experiencing that critique, I was very careful not to use psychiatric jargon or in any way make people uneasy because of my background. I couldn't change who I was, and I used the skills that I had learned of observing people and looking for evidence to explain why certain things happen, but I tried not to give people the impression that they were all patients. I wanted them to think of me as a colleague in the pursuit of good public policy.

Robin Lindley: You contend that most politicians do not set out to be scoundrels but are motivated by money or love or power—or a combination of those factors. Where do you see yourself in that mix? How do you see your evolution as a politician?

Congressman Jim McDermott: It was St. Ignatius who said, “if you give me a boy to the age of six, you could have him for the rest of his life.” The meaning of that aphorism is that the things that we build into children by the time they're six or eight years old are the roots of the way they will behave for the rest of their lives. That is why preschool education is so important.

My parents were very religious and strove very hard to live their lives as they thought Jesus wanted them to live. I never agreed with much of the trappings of religious denominations whether Catholic or Protestant or Jewish, but the roots had been planted at a very early age and followed me through my entire life.

Money was important of course—to have enough to live, but the accumulation of money was not something to be honored. What was to be honored was the self-service to other people: that one was able to do as Christ had done for the world by giving himself as a savior for all of us.

As most people do, I took those parts of the religion that made sense to me and carried then with me for 46 years and only now, as I look back, can I see some of the connections to the religious upbringing that I had.

If one's view of life is swayed by the acquisition of money or power, one is in danger of wandering into becoming a scoundrel. My favorite verse in the Bible is, “What shall it profit a man if he gains the whole world and loses his own soul. Or what shall he give in exchange for his soul.” Faust is just around the corner.

Robin Lindley: You ran for governor of Washington State three times as a neophyte politician. You vividly describe your losses in those statewide races. What happened?

Congressman Jim McDermott: It was a greenhorn in politics but determined to have an impact on the way things went in the world. I was angry at what I had seen in my clinic about the war, and I wanted to change the world. I decided to run for governor which would make me one of 50 governors rather than one of 435 members of Congress. It seemed like I would have a better chance to talk to the president about getting us out of Vietnam. As I look back on it, I can't believe how naive I was.

Robin Lindley: Before running for Congress, you had a successful career as a state legislator. What do you see as your major triumphs in state politics?

Congressman Jim McDermott: Reaganomics turned the country upside down in 1981. The state of Washington was in deep financial trouble.

I lost the election in 1980 for governor but became the Ways and Means chairman in 1981. My job as the chairman of the most powerful committees in the legislature was to control the expenditure of money and provide for the safety net and education for all the citizens.

In 1984 I ran for governor for the third time and was defeated and it was clear to me that my progressive politics or my campaign style or my personality was not going to be accepted ever by the entire state. I returned to the state Senate and put together the Washington Basic Health Plan and the funding for the cleanup of Puget Sound and had hearings on reform of our nursing homes and so forth.

I got frustrated with the inability to reach a greater level power because our congressional seats were taken and the governor's race was closed, so I decided to go back to medicine. That's when I became a Regional Medical Officer for the US State Department based in Kinshasa, Zaire. I had always loved medicine.

Robin Lindley: Thank you for those comments on your work at the state level. Weren’t you a reluctant candidate for Congress in 1988 when you first ran? You were tapped to run when you were working in Zaire as a medical officer with the State Department.

Congressman Jim McDermott: I had a great job working in Africa for the State Department and, for the first time in my life, I had a stable income so that I could put two kids in college and pay for their education. I told my kids that the only thing that I would leave for them for a legacy would be a free college education so that, when they came out of college, they could do whatever they wanted without thinking about having to retire a monstrous debt.

I had already had the experience of picking myself up off the ground having lost three times for governor and knowing what it would be like if I came back from Africa having given up a great job and having to start all over again so I was scared. I wasn't sure that I could get elected to the United States House of Representatives.

Robin Lindley: What are a few of your proudest accomplishments as a member of Congress?

Congressman Jim McDermott: Every time I try to answer this question, I give different answers because I remember stuff I forgot after 28 years.

My biggest desire was to see national health care guaranteed to all Americans. Obama Care was a down payment. While I was there, I played a big part in the response to HIV-AIDS. HOPWA (housing for people living with HIV-AIDS) was very important in keeping people alive until treatment for AIDS came along.

My subcommittee of the Ways and Means committee was charged with rewriting the provisions of the 1935 Social Security Act.  I upgraded unemployment insurance and care for foster kids (500,000) and welfare for families. I also wrote the African Growth and Opportunity Act (AGOA) which was the first trade policy for Africa.  At home, I finished the process of getting the Seattle’s Cedar River Watershed completely protected.  I also passed the bill in the Natural Resources Committee to resolve the land claim between the Puyallup tribe and the Port of Tacoma so the port could reach its present size.

I did lots of other smaller things, but it is important to say: none of this I did alone.  I worked with members from all over the country and, despite our political and social differences, we got stuff done. I also must say my staff did most of the work.  It really came to a screeching halt in 1995 when Newt took control and tried take all the power to himself. It was almost impossible to get a Republican co-sponsor on a bill so then you couldn’t even get a hearing on an idea. You had to get very creative and seed an idea in a Republican mind to get it to go anywhere.

Robin Lindley: You mention the cautionary words of legendary physician William Osler, “Listen to the patient.” How did that advice inform your work in politics?

Congressman Jim McDermott: If one believes in democracy and the principle that you are elected to represent the people of your district, it is imperative that you listen to what 700,000 people are trying to tell you. It's not always clear and it's never easy to do but that is your goal.

The aphorism that I learned in medicine in tandem with William Osler, was what one of my supervisors in psychiatry said to me. “Jim, listen to the patient no matter how crazy he sounds or she sounds because buried in their confusing communication is the truth that you need to know in order to help them.” He finished by saying, “a stopped clock is right twice a day. Your job is to figure out when the clock is telling the right time.” I thought of that whenever my staff would say to me, “You spent too much time listening to so and so.” I was waiting to find out what time it was.

Robin Lindley: You served in Congress for 28 years. How would you compare your Congressional legislative experience to what you did at the state level?

Congressman Jim McDermott: I'm sorry, but the only way I can answer this question is by an analogy to baseball. The Washington State Legislature was like being in the Pacific Coast League in baseball. I batted about .375 and hit about 40 home runs. In 1989 when I went up to the Major Leagues as a congressman in Washington DC, I was lucky to hit .290 and hit 15 home runs. The game is the same as in the state legislature in that the ball is the same shape and size and the distances between the bases and the pitcher’s mound and home plate are the same, but the players have skills that you have never seen before. Congress is a collection of the best players from all over the country who all have different styles. You have to learn a whole new game.

Robin Lindley: Who were some of your significant helpers in your career in Congress as you fought for average citizens, for better health care and education and housing, and more?

Congressman Jim McDermott: Let me start with Tom Foley who was the Speaker of the House. He and his wife Heather took me under their wings and taught me innumerable things. The rest of the Washington delegation were helpful, even [Republican] Sid Morrison who I had known from the state legislature where we had worked together on writing the bilingual education bill in 1973. When Sid left the Congress, he sold me his car.

I came to Congress with Nancy Pelosi, and so I knew her from practically the first day I was started. She and her colleague from California, George Miller, were perhaps my most helpful guides in how to proceed in Washington DC. Danny Rostenkowski and Dick Durbin from Chicago shared my Chicago background and were very helpful.

To begin to name other members is to recite the membership of the House because, from each of them, I learned things that helped me while I was there. Jim Ford, the chaplain of the House, was also a great friend.

Robin Lindley: You mention that the revered civil rights activist, the late Congressman John Lewis, was also a good friend of yours. How to you see your friendship and his approach to working in Congress?

Congressman Jim McDermott: John and talked almost every day.  I was senior to him on the Ways and Means Committee, so I usually spoke before him.  He would always be complementary to me and often take my ideas and make them more acceptable to the committee.  He took some of the edge off my style but he was never afraid to take ideas and improve on them.  He “made trouble” with a velvet glove after I used my fist to make a point. He and I shared the deaths in our family and our aches and pains as we got older.  He was a friend that I admired enormously as I watched him deal with the racism of the world.  He was made of cool steel.

Robin Lindley: Thanks for sharing those memories of Congressman Lewis. In your book, you share stories of ineptitude and corruption and vicious political schemes in Congress. That has to be frustrating when you’re working to help the people in your district and beyond. Are there a couple of particularly egregious incidents that especially surprised you?

Congressman Jim McDermott: I was raised in Chicago area and served in the state legislature for almost 16 years and had worked as a psychiatrist in a number of different settings in which I saw human foibles.

Coming to Congress simply exposed me to a wider variety of the things that people can do to get themselves in trouble because they didn't think about how it would look or how it would affect other people.

Robin Lindley: Right now, our country and Congress are extremely polarized, but you point out how this polarization in Congress really began in the 1990s with Republican Speaker Newt Gingrich and his “Contract for (on?) America.” That was years before the Tea Party far right extremists. How do you see Gingrich’s role in creating today’s divided Congress?

Congressman Jim McDermott: The election of Lyndon Johnson and his promise to bring the Great Society to the country threatened the conservatives and they planned, beginning in 1970, on how they were going to change the country. There are a dozen books you can read about what they did, but Newt Gingrich was simply the first very visible and most powerful prophet of our time to turn the Right away from democracy. He was unabashed in his willingness to bend the rules or to ignore the customs of the democracy in which we lived. It's hard to measure how important he was, but one can say that had he not led the Congress in 1995 in the way he did, we would not be anywhere near where we are today. The conservative wave was coming in the country and Newt Gingrich rode the wave like a surfer. And everyone saw him do it and began to follow suit.

Robin Lindley: Compromise in Congress is virtually impossible now. And you advise, “If you want to be a star, forget bipartisanship.” How does that work?

Congressman Jim McDermott: Two principles are essential for democracy to work: one principle is that of compromise. Everyone everyday compromises in multiple ways to make this society work. People very rarely get life exactly as they want it whether it's in their work or their profession or their personal life. Autocrats don't have to compromise.

The second principle is that there has to be honest debate over an agreed upon set of facts. You cannot have a useful discussion with someone if you cannot agree on the fundamental facts of the problem. You could have your own view about why the facts are the way they are but you and I have to agree on the facts if we're going have a sensible conversation about the problems.

When one side of the political process decides that compromise is not acceptable, partisanship is a direct result of that. Both sides of an argument go to their corner and never talk to the other side.

Diplomacy between nations is the attempt by leaders to bridge this gap between what one side thinks the facts are and what the other side thinks the facts are. If they can come to an agreement on the facts, they can probably work out an agreement. For example, the Good Friday Accords in Ireland settled 800 years of dispute between the British and the Irish. If either side had walked away from the table, you would still have mass killings in the streets of London, Dublin and Belfast.

Robin Lindley: Thanks for those words on the value of compromise. As you detail, it’s expensive to run for Congress and, if you win, it’s challenging to maintain homes in your district and Washington DC, and address other pressures. What did you learn and what would you advise prospective members of Congress regarding expenses?

Congressman Jim McDermott: Becoming a public official is not a way to become wealthy! If you want to be wealthy, go into business or a profession where the compensation is high.

If you go into politics and expect to come out wealthy, you are in grave danger because the temptations to enrich yourself directly or indirectly are everywhere. You will operate on a daily basis with people who have much more of the world's wealth than you do, and as you begin to envy what they have, you open yourself up more and more to be corrupted.

When I came to Congress, buying a house in Washington DC in certain areas was not prohibitive but now housing is out of sight for an ordinary member of Congress. If you're wealthy before you come, it's no problem. You just have a second house in Washington DC. Absent that good luck, you are going to be stretched in your living style. There should be housing provided for members just as we do for diplomats or military personnel so poorer members could bring their families to DC.  The divorce rate might even go down then.

Robin Lindley: You detail in your book the Republican vendetta against you regarding release of an audiotaped recording—and the resulting complex litigation. It seems that you violated no laws but you lost several appeals, etc. What lessons did you take from this experience? As a result of your situation, you argue that all members of Congress have surrendered their First Amendment rights. Why is that?

Congressman Jim McDermott: I would advise anyone going into Congress to begin by studying the genesis and the operation of the Federalist Society. I was an ordinary citizen who had a couple parking tickets and a speeding ticket but that's all I knew about the justice system when I got to Congress.

The first goal of authoritarian leadership is to wipe out any aspect of free speech. You'll hear lots of speeches about support for the First Amendment, but the actions done in the process gradually take away the right of any citizen in the process to speak freely against the government. It's called the First Amendment because it is the first principle of any democracy. Behind that is a government that wants its activities to be as invisible as possible because the scrutiny of the people will lay bare the way things are actually operating.

Congressional oversight is the essence of making democracy fair. Congress tries to pass laws that they think are fair and deal with problems. The executive branch writes the rules and regulations to implement the laws that have been written. That process of writing rules and regulations is arcane and opaque and requires vigilance on the part of both to Congress and the people. There many slips between the lip and the cup, as the saying goes.

Whistle blowers are citizens who are exercising their right of free speech. If they see something in the process that they disagree with, they talk about it with someone. At that moment they have become a problem for the government and can be dealt with in a multiplicity of ways, none of which are pleasant.

My 11-year legal involvement over the release of a tape which I made for The New York Times was the result of an open attempt by the Republican leadership in the House to hide their duplicity from the world. Newt Gingrich said he would not do something and signed an agreement and then got on the phone and did exactly the opposite of what he had said in his declaration. The fact that I became aware of that phone call was truly accidental. I never went looking for it. It was brought to me and I thought the people had a right to know what the behavior of this Speaker of the House really was. Understandably he was quite angry and said “I was violating his privacy.”

At that time, now King Charles had been exposed by a cell phone call to his now wife. Everyone in the world knew that you could capture cell conversations. And the Speaker put everyone on his leadership team in jeopardy by making a call. He gave John Boehner the responsibility of pursuing a lawsuit and the lawsuit was decided with a very novel interpretation of the Constitution. The court decided that I had broken the rules of the House. The Constitution specifically separates the House and its rules from the purview of the courts. That means that when you become a member of the House, you are subject to a set of rules that can be written by the majority with no consultation with the minority by a simple vote on the floor of the House, and that you can be deprived of your First Amendment rights. That is a dangerous precedent to have been set in the House.

Robin Lindley: You have been outspoken on our “Forever Wars” in the Middle East. Many voters appreciate your views on war and peace and the bloated military budget. You mentioned your toughest vote and it involved a yes vote on the US response to the September 11, 2001 attack on America with a Declaration of a War on Terror. Only Representative Barbara Lee voted against the bill in the House. What did you do at this perilous time?

Congressman Jim McDermott: If there is one vote I would like to take again with the benefit of hindsight, it is that one. A group of 30 or so of us went round and round on Yes or No. I gave away my right to object fully to what followed.

The invasion of Iraq and problems with Syria and Yemen and Afghanistan all came from that decision.  What we see today in Ukraine and China and Russia is the result of what happens with countries whose economies are built on continual war making.  

Robin Lindley: You dedicate the book to the late Senators Wayne Morse and Ernest Gruening as well as acclaimed author Karl Marlantes. Morse and Gruening voted against US involvement in the Vietnam War, so I think I understand that part of the dedication. Why did you also mention Mr. Marlantes? I realize he has written vividly about the human cost of war.

Congressman Jim McDermott: Karl and I are good friends and I know the struggles he had to get his novel Matterhorn published. He said to me once,” Jim, if I had a nickel for everyone who has told me, ‘Karl, I feel a book in me,’ I could retire forever.” He encouraged me through the process of writing, editing, and finding a publisher.  I finally had to self-publish, and he was an inspiration for my struggle with the book in me.  He said, “No matter what happens you can be proud because you did it. Two hundred or two million copies make no difference. You made it to the top.” 

Robin Lindley: Thanks for that nod to Karl Marlantes. The election of Donald Trump to the presidency had to be heartbreaking for you. He and his allies worked to undermine American democracy and the rule of law throughout his four-year term. What’s your sense of our fragile democracy today and where do you find hope in these fraught times?

Congressman Jim McDermott: I am worried sick about the midterms because I listen to Kevin McCarthy and I believe he is honest about his intentions.  What he says should turn your blood cold.

I listen to people I dislike or distrust as carefully as I do my pals. I find hope in the ability of the people to see what’s happening.  Look at Kansas. It’s a red state that added abortion protection into its Constitution by a vote of the people.

Robin Lindley: Thanks for that optimistic perspective. What do you miss about Congress since your retirement in early 2017?

Congressman Jim McDermott: My friends and all the trappings of power. People answer your cell calls or emails. I had 16 staff members to help with a thousand details of life.

But what is hardest is the realization that it is so hard for an ordinary citizen to get a change in policy.  I pick up The New York Times every day and see something my friends should be aware or know exists but wonder how I can get their attention.  It is analogous to being a parent and seeing the peril of your child of whatever age and being unable to get them to see it before the fall.

Robin Lindley: You’re living in France now. Do you still have a home in Seattle? What drew you to France?

Congressman Jim McDermott: I have an apartment in the city of Seattle. I still believe Seattle is the best place to live in the USA since I chose to leave Chicago in1966.  I never looked back.

I came to France after I retired to go to a French cooking school. I was single and no one asked me out to dinner every night, so I decided to learn a new skill. I wanted to be able to ask them to dinner.

I came to the Medoc wine region to a town of 661 people and found peace and tranquility.  I don’t speak French, so I was not bothered by press and radio and TV.  I could unwind after racing at breakneck speed for 46 years. It was a place I could actually think about my life as I walked around in 400,000 acres of Merlot and Cabernet Sauvignon grapes and could concentrate and finish my book.

Medoc wines are all blends of these two grapes with some Petit Verdot. I liked it so much that I bought one hectare of vines (approximately 2.25 acres). I invite my American friends to come taste my vintage. 

I have made a whole community of new friends and I’m gradually learning to speak French. The best part, though, is becoming a part of French society.  I prefer Liberté, Egalité, Fraternité to life, liberty and the pursuit of my happiness only. Within six months of arriving here, I have full health coverage with a Carte Vitale, which every French person has. Even a long-term visa (one year) person like me gets health coverage.  I tried to guarantee this to Americans for 28 years and I never got it done. Here I get in within six months.

They say the French work to live while Americans live to work.  It is true. The schedule is 8AM -12:30PM, then closed until 1:30 for lunch, Open up again 2:00-7PM. Then home to dinner. I could go on and on.  Do they have problems? Of course, but the only guns are for hunting.  On Saturday hunters go out to get wild boar from the fields.  You hear a gun but that’s it. Election campaigns take two weeks or so. No huge and long money expenditure to buy your eyes.

Finally, foie gras, fromage and vin rouge.     

Robin Lindley: What’s your typical day like now?

Congressman Jim McDermott: Every day starts the same in retirement.  You decide what you will do. You take a blank sheet of paper and write down what you are going to do. You are the only one who writes on the paper.  I start each day with a cup of coffee, a chocolatine, and The International New York Times followed by a walk in the vineyards. Amazing what a 6000-7000-step walk does for your soul. The rest of the day is yours to enjoy.

Robin Lindley: I read that you are also an artist and enjoy sumi-e painting (an ink-wash technique involving applying black ink to paper in varying concentrations with a brush—ed.). Have you enjoyed making art since childhood? Who inspired your interest in art?

Congressman Jim McDermott: During the Vietnam War, I got interested in Zen Buddhism.  I couldn’t meditate but I gradually taught myself to do Sumi-e painting which was a part of understanding the religion. I didn’t think I had any talent until a friend who was an artist encouraged me at age 45.

Tom Foley had given me the responsibility to attend meetings with legislative exchanges with Japan when I came to Congress in 1989 and so I travelled to Japan more than 40 times over the years. I became good friends with a number of Diet members.  One told me to bring some of my painting to Tokyo.  He took me to one of the emperor’s scroll makers. His artistry in turning my paintings into scrolls raised me to another level and I began giving my paintings as auction items and on and on it went. The Confucian and Buddhist philosophy behind the paintings stirred my soul.  “The bamboo is the perfect gentleman; he bends in the wind but never breaks.”

Robin Lindley: You’ve also been teaching at the University of Washington in recent years. What’s the focus of your courses?

Congressman Jim McDermott: I teach because I am recruiting the best and the brightest into government. Our best and most thoughtful should go into public service. I try to teach the things that are in my book. I have had lots of experience and like to share with students because they make me think about what I really am sure of.  They challenge me as I challenge them.

Robin Lindley: You could rest on your laurels. What inspired you to teach now?

Congressman Jim McDermott: Keeping your mind alive by continually challenging yourself has been a lifelong occupation and I intend to keep offering what I know to those who are interested.  It is fun now because I have nothing to prove and don’t need to compete with others. Gandalf the Wizard said, “You must be the change you want see in the World.”  Or was it Gandhi?  Ha ha.

Robin Lindley: Legislators and other politicians are increasingly facing death threats and actual violence. You have the recent assassination attempt on Speaker Pelosi in which her husband was assaulted and severely injured. What advice to you have for prospective public servants who are concerned about the escalation of hatred and violence in the political arena?

Congressman Jim McDermott: This question is very tough to answer succinctly. I had three death threats made on me.  The first was by mail in a governor’s race.  I told the state police and I don’t know the resolution.

 

The second threat was by phone from someone upset with my advocacy of a tax on a trust baby’s income.  I was in Congress and the caller was very helpful in that he called twice and left his name and phone number.  He was convicted and served a period in jail.

 

The third threat occurred because I would not change my support for Hilary Clinton to Bernie Sanders in 2016.  The threat was very graphic and was delivered over the phone and then the caller followed up with an appearance at my office where he terrified the employees who were there. My office had bullet proof glass as a result of the previous threats. The police arrested him as he stood outside my office trying to break in by beating on the windows. He was also convicted of a felony.

 

I think it was von Clausewitz who said,  “War is simply the continuation of political intercourse with the addition means.” If you enter the political arena, you must be aware that you are going to war. Many citizens have no idea where the interface between diplomacy and conflict begins and, if you enter the arena, there is always the possibility that physical harm may be the result. 

I never had or was offered physical protections.  We took precautions like door locks and bulletproof glass. 

 

Many people don’t enter the arena because of their or their families’ fear. Courage is a prerequisite for a candidate.

Would I run again? Yes, without hesitation. 

 

Robin Lindley: Thanks for your years of service. I really appreciate you bearing with me, Congressman McDermott. Is there anything you’d like to add about your book or your life in Congress and since then?

 

Congressman Jim McDermott: I have probably said too much already, but I just read Hold the Line by Michael Fanone [A police officer injured by pro-Trump rioters in the January 6, 2021 attack on the Capitol].  It is a terrific book and made me think that my whole career from 1970 to Trump was trying to hold the line for democracy.  We are in real peril today.

Since I wrote the first draft I have been reading a fabulous book about a real French heroine. Not Joan d’Arc. Agent Josephine: American Beauty, French Hero, British Spy by Damien Lewis. If you read it you will begin to see why I live in France today.

Robin Lindley: Thank you for your patience and thoughtfulness Congressman McDermott, and for the many ways you have served the country through the years. And congratulations on your lively and insightful new book—a reference on politics and human behavior for the ages. Best wishes on your very active life in retirement.

Congressman Jim McDermott: Best wishes to us all.  The world is looking for its soul.  Materialism and science and tech have left a void. The turmoil we see around is evidence of people looking for something to believe.

Abientot.

 

Robin Lindley is a Seattle-based attorney, writer and features editor for the History News Network (historynewsnetwork.org). His work also has appeared in Writer’s Chronicle, Bill Moyers.com, Re-Markings, Salon.com, Crosscut, Documentary, ABA Journal, Huffington Post, and more. Most of his legal work has been in public service. He served as a staff attorney with the US House of Representatives Select Committee on Assassinations and investigated the death of Dr. Martin Luther King, Jr. His writing often focuses on the history of human rights, social justice, conflict, medicine, art, and culture. Robin’s email: robinlindley@gmail.com.  

 

 

 

 

 

 

 

 

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/blog/154650 https://historynewsnetwork.org/blog/154650 0
James M. Scott's "Black Snow" Traces the Line from Tokyo to Hiroshima

Tokyo, after the firebombing of March 9-10, 1945

 

 

A former Nieman Fellow at Harvard, James M. Scott is the author of four books about World War II including Target Tokyo: Jimmy Doolittle and the Raid that Avenged Pearl Harbor, Rampage:  MacArthur, Yamashita and the Battle of Manila, and The War Below.  In his latest book, Black Snow: Curtis LeMay, the Firebombing of Tokyo and the Road to the Atomic Bomb, Scott details the most destructive air attack in history, the firebombing of Tokyo on the night of March 9-10, 1945 which claimed more than 100,000 lives. His narrative includes the troubled development of the B-29 bomber and the rise of General Curtis LeMay, who developed the low-altitude firebombing strategy. Here, in an interview with History News Network, Scott discusses how and why he wrote the book.

 

Q.  I am curious about how you got started writing history and why you are particularly interested in the U.S. vs. Japan struggle in World War II.  

My first job out of college was as a public-school English teacher in Japan where I taught middle school in a small town of about 20,000 residents on the main island of Honshu. I volunteered as well to teach a course at night at my town’s community center. In that class, some of my students, who were children during the war, talked to me about the B-29 campaign and having to escape Japan’s cities. It was the first I ever really learned about that part of the war, and I was fascinated.

During my time there, I likewise made a trip to  Hiroshima. A couple weeks later, I flew to Hawaii to meet my parents, who were there on vacation. My father had served in the Navy, so the first place we visited was Pearl Harbor. Therefore, in the span of about two weeks, I experienced what for America represented the beginning and the end of World War II. Needless to say, I was hooked.

Q. Some of the most fascinating parts of Black Snow are the first-hand accounts of the Japanese survivors in Tokyo. These voices are often lacking in other WW II histories.  How did you access those records and how did that influence your writing of this history?

I don’t speak much Japanese unfortunately, despite having studied it while living there. I have been very fortunate over the years to develop some great friendships and contacts in Japan, who have always been so gracious to help me in my research, from pulling records to arranging interviews and translators.

The Japanese side of the story is one that I feel has long been overlooked in the examination of the firebombing of Tokyo, despite the fact that there are voluminous materials available to researchers in Japan. There is an entire museum in Tokyo dedicated to the March raid, where survivors often give lectures to school children and visitors. There are great historical accounts and numerous secondary sources as well. All of them, however, are in Japanese, which has limited American researchers from using many of those important sources.

For me, I was particularly interested in what that firestorm was like and how people survived it. I wanted to capture that visceral experience for readers, to take them inside that inferno and show them what it looked like, smelled like, and even sounded like. In short, I wanted to recreate it for readers.

 

Q. I was surprised to learn that General Curtis LeMay made the decision to begin the firebombing raids on his own initiative at his Pacific base, without explicit approval from Washington D.C. Was this kind of “lone wolf” action a lesson for General Marshall and other war commanders? Did the firebombing raid decision affect the atomic bomb decision process?

LeMay benefited from the unorthodox structure set up to govern the B-29 campaign. Army Air Forces commander Gen. Hap Arnold, from the outset, was adamant that his bombers function as independent operators and not be pulled into Douglas MacArthur’s or Chester Nimitz’s orbit. To do this, he convinced the Joint Chiefs of Staff to allow him to create the 20th Air Force under his direct control. He would then report directly to the Joint Chiefs. Arnold, of course, worked in Washington, so LeMay was his operator in the Marianas.

Arnold, however, suffered a major heart attack in January 1945, and ended up in Florida convalescing. In his absence, LeMay was left with Arnold’s chief of staff, Lauris Norstad. LeMay not only outranked Norstad by a star, but he also didn’t really trust him. This led LeMay, who by nature was a pretty solitary individual, to hold his cards close. He didn’t tell Norstad of his plans to firebomb Tokyo until the day of the mission, when Norstad landed in the Marianas for a visit. 

As for the atomic attacks, LeMay’s operation really served as an important trial balloon to see how the American public would respond to the mass killing of enemy civilians, particularly since this attack occurred soon after the firebombing of Dresden, which remains controversial even today. To the surprise of many in Washington, however, the American public voiced no real objection. “Properly kindled,” Time magazine wrote in 1945, “Japanese cities will burn like autumn leaves.”

 

Q. Most Americans are aware of how the U.S. dropped the first atomic bomb on Hiroshima, but many may not be aware that the firebombing raids on Japanese cities killed more civilians. What lessons about military decision making, if any, would you like readers to take away from Black Snow?

You are exactly right. So many folks know of the atomic attacks, but are stunned to learn of the incineration of Tokyo, Osaka, Nagoya, and dozens of other Japanese cities in the waning months of the war. The atomic attacks, however, did not happen in a vacuum. Those strikes were the last stop on America’s march toward total war.

 

Q.  As you note in the Epilogue, General LeMay’s reputation greatly suffered after he agreed in 1968 to be the vice presidential candidate in Governor George Wallace’s campaign for president. In Black Snow you paint a very favorable picture of LeMay as a strategic thinker and a military commander who cared about his air crews. Did your research change your view of him?

Absolutely. Like so many others, I began my project far more familiar with LeMay based on the controversial end of his career than on his wartime service. As I dug into his life, my view of him really changed. He was an incredibly hard worker, who gave tremendously of himself to the war. He rarely saw his family, and he was lucky to sleep more than four hours a night.

He had studied engineering in college, and was a natural problem solver, which is what you need in war. What was also fascinating was to read his efficiency reports in his personnel file, where amazing aviators, like Jimmy Doolittle, wrote that he was one of the best combat commanders produced by the war. You can’t get any higher praise than Jimmy Doolittle. 

 

Q.  What do most Japanese today think about the firebombing?  Is it taught in history books?  Will your book be published in Japan? The U.S. and Japan are, of course, close allies today, but do some Japanese feel the firebombing was a mistake or unnecessary?  

In Japan, the firebombing, much like here in the United States, has been overshadowed by the atomic attacks. That is evident in the beautiful national museum in Hiroshima compared to the small, privately funded one in Tokyo.

That said, there remains interest in Tokyo and there have been many books published on it by survivors and historians. Many of the survivors I talked to place a lot of blame on Japan’s leaders at that time, who unnecessarily prolonged the war even after it was obvious Japan was defeated. That delay in surrender cost hundreds of thousands of civilian lives.

I would love to see Black Snow published in Japan. I think the book fairly captures the story from all sides and could be a great resource for both Japanese readers and historians.  

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184437 https://historynewsnetwork.org/article/184437 0
Professor Adam Winkler on Limitless Political Spending To understand how political spending spiraled out of control—projections estimate $9.7 billion spent on ads alone during the 2022 election cycle—I spoke to Professor Adam Winkler, a professor at the UCLA School of Law. His book We the Corporations: How American Businesses Won Their Civil Rights was a finalist for the National Book Award.

We chatted about changes to election processes in the 1880s and 1890s, and how a political operative named Mark Hanna introduced corporate spending into politics. Professor Winkler connects Hanna’s innovations to the political campaigns we know and don’t love today.

A condensed transcript edited for clarity is below. Paying subscribers can access audio of the full conversation here.

Ben: It’s a pleasure to have you with us, Professor Winkler.

AW: Thanks for having me on.

Ben: Absolutely. In We, the Corporations, you trace the growth of corporate spending in politics, but how did candidates finance elections before corporations came into play?

AW: Well, back in the 1800s, campaigns were generally funded by the wealth of the personal candidate or the personal candidate’s family, or sometimes by very wealthy individuals in the community who supported a particular candidate.

But election funding was much less important then than it is today because you didn't have these mass market campaigns where you had to spend a lot of money to get people to support a candidate. Often in fact elections were really determined by patronage and by local machine politics.

Ben: An example of a political machine being something like Tammany Hall, is that right?

AW: Tammany Hall is a good example. And when a machine like Tammany Hall wanted to get its candidates elected it would often distribute ballots out to members of local communities for that particular candidate.

At the time, the government didn’t issue ballots. Voters would appear at the polls with their own ballot that they would then cast. 

Ben: Does that mean you could bedazzle your own ballot?

AW: Sort of. You came and you took a piece of paper with the name of your candidate and you put it in a box. Political machines printed out their own ballots for their own candidates and distributed them among people who lived nearby. Those voters were expected to go to the ballot booth and cast the ballot that had been given to them by the machine.

Ben: So the machine in some ways is literally just a printing machine. Assuming that the federal government begins issuing ballots, when does that change take place?

AW: The government started printing its own ballots around the 1880s. Prior to then, we didn't have a secret ballot. You would just go into a room and drop your vote into a large box.

But in the 1880s, we saw a real reform of how elections were managed, and one of the developments was the rise of secret voting. So you could vote in private behind a booth and you had a government-printed ballot that had all of the candidates on it. You could always write in your vote for another candidate, but instead of relying on self-printed ballots, the government decided to rely on government-printed ballots, for ease of counting and for administration.

Ben: Am I right in assuming that the government official who devised the secret ballot was always picked last for his kickball team in PE?

AW: No, I think it's more likely that the government official who devised this reform was modeling the electoral reforms of the laws of Australia. This was known as the Australian ballot and the goal was to clean up the electoral process, prevent corruption, and make elections more meaningful as a reflection of the popular will.

Ben: Interesting. Moving forward to the 1890s, is there someone in particular who takes advantage of the new reforms?

AW: No one understood the implications of the transformation in the political process more than Marcus Alonzo Hanna, kind of the Karl Rove of his day. He was the political mastermind behind a twice-successful Republican presidential candidate: in Hanna's case, William McKinley, who won the White House in 1896 and was reelected in 1900. 

Hanna was raised in Cleveland, where he was a high school classmate of John D. Rockefeller, and Hanna had enjoyed a successful career in the coal and steel industries, but his true passion was politics. In 1895, he handed off his company to his brother and turned his full attention to electing William McKinley, a fellow Ohioan, to the presidency.

Hanna used his businessman's instinct for innovation and marketing to revolutionize how election campaigns raised and spent money, and for the first time brought significant amounts of corporate cash into the electoral process. 

Ben: Hanna’s nickname becomes Dollar Mark, is that right?

AW: That's right.

Ben: How did Dollar Mark change the way that campaigns were run at the time?

AW: Hanna was the chairman of the Republican National Committee in the 1896 presidential election, and he understood that he needed to raise more money than any previous presidential campaign in history.

The Democrats had nominated a firey populist named William Jennings Bryan, who was an outspoken opponent of corporate power and who drew a lot of broad support from farmers and the working class. Early in the campaign, Bryan had a lot of momentum. To counter him, Hanna decided to undertake an exhaustive and systematic publicity campaign to educate voters.

Among Hanna's innovations was to centralize control of the presidential campaign. Traditionally, state-based party committees managed local campaigns, even for national candidates like the president, but Hanna centralized all of the state committees under his authority. He also created the first nationwide advertising campaign to market a presidential candidate and produced over a hundred million pieces of campaign literature printed in German, Spanish, French, Italian, Danish, Swedish, Norwegian, and even Hebrew to appeal to immigrants. 

He also promoted William McKinley through the creation of things like buttons, posters, billboards, cartoons, and leaflets that were manufactured by the carload. Hanna hired thousands of people to go out and distribute these buttons, post these posters, and promote McKinley in every competitive district.

It was the kind of campaign that today we take for granted but had really never before occurred in American history.

Ben: I suppose there were a few side effects to this campaign, though? 

AW: Right, so Hanna's new methods of electioneering required far more money than the campaigns of old. Hanna found a lot of that money in corporate America. Hanna was someone who perhaps more than most appreciated the role of money in politics. He once famously quipped “there are two things that are important in politics. The first is money. And I can't remember what the second one is.”

Ben: The obvious answer is lawn signs.

AW: Yeah, exactly. But you need money to make the lawn signs.

To raise the money for McKinley's campaign, Hanna thought the corporate giants of the era were the perfect contributors. 

Business leaders at the time were very fearful of the potential economic consequences of William Jennings Bryan being elected to the presidency. So Hanna went to those business leaders and said, it's time for you to put your money behind a business-friendly candidate like William McKinley. He went to banks and said, you should give one-quarter of 1% of your capital. He went to large industrial corporations and recommended that they give five- and six-figure amounts. Standard Oil, the economic giant run by Hanna's schoolmate, John D. Rockefeller, was asked to give $250,000 (at the time an enormous amount) to the McKinley campaign.

Hanna really systematized the fundraising for political campaigns the way no political operative had ever done, and the overall fundraising haul that Hanna generated for McKinley was estimated to be $7 million, more than ten times the amount that was spent by William Jennings Bryan. It was the most ever spent for a presidential candidate. 

Ben: Wow. To better understand the magnitude of that figure, could you please convert it into Bitcoin for our audience?

AW: Ha! The way to conceptualize how big it was is this: although election campaigns tend to cost more every single season, Hanna's haul in 1896 was so huge that no presidential campaign would equal it for nearly half a century. This was really an unprecedented effort to raise and spend money on an election campaign.

But no one really knew about it! There were no disclosure laws back in the 1890s. So today Americans worry about dark money, about money that comes from unidentified donors, but virtually all of the money in the 1896 election was dark money and we didn't get the first federal laws requiring campaigns to disclose funders until 1910.

To learn more about how the first campaign finance laws came to be, check out the episode of Skipped History on the Great Wall Street Scandal of 1905.

Ben: Ha! I appreciate the plug. On a concluding note, how should we consider Mark Hanna’s reforms when viewing today’s election cycles?

AW: Well, if we want to know why our election campaigns are so out of control, we have to go back to Mark Hanna and the 1896 campaign. We're still living in the world that Mark Hanna built where the wealthiest of Americans and big businesses are expected to support campaigns, and where campaigns use advertising techniques that business corporations were developing around the turn of the century to market political candidates. So many of the ills of our political system can be traced back to Marcus Alonso Hanna.

Ben: An important, discouraging, and illuminating connection to make. Thank you so much for being here and for this captivating history lesson. 

AW: Thanks so much for having me.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/blog/154649 https://historynewsnetwork.org/blog/154649 0
Projecting the Next Presidential Winner from the Midterm Results is a Fool's Bet

Florida Governor Ron DeSantis is the latest politician to be anointed president-in-waiting after the midterms. A lot can change in two years. 

 

 

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015). A paperback edition is now available.

 

With the midterm elections of 2022 now completed, the projection game around the presidential election of 2024 has begun, with the early prognosis being that Florida Governor Ron DeSantis is on the way to the White House.  This is due to his massive reelection victory in the third largest state (and the largest one likely to be in play in the Electoral College), but it is folly to put betting money on DeSantis, just as it has been after every midterm election in American history in modern times. Two years is an eternity in politics, and putting bets on a presidential winner has never worked out in the modern history of the presidency, going back a century.

At the end of 1910, Woodrow Wilson, the president of Princeton University, had been elected Governor in New Jersey, but was not considered to be a factor in the presidential election of 1912.  The Democratic Party had been in the political wilderness nationally, having only won the presidency twice since the Civil War, with Grover Cleveland’s victories in 1884 and 1892 separated by a defeat.  But the Republican Party under William Howard Taft was split between conservatives and progressives, and when former President Theodore Roosevelt chose to challenge his own anointed successor in 1912, it created an unusual opportunity for the Democrats.  However, it would take 46 ballots at the Democratic National Convention before Wilson, with less than two years in elected office, was chosen as a true dark horse nominee.  In a four-way race, including Socialist Eugene Debs who won 6 percent of the national popular vote (an all-time high for any American party with “socialist” in its name), Wilson was able to win the presidency and 40 states, despite garnering the second-lowest winning share of the popular vote (42 percent) in American history (only Abraham Lincoln, in another four-way race in 1860, took office with a lower portion of the vote).

At the end of 1918, Warren G. Harding was a first-term Republican Senator from Ohio. He had accomplished nothing significant in office, but ended up a true dark horse nominee in 1920, chosen at the Republican National Convention on the 10th ballot, after other candidates with greater reputation and accomplishments were passed over.  This was an election shaped by the desire to get away from the aftermath of America’s entry into the first World War in 1917, and demonstrated the goal of restoring isolationism.

At the end of 1926, the American economy was seemingly purring under President Calvin Coolidge, who had succeeded to the presidency upon Harding’s death in August 1923.  After winning his own election easily in 1924, it was assumed that Coolidge would run again in 1928. Few anticipated Coolidge’s decision not to run, nor did many imagine that a cabinet member, Secretary of Commerce Herbert Hoover, would defeat a more public figure like a governor or US Senator to succeed Coolidge in 1928.  But Hoover had made a strong impression, and is still considered one of the most successful and significant cabinet members in American history. With the economy seemingly in great shape, it was assumed that he would have the upper hand in 1932. However, the Great Depression would change the course of history.

At the end of 1930, while the Great Depression was worsening rapidly, there were few observers who would have bet on New York Governor Franklin D. Roosevelt, a paraplegic due to polio, to overcome either Democratic rivals or the incumbent Hoover, who had been a close friend during the Wilson administration (FDR promoted Hoover as a potential Democratic successor to Wilson in 1920).  After FDR had been the losing vice presidential nominee alongside Democratic Governor James Cox of Ohio in 1920, no one in their right mind would have imagined that he would have any political future. Well-known columnist Walter Lippmann had described FDR as a pleasant man but not possessing any important qualifications to be president—a gross underestimation in hindsight, but conventional wisdom at the time.  But FDR went on to serve an unprecedented three full terms as president, winning election in 1932, 1936, 1940 and 1944.

At the end of 1942, Senator Harry Truman had gained notice as the head of a committee investigating wasteful military spending occurring during World War II.  Henry A. Wallace was FDR’s third term vice president. No one then expected that Roosevelt would seek a fourth term, or that Wallace would antagonize Southern Democrats, who would oppose his renomination for vice president.  Yet Truman would become a surprise vice president who assumed the presidency after 82 days in office with the death of FDR on April 12, 1945.

At the end of 1946, after the Republican Party won control of both houses of Congress, President Truman’s poll ratings made many think he would not seek election for a full term.  And when Thomas E. Dewey became the Republican nominee in 1948, every public opinion poll anointed him as president-elect before the election.  Instead, in the upset of the century, Truman went on to a surprise victory, going out to the country in a whistle-stop rail campaign and gaining the nickname “Give ‘Em Hell Harry” for his aggressive attacks on a “Do Nothing Congress.” This triumph occurred despite the fact that Henry A. Wallace ran on the Progressive Party line and Governor Strom Thurmond of South Carolina won four southern states campaigning against Truman’s civil rights initiatives on the States’ Rights Party ticket.  Ironically, Truman had replaced Wallace on the 1944 ticket precisely because he had never spoken up on civil rights as Wallace had done.

At the end of 1950, with the nation engaged in the Korean War, speculation was rampant that Senator Robert A. Taft of Ohio, the standard-bearer of conservatism and son of former President William Howard Taft, was the likely Republican nominee in 1952. However, moderate Republican Senator Henry Cabot Lodge, Jr. recruited General Dwight D. Eisenhower to challenge Taft. Eisenhower had refused an earlier entreaty to run; a desperate President Truman suggested in 1948 that Ike run as the Democratic nominee, with Truman going back to the second spot on the ticket. In 1952, Eisenhower would defeat Taft for the nomination, and serve two terms in the White House; Taft sadly passed away of a brain tumor six months into Eisenhower’s first term.

At the end of 1958, Senator John F. Kennedy of Massachusetts had just won a smashing reelection victory, and many observers thought he would seek the presidency, but the fact that he was a Roman Catholic was perceived by others as an insurmountable problem. An earlier Catholic nominee, Alfred E. Smith in 1928, had been unable to overcome the religious barrier.  But JFK would overcome the disadvantage, and go on to victory in a close election result in 1960; his running mate, Lyndon B. Johnson, was a major factor in keeping the Southern states in the Democratic column.

Lyndon B. Johnson had sought the presidency in 1960, but the reality that a Southerner had not been elected president since Zachary Taylor in 1848 was seen as a barrier.  His succession to the Presidency after the assassination of Kennedy, however, gave Johnson the upper hand in 1964, and LBJ would go on to the greatest popular vote percentage in American history, 61.1 percent, defeating right wing Republican Senator Barry Goldwater.

At the end of 1966, with the war in Vietnam beginning to split the nation, very few would have thought that Richard Nixon, the loser in both the presidential election of 1960 and the 1962 California governor’s race, would have an opportunity to make a comeback.  Governor George Romney of Michigan was thought to be the Republican frontrunner, and many observers thought that for a former presidential loser to recover from two defeats seemed impossible.  But Nixon’s foreign policy expertise was a plus, and he surprised everyone with his resilience, defeating Vice President Hubert Humphrey and third party candidate former Governor George Wallace of Alabama.

After Nixon’s reelection in 1972, few expected the resignation of Vice President Spiro Agnew in October 1973. This led to the first use of the 25th Amendment, under which both houses of Congress approved Republican House Minority Leader Gerald Ford to replace Agnew as VP.  Ford had never had the ambition to run for national office, but eight months into his time as vice president, Ford would succeed to the presidency after Nixon’s resignation in the midst of the Watergate scandal.

As Ford became president in the last few months of 1974, Georgia Governor Jimmy Carter was finishing the one term allowed by state law, and had gained some national attention as a “New South” governor.  But when he announced for president at the end of 1974, few observers saw him as more than a footnote, as many other Democrats in Congress were perceived as more likely and stronger potential candidates.  Carter was labeled as “Jimmy Who?”, but totally surprised the nation with his outstanding campaign strategy and organization. The idea that he would be elected to the White House in 1976 seemed unlikely, but he defeated President Ford in a close election.

At the end of 1978, Ronald Reagan was well known as a former Hollywood movie actor and two-term California governor, but he had failed to stop the nomination of Gerald Ford at the Republican convention in 1976.  Speculation was that he was too old to run in 1980, as he was nearing the age of 70, Eisenhower’s age when he left office in 1961. Ike had suggested that no one older than himself should be considered for president.  But Reagan overcame skepticism, and rivals including George H. W. Bush.  He decided that Bush’s foreign policy expertise was useful, so a team of Reagan and Bush became the winning combination in 1980; with high inflation, a brief recession, the Iran hostage crisis, and the Soviet invasion of Afghanistan plaguing President Carter, Reagan would go on to a massive victory in the 1980 general election.

At the end of 1986, speculation about Bush being nominated to succeed his boss was met with skepticism, as no vice president had succeeded the president he served in 152 years, since Martin Van Buren followed Andrew Jackson in 1836.  Bush had to deal with challengers including Senator Bob Dole of Kansas, who had run with Gerald Ford for vice president in 1976, but he overcame opposition, and was elected president by a substantial margin over Michael Dukakis in 1988.

At the end of 1990, and especially after the brief first Gulf War against Saddam Hussein’s Iraq in the early months of 1991, Bush looked invincible. Two leading Democrats, Al Gore and Mario Cuomo, decided not to run in 1992.  A so-called second tier of candidates emerged, including Arkansas Governor Bill Clinton, who had bored the 1988 Democratic National Convention with a long, uninspiring speech to formally nominate Michael Dukakis.  When evidence of his womanizing emerged, it seemed clear that Clinton’s candidacy was in trouble, particularly since a revealed affair had derailed the candidacy of Colorado Senator Gary Hart in 1987.  However, Clinton overcame the scandal, and in 1992 had the fortune of a recession that harmed the Bush administration and the independent candidacy of H. Ross Perot which took 18.9 percent of the vote.  Bush, despite enjoying the highest approval ratings of any president in American history during the Gulf War, ended up losing with a lower percentage of votes (37.5 percent) than any earlier president except William Howard Taft in 1912.  For his part, Clinton had the third-lowest vote percentage for any presidential winner (43 percent), better only than Abraham Lincoln in 1860 and Woodrow Wilson in 1912.

At the end of 1998, speculation was that Arizona Senator John McCain would be the Republican favorite, although Texas Governor George W. Bush, son of former President George H. W. Bush, was considered a contender.  But to the surprise of many, Bush went on to triumph over McCain, and although outgoing Vice President Al Gore won the national popular vote in 2000, Supreme Court intervention in Florida’s vote count would make Bush the first president to win the election while losing the popular vote since Benjamin Harrison in 1888.

At the end of 2006, future Illinois senator Barack Obama had drawn a lot of attention since his 2004 Democratic National Convention speech, but when he announced his presidential candidacy, it was thought that former First Lady and New York Senator Hillary Clinton had the upper hand for the nomination in 2008.  But after a long, hard fought battle, Obama emerged the nominee. The Great Recession undermined Republican nominee John McCain, and led to Obama’s substantial victory.

At the end of 2014, as Republicans began planning presidential campaigns for 2016, real estate mogul Donald Trump, who had flirted with running before, was seen as unlikely to either announce or to overcome seasoned politicians in his party, including Ohio Governor John Kasich, former Florida Governor Jeb Bush, New Jersey Governor Chris Christie, and others.  Trump overcame all of his Republican opponents, but was still seen as unlikely to defeat Democratic nominee Hillary Clinton. But for the second time in 16 years, a candidate who lost the national popular vote went on to win the Electoral College, making him the fifth president to do so. This was certainly the biggest surprise since the upset victory of Harry Truman over Thomas E. Dewey in 1948; to many, it was even more startling.

At the end of 2018, as the 2020 Democratic presidential campaign began, former Vice President Joe Biden, who had passed on running in 2016 after the death of his son Beau, decided that he should run to work against the authoritarianism and destructive domestic and foreign policies of Trump. Biden lost the Iowa Caucuses and New Hampshire Primary, but then won the South Carolina Primary, and was able to overcome a multitude of competitors in other primary contests, including among others, Senators Amy Klobuchar of Minnesota, Bernie Sanders of Vermont, and Elizabeth Warren of Massachusetts.  Biden would then go on to defeat President Trump by flipping five states that aided Trump’s win—Georgia, Pennsylvania, Michigan, Wisconsin, and Arizona—and become the 46th President of the United States, despite the shocking attack on the Capitol by Trump supporters on January 6, 2021.

So now at the end of 2022, Florida Governor Ron DeSantis is getting all the hype, but he will have many challengers in the Republican Party, likely including Donald Trump.  And the question of whether Joe Biden, who will turn 82 two weeks after the 2024 election, will run, or who might challenge him within the Democratic Party, is wide open.

So to project who will be president on January 20, 2025 is a pure guessing game, with the likelihood that it could be one of a multitude of alternative candidates, including Joe Biden himself.  After the surprising midterm election of 2022, with Biden emphasizing the emerging threat to American democracy and Democrats performing much better than expected of the party in power at the midterm, who can say that he might not have a good chance for a second term?

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/blog/154651 https://historynewsnetwork.org/blog/154651 0
The Roundup Top Ten for November 18, 2022

Crypto Collapse Shows No Lessons Learned From Enron Episode

by Gavin Benke

The mythos embraced by both Jeff Skilling and Sam Bankman-Fried – that opaque and incomprehensible business practices signalled entrepreneurial genius – has only become more entrenched in the social media era. 

 

Monuments to the Unthinkable

by Clint Smith

German and European memorials to the Holocaust contrast starkly with an American memorial culture where the Confederate dead are revered, former slave plantations are tourist attractions, and state legislatures are seeking to ban the teaching of the nation's history in full. 

 

 

Why Direct Democracy is The Best Protection for Abortion Rights

by Rachel Rebouché and Mary Ziegler

Given the chance to vote directly on abortion rights, voters have been swayed by personal experience and shared stories to protect reproductive freedom and leave the choice in the hands of women, not politicians. 

 

 

A New AP Course Advances the Teaching of Race in American History (While the Right Seeks to Restrict It)

by Michelle A. Purdy

"The nation’s public schools have long been caught in a tug of war over whether they should be used to conserve or disrupt existing social arrangements."

 

 

Immigrant Merchants and Law-and-Order Politics in Detroit

by Kenneth Alyass

The Chaldean community of Detroit became a significant middleman-minority through the operation of small stores in working-class and majority-Black neighborhoods. As white flight and disinvestment created increasingly dire conditions, they also became a constituency for aggressive policing. 

 

 

Lunchtime in Italy: Work, Time and Civil Society

by Jonathan Levy

The Italian lunchtime insists that time be organized around communal rituals and sustenance, not work. Does the utter foreignness of this attitude in America help explain the current national derangement? 

 

 

Mike Davis Forced Readers to Embrace Specificity

by Gabriel Winant

The recently deceased radical scholar never allowed the particularity of historical moments to disappear under theoretical abstraction, which made his work powerful and compelling. 

 

 

Mormon Support for Same-Sex Marriage isn't a Total Surprise

by Benjamin E. Park

A historian of the Latter Day Saints explains that the church has become more willing to tolerate general expansions of rights for LGBTQ Americans at the same time as it reserves the right to dictate sexual mores within its own ranks. 

 

 

Tenured Faculty are the 18% – What Will They Do for the 82?

by Claire Potter

"Go to any faculty meeting, and you will hear what the 18% really believes: that if contract and contingent faculty deserved tenure-stream positions, they would have them."

 

 

When Christmas Started to Creep

by Bill Black

The story of "Christmas Creep" is not a linear encroachment of Yuletide on the rest of the calendar, and hinged on political decisions made during the Great Depression and World War II. 

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184436 https://historynewsnetwork.org/article/184436 0
A Loss for Bolsonaro is a Win for the Amazon and the Planet

 

 

President-elect Luiz Inácio Lula da Silva’s victory over Jair Bolsonaro in Brazil represents a historic chance to begin undoing some of the great harm that was inflicted on Brazil’s Amazon rainforest over the last four years. Since taking office in January 2019, Bolsonaro has ravaged the earth for short-sighted gains, turning back environmental regulations that any thinking human being would wish to preserve in the face of such unprecedented global degradation. Bolsonaro systematically dismantled environmental protections so that those who could not care less about the environment would be free to clear the land and turn it into pastures without any accountability. The unfolding crisis of the Amazon is a catastrophe for climate change, biodiversity, Indigenous people of the region, and the untold wonders that human science has yet to understand. A 2020 study published in the journal Nature has shown that if the systematic destruction of the Brazilian Amazon continues unabated, much of it could become an arid savannah, or even “dry scrubland,” within decades given the rate of deforestation, largely due to deliberate and illegal fires that are meant to permanently convert forest into pastureland. With the devastation of the rainforests has also come the devastation of those Indigenous people whose homelands and livelihood are being destroyed by deforestation. Just imagine, between August 2020 and July 2021 over 5,000 square miles of rainforest were lost in the Brazilian Amazon – that is an area larger than the land area of Connecticut. In fact, under Bolsonaro the rate of destruction reached a ten-year high, as his administration turned a blind eye to illegal logging, the deforestation of Indigenous land, and, as Amnesty International notes, the “violence against those living on and seeking to defend their territories.” Under Bolsonaro’s reckless and corrupt rule, his government deliberately “weakened environmental law enforcement agencies, undermining their ability to effectively sanction environmental crime or detect exports of illegal timber,” as Human Rights Watch describes. Fines for illegal logging in the Brazilian Amazon were suspended by presidential decree at the beginning of October 2019. Illegal seizures of land on Reserves and Indigenous territories in Brazil’s Amazon became routine, as Bolsonaro slashed the budget of agencies that protected the jungle from unauthorized clearing. Criminal organizations, aptly called “rainforest mafias,” allow cattle ranchers to operate with impunity, and according to the US State Department possess the “logistical capacity to coordinate large-scale extraction, processing, and sale of timber, while deploying armed men to protect their interests.” It is hard to fathom the sheer scale of destruction that was wreaked by Bolsonaro upon the Amazon. Such rampant deforestation is tragic on many levels — it is destroying habitats and countless species being pushed to the brink of extinction when we are already in the midst of a mass extinction of this planet’s animals, insects, and plants. It is hastening the onslaught of climate change when we are already facing the dire effects of a warming planet. And it is obliterating the lands of Indigenous people who have already suffered and been persecuted and murdered for decades. To be sure, the extent of devastation of the rainforest under Bolsonaro was so enormous that we can barely begin to comprehend the loss to humanity, to science, and to our knowledge of undiscovered plants and animals that hold the answers to questions of which we have not even dreamt. This is a shameful loss to the entire world and to generations hence. The Bolsonaro government failed miserably to act as a responsible custodian of the Amazon and Pantanal (the world’s largest tropical wetland located mostly within Brazil, which along with the Amazon has some of the world’s most biologically diverse ecosystems) — instead it helped in every way it can to hasten this unimaginable devastation. Dr Michelle Kalamandeen, a tropical ecologist on the Amazon rainforest, observed that “When a forest is lost, it is gone forever. Recovery may occur but never 100% recovery.” We must bring this travesty to a halt. By this wanton and dismally short-sighted decimation of the rainforests we are depriving humanity of knowledge which could alter medicine, improve our lives and transform the world, from the way we build our cities to the ways we make our homes. Plant and animal species inspire new technologies, new forms of architecture, new kinds of design and materiality. Yet probably less than 1 percent of rainforest trees and plants have been studied by science — though not less than 25 percent of Western pharmaceuticals are derived from rainforest ingredients. By allowing rampant deforestation to continue, we are doing ourselves and future generations untold and unconscionable harm. Let us remember that the Amazon does not simply belong to the countries in which it happens to be found – it is not the exclusive resource of those companies that are able to exploit it, appropriate its resources, and destroy it with impunity. The Amazon is part of our collective patrimony, a heritage beyond price which we are duty-bound to pass on to future generations, regardless of the profits that we may yield from its systematic rape. And let us make no mistake, or mince words—the Amazon is being raped hour by hour, month by month, year by year, and the world is watching in silence as this violation is repeated daily. The time is running out for us to act in a meaningful way to stop this mindless decimation of one of the world’s greatest natural wonders. With the election of Lula as President of Brazil, we now have a historic opportunity to support and encourage him to immediately start working on a plan to reverse Bolsonaro’s disastrous policies in three main areas: the environment, public security, and scientific discoveries. First, President Lula should start by prohibiting deforestation, illegal logging, and land grabbing. To that end, he must stop short of nothing to pass a new law to be enshrined in the Brazilian constitution that puts an end to the systematic destruction of the rainforest. The law should include mandatory prison sentences as well as heavy fines to prevent cattle ranchers and illegal loggers from committing such crimes ever again with impunity. Second, he must develop a comprehensive plan to protect the human rights of Indigenous communities from the criminal networks that use violence, intimidation, and terror to cow the locals into silence. He should make such a plan the center of his domestic policy while improving security and providing the necessary funding for environmental agencies to perform their tasks with zeal. Third, President Lula should invite the global scientific community to further study the wonders of the Amazon and in partnership with them initiate scores of scientific projects from which the whole world would benefit, while preserving the glory of the Amazon as one of the central pillars in the fight against climate change. Finally, President Biden, who understands full well the danger that climate change poses, should provide political support and financial assistance to President Lula to help him reverse some of the damage that was inflicted on the Amazon by his predecessor. President Lula must view his rise to power and the responsibility placed on his shoulders as nothing less than a holy mission that will help save the planet from the man-made looming catastrophes of climate change.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184410 https://historynewsnetwork.org/article/184410 0
A Hundred Years On, Tutankhamun's Alleged Curse Still Captivates

Howard Carter examines the inner coffin of Tutankhamun, c. 1925

 

 

Four months after the discovery of Tutankhamun’s tomb in November 1922, the expedition’s sponsor, Lord Carnarvon, died unexpectedly, and the tabloid press in both the UK and US had no doubt what was to blame. Claims were already widespread that an engraved plaque had been found in the burial chamber, reading “Death comes on swift wings to he who disturbs the tomb of the pharaoh.” The intruders had been affected by an ancient curse, the papers said, and the Earl’s death proved it.

 

Evidence mounted: at the exact moment Lord Carnarvon died, the lights went out across Cairo and his pet terrier dropped dead back home in Highclere Castle. A canary belonging to archaeologist Howard Carter had been swallowed by a cobra – a symbol of Egyptian royalty – on the night the tomb was first breached. The Daily Express newspaper reported that all over Britain, people had begun donating their Egyptian antiquities to museums for fear of being affected by a curse.

 

The story was further fuelled a month later when American railroad executive George Jay Gould caught a cold and died shortly after visiting the tomb. Was it the curse? Or was there a more scientific reason? The Express in Britain and New York World in the US speculated that the tomb had been impregnated with ancient poisons that had somehow remained toxic for over three thousand years.

 

Thereafter, every death associated with the tomb made headlines. Sir Archibald Douglas Reid, who X-rayed the mummy, died the following year of an unexplained illness. Egyptologist Professor Hugh Evelyn White committed suicide and left a note saying there was a curse on him. Another team member, Arthur Mace, was a victim of suspected arsenic poisoning. Howard Carter’s secretary was smothered with a pillow in an elite gentleman’s club. American Egyptologist Aaron Ember died in a house fire. And there were more.

 

Lord Carnarvon’s daughter, Lady Evelyn Herbert, believed the curse story. She told the Philadelphia Inquirer that she had offered to release her fiancé Brograve Beauchamp from their engagement, since she had entered the tomb with her father and feared that Brograve might become affected by the curse through marrying her. Brograve gallantly refused, saying that if she were cursed, she needed him all the more so he could protect her.

 

 

From the earliest curses to modern conspiracies

 

Throughout the Bible, the Koran and the Hindu Vedas, there are stories of curses placed on those who’ve sinned. In a famous example, Cain is condemned to a life of eternal wandering after he murders his brother Abel. Curse tablets and dolls have been found in tombs throughout the ancient world, invoking deities and promising a range of psychological and physical repercussions if certain conditions are breached. Some of these repercussions were pretty dire: Dido was so furious when Aeneas betrayed her that she cursed Troy and Carthage, saying they would be forever enemies, and, according to legend, that’s what caused the Punic Wars, in which up to two million died.

 

The idea of curses in mummies’ tombs first took hold after Napoleon’s army stumbled across the Valley of the Kings in 1798 and looted its treasures. An early Victorian stage act involved two comics unwrapping mummies, who then came back to life. The trope was adopted by umpteen Gothic writers, who churned out stories of mummies strangling, poisoning, or possessing their victims. Louisa May Alcott wrote one in 1867 called “Lost in a Pyramid, or The Mummy’s Curse”; Sir Arthur Conan Doyle wrote one in 1892 about a reanimated mummy. As a fictional subject, mummies and curses were pure Egyptian gold.

 

The Grimaldis of Monaco were said to have been cursed in the 14th century by a witch who swore they would never find happiness in marriage. Sure enough, there have been umpteen family divorces in the 20th century alone, plus the tragic death of Princess Grace in a car accident in 1982. Was that the witch’s fault?

 

The Kennedys have lost twelve family members in tragic accidents since 1944. Of Rose and Joe Kennedy’s nine children, four predeceased their parents – including Joseph and Kick, who died in plane crashes, and John F. and Robert, who were assassinated – and the deaths continue in the 21st century. It’s vaguely asserted that someone Joe Kennedy crossed in business during the 1920s was responsible for the curse, but in that case the punishment seems disproportionately harsh.

 

JFK’s assassination is famously cited by conspiracy theorists, who can’t accept that Lee Harvey Oswald acted alone. Piecing together random aspects of the event in Dallas, they conclude it was a carefully planned murder carried out by the FBI/the Russians/organized crime/some other sinister force. Similarly, there are those who refuse to believe that nineteen men acting on behalf of a terrorist organisation few had heard of in 2001 were able to hijack four planes on 9/11, causing thousands of deaths. There were too many coincidences; it had to be an inside job.

 

We seek someone, or something, to blame for particularly shocking and terrifying events; otherwise, we’d have to accept that tragedy is the luck of the draw and we could be next.

 

Apophenia

 

“In the absence of perceived control, people become susceptible to detecting patterns in an effort to regain some sense of organisation,” says experimental psychologist Bruce Hood in Psychology Today. The human brain is resourceful at finding what seem like meaningful patterns in a random universe. This phenomenon has a name: apophenia. It leads some to attribute misfortunes to curses, conspiracies, past lives, ghosts or psychic phenomena, instead of accepting they are almost always the result of chance, human behaviour, or both.

So what about the ‘curse’ of Tutankhamun? Contrary to news stories, there was no warning inscribed on the walls of the burial chamber or elsewhere in the tomb. There were a few in other Ancient Egyptian sites – it made sense for priests to spread fear amongst the population to deter tomb robbers – but not here.

 

What accounts for all the deaths amongst those who visited King Tut’s tomb? Did the Ancient Egyptians have the technology to implant a long-lasting poison in 1323BC? None was found, but modern scientists detected a black mold that they identified as types of bacilli that could conceivably have caused congestion and bleeding in the lungs if inhaled in a large quantity – but no evidence has emerged that anyone who died after entering the tomb was affected by these particular bacilli.

 

Another theory was that bat droppings in the tomb could have poisoned those who visited. But the tomb was sealed and airless so no bats could have survived within. Was it a poison transmitted by mosquitoes? That argument didn’t work because there was no water in the desert, and therefore no mosquitoes.

 

Sir Arthur Conan Doyle, a strong proponent of the curse theory, got around these technical difficulties by saying that “an evil elemental” had caused Lord Carnarvon’s death – a claim so unspecific that it’s hard for science to disprove.

 

In fact, Lord Carnarvon had suffered poor health since a car accident in 1909 when he incurred chest injuries that exacerbated his asthma. In March 1923 he was bitten on the cheek by a mosquito while on a trip down the Nile to Aswan, then he accidentally nicked the bite with his razor. Blood poisoning set in and, in those pre-antibiotic days, for a man with weak health, it proved fatal. Human behaviour combined with mosquito behaviour led to his death.

 

The truth is that of the twenty-six people present at the official opening of the tomb, only six had died a decade later. Those who spent most time working with the mummified body lived to an average age of seventy, which was beyond life expectancy at the time. Howard Carter – the archaeologist who devoted years of his life to excavating the tomb – should surely have been the main victim of any curse but he lived for seventeen years after its discovery. Lady Evelyn Herbert, said to have been the first to crawl inside, lived till 1980.

 

Given the conditions in Egypt at the time, with poor sanitation, snakes, crocodiles, and swarms of disease-bearing insects, the inside of Tutankhamun’s tomb was probably one of the safest places in the country. Lord Carnarvon’s sudden death, while sad, was entirely explicable by medical science. Reporters knew all this back in the 1920s – but far be it from them to spoil a good story.

 

 

 

 

 

 

 

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184412 https://historynewsnetwork.org/article/184412 0
The Wartime Service and Postwar Activism of One Latino Veteran

Henry Romo, in Leyte Island. Photo found in the 1990s by Henry Romo, Jr.

All photos accompanying this essay from Ricardo Romo collection.

 

 

On Veterans Day every November 11, we recognize the men and women who have served our country in war and in peacetime. World War II stands out as a particularly challenging period when well-coordinated and well-armed armies led by sociopathic leaders of Germany, Japan, and Italy threatened the very existence of the Western civilization.

My dad, Enrique [Henry] Romo served in World War II and seventy-five years ago he stored a batch of his war photos in a small cigar box at the bottom of my mom’s cedar chest. Until three weeks ago, I had never seen these pictures, and my mom, Alicia Saenz Romo, never mentioned the box of photos. Several other photos were placed in a small bag which we discovered a few years before he died in 2005. 

 

My dad, Henry Romo, with a captured Japanese flag.

The cigar box photos are unusual because they show soldiers in various combat zones in the Pacific theater, combat activities the soldiers could not have shared until after the war. But even after the war, some of the veterans chose not to discuss their experiences. The account of why many veterans kept the military experiences to themselves says much about the horrors of war and the desire to put those difficult times behind them.

World War II required a massive mobilization effort. Millions of Americans, whether engaged in essential duties of manufacturing arms and equipment or active in fierce naval, air, or ground battles, contributed to a victorious outcome for the Allied Forces. Many who served in the U.S. military returned to their homes after the war committed to the economic and political development as well as civic betterment of their communities. Others who had fought valiantly lost their lives in this conflict. 

My dad, along with thousands of American soldiers fighting in the Pacific region, witnessed some of the bloodiest battles of the war. Military historians estimate that more than 30 million soldiers and civilians were killed in the Pacific conflict during the course of WWII, compared with 15 million to 20 million killed in the conflict in Europe. 

Journalist Tom Brokaw’s prize winning book The Greatest Generation is a brilliant account of men and women “who came of age during the Great Depression and the Second World War and went on to build modern America.” Brokaw interviewed hundreds of veterans who served in World War II and concluded that the generation “was united not only by a common purpose, but also by common values--duty, honor, economy, courage, service, love of family and country, and, above all, responsibility for oneself.”

 

Henry Romo on a bombed out railway car. 

 

My dad is one of those Americans who answered the call, as Brokaw described, “to save the world from the two most powerful and ruthless military machines ever assembled, instruments of conquest in the hands of fascist maniacs.” Many American historians are convinced that the men and women who fought in WWII were in fact a remarkable generation “because they succeeded on every front. They won the war; they saved the world.” 

My dad, a veteran and lifetime resident of San Antonio’s Westside, seldom mentioned his World War II service to his country, and never thought of himself as belonging to America’s “Greatest Generation.” 

Dad volunteered for the Army Air Corps shortly after the attack on Pearl Harbor. He saved the documents showing that he officially joined the Air Corps, a precursor to the U.S. Air Force, on October 23, 1942. A year earlier, at age 23, married and the father of one son, dad graduated from high school. He dropped out of Lanier High School at age 15 and returned to high school classes in his early twenties to finish his education at the San Antonio Technical and Industrial night school program. He studied radio repair and upon enlisting in the military, he was assigned to the United States Army Air Corps radio repair training program. A photo he saved may have been taken on the steps of his old high school showing the group of new recruits.

 

Henry Romo, with a friend on the remains of a downed Japanese airplane.

 

My dad and my mom, Alicia, along with their newborn son Henry Jr., were sent to Austin in early 1943. My dad was to continue his radio repair course. Over the next six months dad took classes taught in a University of Texas barracks loaned to the U.S. military. Dad became highly proficient in repairing radios and hoped that his radio skills would enable him to fly with the bomber squads as a radio technician. 

He never got to fly with the bomber squads. He was fortunate because the vast majority of the bombers that flew out of the Phillipines never returned to base. I once heard him remind a war buddy who served with him in the same squadron that after the company’s cook [Pfc. Miles H. Clayton] was killed in a bombing raid, my dad was assigned to help out in the base kitchen. Dad was modest and laughed the episode off. In reality, the photos he saved show that he was also engaged in field battles in the Philippine jungles.

 

Henry Romo and buddy in combat. 

 

Dad spent some time in New Guinea, but he was deployed the last two years of the war to the Island of Leyte, a major island of the Philippines. Scholars have declared the Battle of Leyte as the last major battle of the Pacific. The defeat of the Japanese forces at Leyte demonstrated the U.S. superior forces and weapons. Brig. General J.V. Crabb, Commander of the 5th Bomber Command, distributed a booklet to all the men who served with him. The booklet, designed and published at the base, included photos, drawings, and written accounts of some of the battles described by members of my dad’s squadron. Several of the photos and stories give observers a close-up look at how the U.S. military fought.

 

Henry Romo and friend on a captured Japanese tank.

Brokaw’s interest lay in writing about the men and women at war as well as what their lives were like when they returned home at the end of the war. Henry Romo came home from the war to find that discrimination against Latinos had not abated during his time away. After a Latino family learned that the local funeral home in Three Rivers, Texas would not handle the burial of Pvt. Felix Longoria—because the local white community might object—my dad joined Dr. Hector Garcia in a newly formed Latino organization, the American G.I. Forum. The Forum successfully gained the support of the newly elected U.S. Senator Lyndon B. Johnson, who arranged for Pvt. Longoria, who had been killed in the Philippines, to be buried at Arlington National Cemetery outside Washington, D.C.. My dad remained active with the G.I. Forum for many years, serving on the National Board and as director of the San Antonio regional chapter.

 

Henry Romo, in Leyte Island. 5th Bomber Squad.  Photo found in the 1990s by Henry Romo, Jr. 

 

My dad was proud of serving his country. He was not unusual as a veteran in not talking about his war experience, especially to his family. He never discussed his military service at any family gathering. The one time we heard about his WWII experience was when a former military buddy dropped by to visit with him at his small grocery store in the Westside of San Antonio. On Veterans Day, I am happy to share some of the WWII photos that my dad treasured but never discussed with his family. 

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184391 https://historynewsnetwork.org/article/184391 0
Are the Modern Stoics Really Epicureans?

Zeno of Citium (c. 335-c.263 BCE) and Epicurus (341-270 BCE)

 

 

Modern Stoicism has saturated the philosophical market—seminars, apps, podcasts, retreats, bestseller lists, psychotherapy. As a specialist in ancient Greek philosophy, I admit that I’m pleased to see so many people take an interest in what I study for a living. Stoicism has a lot going for it, and many of my students are powerfully drawn to its core commitments. All that is to say, I can see the allure.

 

My aim here, though, is to convince readers, especially those committed to evolutionary science and modern physics, to learn more about Epicureanism, Stoicism’s oldest and greatest rival. Cards on the table—I prefer Epicureanism, and I have recently published a book on Epicureanism as a way of life.  That said, I think even devoted, forever members of the Stoic caucus have good reason to study Epicureanism, if only because taking your rivals seriously is a sign of intellectual virtue, an indication that you have not grown complacent. As a more controversial point, I suspect that many Modern Stoics are already Epicureans, at least by the standards of the Roman Emperor Marcus Aurelius. Let me explain.

 

The Epicureans and Stoics were at loggerheads from their very beginnings. Epicurus (341-270 B.C.E) and Zeno of Citium (c. 335-c. 263 B.C.E), Stoicism’s founder, were fellow residents of Athens, and their schools developed in response to the same set of political and material circumstances. Specifically, Athens was a city in decline. It found itself knocked down a peg after many years of power and democratic self-rule, the city having been conquered by the Macedonians, their fellow Greeks to the north. Athens had lost control of its major port, the Piraeus, the gateway to their once abundant wealth. The region suffered periodic military incursions and occasional food shortages. Many scholars (though not all) think this precarious environment explains why Greek philosophy during the Hellenistic Era seems to give greater attention to developing resilience in circumstances of scarcity, discord, and powerlessness.

 

Both Stoicism and Epicureanism, then, were designed to help followers manage periods of uncertainty and adversity, and in that respect they both perform commendably. Notable Stoics even tended to praise Epicureanism in this respect. The Roman Emperor Marcus Aurelius, who identified himself as a Stoic, admired Epicurus the man, especially his moderation, his tolerance of pain, and his noble death: “In sickness, then, if you are sick, or in trouble of any other kind, be like Epicurus.” Seneca, also a Roman Stoic, likewise praised Epicurus’ self-control. It makes sense that Marcus and Seneca would respect many features of Epicureanism. Both Stoicism and Epicureanism teach the importance of using reason to control our desires, that misfortune need not put happiness out of reach, and that an important part of living well is preparing to die well. They both suggest ways to preserve joy when things go south.

 

So what sets Epicureanism and Stoicism apart? One way to get at the answer is to consider why Marcus himself opposed Epicureanism. In short, Marcus objected to Epicurus’ natural science and his advocacy of hedonism, the view that humans achieve tranquility through strategic pursuit of pleasure and avoidance of pain. That sounds like two objections—natural science and hedonism—but it’s really one. The Epicureans were intellectually-refined hedonists because of their science.

 

The most striking feature of Epicurean natural science is how much they got right. Epicurus thought the cosmos in its entirety is composed of atoms and void. The atoms collide, come together, and break apart according to fixed causal principles with some occasional indeterminacy of the sort postulated in particle physics. The opening quote of Neil de Grasse Tyson’s Astrophysics for People in a Hurry comes from the Roman poet Lucretius, whose account of Epicurean atomism inspired major figures in the Scientific Revolution.

 

The Epicureans were also, thousands of years before Darwin, proponents of an evolutionary account of the origin of the species. They thought the species that populate our earth are the ones that have proven the fittest in a fight for survival. On a general evolutionary account, animals seek pleasure and avoid pain by nature (i.e., they are hedonists). Humans distinguish themselves from other animals by our powers of reason and sense of ourselves in time, which allows us to deliberate about what will most effectively produce pleasure and minimize pain over the long-term. We are animals with a souped-up practical reason.

 

Marcus rejected these Epicurean views whole-heartedly because he considered the divine creation of a providential universe essential to the Stoic project, as did other Roman Stoics like Epictetus and their Greek predecessors. For the Stoics, human rationality is a manifestation of God’s generosity to humans, not a sophisticated animal capacity. Marcus insists that “the whole divine economy is pervaded by Providence.” When he writes, “If not a wise Providence, then a jumble of atoms,” he means to offer two options: “If not Stoicism, then Epicureanism.” In fact, Marcus admits that if Epicurean natural science were right, he would fall into despair. Without providence, he asks, “Why care about anything?”

 

I have good friends, students, and close relatives who fall on opposing sides of the providential creation divide, and I understand that people have their own reasons for choosing one commitment over the other. Modern Stoics, though, cannot simply set aside the fact that the Stoics fell squarely on the side of providence without risk of undermining some of Stoicism’s core tenets. Stoicism’s emblematic acceptance of suffering follows from their ability to reconceive it as divine providence, as God working in “mysterious ways.”  Marcus writes that someone who suffers something “unpalatable” should “nevertheless always receive it gladly” because Zeus designed individual suffering “for the benefit of the whole.” Even Stoicism’s deep, admirable commitment to caring for all humankind, the notion that we are all “citizens of the cosmos,” is fundamentally grounded in the view that all human beings are manifestations of God.

 

Epicureans, by contrast, build their practical philosophy on a natural science that denies a cosmic significance to suffering, and they see their endorsement of hedonism as an outgrowth of treating humans as sophisticated animals rather than as expressions of a divine rational nature. Epicurus was not an atheist, but he denied a providential God who created the universe or intervenes in its events. Perhaps, then, many Modern Stoics should consider reading more about Epicureanism, since Marcus admired Epicurus’ resilience, temperance, and approach to death, which all grew out of a science many Modern Stoics already accept (and that the Stoics vehemently opposed).

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184389 https://historynewsnetwork.org/article/184389 0
Doug Mastriano's Political Mad Libs

Doug Mastriano wears a Confederate uniform in a 2013 U.S. Army War College faculty photo

 

 

On Sept. 16, Doug Mastriano, the Republican nominee for governor in Pennsylvania, held a rally in Chambersburg, Penn. featuring Donald Trump, Jr. But the rally’s viral moment occurred when Lance Wallnau, a self-proclaimed evangelical leader and futurist, asked the crowd to put their right hands in the air while he commemorated the Battle of Gettysburg. It was a historical mad libs moment, combining a motion associated with Nazi Germany and Civil War nostalgia. Meanwhile, Mastriano himself has been photographed wearing a Confederate uniform. A Mastriano adviser accused those critical of Mastriano’s Confederate uniform as seeking to “erase history” and invited reporters to go on a Gettysburg tour with the candidate, suggesting that they would “learn a lot.”

 

By playing with this sort of historical quotation—in word and deed—politicians and activists invite people who might embrace extremism to do so. The use of such symbols is a warning sign that democratic norms are crumbling.

 

But those who would defend democracy in 2022 can draw on history, too. We can learn from past assaults on democracy, whether during the Reconstruction-era United States or the Weimar Republic in interwar Germany, that such symbols matter.

 

After the Civil War, U.S. forces occupying the South banned the wearing of Confederate uniforms as an effort to quash the continued armed resistance to federal authority. While Congress worked out the details of Section three of the 14th Amendment that disfranchised former rebels, the Army recognized that the performance of Confederate rituals such as wearing the uniform was part of a campaign to intimidate Black voters and threaten Black officeholders building local governments. The government determined that these were not empty rituals but were powerfully connected to the willingness to resort to White supremacist violence.

 

Ex-Confederates also recognized the power of symbols. And they sought to co-opt Lincoln’s memory. They justified their opposition to the civil rights of African Americans, for example, by portraying the murdered president as sympathetic to the plights of poor southern Whites who were “crushed” by Reconstruction policies — the “Great Heart” as he was remembered in the racist 1915 movie, “Birth of a Nation.”

 

After Reconstruction ended, symbols became powerful tools in the hands of those working to reverse the progress made toward a biracial democracy.  Confederate veterans and women’s groups promoted a Lost Cause version of history that sought to establish white supremacy as the Civil War’s legacy. Their progress toward this end could be measured by the monuments and Confederate flags that began appearing everywhere in the early twentieth century. Reviving these once outlawed symbols was dangerous then and now.

 

In the 20th century, democracy’s defenders continued to recognize the power of symbols. For example, as the Nazi threat emerged in Germany in the early 1930s, public displays were subject to similar scrutiny.

 

In June 1930, the state of Prussia (of which Berlin was part) banned the wearing of Nazi stormtrooper (SA) uniforms. This regulation sought to restrict the visible performance of Nazi force and to control the violence that often accompanied such demonstrations. Even after the 1930 ban, police in Berlin regularly encountered young men wearing khaki trousers or white shirts. At what point did their sartorial decisions cross over into a banned political act? Records in the Berlin State Archive testify that police ultimately decided that whenever three or more people gathered in similar clothing that “diverged from a civilian norm,” they should be cited as violating the Prussian state’s restrictions on the Nazi paramilitary organization.

 

German authorities’ opposition to the Nazis proved rather half-hearted. Within two years, uniformed stormtroopers were legally back on the streets, and German conservatives used an emergency decree to take over the Prussian State. They justified their assault on a democratic government as a defense of national order. But their tolerance of Nazi violence helped foster a public embrace of a politics that would culminate in genocidal war.

 

During a 1973 performance at Cologne’s annual Carnival, a West German comedian warmed up the crowd by leading it through a series of familiar call and response rituals. When he concluded by yelling “Sieg,” the crowd obligingly (unthinkingly?) responded, “Heil!” As the crowd laughed, not all that uneasily it seems, he expressed ironic surprise that there were so many “old comrades” in the audience. Although he quickly moved on to his next bit, the power of his humor was its ability to call out the implicit reality of 1970s West Germany: the rituals of Nazi Germany remained disturbingly familiar to many of its citizens.

 

These examples are a reminder that how people choose to present themselves for public interpretation matters, and historians should draw attention to actions that tend to elude scrutiny in the relentless churn of the news cycle.

 

After the Mastriano rally, Wallnau dismissed critics who identified his actions as fascist. He claimed that the viral clip had merely captured him leading the attendees in a prayer to commemorate Joshua Chamberlain’s famous bayonet charge down Little Round Top on July 2, 1863. Audience members raised their arms as Wallnau recited, “Pennsylvania will be like Little Round Top and America will have a new birth of liberty.”

 

His words called to mind President Abraham Lincoln’s visit to Gettysburg to dedicate the cemetery there four months after the battle. He urged listeners to defeat the Confederacy and give the nation “a new birth of freedom.” The freedom Lincoln referenced was emancipation. But Wallnau’s choice to replace “freedom” with “liberty” plucked the battle out of the context of a nineteenth century war over human freedom and placed it into a twentieth-first century contest over supposed government overreach and the right of (violent) revolution.

 

The power of symbols — and of symbolic gestures — is that they never possess just one meaning. Lance Wallnau can always claim, with a veneer of plausible deniability, that the action he was orchestrating at that rally in Chambersburg was not a visible evocation of the Nazi salute. But we can’t permit Wallnau or Mastriano simultaneously to claim historical expertise and historical ignorance.

 

Like that Cologne crowd in 1973, many Americans in 2022 may be willing to discount as unimportant that many in the United States are willing to don Confederate uniforms, to fly Confederate and Nazi flags or to offer up some version of a Nazi salute. But these symbolic gestures remain much more than familiar rituals. They evoke a political movement that repudiated the freedom that Chamberlain fought for, and that Lincoln celebrated in his Gettysburg Address.

 

Gathering in a parking lot 25 miles west of Gettysburg, Wallnau cosplayed the Battle of Gettysburg to channel a specific kind of male fantasy, one that aspires to fight not just in an imagined past, but on behalf of a racist and authoritarian future. In that kind of role-playing, a Nazi salute fits right in. Ultimately, if American political movements decide to mimic Nazis, we should take them at their word.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184362 https://historynewsnetwork.org/article/184362 0
Monkeypox Has Been Around for Decades; This Outbreak is a Product of Neglect

A juvenile monkeypox patient photographed in Liberia, 1971.

Photo Centers for Disease Control and Prevention

 

 

Monkeypox is a zoonotic DNA virus first described in humans in 1970. At the time, a 9-month-old patient unvaccinated against smallpox presented to a hospital in the Democratic Republic of the Congo (DRC) with pus-filled bumps resembling smallpox. This similar symptom reflects the shared genus of both causative viruses under the orthopox designation. Despite the monkeypox confirmed case count rising to 47 in sub-Saharan Africa by 1979, with seven proving fatal, the World Health Organization (WHO) Global Commission for the Certification of Smallpox Eradication dismissed its importance citing ineffectual human-to-human transmission. Monkeypox subsequently migrated to other continents and infected 65,415 people globally as of September 20, 2022. As WHO Director Tedros Adhanom Ghebreyesus now asseverates, “this virus has been circulating and killing in Africa for decades. It’s an unfortunate reflection of the world we live in that the international community is only now paying attention to monkeypox because it has appeared in high-income countries.”

 

When the DRC encountered the world’s first monkeypox case, the nation was already tackling civil and economic turbulence and a waning population immunity from a concurrent HIV outbreak. Neglect from wealthy countries and institutions of power cemented the ensuing shortage in vaccines, medical care, and research funding. The Global North even hoarded vaccines after the eradication of smallpox in preparation for potential leaks from biological laboratories, leading to a shortage in orthopox vaccines across the DRC. According to Harvard Medical School professor Bruce Walker, “[e]very person that becomes infected with the virus gives the virus another chance to mutate. Aside from just the humanitarian issues, there’s an additional issue of trying to prevent a worse pandemic from arising.” Pandemics thus offer a lesson policymakers never learn: isolationist responses to global health threats might appear to benefit powerful nations. In reality, however, such myopic strategies endanger everyone.

 

Monkeypox went international in 2003 when a Ghanaian exotic animal shipment housing infected rodents arrived in America. An Illinois animal distributor housed these animals alongside prairie dogs, which became infected and transmitted the virus to 70 humans. Instead of redirecting resources to tackle monkeypox in Africa, American epidemiologists instead focused on the potential domestic impacts of this real foreign problem. For example, American scientists focused on checking animals in their own country without bothering to locate the original wild animal reservoir in the resource-poor source country. Though epidemiologist Anne Rimoin notes how “we narrowly escaped having monkeypox establish itself in a wild animal population,” this intense focus on the potential for infection in wealthy countries rather than its reality in Africa reflects a colonial understanding of disease as belonging within certain bodies and borders. By centering Western welfare and failing to intervene in the vulnerable areas of active infection, the continued inaction of the Global North overseas kept the engine of monkeypox variant production running full speed.

 

Monkeypox is now endemic in 10 countries across Sub-Saharan Africa, and is present in over 50 countries worldwide. With powerful nations finally threatened, the WHO has begun its race to catch up to this virus, recognizing the disease as a “public health emergency of international concern.”

 

Too late.

 

Even after proclaiming monkeypox as a public health emergency, WHO member nations stockpiled 31 million orthopox vaccine doses and refused to distribute them to African nations, claiming the potential for unforeseen side effects. The real side effect of this policy? The mutated Central African monkeypox virus strain mortality rate of about 10% now dwarfs the corresponding 1-3% rates among strains endemic to Central and West Africa. In the same vein, wealthy countries only acknowledged the recent COVID-19 pandemic when it reached their shores and again hoarded boosters from unvaccinated countries across the Global South. Despite a surplus of vaccines, this similarly lackluster international effort to vaccinate Sub-Saharan Africa led to the spread of the more easily transmissible Omicron variant. We keep paying the price of neglect, but refuse to learn from our mistakes.

 

This apathy towards the country is reminiscent of the slow and uncoordinated global response to the 2017 Zika mosquito-borne virus outbreak from Brazil, which caused abnormal fetal brain development in pregnant women. As the New York Times reported in 2017, “while tourists were warned away from epidemic areas, tens of millions of women living in them were left unprotected.” Scarcity of clean drinking water, lack of healthcare in rural environments, and sexual and reproductive restrictions all amplified these adverse health outcomes.

 

Though the Brazilian government was roundly criticized for inaction, the broader international community must be held similarly responsible for the spread of Zika virus. According to Lawrence Gostin, director of the O’Neill Institute for National and Global Health Law at Georgetown University, “Latin America was pretty much left to its own devices.” This them-and-us apathy from wealthy countries might have been due to the reliance of the Zika virus on mosquito vectors, which prevail in regions with stagnant water, humid climates, and poor infrastructure. Powerful public health organizations and nations without these conditions subsequently adopted a colonial understanding of disease as belonging to peripheral countries oceans away from them.

 

Beyond thus devoting scant resources for vaccine development, the WHO and Centers for Disease Control and Prevention (CDC) did not issue formal guidance for women to postpone pregnancy or seek an abortion if an ultrasound revealed fetal microcephaly or brain damage. This lack of advice might have been a defensive tactic to ensure minimal criticism over violating women’s autonomy from the political left or religious values from the right. As a result of this limited guidance, many gynecologists felt compelled to deviate from official recommendations and privately suggest abortion to these patients. Despite such interventions on the individual level, an estimated 2,500 Brazilian children were born with life-threatening microcephaly during the Zika pandemic, with evidence of mosquito-transmitted infection reported in 86 other countries. Authorities tackling other pandemics like monkeypox must now reflect on this failure and allocate resources swiftly and generously to resource-poor countries to prevent undue suffering everywhere.

 

For the 21 million immigrants streaming into Ellis Island between 1892 and 1924, one of the most feared government officials were physicians of the U.S. Public Health Service. After disembarking, a physician would inspect bodies for atherosclerosis, rheumatic fever, and lung problems. As the immigrants continued walking, another physician would watch intently for posture, muscle weakness, and lameness. Later, yet another physician would inspect people’s faces and necks for asymmetry, defects, goiters; nail, skin and scalp for fungal infections; and mental acuity or psychopathic tendencies. Finally, the most feared exam would come - an inspection of the eyelids for trachoma. Using a hook, doctors would inspect for the blindness-producing disease, easily cured today with antibiotics. Then, with one word, immigrants, ultimately totaling 200,000, would be returned to their country of origin with hopes of a better future dashed. Because of this medical neglect for outsiders to the US, these returned immigrants were free to spread observed communicable diseases within their home country. Note that these home countries did not likely did not share a similarly robust healthcare infrastructure to the US.

 

Modern viral molecular phylogenetics reveal that HIV arrived in New York City around 1969 from Haiti. Yet, it did not become significantly discussed until the 1980s or significantly addressed until the 1990s. However, HIV emerged sometime between the 1920s and 1940s in the Congo, primarily impacting the rural communities under Belgian rule. Because of the lack of medical infrastructure and neglect by Belgian officials, the viral spread remains unnoticed, despite the unusual spike in deaths of young and otherwise healthy individuals. NYU Professor Dr. Joseph Osmundson in an NYU news story that a country that provides its citizens access to healthcare “would notice” a rise in unexpected deaths due to rare cancers and rarely lethal pathogens. Of note, when the Congolese gained independence in 1960, there were 0 Congolese doctors due to a history of colonial legislation forbidding Congolese natives from education beyond the fifth grade. As a result, foreign health interventions require education initiatives in target communities to produce more informed citizens and healthcare leaders. These societies, if enjoying greater education, might prove more receptive to foreign interventions while remaining trained and ready to deter the potential encroachment of power from the Global North.

 

With global warming increasing the prevalence of pandemics, we must better prepare ourselves to fight these disasters. Despite this rising pressure to perform cooperatively, countries in the Global North are still more willing to vaccinate away any possibility of infection on domestic shores before offering such life-saving supplies in resource-poor viral epicenters to combat variant generation overseas. We thus encourage wealthy countries and influential organizations to graduate their worldviews beyond the Global North to crush future pandemics at the source. Global interests cannot continue operating reactively and defensively, hoarding vaccines and ignoring threats until they arrive at domestic shores. A proper active and offensive strategy entails prioritizing funding for researchers, educators, and communities on the front lines.

 

On the battlefield of public health, everyone is conscripted to fight a common viral enemy. As infected bullets fly from the barrels of noses and viral gas attacks drift over crowded areas, saving others first might not be a popular option among anxious soldiers and the generals that risk mutiny with every choice. Regardless, past pandemic missteps revealed booby traps lining a clear path forward with monkeypox: if we continue failing to support our world’s most vulnerable regiments, inaction will continue to drive variant generation and heave the world into unnecessary suffering.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184413 https://historynewsnetwork.org/article/184413 0
"Divisive Concepts" Bans Will Undermine Teaching Some of the Values Conservatives Claim to Uphold

From Jacob Lawrence, "Douglass Argued Against John Brown's Plan to Attack the Arsenal at Harper's Ferry"

Image National Archives and Records Administration (NARA)

 

 

How to teach Black history in K-12 schools has been a contentious issue in recent years, especially because of the cultural impact of the George Floyd protests. Republican politicians are alleging a vast conspiracy, cooked up by radical left-wingers, to corrupt traditional narratives of American history through what they describe as “critical race theory.” Right-wing activists have framed it as an attempt to indoctrinate students across the country into “Marxist” thought, “destroying” the foundations on which the United States was built.

Such talk has provoked state legislatures across the country to ban “critical race theory” from being taught in their schools. Many of these bans, passed and proposed, are ambiguously worded, possibly putting the teaching of Black history in jeopardy. They purport to ban the teaching of any history deemed “divisive” or “controversial” to certain groups.

There is, of course, no reason that teaching Black history will automatically evoke those feelings. Take, for example, the history of the abolitionist movement in the United States, and specifically Frederick Douglass. The movement, with Douglass at the forefront, led the effort to end the institution of slavery in this country, and it met considerable resistance throughout the country, culminating in a deadly and destructive civil war. While studying this era may bring up painful thoughts, figures of inspiration and patriotism also emerge in a careful reading of this time.

Frederick Douglass is a complex figure who offers traits that could find admiration on both the right and left. Born into slavery, he escaped bondage and became a noted writer and orator who attracted crowds, Black and white, from across the country. Despite enduring a childhood of oppression, Douglass came to embrace the principles embedded in the Constitution, arguing that the document was a defense of freedom of all peoples. At a time when many free Blacks were arguing that brighter futures for the race lay outside the country, Douglass argued fiercely against that notion. He believed that Black Americans should stay within the United States and fight for their freedoms and liberties here, even encouraging Blacks to voluntarily enlist in the Union Army during the Civil War. In a recruiting speech in Philadelphia in 1863, he said that he “[held] that the Federal Government was never, in its essence, anything but an anti-slavery Government…such is the Government, fellow-citizens, you are now called upon the uphold with your arms.”

The idea that the Constitution ultimately would doom slavery is a common refrain among many conservative academics today, and preventing such a speech from being studied in schools would ironically be a disadvantage to their cause. With respect to recruiting Black soldiers, Douglass stated to his crowd that “this is no time for hesitation…the hour has arrived, and your place is in the Union Army…in your hands [the] musket means liberty.” Second Amendment enthusiasts, who are often behind these bans on “critical race theory,” would do well to listen to Douglass’ words here and see what kind of figure he was. His liberal appeals for reform offer something for the left, and his dedication to constitutionalism offer something for the right.

Notably, near the end of the Civil War, abolitionists held in Syracuse, New York to decide the best course for Blacks in America once the war concluded. Douglass spoke of “[promoting] the freedom, progress, elevation, and perfect enfranchisement, of the entire colored people of the United States; to show that, though slaves, we are not contented slaves, but that, like all other progressive races of men, we are resolved to advance in the scale of knowledge, worth, and civilization, and claim our rights as men among men.” A resolution was also made stating “that as natives of American soil, we claim the right to remain upon it: and that any attempt to deport, remove, expatriate, or colonize us to any other land, or to mass us here against our will, is unjust; for here were we born, for this country our fathers and our brothers have fought, and here we hope to remain in the full enjoyment of enfranchised manhood, and its dignities.” Those abolitionists who participated saw their future was in the United States, and the best course for Blacks was to use the Constitution to fight for their rights and liberties here.

The end of the Civil War did not bring all aspirations for equality into reality for Black Americans, but Douglass still spoke of upholding the Constitution’s values. In a speech reflecting on the Civil War, amid the rise of the Lost Cause of the Confederacy, Douglass emphasized the importance of remembering what the war was fought over. He stated “the American people will, in any great emergency, be true to themselves. The heart of the nation is still sound and strong, and as in the past, so in the future, patriotic millions, with able captains to lead them, will stand as a wall of fire around the Republic, and in the end see Liberty, Equality, and Justice triumphant.” Even when history was seemingly making a wrong turn, with white supremacy making a resurgence resulting from deliberate undermining of Reconstruction policies, Douglass saw unity between all Americans regardless of race on the horizon. Republicans today promote themselves as being for unity and equality even in the face of adversity, and if they wish to make a valid case for that argument, they can harken back to Douglass’ words on this matter.

Abolitionists embraced what the United States had to offer despite difficult times, and it would be a shame if their history were subsumed under these Republican-proposed bans. Frederick Douglass certainly offers a unifying vision for those on each side of the political spectrum. Anyone arguing that preventing grade school students from learning this history is “protecting” them from harm would do well to read more into what they are trying to ban. It is a disservice to our young to prevent them from learning how our founding principles were embraced by those free and enslaved, and to teach us all how we can work together to build a brighter future for all Americans.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184414 https://historynewsnetwork.org/article/184414 0
The Roundup Top Ten for November 11, 2022

Why King Oil Rules American Politics

by Meg Jacobs

Politicians have repeatedly faced imperiled election prospects because of high gas prices, yet remain unwilling to take serious steps to unwind the oil dependency built into the American environment. 

 

Asian Americans Helped Build Affirmative Action; Today Some are Working to Dismantle It

by Ellen Wu

The midcentury rise of fascism and struggles for Black civil rights gave some Japanese American activists an opening to argue for principles of proportionality in ethnic representation in politics, education, employment, and other areas, key support for the group of policies that became affirmative action. 

 

 

The 1968 Tuskegee Student Uprising and the Moral Force of the Black University

by Brian Jones

"In a moment when Black studies and Black history are widely under attack, it can be useful to remember that Black student activists have historically been at the center of fights to democratize higher education and expand the curriculum."

 

 

Miami Was Once a Model for Diversity Training—But it was Always Controversial

by Catherine Mas

Beginning in the 1970s, Miami-Dade County led the way in exploring ways to train health and human services professionals to interact with the area's diverse populations with respect and effective communication. The conflicts exploited by the "Stop WOKE" Act date back to this period. 

 

 

Human Beings Make Elections Work – Were You Kind to Poll Workers Today?

by Amel Ahmed

Overworked, underpaid, and now harassed and even threatened: election workers are the backbone of democracy and the nation can't afford to have them pushed out of their jobs. 

 

 

Critical Minerals and Geopolitical Competition

by Gregory Brew and Morgan Bazilian

Can developed nations decarbonize without exacerbating the geopolitics of resource extraction as demands for critical minerals conflicts with local labor, environmental, and human rights protection? 

 

 

In "God Forbid" the Falwells Epitomize Christian Hypocrisy

by Anthea Butler

The story of the Falwells and the scandal that undid their empire isn't about sexuality, but about the Evangelical compulsion to push back social change and the elevation of power over morality. 

 

 

The Fascism Debate is Over; Fascism Won

by Jonathan M. Katz

Academic hair-splitting about the applicability of the F-word to the MAGA phenomenon has not served the cause of democracy well. 

 

 

"Bolsonarismo" After Bolsonaro: Lula's Return and Antifascist Organizing in Brazil

by Sean Purdy

Entrenched support for Bolsonaro in Brazil's police, military, and institutions mean that the left will need to sustain grassroots mobilization to prevent the right from sweeping back into power. 

 

 

No, Liberal Historians Can't Tame Nationalism

by Eran Zelnik

Liberal historians confronted with both right-wing nationalism and renewed "history wars" have tried to thread a needle by telling a positive story of nationalism. The author contends the exclusionary and belligerent aspects of nationalism can't be domesticated by surrounding them with the right narrative.

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184409 https://historynewsnetwork.org/article/184409 0
Russian Soldiers' Calls Home Echo Moral Injury Testimony of Vietnam Vets

Members of Vietnam Veterans Against the War offer testimony at a panel on drug abuse at Faneuil Hall, Boston, October 10, 1971. 

Albertson, Jeff. Vietnam Veterans Against the War Winter Soldier Investigation: Faneuil Hall audience from behind panel, October 10, 1971. Jeff Albertson Photograph Collection (PH 57). Special Collections and University Archives, University of Massachusetts Amherst Libraries

 

 

On this sixty-eighth Veterans Day, the American citizenry will thank its veterans for risking their lives to protect the United States and its freedoms.

Countless veterans also deserve the nation’s thanks for serving wholly voluntary tours of duty since reentering civilian life. 

Many veterans have and continue to provide other veterans with life-saving mental health assistance.  Others have become, and are still serving as, antiwar activists.  In the past half-century, a great number have done both, as best evidenced by the actions and activism of the organization Vietnam Veterans Against the War (VVAW).  By 1970, and with help from psychologists like Dr. Jonathan Shay, the VVAW membership came to understand that stopping future carnage is the only way to alleviate crushing feelings of guilt.  At great personal risk to their relationships, their reputations, and even their job prospects, these Vietnam veterans bravely stepped out from behind the warrior image that pervades our popular culture and revealed their broken hearts. 

What the VVAW membership said at the organization’s Winter Soldier Investigation from January 31 to February 2 and in front of the Senate Foreign Relations Committee on April 22 in 1971 is all the more striking today when we consider how closely their public revelations match what Russian soldiers recently said in private phone calls to their loved ones about President Putin’s invasion of Ukraine’s capital city of Kyiv. (These phone calls were intercepted by Ukrainian officials and, on September 28, 2022, printed in translation by The New York Times.)

Once again, soldiers are learning that the casualties of war are never just the fighting forces who are injured or killed.  The soldiers who physically survive suffer what Dr. Shay calls a moral injury which, although imperceptible on the outside, can be life-threatening. 

Here are some of the moral injuries enumerated by Vietnam veterans and recently suffered by Russian soldiers. 

 

Warning: soldiers often use graphic language to describe their experiences.

 

Soldiers are asked or forced to kill civilians.

Michael Kenny, US Marines: “Circumstances would come up where there would be a patrol walking along, a single person or a small group of persons would be sighted at a distance of anywhere from, like, one to maybe five hundred meters. The standard procedure was to holler ‘Dong Lai!’ which is ‘Stop.’ A lot of times the civilians or Vietnamese couldn't hear at that distance and if they didn't respond immediately, the procedure was to have the squad or platoon open up on these people. Upon approaching the bodies, it was usually found that these people had no weapons at all; that the only reason they hadn't stopped was that they hadn't heard or were frightened.”

Sergey, a Russian soldier in Ukraine: “They told us that, where we’re going, there’s a lot of civilians walking around.  And they gave us the order to kill everyone we see….  I’ve never seen so many corpses in my fucking life.  It’s just completely fucked.  You can’t see where they end….  I’ve already become a murderer.  That’s why I don’t want to kill any more people, especially ones I will have to look in the eyes.”

 

Ineffective military leadership leaves soldiers vulnerable.

John Kerry, US Navy: “Where is the leadership? We are here to ask where are [former Defense Secretary (1961-1968) Robert] McNamara, [former National Security Adviser (1966-1969) Walt Rostow, [former National Security Adviser (1961-1966) McGeorge] Bundy, [former Deputy Defense Secretary (1961-1964) Roswell] Gilpatric and so many others….  These are commanders who have deserted their troops, and there is no more serious crime in the law of war.”

Aleksandr, a Russian medic sent to Ukraine with the 237th Airborne Regiment: “Putin is a fool.  He wants to take Kyiv.  But there’s no way we can do it.”

Roman, a Russian soldier in Ukraine: “Fucking higher-ups can’t do anything.  Turns out, they don’t really know anything.  They can only talk big in their uniforms.”

 

Military leaders consider soldiers dispensable.

Christopher Soares, US Marines: “We lost, I'll be very conservative, at least 50% of these 2,000 men in this operation [Operation Dewey Canyon, an American military invasion of the neutral country of Laos]….   I remember an incident in which… two squads got ambushed one right after the other and wound up with three men killed and fourteen wounded and not one enemy soldier killed. And that's the way we fought in Laos. I mean, like, just everybody was being killed, left and right, and they called this operation a success.”

Unnamed Russian soldier in Ukraine: “Dear, I really want to go home.  I'm so fucking tired of being afraid of everything.  They brought us to some fucking shithole.  What are we fucking waiting for? To be fucking killed?”

Sergey: “There were 400 paratroopers. And only 38 of them survived.… Because our commanders sent soldiers to the slaughter.”

 

Soldiers are expected to treat the people they were sent to conquer as less than human.

Scott Camil, US Marines: “When you shot someone you didn't think you were shooting at a human. They were a gook or a Commie and it was okay. And anything you did to them was okay because, like, they would tell you they'd do it to you if they had the chance.”

Nikita, a Russian soldier sent to Ukraine with the 656th Regiment of the National Guard: “Everything was fucking looted.” 

 

There is no justifiable reason for the violence.

Sean Newton, US Marines: “The Communist threat was brought up time and time again [during Basic Training], like, you had to go over there and do this thing so that they wouldn't come invading the United States, make a beach landing, or something or other.”

Sergey: “Mom, we haven’t seen a single fascist here....  This war is based on a false pretense. No one needed it.  We got here and people were living normal lives….”  

 

These moral injuries destroy soldiers’ faith in their country and its institutions.

Don Duncan, US Special Forces: “The [American] men keep getting killed. And every day the rage builds up, and the hate grows a little harder. And that rage must vent itself. And who do we blame this rage upon? The captain that gave the order to attack? The people that sent them over there so the captain could give them that order? Or the people who are shooting at you? The Vietnamese are shooting at you, and fuck it, you'll kill Vietnamese, that's what you're in Vietnam for. So that terrible hatred spills out. And the whole thing not only destroys Vietnamese. It destroys the people who are destroying the Vietnamese.” 

Vlad, a Russian soldier in Ukraine: “Fuck the army.”

 

President Nixon was unable to fulfill his goals in Southeast Asia because scores of American soldiers on the ground evaded or refused orders and thousands of others who made it home chose to reveal the truth about the illegal and immoral nature of the United States’ aims and its military strategy.  The intercepted phone calls published in The New York Times are reason to hope for a similar backlash in Russia.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184347 https://historynewsnetwork.org/article/184347 0
The Wartime Service and Postwar Activism of One Latino Veteran

Henry Romo, in Leyte Island. Photo found in the 1990s by Henry Romo, Jr.

All photos accompanying this essay from Ricardo Romo collection.

 

 

On Veterans Day every November 11, we recognize the men and women who have served our country in war and in peacetime. World War II stands out as a particularly challenging period when well-coordinated and well-armed armies led by sociopathic leaders of Germany, Japan, and Italy threatened the very existence of the Western civilization.

My dad, Enrique [Henry] Romo served in World War II and seventy-five years ago he stored a batch of his war photos in a small cigar box at the bottom of my mom’s cedar chest. Until three weeks ago, I had never seen these pictures, and my mom, Alicia Saenz Romo, never mentioned the box of photos. Several other photos were placed in a small bag which we discovered a few years before he died in 2005. 

 

My dad, Henry Romo, with a captured Japanese flag.

The cigar box photos are unusual because they show soldiers in various combat zones in the Pacific theater, combat activities the soldiers could not have shared until after the war. But even after the war, some of the veterans chose not to discuss their experiences. The account of why many veterans kept the military experiences to themselves says much about the horrors of war and the desire to put those difficult times behind them.

World War II required a massive mobilization effort. Millions of Americans, whether engaged in essential duties of manufacturing arms and equipment or active in fierce naval, air, or ground battles, contributed to a victorious outcome for the Allied Forces. Many who served in the U.S. military returned to their homes after the war committed to the economic and political development as well as civic betterment of their communities. Others who had fought valiantly lost their lives in this conflict. 

My dad, along with thousands of American soldiers fighting in the Pacific region, witnessed some of the bloodiest battles of the war. Military historians estimate that more than 30 million soldiers and civilians were killed in the Pacific conflict during the course of WWII, compared with 15 million to 20 million killed in the conflict in Europe. 

Journalist Tom Brokaw’s prize winning book The Greatest Generation is a brilliant account of men and women “who came of age during the Great Depression and the Second World War and went on to build modern America.” Brokaw interviewed hundreds of veterans who served in World War II and concluded that the generation “was united not only by a common purpose, but also by common values--duty, honor, economy, courage, service, love of family and country, and, above all, responsibility for oneself.”

 

Henry Romo on a bombed out railway car. 

 

My dad is one of those Americans who answered the call, as Brokaw described, “to save the world from the two most powerful and ruthless military machines ever assembled, instruments of conquest in the hands of fascist maniacs.” Many American historians are convinced that the men and women who fought in WWII were in fact a remarkable generation “because they succeeded on every front. They won the war; they saved the world.” 

My dad, a veteran and lifetime resident of San Antonio’s Westside, seldom mentioned his World War II service to his country, and never thought of himself as belonging to America’s “Greatest Generation.” 

Dad volunteered for the Army Air Corps shortly after the attack on Pearl Harbor. He saved the documents showing that he officially joined the Air Corps, a precursor to the U.S. Air Force, on October 23, 1942. A year earlier, at age 23, married and the father of one son, dad graduated from high school. He dropped out of Lanier High School at age 15 and returned to high school classes in his early twenties to finish his education at the San Antonio Technical and Industrial night school program. He studied radio repair and upon enlisting in the military, he was assigned to the United States Army Air Corps radio repair training program. A photo he saved may have been taken on the steps of his old high school showing the group of new recruits.

 

Henry Romo, with a friend on the remains of a downed Japanese airplane.

 

My dad and my mom, Alicia, along with their newborn son Henry Jr., were sent to Austin in early 1943. My dad was to continue his radio repair course. Over the next six months dad took classes taught in a University of Texas barracks loaned to the U.S. military. Dad became highly proficient in repairing radios and hoped that his radio skills would enable him to fly with the bomber squads as a radio technician. 

He never got to fly with the bomber squads. He was fortunate because the vast majority of the bombers that flew out of the Phillipines never returned to base. I once heard him remind a war buddy who served with him in the same squadron that after the company’s cook [Pfc. Miles H. Clayton] was killed in a bombing raid, my dad was assigned to help out in the base kitchen. Dad was modest and laughed the episode off. In reality, the photos he saved show that he was also engaged in field battles in the Philippine jungles.

 

Henry Romo and buddy in combat. 

 

Dad spent some time in New Guinea, but he was deployed the last two years of the war to the Island of Leyte, a major island of the Philippines. Scholars have declared the Battle of Leyte as the last major battle of the Pacific. The defeat of the Japanese forces at Leyte demonstrated the U.S. superior forces and weapons. Brig. General J.V. Crabb, Commander of the 5th Bomber Command, distributed a booklet to all the men who served with him. The booklet, designed and published at the base, included photos, drawings, and written accounts of some of the battles described by members of my dad’s squadron. Several of the photos and stories give observers a close-up look at how the U.S. military fought.

 

Henry Romo and friend on a captured Japanese tank.

Brokaw’s interest lay in writing about the men and women at war as well as what their lives were like when they returned home at the end of the war. Henry Romo came home from the war to find that discrimination against Latinos had not abated during his time away. After a Latino family learned that the local funeral home in Three Rivers, Texas would not handle the burial of Pvt. Felix Longoria—because the local white community might object—my dad joined Dr. Hector Garcia in a newly formed Latino organization, the American G.I. Forum. The Forum successfully gained the support of the newly elected U.S. Senator Lyndon B. Johnson, who arranged for Pvt. Longoria, who had been killed in the Philippines, to be buried at Arlington National Cemetery outside Washington, D.C.. My dad remained active with the G.I. Forum for many years, serving on the National Board and as director of the San Antonio regional chapter.

 

Henry Romo, in Leyte Island. 5th Bomber Squad.  Photo found in the 1990s by Henry Romo, Jr. 

 

My dad was proud of serving his country. He was not unusual as a veteran in not talking about his war experience, especially to his family. He never discussed his military service at any family gathering. The one time we heard about his WWII experience was when a former military buddy dropped by to visit with him at his small grocery store in the Westside of San Antonio. On Veterans Day, I am happy to share some of the WWII photos that my dad treasured but never discussed with his family. 

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184391 https://historynewsnetwork.org/article/184391 0
Doug Mastriano's Political Mad Libs

Doug Mastriano wears a Confederate uniform in a 2013 U.S. Army War College faculty photo

 

 

On Sept. 16, Doug Mastriano, the Republican nominee for governor in Pennsylvania, held a rally in Chambersburg, Penn. featuring Donald Trump, Jr. But the rally’s viral moment occurred when Lance Wallnau, a self-proclaimed evangelical leader and futurist, asked the crowd to put their right hands in the air while he commemorated the Battle of Gettysburg. It was a historical mad libs moment, combining a motion associated with Nazi Germany and Civil War nostalgia. Meanwhile, Mastriano himself has been photographed wearing a Confederate uniform. A Mastriano adviser accused those critical of Mastriano’s Confederate uniform as seeking to “erase history” and invited reporters to go on a Gettysburg tour with the candidate, suggesting that they would “learn a lot.”

 

By playing with this sort of historical quotation—in word and deed—politicians and activists invite people who might embrace extremism to do so. The use of such symbols is a warning sign that democratic norms are crumbling.

 

But those who would defend democracy in 2022 can draw on history, too. We can learn from past assaults on democracy, whether during the Reconstruction-era United States or the Weimar Republic in interwar Germany, that such symbols matter.

 

After the Civil War, U.S. forces occupying the South banned the wearing of Confederate uniforms as an effort to quash the continued armed resistance to federal authority. While Congress worked out the details of Section three of the 14th Amendment that disfranchised former rebels, the Army recognized that the performance of Confederate rituals such as wearing the uniform was part of a campaign to intimidate Black voters and threaten Black officeholders building local governments. The government determined that these were not empty rituals but were powerfully connected to the willingness to resort to White supremacist violence.

 

Ex-Confederates also recognized the power of symbols. And they sought to co-opt Lincoln’s memory. They justified their opposition to the civil rights of African Americans, for example, by portraying the murdered president as sympathetic to the plights of poor southern Whites who were “crushed” by Reconstruction policies — the “Great Heart” as he was remembered in the racist 1915 movie, “Birth of a Nation.”

 

After Reconstruction ended, symbols became powerful tools in the hands of those working to reverse the progress made toward a biracial democracy.  Confederate veterans and women’s groups promoted a Lost Cause version of history that sought to establish white supremacy as the Civil War’s legacy. Their progress toward this end could be measured by the monuments and Confederate flags that began appearing everywhere in the early twentieth century. Reviving these once outlawed symbols was dangerous then and now.

 

In the 20th century, democracy’s defenders continued to recognize the power of symbols. For example, as the Nazi threat emerged in Germany in the early 1930s, public displays were subject to similar scrutiny.

 

In June 1930, the state of Prussia (of which Berlin was part) banned the wearing of Nazi stormtrooper (SA) uniforms. This regulation sought to restrict the visible performance of Nazi force and to control the violence that often accompanied such demonstrations. Even after the 1930 ban, police in Berlin regularly encountered young men wearing khaki trousers or white shirts. At what point did their sartorial decisions cross over into a banned political act? Records in the Berlin State Archive testify that police ultimately decided that whenever three or more people gathered in similar clothing that “diverged from a civilian norm,” they should be cited as violating the Prussian state’s restrictions on the Nazi paramilitary organization.

 

German authorities’ opposition to the Nazis proved rather half-hearted. Within two years, uniformed stormtroopers were legally back on the streets, and German conservatives used an emergency decree to take over the Prussian State. They justified their assault on a democratic government as a defense of national order. But their tolerance of Nazi violence helped foster a public embrace of a politics that would culminate in genocidal war.

 

During a 1973 performance at Cologne’s annual Carnival, a West German comedian warmed up the crowd by leading it through a series of familiar call and response rituals. When he concluded by yelling “Sieg,” the crowd obligingly (unthinkingly?) responded, “Heil!” As the crowd laughed, not all that uneasily it seems, he expressed ironic surprise that there were so many “old comrades” in the audience. Although he quickly moved on to his next bit, the power of his humor was its ability to call out the implicit reality of 1970s West Germany: the rituals of Nazi Germany remained disturbingly familiar to many of its citizens.

 

These examples are a reminder that how people choose to present themselves for public interpretation matters, and historians should draw attention to actions that tend to elude scrutiny in the relentless churn of the news cycle.

 

After the Mastriano rally, Wallnau dismissed critics who identified his actions as fascist. He claimed that the viral clip had merely captured him leading the attendees in a prayer to commemorate Joshua Chamberlain’s famous bayonet charge down Little Round Top on July 2, 1863. Audience members raised their arms as Wallnau recited, “Pennsylvania will be like Little Round Top and America will have a new birth of liberty.”

 

His words called to mind President Abraham Lincoln’s visit to Gettysburg to dedicate the cemetery there four months after the battle. He urged listeners to defeat the Confederacy and give the nation “a new birth of freedom.” The freedom Lincoln referenced was emancipation. But Wallnau’s choice to replace “freedom” with “liberty” plucked the battle out of the context of a nineteenth century war over human freedom and placed it into a twentieth-first century contest over supposed government overreach and the right of (violent) revolution.

 

The power of symbols — and of symbolic gestures — is that they never possess just one meaning. Lance Wallnau can always claim, with a veneer of plausible deniability, that the action he was orchestrating at that rally in Chambersburg was not a visible evocation of the Nazi salute. But we can’t permit Wallnau or Mastriano simultaneously to claim historical expertise and historical ignorance.

 

Like that Cologne crowd in 1973, many Americans in 2022 may be willing to discount as unimportant that many in the United States are willing to don Confederate uniforms, to fly Confederate and Nazi flags or to offer up some version of a Nazi salute. But these symbolic gestures remain much more than familiar rituals. They evoke a political movement that repudiated the freedom that Chamberlain fought for, and that Lincoln celebrated in his Gettysburg Address.

 

Gathering in a parking lot 25 miles west of Gettysburg, Wallnau cosplayed the Battle of Gettysburg to channel a specific kind of male fantasy, one that aspires to fight not just in an imagined past, but on behalf of a racist and authoritarian future. In that kind of role-playing, a Nazi salute fits right in. Ultimately, if American political movements decide to mimic Nazis, we should take them at their word.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184362 https://historynewsnetwork.org/article/184362 0
Understanding the Political Power of Nixon's "Silent Majority"

 

 

Whenever someone wants to represent "the Sixties" in a movie or historical documentary, they'll exhibit a stock gallery of sounds and pictures:  civil rights marches; Haight-Ashbury hippies; Woodstock; anti-war rallies; the Jefferson Airplane's "White Rabbit."  The scenes and the songs have been widely disseminated and are now taken as a universal précis of the entire period.  But as in any depiction of reality, something is lost between the actual and the image, and one of the most celebrated and certainly most influential critiques of this came not from sages of mass communications like Marshall McLuhan or Neil Postman, but from the President of the United States.

Richard Nixon's address from the Oval Office on November 3, 1969 is remembered today for his appeal to the "great, silent majority" of Americans for their backing of a continued US military commitment in Vietnam.  "I know it may not be fashionable to speak of patriotism or national destiny these days," he said, "but I feel it is important to do so on this occasion."  In the decades since the speech - which seems to have been mostly written by its speaker - critics have accused Nixon of making a veiled threat from the many against the few, or of firing an opening shot in the culture wars, in which citizens are increasingly divided by calls to "take back" their country from others who've supposedly stolen it.  "With what seemed like a mere throwaway line," assessed historian Scott Laderman in the Washington Post fifty years later, "Nixon gave birth to a moniker that quickly came to encapsulate the modern era's burgeoning reactionary movement."  Yet it's just possible to reinterpret Nixon's message not as a cynical political stratagem but as an unintentionally prophetic mix of demographic analysis and media theory.

Nixon's key insight in the Silent Majority speech of 1969, perhaps, lay not in his noun but in his adjective.  Amid the din of headlines, top stories, special reports, and all manner of publicity - already apprehended by, among others, Daniel Boorstin in his landmark 1962 book The Image:  A Guide to Pseudo-Events in America - he suggested that it was easy to forget the hard numbers that indexed overall popular sentiment.  This was an idea he'd previously raised when accepting the Republican nomination in August 1968, citing "the forgotten Americans, the non-shouters, the non-demonstrators" as the party's base.  That year Nixon had eked out a narrow win over his Democratic rival Hubert Humphrey (31,783,783 votes to 31,271,839), and a Gallup poll of November 1969 showed 64 percent of American adults and 50 percent of college students approved of the president's Vietnam policies.  Not overwhelming figures, of course, but they arguably delivered a clearer gist of the national outlook than anything obtained from the nightly news, the op-ed page, or Top 40 radio.  The President may have been overconfident in his belief that most Americans were on his side. He was on to something, however, in his doubt that most of them were on television.

As more and more people since Gutenberg have learned about the world not through immediate experience but from information conveyed first through books, newspapers, and magazines, and then cinema, broadcasting, and sound recordings, and now the multimedia maelstrom of the internet, thinkers have warned that the conveyed information might distort the truth and deceive the people with slanted or sensationalized versions of what's really out there.  For a long time the warnings came from the left, via scholars like Noam Chomsky, who charged that the platforms of what we now call legacy media were tools of a ruling elite. It's a great irony of our age that this thesis has migrated from circles of academics and the urban hip to truck blockaders and small-town senior citizens.  Indeed, the TV networks and press empires we weren't supposed to trust in 1985 - because they were brainwashing the gullible hordes and manufacturing consent - have apparently become bastions of responsible journalism in 2022, according to the journalists they employ.  Meanwhile, the purportedly liberating potential of social media during the Obama presidency and the Arab Spring has curdled into panics over disinformation and cancel culture.   

This is the epistemic crisis, in which there's no longer one authoritative source of knowledge everyone can agree on, and where one person's Fair and Balanced is another person's partisan propaganda.  At the same time, we can reflect that even bestselling books and blockbuster films draw a minority of the potential market:  some entertainment may be a lot more successful than others, but Star Wars and Harry Potter are still probably known to but a fraction of the planet's population.  "Half of all advertising budgets are wasted," runs a Madison Avenue saying, "but no one knows which half."  The point is that at some level, we all make distinctions between the stories we are told by our analog or digital technologies, and what empirical evidence tells us directly.  

Here's where Richard Nixon comes in.  Rightly or wrongly, he charged in November 1969 that dissent was being manufactured and broad consensus suppressed.   For all his calculation, the Silent Majority speech may have just been his frustrated vent at what he saw as a skewed portrait of public opinion.  As an elected politician, he was obliged to serve a constituency, but how could he (or anyone) be sure that the constituency's mood was being accurately relayed?  Rock festivals and peace protests got a lot of attention, but how to measure the preferences of the people who didn't attend them?  If news is not made when a dog bites a man, but when a man bites a dog, who'll warn that the first happens a lot more than the second?  The American model of democracy had been designed in a quieter time, when vox populi was audible mostly at the ballot box.  By 1969, though, it was heard in newsstand sales, box office grosses, Nielsen ratings, and Billboard charts, and it's gone on to reverberate in Google hits, Youtube views, and Facebook Shares.  That's a lot of noise drowning out any underlying signal.   Inevitably, a Daniel Boorstin, a Neil Postman, a Noam Chomsky - or a Richard Nixon - will remind us of the discrepancy.

Maybe Nixon's forgotten Americans weren't a genuine plurality of the American electorate (and, despite invocations by modern conservatives, maybe they're less so today) yet there is in any case a segment of the population that's rendered mute every time the interests of another segment, possibly smaller or less illustrative of the whole, are amplified by the media.  Specialists who crunch the data of air time or column inches might quantify the degree to which a given position or subject is disproportionately covered relative to another, while ordinary readers or viewers can themselves intuit when there's a politicized thumb on the scale. All of it undermines trust in the fairness of government and its responsibility to the common good. A Twenty-First Century Richard Nixon would go further, adding that lopsided and demonstrably false characterizations of the societal collective - whatever's behind them -  only serve to lead on, let down, and leave out the great, silent majority.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184301 https://historynewsnetwork.org/article/184301 0
The Schlesinger Diaries—A Gift to Historians that Keeps Giving

 

 

Fourteen years after the passing of Arthur Schlesinger, Jr., his diaries continue to provide historians with important new information. The latest beneficiary is John A. Farrell, whose biography of Ted Kennedy contains disturbing new details concerning the Chappaquiddick cover-up, which Farrell obtained by gaining access to unpublished sections of Schlesinger’s diaries.

 

My own experiences with Schlesinger and his diaries concerned a different American political leader, President Franklin D. Roosevelt. The information that emerged was deeply troubling, to say the least.

 

“We Have No Jewish Blood”

 

My first encounter with Schlesinger was related to a meeting that President Roosevelt held on August 4, 1939, with a political ally, Sen. Burton Wheeler (D-Montana). They discussed possible Democratic candidates for president and vice president in the event FDR did not seek re-election in 1940; Wheeler composed a memo for his private files recounting their conversation.

 

According to the memo, FDR dismissed the idea of vice president Jack Garner as the party’s presidential nominee on the grounds that he was too conservative: “[Roosevelt] said ‘I do not want to see a reactionary democrat nominated.’ The President said, ‘I love Jack Garner personally. He is a lovable man,’ but he said, ‘he could not get the n—- vote, and he could not get the labor vote’.” (Wheeler did not use the dashes.)

 

The president also expressed doubt about the viability of a ticket composed of Secretary of State Cordell Hull for president and Democratic National Committee chairman Jim Farley for vice president. Sen. Wheeler wrote:

 

I said to the President someone told me that Mrs. Hull was a Jewess, and I said that the Jewish-Catholic issue would be raised [if Hull was nominated for president, and Farley, a Catholic, was his running mate]. He [FDR] said, “Mrs. Hull is about one quarter Jewish.” He said, “You and I, Burt, are old English and Dutch stock. We know who our ancestors are. We know there is no Jewish blood in our veins, but a lot of these people do not know whether there is Jewish blood in their veins or not.”

 

The memo is located in Wheeler’s papers at Montana State University. The file also contains two letters sent to Wheeler from Arthur Schlesinger, Jr. in 1959. At the time, Schlesinger was working on The Politics of Upheaval, the final installment of his three-volume history of the New Deal. According to the letters, Sen. Wheeler sent Schlesinger a copy of his 1939 memorandum on the “Jewish blood” conversation with FDR. Schlesinger, after reviewing the memo, wrote to Wheeler that the document “offer[s] valuable sidelights on history.”

 

Nevertheless, Schlesinger never quoted FDR’s remarks about “Jewish blood” in any of the many books and articles he subsequently wrote about Roosevelt and his era. Ironically, in one of those articles (published in Newsweek in 1994), Schlesinger specifically defended FDR against any suspicion that he was unsympathetic to Jews; he approvingly quoted Trude Lash, a friend of the Roosevelts, as saying, “FDR did not have an anti-Semitic bone in his body.”

 

I wrote to Schlesinger, in 2005, to ask why he had withheld Roosevelt’s “Jewish blood” statement from public view. Schlesinger insisted he had done nothing wrong, since, in his view, Roosevelt’s remark was not antisemitic. “FDR’s allusion to ‘Jewish blood' does not seem to me incompatible with Trude Lash’s statement,” Schlesinger wrote me. “It appears to me a neutral comment about people of mixed ancestry.”

 

But if that were the case--if Roosevelt’s remark about Jewish blood was indeed “neutral” and not an expression of bigotry--then why did Schlesinger decide to suppress it? Why didn’t Schlesinger mention it in one of his published writings about FDR? After all, it certainly sheds interesting light on Roosevelt’s thought process in considering whether to run for a third term in 1940. Why didn’t Schlesinger at least acknowledge FDR’s statement when he himself raised the antisemitism issue in his Newsweek essay, and let readers judge for themselves?[1]

 

Bombing Auschwitz

 

My second Schlesinger experience involved his diaries. This episode concerned an exhibit at the United States Holocaust Memorial Museum, in Washington, regarding the refusal of the Roosevelt administration to bomb the railways and bridges leading to Auschwitz, or the gas chambers and crematoria in the camp itself.

 

When the museum opened, in 1993, the text accompanying that exhibit stated, accurately, that “American Jewish organizations repeatedly asked the U.S. War Department to bomb Auschwitz.” Historians have documented calls by thirty officials of Jewish organizations or publications for such bombings, as well as one instance in which a Jewish official supported bombing the railways but urged using ground troops instead of air strikes on the gas chambers for fear of hitting prisoners by accident.

 

Three years later, however, the museum quietly changed the text of that panel to: “A few Jewish leaders called for the bombing of the Auschwitz gas chambers; others opposed it….No one was certain of the results…” The museum never provided evidence from historians to support making that change. In fact, the change would never even have come to light if a reporter for a Jewish newspaper not caught wind of it.

 

An accurate text in the museum had been changed to an inaccurate one. Not only was the new wording inaccurate, it carried a significant broader implication—if only “a few” Jewish leaders favored bombing, while “others” (which sounds like a comparable number) opposed it; and if nobody could be “certain” of the results; then nobody today can reasonably criticize the Roosevelt administration for not bombing the death camp. In short, changing the panel got FDR off the hook.

 

It is important to emphasize that the positions taken by Jewish leaders concerning bombing Auschwitz had no actual impact on U.S. policy. The Roosevelt administration decided in February 1944—four months before the first Jewish request for bombing— that it would not use military resources “for rescuing victims of enemy oppression.” That U.S. policy decision never wavered. But that fact has not stopped some contemporary defenders of FDR from citing alleged Jewish opposition to bombing in order to exonerate the Roosevelt administration.

 

 That said, a glaring question presents itself: why did the museum change its exhibit? The answer would emerge, years later, from Arthur Schlesinger’s diaries.

 

“A Successful Campaign”

 

During the three years between the opening of the U.S. Holocaust museum and the revision of the bombing panel text—between 1993 and 1996—something curious happened. A retired nuclear engineer in Seattle, named Richard H. Levy, suddenly took an interest in the bombing issue. Although he had no professional training as a historian and no publications on the topic, Levy wrote a lengthy memorandum in which he argued that it was “beyond the power” of the Allies to interrupt the Holocaust by bombing Auschwitz or the railway lines leading to it; that many prominent Jewish leaders opposed such bombing; and that the museum should change its exhibit on the bombing issue to reflect these assertions. Those positions contradicted the findings of every serious historian who had researched the subject. Yet Levy and his arguments were championed in a series of articles and speeches by William vanden Heuvel, then president of the Roosevelt Institute. The institute’s mission is "to carry forward the legacy and values of Franklin and Eleanor Roosevelt.” 

 

Remarkably, Levy was invited to speak at the U.S. Holocaust Museum; his bombing memorandum was published in a book of proceedings from a conference that he did not attend, and the memo was then reprinted, in 1996, in the museum’s journal, Holocaust and Genocide Studies—despite the journal’s own policy of considering only submissions that have “not been published and will not be simultaneously submitted or published elsewhere.”

 

There was more: in a 1997 newspaper article, vanden Heuvel wrote that Levy “met with representatives of the U.S. Holocaust Memorial Museum” and persuaded them to change the bombing exhibit. Vanden Heuvel did not indicate that he had played any role in the process.[2]

 

His account raised more questions than it answered. How, without any outside influence, could a memo about Auschwitz by a retired nuclear engineer—a memo that was at odds with the findings of the experts in that field—convince a major museum to change an exhibit?

 

The answer came from Schlesinger’s diaries, large portions of which were published posthumously in 2007 by Penguin Press. Schlesinger was a close friend of vanden Heuvel and had collaborated with him in efforts to publicly defend FDR’s Holocaust record. In a diary entry dated August 21, 1996 (p.789), Schlesinger celebrated the conclusion of what he described as “Bill [vanden Heuvel]’s successful campaign to persuade the Holocaust Museum to revise a most tendentious account of the failure to bomb Auschwitz.” 

 

So there HAD been a “campaign” (behind the scenes) by the president of the Roosevelt Institute to change the bombing exhibit. Evidently, the change was not—as vanden Heuvel had claimed—the result of Levy’s persuasive powers; nor was it the result of the museum’s historians discovering errors in the exhibit. The original exhibit, in fact, was accurate; it was changed--according to Schlesinger--because of the Roosevelt Institute’s “campaign.” Precisely what type of pressure vanden Heuvel’s campaign employed was not specified in Schlesinger’s diary. But the outcome of the campaign certainly indicated that the pressure had worked.

 

Obviously, history museums update their exhibits from time to time. If a reputable historian points out an inaccuracy, a correction is made. Or if the museum staff itself uncovers new information, an exhibit will be revised. But it is another matter entirely to change an accurate text to an inaccurate one, in response to pressure from the president of an institute that has an agenda--in this case, the agenda of protecting the “legacy” and image of its namesake, Franklin D. Roosevelt.

 

There Was Pressure”

 

Schlesinger’s posthumous revelation went unnoticed by the news media and the academic community for more than two years. Finally, in 2009, New York Times reporter Patricia Cohen raised it. In the course of preparing an article about the bombing issue, Cohen interviewed Dr. Michael Berenbaum, who had been research director of the U.S. Holocaust Museum at the time the exhibit was changed. Cohen asked Berenbaum whether vanden Heuvel indeed had pressured the Museum to make the change, as Schlesinger’s diary entry indicated. Berenbaum replied (as quoted on Cohen’s blog on October 5, 2009): “There was pressure from the Roosevelt Foundation and we paid no attention whatsoever to that pressure.”

 

Berenbaum’s acknowledgement that “there was pressure” was significant. It contradicted the narrative that vanden Heuvel had presented in his lectures and articles; according to vanden Heuvel Levy had, on his own, managed to persuade the Museum to make the change.

 

As for Dr. Berenbaum’s statement that he and his colleagues “paid no attention” to the pressure--one can only note that they made the very change the Roosevelt Institute pressed them to make, despite the fact that the proposed change had no basis in the historical record.

 

An Error Remains Uncorrected

 

In the autumn of 2009, my colleagues and I provided the U.S. Holocaust Museum with new research identifying the 30 Jewish officials who advocated bombing Auschwitz or the railways. We also asked that the original caption in the bombing exhibit be restored.

More than two years later, in early 2012, the Holocaust Museum leadership responded with a ten-page memorandum in which they agreed that at least 26 Jewish officials had supported bombing.

That directly contradicted the statement in the museum’s exhibit that only “a few” Jewish leaders called for bombing. Twenty-six is not “a few.”

 

Nonetheless, today, thirteen years later, the inaccurate caption is still on display at the museum.[3] The erroneously-described position of the Jewish leaders continues to be used to, in effect, exonerate the Roosevelt administration on the issue of bombing Auschwitz. Presumably Arthur Schlesinger, Jr. would be pleased. But it does a disservice to the historical record.

 

[1] “Confidential - Memo on conference at the White House with the President---August 4, 1939,” Burton K. Wheeler Papers, Box 11: File 18, Montana State University, Bozeman, MT; Schlesinger to Wheeler, November 30, 1959 and December 22, 1959, Wheeler Papers, Box 11: File 18; Arthur Schlesinger Jr., “Did FDR Betray the Jews?,” Newsweek, April 18, 1994, p.14; Schlesinger to Medoff, September 4, 2005.

[2] “FDR Did Not Abandon European Jewry,” Washington Jewish Week, February 27, 1997.

[3] Wyman and Medoff to Luckert, September 21, 2009; Luckert to Wyman and Medoff, January 10, 2012.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184361 https://historynewsnetwork.org/article/184361 0
Eying Return to Power, Conservatism Learns to Love the Administrative State

Rod Dreher at the National Conservatism conference, Rome, February 2020

 

 

Beneath and beyond the January 6 insurrection and the right-wing populist surge that’s expected this Tuesday, American conservative thinking is taking some confused and confusing turns. One of them involves backing away from familiar “supply-side” dogmas and moving instead toward seizing the power of the administrative state to restore order and public virtue to Silicon Valley technocratic elites and to unruly masses, all under the tutelage of a “truly” conservative ruling elite. 

 

These thinkers aren’t flirting with Bernie Sanders’ democratic socialism or Joe Biden’s new New Deal. They’re edging closer to the Roman Catholic “common good Constitutionalism” of Harvard Law Prof. Adrian Vermeule and several Supreme Court justices, or to the old Ivy-Protestant, “Good Shepherd” guardianship of the republic, or even to the Nineteenth-century German Chancellor Otto von Bismarck’s authoritarian, ethno-nationalist welfare statism, which presaged the “national socialism” of a German political party that incorporated that phrase into its name and its public promises.

 

It's a complex development, but let me try to make it as comprehensible as it is reprehensible, because it may be hard upon us after this Tuesday's elections.

“We Need to Stop Calling Ourselves Conservatives,” writes John Daniel Davidson, a senior editor of The Federalist, a conservative publication (unaffiliated with the judiciary-focused, right-wing Federalist Society). Davidson praises and echoes an argument by Jon Askonas, a professor of politics at The Catholic University of America, who writes in Compact, another conservative site, that “the conservative project failed” because it “didn’t take into account the revolutionary principle of technology, and its intrinsic connection to the telos [or over-determined trajectory], of sheer profit.”

 

Both writers want a conservative-led counter-revolution against a corporate technocracy whose fixation on maximizing profit has trapped Americans like flies in a spiderweb of come-ons that grope, goose, track, and indebt us, bypassing our brains and hearts on the way to our lower viscera and wallets. But are conservatives who lament such developments truly urging a revolution within “free market” conservatism itself? Or are they making only a tactical shift in their strategy to support the scramble for sheer profit and accumulated wealth, glossed by religiously inflected public discipline?

 

American conservatives have long ridden every national gold rush, blaming liberals and progressives for trying to stop such stampedes by desperate mobs and greedy plutocrats. Most of Davidson’s articles have pounced, in sync with conservative media, on any opportunity to lambaste liberals for disrupting plutocracy’s accumulation of wealth (see “The Next GOP Congress Should Impeach Merrick Garland” and “It’s Not Crazy to Think Biden Sabotaged Nord Stream to Deepen U.S. Involvement in the Ukraine War”).

Yet now Davidson is warning that conservatives themselves have undermined their small-r republican virtues and freedoms by surrendering more than they’re conserving. He’s accusing them of accommodating themselves to “woke” liberals’ efforts to redress income inequality, sexual and racial grievances, and markets’ amoral reshaping of society. In so doing, he warns, conservatives, too, have disfigured civic and institutional order.

 

Once upon a time, Davidson explains,

 

conservatism was about maintaining traditions and preserving Western civilization as a living and vibrant thing. Well, too late. Western civilization is dying. The traditions and practices that conservatives champion… do not form the basis of our common culture or civic life, as they did for most of our nation’s history.

 

So, conservatives must seize power instead of sharing it. They must restore moral and social order, even if doing so requires using big government to break up a few monopolies and redistribute income a little to Americans whom conservatives have claimed to champion even while protecting the powers and processes that have left them behind.

 

Davidson and Askonas reproach fellow-conservatives for buying into “woke” corporate capital’s intrusive, subversive technologies, which treat citizens as impulse-buyers whose “consumer sovereignty” suffocates deliberative, political sovereignty. The irony in conservatives making this critique is that profit-crazed media such as Rupert Murdoch’s assemble and disassemble audiences on any pretext—sensationalistic, erotic, bigoted, nihilistic—that might keep them watching the ads and buying whatever they’re pitching.

 

Conservative jurisprudence that protects consumer marketing’s algorithmically driven pitching—by pretending that the business corporations engaging in it are persons deserving of the First Amendment-protected speech of self-governing citizens—only hands bigger megaphones to managers of swirling whorls of anonymous corporate shareholders, leaving truly deliberative citizens with laryngitis from straining to buck the telos of sheer profit.  

 

*   *   *

 

It's no small thing for conservatives to acknowledge that they can’t reconcile their claim to cherish traditional communal and family values with their knee-jerk obeisance to every whim and riptide of conglomeration or financialization. Ivy League graduates often try to finesse the contradiction gracefully and persuasively to most Americans. John F. Kennedy and the two George Bushes have arguably done so, but, perhaps, they knew better than to persuade themselves. “We are poor little lambs who have lost our way… damned from here to eternity,” Yale’s Whiffenpoof songsters croon, clinging to lost civic virtue in formal white ties and tails but acknowledging, humorously and ruefully, the soulless life awaiting them in Dad’s firm or at J.P. Morgan or in poring over spreadsheets as corporate lawyers and business consultants.

 

Although Davidson and Askonas are more candid than the Whiffenpoofs about the costs of facing both ways, they stop short of crediting Whittaker Chambers' warning to William F. Buckley, Jr.. Chambers, the ex-Communist, took a cue from Marx advising that "you can't build a clear conservatism out of capitalism, because capitalism disrupts culture," as Sam Tanenhaus, a biographer of Chambers, put it in a lecture at the American Enterprise Institute in 2007. Liberal Democrats, too, have stopped short of challenging neoliberal capitalism’s relentless dissolution of civic-republican virtue, voting instead for “the pro-corporate and anti-worker policies that made Trump,” as Robert Kuttner reminds readers of an American Prospect column in which he filleted the centrist liberal writer Anand Giridharadas's effort to rescue liberalism without indicting or significantly reconfiguring corporate capitalism.

 

Democrats celebrate their breaking of glass ceilings to install “the first” Black and/or female or gay CEO, but they do little to reconfigure those structures’ foundations and walls. While they’ve been breaking glass ceilings, they’ve also been breaking laws and regulations like Glass-Steagall, which restrained the investment banking, private equity, and hedge fund rampages that bamboozle and dispossess millions of Americans. They’ve even accepted the Supreme Court’s orchestration of George W. Bush’s ascent to the presidency and its decimation, via the Citizens United ruling, of campaign-finance laws that curbed corporate capital’s sway over the election of officials who are supposed to regulate corporate capital itself.

 

In Kuttner's view (and mine; see Liberal Racism), liberal Democrats who wave banners of ethno-racial and sexual identity to cover for their complicity in all this have given conservatives excuses to divert a resentful public's attention from the right’s even-more deceitful complicity in fomenting our republican crisis. Instead of offering alternatives to inequality and decay, conservatives have dined out so compulsively on liberals’ follies that they’ve forgotten how to cook for themselves and the rest of us and have abandoned the kitchen to Donald Trump.

 

*   *   *

 

Peddling demagoguery and finding themselves soulless, some conservatives have turned to religion for cover and succor, if not salvation. But religion should scourge them, as Moses scourged the fabricators of the Golden Calf at the foot of Mt. Sinai; as Jesus did the money-changers whom he drove from the Temple; and even as the conservative theologian Richard John Neuhaus did Senator Bob Dole, who’d condemned cultural decadence in Hollywood and had challenged Bill Clinton in the 1996 election but later made TV commercials for Pfizer, testifying that Viagra helped him cope with his erectile dysfunction. “The poor fellow looks like he’s restraining the impulse to unzip and show us the happy change,” Neuhaus barked.

 

When I noted Dole’s folly in "Behind the Deluge of Porn, a Conservative Sea Change," an essay for the journal Salmagundi, the conservative Christian editor Rod Dreher, then at The Dallas Morning News, republished my essay in that newspaper, explaining to the conservative Catholic magazine GodSpy that although I made “an impassioned case” that “’the pornification of the public square’ is destroying any kind of civic-republican ethos,” I would never see my dreams realized through liberalism because “only religious faith has the power to resist our very powerful commercial culture.”

 

Religious conservatives such as Dreher and New York Times columnist Ross Douthat have indeed sought in faith an escape hatch of sorts from conservatism’s imprisonment in the telos of sheer profit in our fallen world. Religion served the purpose, too, for William F. Buckley, Jr., who was a wealthy heir to part of the fortune his father had accumulated as an oil prospector and industry developer who meddled in Mexican politics during the military dictatorship of Victoriano Huerto. In 1951 Bill, Jr.’s book God and Man at Yale summoned that college’s presumptively Christian gentlemen alumni to rout the godless socialism of its professors.

 

 *   *   *

 

Buckley’s conservative movement has been at it ever since his passing in 2008. The lavishly funded William F. Buckley Jr. Program at Yale characterizes itself as a champion of “viewpoint diversity” instead of color-coded diversity, and it claims to oppose “intellectual and moral conformity” on campus. Its website features Buckley’s observation that “liberals claim to want to give a hearing to other views, but then are shocked and offended to discover that there are other views.” Actually, the program isn’t above trying to shock left-of-center students into making censorious protests which conservative media then spotlight and lampoon, as I recounted in Salon.

 

Whittaker Chambers would have responded to conservatives’ zeal to own the libs with a shrug and “the sly half-smile of a melancholy man who knows better,” as Tanenhaus put it. The fuller truth is that “viewpoint diversity” doesn’t make much headway against the Buckley Program’s own carefully managed internal conformity, as I discovered in September 2021, when its president, having read a column of mine about Yale’s star-crossed venture to establish a liberal-arts college with the tightly run city-state of Singapore, invited me to speak with the Buckley student fellows, writing that “we are looking to host in-person events with Yale affiliates. Please let me know if you are and I would be happy to follow up.”

 

“I'd be delighted to talk with and listen to Buckley Program participants,” I responded.

 

My criticisms of Yale College (which I've defended at times against certain outside conservative critics) are themselves somewhat "conservative," in that I try to protect old civic-republican virtues that I think Yale should continue to nurture. I agree with conservatives that Yale doesn't do enough of that. But… I believe that… finance capital… undermines what's best and necessary in a traditional liberal education….

 

I could also discuss broader dilemmas that Yale faces in its role as a crucible or training center for civic-republican leadership. Again, I've been severely critical of some conservative critics of Yale (try this, for example! --how's that for "viewpoint diversity"?!). But the older and wiser I become, the more convinced I am that each side of the political spectrum needs the best of the other side in certain ways, and, in this time of increasing polarization, that can't be stated often or clearly enough.

 

I'd be glad to explain what I mean by this, and I'd be more than willing to listen for a long time to the Buckley student fellows' own thoughts about this….

 

I never heard back from the Buckley president or anyone else in the program. Ironically, my disinvitation may have had been prompted by my depiction of some conservatives’ stagey condemnations of liberals’ “disinvitations” of conservative speakers. I described Buckley board chairman Roger Kimball’s introduction of the columnist George Will to Buckley student fellows at a “Disinvitation Dinner” staged by the program to dramatize Scripps’ college’s cancellation of its speaking invitation to Will after Will had made disparaging remarks about a “rape culture” of supposedly inflated accusations and cries of victimization. “Our colleges and universities, though lavishly funded and granted every perquisite which a dynamic capitalist economy can offer, have become factories for the manufacture of intellectual and moral conformity,” Kimball thundered, oblivious of the conformity he was enforcing on the 20 year-olds seated before him in formal wear at Will’s “Disinvitation Dinner” in an elegant hotel.

 

More telling than this reeking strain of hypocrisy has been the conduct of the Yale Law School’s chapter of The Federalist Society, some of whose alumni guided Trump in deciding his appointments to the Supreme Court and other federal judicial benches. Here I commend a brilliant expose of the Federalist Society’s “free speech” hypocrisy by Jack McCordick, a Yale undergraduate at the time, now a researcher-reporter for The New Republic. 

 

Firebrands in the Buckley undergraduate program and the Federalist Society’s law school chapter succeed at times in baiting left-leaning students (and, sometimes, university administrators) into committing or suborning excesses that the national conservative media eagerly denounce. But when the Law School’s Federalist Society chapter did manage to sponsor a straightforward debate—“Income Inequality: Is it Fair or Unfair?”—between the progressive Yale Law Professor Daniel Markovits and libertarian writer Yaron Brook of the Ayn Rand Institute—Markovits wiped the floor with Brook: See for yourself how an outbreak of “viewpoint diversity” at the behest of the Federalist Society flummoxed its organizers.

 

A similar embarrassment became public when editors and board members of conservative Manhattan Institute’s City Journal denied its writer Sol Stern freedom of speech to criticize Donald Trump at all. Stern, who’d been writing for that magazine and institute for years, outed them in an article -- "Think Tank in the Tank" -- for the left-liberal DEMOCRACY Journal that’s as telling as McCordick’s expose of the Federalist Society.

 

*  *  *

 

It's almost enough to make one sympathize with some conservatives’ religious escapes --Rod Dreher’s embrace of what he calls the “Benedict Option,” comes to mind. But it’s not enough to make me sympathize with the secular cries de Coeur of Davidson. “Put bluntly,” he writes, “if conservatives want to save the country, they are going to have to rebuild and in a sense re-found it, and that means getting used to the idea of wielding power, not despising it.” Why? Because accommodation or compromise with the left is impossible.

 

One need only consider the speed with which the discourse shifted on gay marriage, from assuring conservatives ahead of the 2015 Obergefell decision that gay Americans were only asking for toleration, to the never-ending persecution of Jack Phillips,” the baker who has indeed been hard-pressed to defend himself legally several times for refusing to decorate a cake with words congratulating a gay couple on a wedding. “The left will only stop when conservatives stop them,”

 

Davidson continues, warning that

 

conservatives will have to discard outdated and irrelevant notions about ‘small government’…. To those who worry that power corrupts, and that once the right seizes power it too will be corrupted, they certainly have a point,

 

he concludes.

 

If conservatives manage to save the country and rebuild our institutions, will they ever relinquish power and go the way of Cincinnatus? It is a fair question, and we should attend to it with care after we have won the war.

 

But when have conservatives ever shied from wielding power, except when they’ve been embarrassed or forced into relinquishing it by the brave civil disobedience of a Rosa Parks and the civil-rights movement, or by the disciplined, decisive strikes and protests and electoral organizing of labor unions and social movements?

 

If conservatives really want to “attend with care” to the examples set by Cincinnatus and George Washington, who relinquished power so that the public interest would continue to be served more lastingly and effectively by others, they’ll have to enable American working people to resist the “telos of sheer profit” that’s stressing and dispossessing them and that’s displacing their anger and humiliation onto scapegoats under the ministrations of Fox News and right-wing demagogues.

 

How about taking seriously Davidson’s proposal that government offer “generous subsidies to families of young children” -- a heresy to Grover Norquist, the anti-tax zealot who said he wants to shrink government to a size where he could drag into a bathtub and drown it? How about banishing demagoguery from their midst, as they often claim that Buckley banished John Birchite anti-Semitism? How about disassociating themselves, as I think Buckley would have done, from The Claremont Institute, the hard-right think tank that’s been so deeply “in the tank” for President Trump that he gave it a National Humanities Medal and followed the advice of its senior fellow John Eastman in attempting to overturn the 2020 election?   

 

Davidson proposes that “to stop Big Tech… will require using antitrust powers to break up the largest Silicon Valley firms.” But he also proposes that “to stop universities from spreading poisonous ideologies will require state legislatures to starve them of public funds.” He writes that conservatives “need not shy away from [big-government policies] because they betray some cherished libertarian fantasy about free markets and small government. It is time to clear our minds of cant.”

 

American conservatives hoping to wield big-government power are moving toward something like European conservatism, which has long mixed capitalism and welfare-state spending to advance nationalist, imperialist, and even racialist purposes. That dark, dangerous tradition began with Bismarck and metastasized into Nazi “national socialism” half a century later.

 

Americans should look carefully into the Pandora’s Box that they’re opening, and those who crave a more-“godly” relation to power would do well to ponder the observation by John Winthrop, the founder and first governor of the Puritan Massachusetts Bay Colony in 1630, in his essay-sermon “A Modell of Christian Charity,” that “it is a true rule, that particular estates cannot subsist in the ruin of the public.” In other words, not even wealthy estates can survive for long in a society that’s being disintegrated by capitalism. It's getting very hard to imagine America’s conservative “fundamentalists,” be they religious or secular, finding it in themselves to escape the English poet Oliver Goldsmith’s foreboding of doom in 1777: “Ill fares the land, to hastening ills a’prey, when wealth accumulates, and men decay.”

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184360 https://historynewsnetwork.org/article/184360 0
From Torch to Tunis to El Alamein: Events 80 Years Ago Made the Modern Middle East  

 

 

The Middle East is again in the news. With nationwide protests in Iran, the schism in U.S-Saudi relations and Israel’s fateful political election, it appears the region is again convulsed in tumultuous change. That makes it an appropriate moment to pause and reflect on how the Middle East we know today owes itself to events exactly eighty years ago, when the region passed through one of the most consequential, if largely forgotten, weeks of the last century.

 

The region has known many seismic moments over the past hundred years. From Allenby’s entry to Jerusalem (ok, nearly 105 years ago) to the world-shaking Suez Crisis to Khomeini’s overthrow of the Shah, the Middle East has been a setting for events that defined global politics. Each one of these moments – along with many others – not only changed the course of history but helped define the contours of the contemporary Middle East.

 

But there have never quite been so many historic events crammed into a single week as what happened eighty years ago. Students of Middle East history along with practitioners of Middle East policymaking should take a moment to consider the cumulative impact of three oft-forgotten events that all occurred within just days of each other, moments that forever altered the region’s landscape.

 

*   *   *

 

The fateful week begins at dawn on Sunday, November 8, with the launching of Operation Torch, the Anglo-American landing in Algeria and Morocco that was, at the time, the greatest amphibious invasion in human history. The first major U.S. combat operation of World War II, Torch marked the opening of a second front against Nazi Germany and a critical turning point in the war.

 

But did Torch leave a lasting imprint on the Middle East? Most observers say no. After all, American forces came and went, transiting through North Africa on their way to Italy, never to look back. But that’s precisely the point.

 

Torch underscored a strategic lesson that few contemplated when the United States declared war on Hitler’s Germany the day after Pearl Harbor – that victory in Europe would run through the sands of the Middle East. The region mattered little to America in the 120 years since the Barbary pirates gave us the Marines’ celebrated landing on the “shores of Tripoli,” but Torch marked the dawn of a new era in which the Middle East was a critical theater of strategic competition.

 

Here, Torch’s lesson was that America would henceforth ignore the region at its peril. Indeed, the Biden Administration may be the third in a row looking for ways to diminish America’s role in the region but like his two predecessors, the incumbent president is finding that even with hot war in Europe and brewing conflict in Asia, America cannot escape the hold the region has on American interests and attention.

 

At the same time, America’s wartime leaders eighty years ago carved another principle into subsequent U.S. strategy by wanting little to do with the local politics of this obscure, distant and little-understood region. Washington showed itself especially allergic to involvement in the messiness of Middle East governance.

 

In retrospect, one could have imagined a different approach. After all, fifteen months earlier, U.S. and British leaders had issued the Atlantic Charter, declaring their common war aims to be to “respect the right of all peoples to choose the form of government under which they will live [and] to see sovereign rights and self-government restored to those who have been forcibly deprived of them.” Within hours of landing in Algiers, the capital of French North Africa, U.S. commanders had the option of implementing those principles in the very first territory they captured from Axis forces.

 

But instead of bringing liberty to the liberated lands, they made a deal that empowered local Vichy administrators to maintain the reins of power in exchange for safe passage for U.S. troops. Rather than endorsing self-government and bestowing rights on “those who have been forcibly deprived of them,” U.S. commanders were eager to leave the business of local management to the Vichy colonialists they just defeated.

 

Those early U.S. decisions in Algiers in the hours after Torch laid the foundation for what became a core element of American Middle East policy. This is the instrumentalist idea that the region was a way-station to other U.S. policy aims, such as the defense of oil or the protection of critical sea lanes, but not intrinsically worthy of much political investment. Ultimately, this left the Middle East as the region of the world where America relies the most on friendly despots to help secure its interests.

 

For decades, through the Cold War and the “end of history” era that followed, this system worked well enough, though massaging the built-in pressures of these relationships was never easy. Now, as both ends of our domestic politics – the progressive left and the America First right – call for a shift in how we conduct this policy, that past uneasiness will pale in scope to the stress and anxiety likely to emerge from clashes with these traditional authoritarian partners. In this regard, current U.S.-Saudi tensions may just be a harbinger of things to come.

 

*   *   *

 

The second critical event to occur in this pivotal week eighty years ago was Monday, November 9, when Germany responded to Torch by invading Tunisia. Few today recall the six-month period after Torch when the most hotly contested territory in the European theater of war was not in Europe at all but in Tunisia, where Hitler deployed hundreds of thousands of troops ordered to hold the North African beachhead at all costs.

 

Fewer still recall Tunisia as the only Arab country to suffer a full-scale German occupation during the war. This included the dispatch of SS units, under the ruthless Walter Rauff, who applied his experience with mobile death gas vans used to kill Jews in Europe to implement round-ups, hostage-taking, concentration camps, deportations and executions of Jews in Tunisia. Despite the fact that the Germans never controlled more than a third of Tunisian territory, faced repeated military attacks by allied troops and were targeted almost nightly by allied bombers, the Nazis persisted in building the foundation for what would have been, if they kept their hold on Tunisia, the full persecution and execution of its Jews – the Final Solution was core to their mission here just as it was in Europe.

 

Driving the Germans from Tunisia was a tough slog. It took nearly as long for U.S., British and Free French troops to take Tunisia as it did, a year later, to move from the beaches of Normandy to the Battle of the Bulge. But after some missteps, the allies eventually routed the Germans in April/May 1943, taking about a quarter-million prisoners and setting up the invasion of Sicily and then mainland Italy the following September.

 

Few physical reminders remain of the German occupation. But the half-year of German control left a deep scar on the psyche of many Tunisian Jews, who only in recent years have emerged to tell their stories and demand their rightful chapter in the history of the Holocaust. As for broader Tunisian society, the German occupation ended with the return of the French protectorate and proved to be a brief detour on the road to eventual independence in 1956.

 

Yet here too, there is more to the story, as it marked the arrival of fascism in Arab lands. Indeed, the most lasting impact of the Nazi presence in Tunisia was to give Arabs an up-close look at a model of all-powerful government infused with supremacist ideology. Along with the 1941 arrival in Berlin of Jerusalem mufti Hajj Amin al-Husseini and Iraqi puschist Rashid Ali, both forced to flee from Baghdad, the Tunisia experience would play a role in building two movements that competed for power in the Middle East for decades to follow -- the radical Arab nationalism of Gamal Abdul Nasser and Saddam Hussein and the Islamist extremism of Osama bin Ladin and Abu Bakr al-Baghdadi. Whether both of these movements have been flushed from the Arab political system – or are just passing through a period of reassessment, retrenchment and rebirth – is one of the region’s most profound uncertainties.

 

*   *   *

 

The fact that Nazism itself did not spread throughout the Levant was thanks to the third event of this historic week, Britain’s success in the second battle of El Alamein, which concluded on Wednesday, November 11. This is the legendary battlefield victory of British general Bernard Montgomery over the Desert Fox, German field marshal Erwin Rommel – what Churchill famously called, after suffering through three disastrous years of German victories, “the end of the beginning.”

 

As recent scholarship shows, the Germans had designs on Egypt and the Levant that went beyond the purely strategic objectives of controlling the Suez Canal, the eastern Mediterranean and the oil fields of Arabia. In fact, there is convincing evidence that the Nazis planned to follow on Rommel’s expected sweep into Cairo and then onto Jerusalem with the extermination of the Jewish communities of Egypt, Palestine and beyond. If the Panzers were not defeated in the Western Desert, this would likely have added more than 600,000 additional Jews to the Holocaust death toll.

 

This would have aborted any hope of the Zionist dream for a “Jewish national home” in the historic homeland of the Jewish people. The near annihilation of the Jews of Europe fed the desire for Jewish sovereignty; the annihilation of the Jews of the Levant would have killed it. Israel would never have been.

 

*   *   *

 

Torch, Tunisia, El Alamein – to understand how the Middle East we know today came to be and to appreciate the origin and evolution of American policy in a region that has loomed so large in our national consciousness in recent decades, let’s pause to remember the events of the second week of November 1942.

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184348 https://historynewsnetwork.org/article/184348 0
Lindsey Fitzharris on the Pioneering Facial Reconstruction Surgeon Who Remade the Faces of Great War Veterans

 

 

Today in northeastern France, cemeteries and monuments scattered across the rolling green landscape honor the Allied dead of World War 1, but the suffering of the wounded, particularly those with cruelly disfigured faces, has often been ignored or discreetly hidden.

World War 1 produced an astonishing 40 million casualties, including some 10 million military dead and 20 million wounded. The brutality of the war, the first large-scale conflict with machine guns and accurate, rapid firing artillery and poison gas, caught political leaders, generals, and medical and medical authorities by surprise.  Artillery  caused two-thirds of all military injuries and deaths. Soldiers’ bodies, when not entirely obliterated by high explosive shells, were dismembered, losing arms, legs, noses ears and even entire faces. While losing a leg or an arm made a man a hero, a devastating head wound that left him without a nose or jaw made him a pariah, a grotesque object of pity, unfit for society.

By the armistice in November 1918, 280,000 men from France, Germany and Britain alone had sustained some form of facial trauma. At the beginning of the war, no one knew how to treat these horrifying injuries. As one battlefield nurse wrote home, “the science of healing stood baffled before the science of destroying.”

In The Facemaker, Lindsay Fitzharris, a medical historian, reveals the suffering endured by these soldiers and details the accomplishments of Dr. Harold Gillies, a London surgeon who pioneered dozens of plastic surgery procedures. Gillies’ innovative techniques gave thousands of greatly disfigured soldiers a new life, with reconstructed faces that allowed them to return home to their families. 

Dr. Harold Gillies

Gillies, born in New Zealand, had graduated from medical school in Cambridge and become a successful ear, nose and throat (ENT) surgeon in London when the war came.

Summoned by the Royal Army, he was charged with treating men with gaping holes in their faces, some missing jaws who had to be fed through a straw.  He soon put together a team of physicians and nurses at Queen's Hospital  outside London where up to a thousand patients could be undergoing treatment.

The word plastic in plastic surgery means "reshaping" and comes from the Greek plastikē. A very primitive form of reconstructive surgery, nose reshaping, was practiced as early as 800 B.C. in India, but relatively few advancements in this specialized field were made for the next 2000 years.

However, by 1914 great progress had been in other areas of medicine. Antiseptic techniques, such as sterilizing wound sites and frequent washing of hands, were standard in British medical facilities (though antibiotics were still in the future).  As the war developed, Allied armies assembled fleets of motorized ambulances and portable triage centers where the wounded could be scanned by X-ray machines.  Blood transfusion, although not perfected, was sometimes administered to those with severe blood loss.

Gillies pioneered the use of multiple skin grafts on the face with the tissue being excised from other parts of the body. He invented the tubed pedicle skin graft, also known as the walking-stalk skin flap, in which the skin for the target site is folded into a flap and attached at both ends to preserve the blood supply. Gillies’ team also pioneered the epithelial outlay technique, which enabled a new set of eyelids for those who had them burned off. The physicians at Queen’s Hospital also introduced new techniques in anesthesia and blood transfusion.

Gillies, concerned about the psychological health of his patients, encouraged the visits of family members and provided training programs for those soldiers who left the hospital with disabilities.

A long recitation of medical procedures can get boring, and Fitzharris skillfully weaves in stories of the young men who were wounded on the battlefield and were fortunate enough to make to the London hospital.

Private Percy Clare 

Private Percy Clare was advancing at the Battle of Cambrai in 1917 when a bullet passed through his cheek ripping a large hole in the left side of his face. Blood gushed down his tunic and he collapsed on the ground, as the battle raged around him. A medical orderly came by, and eyeing the large head wound and pools of blood, muttered “that sort always dies soon” and moved on. After hours passing in and out of consciousness, Private Clare was recognized by a friend, who promised to get help. More hours passed as the friend tried to locate a team of stretcher-bearers who would venture out to help Clare. Finally, he was rescued and eventually returned to England, where he underwent a series of operations at Queen’s Hospital, which Dr. Gillies ran. After several hospital stays and painful recuperation he was discharged from the army in 1918 and returned to his wife and son. He died in 1950, age sixty-nine.  

By the end of the war, Gillies and his colleagues had performed more than 11,000 operations on some 5,000 patients. In 1920 he published a book Plastic Surgery of the Face, which became an essential textbook for surgeons in many nations

In World War II, Gillies was again called upon by the British government and organized plastic surgery units at a dozen hospitals. Gillies continued to pioneer plastic surgery techniques. In 1946, he one performed on of the first sex reassignment surgeries from female to male on  Michael Dillon (born Laura Dillon). In 1951, he headed a team that completed a male-to-female reassignment surgery.

For American readers, one disappointment will be absence of any discussion of the American Expeditionary Force and its medical corps. Fitzharris, who holds a doctorate in the history of science and medicine from the University of Oxford, naturally focuses her research on archives in the U.K.

World War 1 was a world-shattering disaster that ended in an uncertain peace.  As Robert Kirby, a historian at Keele University, observed “Nobody won the last war but the medical services. The increase in knowledge was the sole determinable gain for mankind in a devastating catastrophe.”

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184346 https://historynewsnetwork.org/article/184346 0
The Roundup Top Ten for November 4, 2022

The Line Between Rhetoric and Political Violence is Fading Fast

by Matthew Dallek

By rhetorically signalling contempt for the government, public institutions, and their opponents, the leaders of the Republican Party are failing to maintain the separation of politics and violence.

 

Broken Faith: What Must Christians Do About the White Nationalists Among Them?

by Anthea Butler

"Three major forces have combined to lead us perilously close to disaster: conspiracy theories, racial and historical panics, and the increasing language of spiritual warfare."

 

 

The Robber Barons Had Nothing on Musk

by David Nasaw

Like the Gilded Age robber barons, Elon Musk's self-made mythos hides the government subsidies supporting his businesses. Unlike them, he has the werewhithal to move financial markets to his advantage through Twitter. 

 

 

"A League of Their Own" Update Engages Lives of Queer Women in the 1940s

by Lauren Gutterman

"The series’ portrait of queer life amid World War II might seem unrealistic to some, but history reveals that queer women and trans men — from butch to femme and married to unmarried — often found opportunities to act on their desires and build queer communities."

 

 

Musk Just Latest in Right's Push To Acquire Media Platforms

by A.J. Bauer

The history of conservative media acquisitions reflects the anxiety on the right that their ideas are broadly unpopular. 

 

 

A Warning from Weimar: The Danger of Courts Hostile to Democracy

by Samuel Huneke

Far from being guardrails for democracy, Weimar courts were implacably hostile to it, and paved the way for its overthrow by leniency toward right-wing political violence. 

 

 

The Past and Present of Christian Nationalism in America

by Eric McDaniel

"Christian nationalism is a religious and political belief system that argues the United States was founded by God to be a Christian nation and to complete God’s vision of the world. In this view, America can be governed only by Christians, and the country’s mission is directed by a divine hand."

 

 

The Tyranny of the Maps: Rethinking Redlining

by Robert Gioielli

The four-color mortgage security maps created by New Deal-era bureaucrats and bankers have become a widely-known symbol of housing discrimination and the racial wealth gap. But does the public familiarity with the maps obscure the history of housing discrimination? And what can historians do about that?

 

 

Can Americans Understand the Divisions in Latino Politics?

by Geraldo Cadava

Despite the lip service both parties pay to welcoming (and deserving) the growing Latino vote, do their non-Latino leaders actually understand the complexities of this large demographic category? Do they want to? 

 

 

Small Nations, Big Feelings: America's Favored European Nations Before Ukraine

by Madelyn Lugli

"Feeling patriotism for a foreign country is, when you think about it, odd."

 

]]>
Thu, 01 Dec 2022 04:15:49 +0000 https://historynewsnetwork.org/article/184359 https://historynewsnetwork.org/article/184359 0