History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sun, 29 Nov 2020 15:33:52 +0000 Sun, 29 Nov 2020 15:33:52 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed UCLA Historian Carla Pestana Debunks Myths About the Pilgrims and the Plymouth Colony

 

 

 

In the traditional view of the Pilgrims, one told in many 20th century textbooks, the Pilgrims came to Massachusetts on the Mayflower to make a break with old England and start a unique society. 

 

In her new book, The World of the Plymouth Plantation, Dr. Carla Gardina Pestana corrects the “false impression that Plymouth was disconnected from the world that gave rise to it.”

 

In fact, the Plymouth settlers were “deeply connected” to many other places through England’s transatlantic trading network. They envisioned creating an English “plantation,” and carried with them many traditional English customs. The term “Pilgrims” was bestowed upon them in the 1700s in order to distinguish them from other settlers, particularly the Puritans of Boston.

 

Dr. Pestana, the Joyce Appleby Endowed Chair of America in the World in UCLA’s History Department, is the author of four previous books including The English Atlantic in an Age of Revolution, 1640-1661.

 

Pestana discussed her work with the History News Network.   

 

Q. While we may think of Thanksgiving as a uniquely American holiday, you describe the gathering that took place in November 1620 as one that “invoked the seasonal festivals of English rural culture.” The event was not labeled “thanksgiving” until the 1800s.  What shaped our modern conception? 

 

Edward Winslow supplied a brief description of this gathering, the only mention we have of it. His account makes clear that the event marked the first harvest with a meal enjoyed in a community gathering. The martial display that kicks off the event may have attracted the 90 warriors whom Winslow mentions as showing up, but in the end, they supplied venison and took part in the feast. To mark a harvest with a community festival was a standard celebration in any English village, and, from what Winslow describes, that sort of event was their model. It occurred probably in September, a more appropriate time for a harvest gathering in New England, and it represented a chance to rest and celebrate after the hard work of bringing in the crops. The women who prepared the food did not experience a rest from their labors, however, which is usually true of most American thanksgiving celebrations to this day. 

 

The label “Thanksgiving” got added later, and it invoked a different colonial practice, getting it slightly wrong too. In colonial New England, in later years, days of Thanksgiving occurred irregularly, prompted by specific occurrences. Local leaders called for such a day when they wanted to thank God for some good turn of events. A successful harvest could prompt such a day, but so did many other occasions: good news about events in England, success in war, a possible calamity (such as a fire) averted. 

 

Thanksgiving days had a counterpart in days of humiliation. When events went against the community, leaders called for such an occasion, in which fasting and prayer brought everyone together, to beg God for forgiveness. They saw communal suffering as judgement for past sins and successes as a reward or a mercy. In this providential world view—in which God was thought to take a regular interest in the community’s affairs—New Englanders (and Christians elsewhere) responded to what it perceived to be the message behind each major change in their fortunes. 

 

Thanksgiving as a national holiday denoted, at the time of its 19th century beginnings, hard work, sacrifice for the common good, and what we would call family values. Abraham Lincoln wanted to emphasize these values at the height of the Civil War, and they remain central to the holiday’s image. If today we have gotten far from a harvest festival and farther still from that of a one-off moment to thank God for a specific event, family remains at its heart. 

 

Q. You write that the Plymouth settlers considered themselves “planters” intent upon “transplanting the society they knew as well as well as the household work regimens that made it possible.”  For many Americans, “plantation” invokes images of southern aristocrats lording over enslaved workers

 

The idea of plantation today means a large-scale agricultural undertaking that used enslaved laborers and (usually) produced a single crop like sugar, tobacco, or cotton. Planters were individual landowners who privately owned both the plantation itself and the slaves who worked it. Plantation is thus associated with the worst abuses of racial slavery, and activists today are understandably interested in eliminating that language. 

 

Yet those who named Plymouth a plantation (or for that matter the recently renamed Rhode Island) were not thinking of slavery. Rather, they understood plantation to mean transplanting English people, their culture and ways of organizing society, into another location. It was, in other words, a synonym for colony or settlement, using a then-common meaning for the term.  Ireland had plantations dating from the 16th century, created when lands seized from Irish landowners were turned over to English (or later Scottish) migrants who supported the English conquest of the island. In both the Irish case and the North American one, plantations introduced what we would today call settler colonialism, with the goal of displacing native peoples and replacing them with transplants.  

 

That idea of plantation contained a certain arrogance, as those who did the planting clearly believed they had some right to occupy the lands of others. While it was not blameless, it was not worthy of censure for what modern critics of the term believe. 

 

Q. The Mayflower Compact is often cited as the “birth of American democracy,” but you believe the men who wrote it in 1620 had another purpose in mind.

 

The men who signed the compact (which they did not call the Mayflower Compact) did so because they found themselves unexpectedly in a place with no government. They had hoped to land in what was then northern Virginia (around modern-day Delaware). Had they been there, they would have come under the jurisdiction of the Virginia Company. In the unhappy event of landing in New England, they decided they needed to curb an impulse toward anarchy that some leaders feared would overtake them. The document they drafted placed the group firmly under the authority of the English monarch, King James I.  They agreed to create their own laws and ordnances, and pledged everyone’s “Submission and Obedience.” 

 

The signers considered it a stop-gap measure until they could arrange a better basis for government, in particular a charter from the king. They never attained that charter, and they had to fall back on a ragged collection of inadequate documents, including a land grant from a defunct company and their shipboard agreement. Far from celebrating democracy, the agreement aimed to suppress individual impulses, granting more duties than rights. The document opens with allegiance to the king and ends with promising that they would submit and obey because those were its main concerns. 

 

Q. You are skeptical that the “Plymouth Rock” memorialized in contemporary Plymouth Massachusetts was ever used as a docking point for the Mayflower passengers. 

 

Plymouth Rock was identified as the possible landing site more than a century after 1620, when a group of interested townsmen took an elderly man out to the beach to question him about what he knew.  If anyone had earlier worried about the precise landing site or discussed it, we have no evidence that was the case. There is little reason to believe that the elderly man (who could not have been an eyewitness) knew anything about the matter. Once he spoke, however, the townsmen dubbed the rock the landing site. 

 

In a surreal twist, they then dragged this random boulder around, breaking it in the process, and installing it in various locations in town. It came to its final resting place, in the classical temple by the highway, much later, and that site was not its original resting place. So, the boulder that is supposed to denote the landing site is not at its original site nor is it likely to have had anything directly to do with the landing in any case. 

 

A boulder as a landing site—often depicted as a sort of natural dock in various later images—makes no sense. Wooden boats steer away from boulders, not toward them. In addition, we have descriptions of the passengers wadding through the surf, dampening their clothing, when they disembarked. Clearly the sailors knew better than to deposit them on boulders, and the idea that they had done so may have been fueled by drink. I envision those who dragged the boulder around and broke as being the worse for having imbibed, in a sort of early version of a fraternity prank.  

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178318 https://historynewsnetwork.org/article/178318 0
The Devil and Mary Lease

 

 

 

The Black Lives Matter movement precipitated a national discussion over who and what should be honored with statues, monuments and place names. Recently the New York City Parks Department announced ten “park spaces” would be named in “honor of the Black experience in New York City.” The newly named park spaces recognized national figures like Langston Hughes, James Baldwin, Elston Howard, and Ella Fitzgerald who all had New York ties and local community leaders. 

 

There is a statue of Mary E. Lease in Wichita, Kansas, erected in 2001 by a Kansas women’s club she founded in 1886. As someone active in the push to take down statues and rename places, I have been thinking a lot about that statue and Lease’s role in history.

 

As a high school United States history teacher, I introduced students to Lease as a leader of the late 19th century Populist movement that introduced many democratic reforms into American politics, as a dramatic speaker who had the ability to mobilize a mass movement, and as an example of a powerful woman whose contributions to American history are often ignored.

 

Mary E. Lease was a late 19th century Populist leader who campaigned for the rights of farmers, workers and women. She was a leader of both the women’s suffrage and prohibition of alcohol movements and was active in the labor movement as a member of the Knights of Labor. As a Populist, Lease led a crusade against the power of corporate monopolies and banks that dominate the American and global economy. She is credited with advising Kansas farmers to “raise less corn and more hell.” 

 

Lease is featured as a representative of Populism in The Americans (Houghton-Mifflin, 2005), one of the standard texts used in United States high schools. In the chapter on “Farmers and the Populist Movement” there is a photograph of Lease, a mini-biography, and a quote from one of her speeches (425). She is described as a “spellbinding” speaker (427). 

 

Unfortunately, Lease was also an anti-Semite who used anti-Jewish tropes and direct anti-Semitic references to stir up her audience. In her standard stump speech, Lease warned

 

this is a nation of inconsistencies. The Puritans fleeing from oppression became oppressors. We fought England for our liberty and put chains on four million of blacks ... Wall Street owns the country. It is no longer a government of the people, by the people, and for the people, but a government of Wall Street, by Wall Street, and for Wall Street. 

 

She also accused the Vice-President of the United States of being a “London banker,” a veiled reference to supposed ties with the Rothschild banking firm. This speech is quoted in Howard Zinn’s A People’s History of the United States, but the reference to “London banker” is left out.

 

In another speech, this one to the Women’s Christian Temperance Union, Lease claimed “the government, at the bid of Wall Street, repudiated its contracts with the people . . . in the interest of Shylock,” the stereotypical Jewish villain in Shakespeare’s The Merchant of Venice.

 

On August 11, 1896, the New York World reported on Lease’s “seductive oratory” at a Cooper Union “free silver mass-meeting” in Manhattan. Lease “attacked the entire social system” and “every reference to wealth and its owners received with wild delight.” 

 

During her speech “Every mention of gold or wealth was greeted with shouts and jeers, and the names of Whitney and Cleveland, of Vanderbilt and Rothschild were hailed with hisses and cat-calls.” Whitney was William C. Whitney, financier, coal baron and political insider. Cleveland was President Grover Cleveland. Vanderbilt was Cornelius Vanderbilt II, who inherited his family’s control of the New York Central Railroad. Rothschild was the European Rothschild banking house, reportedly the wealthiest in world history. In the World article, Lease did not identify the Rothschilds as Jews. However, the New York Times also reported on Lease’s speech at the Populist rally. The Times article quotes Lease “We are paying tribute to the Rothschilds of England, who are but the agents of the Jews.” The speech was ”received with close attention, and was heartily applauded.”

 

In her book, The Problem of Civilization (1895), Lease wrote 

 

“Our commercial system would be sadly disturbed if our government granted a monopoly of gallons, bushels and yards to a company of Jews. Then the man who conducts a wholesale or retail business would be compelled to hire a bushel, gallon or a yardstick from the Hebrew before waiting upon his impatient customers. Hunger, haste and pressing necessity alike would have to wait the pleasure and interest of the Jew.”

 

She asked, 

 

“why should money be conceded the quality of intrinsic value? Because tradition and superstition have invested it with an artificial nobility, similar to that of the Divine right of kings; because gold and silver are commodities which can be manipulated in their value by speculators and pirates of the financial world, and because, in the hands of the Rothschilds and their imitators the measure of values may be expanded or contracted to suit their interest.”  

 

Later, Lease described the Rothschilds as the “hook-nose harpies of the House of Heber” and she accused President Cleveland, “Grover the First” of being “the agent of Jewish bankers.”

 

I am a descendent of the people Mary E. Lease called “hook-nose harpies of the House of Heber.” However my Eastern European ancestors were impoverished village peddlers and factory workers, not global bankers, as were the ancestors of most American Jews. There is no question that Mary E. Lease believed in and used anti-Semitic tropes. As the Black Lives Matter movement challenges statues and place names celebrating the Confederacy and racists, what do we do about Mary E. Lease?

 

In an October 1962 article in the American Historical Review (68/1, pg. 76-80), Norman Pollack of Michigan State University challenged “The Myth of Populist Anti-Semitism.” Based on his research in archive collections of the papers of Henry Lloyd, William Jennings Bryan, and Ignatius Donnelly, Pollack argued incidence of Populist anti-Semitism were “infinitesimal.”  He dismissed accusations leveled at Lloyd and Bryan and describes Donnelly’s anti-Semitism as “ambivalent and complex.” According to Pollack, Populist newspapers were principally concerned with the Rothschilds as international bankers, not as Jews. Mary E. Lease’s use of anti-Semitic references in her speeches raise serious doubts about Pollack’s conclusions.

 

Unfortunately, anti-Semitism has a powerful and disturbing history in the United States and, through the speeches of Mary E. Lease, a connection with late 19th century Populism. Three decades after the peak of the Populist movement, quotas were imposed that severely limited Jewish immigration to the United States. Prior to World War II, President Franklin D. Roosevelt, fearing political repercussions if he were seen as aiding European Jewry, refused to allow Jewish refugees fleeing Nazi Germany to enter the United States and during the war would not order U.S. forces to bomb rail lines transporting millions of Jews to their deaths in Nazi concentration camps. In 2017, at a Unite the Right rally in Charlottesville, Virginia, marchers who were later defended by President Trump, chanted anti-Semitic slogans.

 

"The Devil and Daniel Webster" is a short story by Stephen Vincent Benét first published in the Saturday Evening Post in October 1936. It became an Academy Award winning movie in 1941. 

 Daniel Webster, Senator from New Hampshire, is the lawyer for a farmer accused of selling his soul to the Devil. The Devil acts as prosecutor and produces a jury of assorted villains that he expects to convict the farmer and honor the contract. But Webster is very persuasive and the Devil is incredulous when the farmer is acquitted.

 

Mary E. Lease, populist and feminist, sold her soul to an anti-Semitic Devil. If I were on her jury, I would vote to let her statue remain but insist that a plaque, and all textbook references, include the disturbing fact of her anti-Semitism.

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178319 https://historynewsnetwork.org/article/178319 0
Nuclear Deterrence and Things Left to Chance David P. Barash is professor of psychology emeritus at the University of Washington. His latest book is Threats: Intimidation and its Discontents (2020, Oxford University Press). 

 

 

USAF General Jack D. Ripper (Sterling Hayden) and RAF Captain Lionel Mandrake (Peter Sellers) overcome the credibility gap in Stanley Kubrick's Doctor Strangelove or: How I Stopped Worrying and Learned to Love the Bomb (1964) 

 

 

An ancient dilemma faced by leaders throughout history has been how to prevent — deter — attacks on their realm from outside (invasions) or from inside (rebellions). And an ancient answer, albeit not the only one, has been to threaten that any such perpetrators will be punished.  The most prominent alternative has been attempted deterrence by denial, which has experienced mixed success, from the Great Wall of China to the Maginot Line of 20th century France. 

 

Deterrence by punishment gets particular attention, not only because it underlies nuclear deterrence (there being no effective deterrence by defense), but because its consequences have been so horrific. Some of the most riveting accounts of murderous cruelty come down to us from Bronze Age kings, who famously flayed their opponents and made mountains out of human skulls, often as a “lesson” to would-be opponents. 

 

A more recent but nonetheless hair-raising statement of this perspective came from Sir John Fisher, First Sea Lord, Admiral of the Fleet, and widely regarded as the most important British naval figure after Horatio Nelson. Speaking as the British naval delegate to the First Hague Peace Conference in 1899, Fisher emphasized that deterrence by punishment is likely to be effective in proportion as the threatener has a fearsome reputation: 

 

“If you rub it in both at home and abroad that you are ready for instant war . . . and intend to be first in and hit your enemy in the belly and kick him when he is down and boil your prisoners in oil (if you take any), and torture his women and children, then people will keep clear of you.”

 

Connoisseurs of deterrence by punishment have long struggled with how to make the concept work, challenged not only by a desire to be something less than incorrigibly bloodthirsty, but also — especially in the Nuclear Age — deeply worried about how to make an incredible threat credible. Here is one of the more intriguing and incredibly dangerous “solutions.”

 

Even if you’re not a mountain climber, imagine for a moment that you are. Moreover, you’re roped to another climber, both of you standing by the edge of a crevasse. You’re having a heated argument, trying to get the other to do something that she doesn’t want to do — or alternatively, trying to get her to keep from doing something that she wants to do. The details don’t matter; what does matter is that the two of you disagree, strongly, about what should happen next.

 

How can you get your partner/opponent to bend to your will?

 

This sets the stage for an imaginary situation developed by the late Thomas C. Schelling, one of the leading theoreticians of nuclear deterrence and co-recipient of the 2005 Nobel Prize in economics. In his book, Arms and Influence, Schelling used a mountaineering model to explain what he called the “threat that leaves something to chance.” He proposed it as a way to get around the problem of credibility when it comes to the use of nuclear weapons. It is a very big dilemma, one that has bedeviled nuclear strategists for decades and that despite efforts by the best and brightest (including Schelling) remains unsolved to this day. 

 

When it comes to nuclear deterrence, the credibility gap is easy to state, impossible to surmount: Nuclear weapons are so destructive and their use is so likely to lead to uncontrollable escalation and thus, to unacceptable consequences for all sides, that the threat to use them is inherently incredible. And this, in turn, presents an immense difficulty for strategists hoping to use the threat of nuclear war as a way of either coercing another side to do something they’d rather not (say, withdraw from a disputed region), or to refrain from doing something that they might otherwise do (e.g., attack the would-be deterrer). So, let’s return to those disputatious climbers. 

 

If you simply announce your demand, the other might well reject it. What, then, might you do if you really, really want to get your way? You could threaten to jump, in which case both of you would die; remember, you’re roped together. Because of its suicidal component, however, such a threat would lack credibility, so the other person might well refuse to take it seriously. But suppose you move right to the edge, becoming not only more insistent but also increasingly erratic in your movements. 

 

What if you start leaping up and down, or shuffling your feet wildly? Your credibility would be enhanced, not because falling in would then be any less disastrous, but because by increasing the prospect of shared calamity, emphasizing your threat by adding a soupçon of potentially lethal unpredictability over which you have no control, your literal brink­manship just might kill both of you, not on purpose, because as we already saw, that threat would lack credibility. But rather because chance factors — a sudden loss of bal­ance, a gust of wind — might do what prudence would otherwise resist. Thus, according to Schelling, unpredictability — leaving something, somewhat, to chance — would surmount the problem of incredibility. This terrifying loss of control wouldn’t be a bug, but a feature.

 

In the world of nuclear strategy, the problem of credibility is like that of an adult, trying to deal with a child who refuses to eat her vegetables; the frustrated parent might threaten “Eat your spinach or I’ll blow up the house.” (Don’t try this at home; first of all, it probably won’t work.) Or consider a police officer, armed with a backpack nuclear weapon, who confronts a bank robber by demanding “Stop, in the name of the law, or I’ll blow up you, me, and the whole city.” It’s what led a NATO general to complain during the Cold War, when the West’s nuclear weapons were deployed to deter the Red Army from over-running Europe, that “German towns are only two kilotons apart.” And what led to interest in neutron bombs (designed to kill troops but leave buildings intact), as well as in doctrines (“limited nuclear war-fighting”) and devices (battlefield nuclear weapons), designed to be usable and thus, credible.

 

But the downside of dancing on the edge of a crevasse in order to make your threat credible by leaving it somewhat to chance, is that, well, it leaves that thing — and a rather important one at that — to chance! By the same token, making nuclear deterrence more credible by deploying weapons that because of their size and ease of employment are more usable means that they must in fact be more usable, a paradoxical situation for weapons whose ostensible sole purpose is to make sure that they won’t be used!

 

In an earlier book, The Strategy of Conflict, Schelling had discussed the means whereby one side might coerce another, despite the fact that it cannot credibly threaten nuclear war, by employing “the deliberate creation of a recognizable risk of war, a risk that one does not completely control, deliberately letting the situation get somewhat out of hand . . . harassing and intimidating an adversary by exposing him to a shared risk.” 

 

Schelling’s hair-raising mountain metaphor may have been inspired by the word “brinkmanship,” which seems to have been first used by Democratic Party presidential candidate Adlai Stevenson, who, at a campaign event in 1956, criticized Republican Secretary of State John Foster Dulles for “boasting of his brinkmanship — the art of bringing us to the edge of the nuclear abyss.”  At about this time, Henry Kissinger (then a little known university professor) began developing both the notion of “limited nuclear war” as a way of circumventing the credibility problem, and the concept of the “security dilemma,” in which “the desire of one power for absolute security means absolute insecurity for all the others.”

 

The idea of security dilemmas has typically been applied to the problem of recipro­cal arms races, whereby a country’s effort to counter a perceived military threat by building up its arsenal results in its rival feeling threatened, which leads that rival, in turn, to build up its arsenal — and so on. As a result, both sides end up less secure than they were before. Brinkmanship, à la Dulles and Schelling, introduces yet another dilemma: when a lack of credibility leads to various stratagems intended to enhance credibility, they may well succeed in doing so, but in the process reduce security on all sides.

 

So, the next time you find yourself tethered to an adversary at the edge of a crevasse — whether in a thought experiment or reality —  you might want to recall the advice offered by the super-computer in the 1980s movie, WarGames: “the only winning move is not to play.”

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/blog/154436 https://historynewsnetwork.org/blog/154436 0
Biden will Confront Systemic Conservatism Despite a Mandate for Change

William Thornton's Design for the U.S. Capitol, 1796

 

 

On the Sunday after the November 3rd presidential election, Utah Senator Mitt Romney, the 2012 Republican presidential candidate, congratulated President-elect Joe Biden but insisted that the overall election was an endorsement of conservative principles. He pointed to the gains Republicans made in the House, though they are still in the minority, and the failure of the Democrats to capture control of the Senate, at least so far. Romney found further evidence in the Democrats’ inability to flip GOP-controlled statehouses.

Romney, however, is mistaken in his basic assertion. First of all, Biden won by more than 5 million popular votes, nearly 4 percent more than Trump’s total. The president-elect obtained the highest number of popular votes in the nation’s history. Biden’s margin of victory, contrary to Romney’s claim, is not a mandate for conservatism. Rather, at the very least, the election was a referendum on President Trump’s leadership, which of course Trump used to promote conservative ideas concerning tax cuts for the wealthy and the relaxation of business and environmental regulations.                  

No presidential election outcome reflects any single issue and it remains for the experts to crunch the numbers and analyze the ingredients that secured the Biden-Harris victory. Yet we already have sufficient evidence that the majority of the American people favor progressive positions on many issues. Surveys by Gallup, Pew, and other reliable organizations consistently show that a significant majority of Americans favor “Medicare for All,” tighter gun safety restrictions, and the freedom of women to have abortions in most situations. A number of states, including Florida, have voted for the $15 minimum wage. Last but not least, polls show that a majority of Americans believe that racial discrimination continues to exist and should be addressed. These are progressive not conservative principles, and are sustained by the Biden-Harris victory.

Nevertheless, Romney is correct in one sense. In the United States what I call “systemic conservatism” continues to prevail. I draw this phrase from our reawakened realization of “systemic racism,” and the two are related. By systemic conservatism I mean the institutional barriers created by the founding fathers to limit popular democracy. One obvious example is the Electoral College. Presidential candidates have to win a majority of electoral rather than popular votes. Had it been otherwise we would not have a President George W. Bush or Donald J. Trump. The Electoral College consists of the number of votes each state has in the U.S. House and Senate (except for the non-state of Washington D.C., which has three electors). The choice of an Electoral College to decide the presidency resulted from efforts of small state and slave state delegates at the Constitutional Convention to ensure their ongoing power. Most troubling, under the three-fifths compromise slave states increased their electoral votes.  They did so by securing the constitutional right to count 60 percent of their enslaved people for purposes of representation in Congress and the Electoral College.

In addition, the Senate created rules to frustrate a majority of its members. Until the 1960s, southern senators used the filibuster rule, which allowed unlimited debate in the absence of a supermajority vote, to frustrate attempts to pass civil rights legislation. Republican Majority Leader Mitch McConnell has used this rule to thwart progressive legislation for the past ten years. Even if the Democrats wind up gaining two seats in Georgia, resulting in a Senate tie, they will need sixty votes to enact legislation unless the filibuster rule is changed. And if they manage to do so, a conservative majority on the Supreme Court can still overturn that legislation.

The federal system has often blocked the effects of progressive policies initiated at the national level, The post-Reconstruction Jim Crow era that lasted into the 1960s saw the southern states eviscerate the Fourteenth and Fifteenth Amendments in a variety of ways. Even when the Supreme Court struck down racial segregation in schools in 1954, southern states adopted so-called freedom of choice plans to sidestep the court’s ruling for another two decades. President Franklin D. Roosevelt’s New Deal legislation was instrumental in combating the Great Depression, but it had to be administered through the states. This gave states, particularly in the South, the opportunity to reinforce racial segregation within these programs and also ensure that agricultural subsidies benefitted plantation owners to the detriment of their tenant farmers and sharecroppers, a disproportionate percentage of whom were African American. 

Progressive change does happen within our political system but it faces serious obstacles. The abolition of slavery and the extension of citizenship and voting rights to African Americans required a Civil War. It took the Great Depression to achieve Social Security, minimum wages, and anti-child labor laws.  The Civil Rights Movement was necessary to re-enfranchise African Americans and people of color as was the Women’s Suffrage Movement that extended the vote decades earlier mainly to white women. Democratic Party victories following the 2008 Great Recession provided for a short time the majorities needed to move incrementally toward universal health insurance.

There is also ample precedent within the federal system of states serving as laboratories for progressive policies, as was the case in Wisconsin during the early twentieth century. Under the leadership of Governor Robert M. LaFollette, Wisconsin joined government officials together with academic advisers to create a reform agenda that was copied throughout the nation In the early 2000s, Massachusetts under the leadership of Governor Mitt Romney created a system of statewide health insurance that became the model of President Barack Obama’s Affordable Care Act. However, in 2020, with most state legislatures in the hands of Republican majorities, the prospects for reform measures bubbling up from the bottom to the top of the political mainstream are dim. 

Just as President-elect Biden will have to confront systemic racism he will also have to deal with systemic conservatism. It does not look like he will have the necessary legislative majority to achieve his programs. At best, incremental rather than sweeping change is more likely.  

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178320 https://historynewsnetwork.org/article/178320 0
A Surprise Encounter with Zora Neale Hurston Fred Zilian (zilianblog.com; Twitter: @FredZilian) is an adjunct professor of history and politics at Salve Regina University. ]]> Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178322 https://historynewsnetwork.org/article/178322 0 Take a Lesson from the Persistence of the Founder of Modern Thanksgiving Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178321 https://historynewsnetwork.org/article/178321 0 I Dare Call it Treason Steve Hochstadt is a writer and an emeritus professor of history at Illinois College.

 

 

We are now witnesses to the most dangerous act of selfishness from the King of the Self. Trump knows the evidence for any form of election fraud is silly fantasy. His electoral deficits are beyond challenge--74 Electoral College votes and more than 5 million popular votes. Yet he repeats his denunciations of American elections, the bedrock of any democracy, that began when he was only a candidate. In October 2016, he called the election “one big, ugly lie”.

 

His disastrous character flaws are obvious to everyone. We, and here I mean all those who care about the real world around us, must now go beyond psychological analysis to political clarity. Trump is a traitor.

 

That most despised word is not necessarily synonymous with committing treason as defined in law. In the midst of a war against a foreign oppressor, our founders explicitly limited treason in Article III: “Treason against the United States, shall consist only in levying War against them, or in adhering to their Enemies, giving them Aid and Comfort.” For centuries, our judicial system has hewed to that original language, standing on “only”.

 

The label traitor was used in a wider sense by Joseph McCarthy and the legions of supporters for his conspiracy theories. He asserted in 1954 that Democrats were the “party of treason” and the entire administrations of FDR and Harry Truman were guilty of “twenty years of treason”. The foreign enemy was obvious then, even without a declaration of war. Although many Democrats supported the anti-communist witch hunts associated with McCarthy’s name, Republicans have been using accusations of communism, and thus collaboration with the enemy, in elections ever since. John Stormer’s 1964 book “None Dare Call It Treason”, asserting that America had been thoroughly infiltrated by Communists, played an important role in Barry Goldwater’s presidential campaign and in influencing Ronald Reagan’s political philosophy.

 

Some of Trump’s foreign acts have skirted the line to treason, and I expect we will eventually learn more about details of his relationship with Russia that do involve giving aid to an enemy of America. But Trump has been the only President to routinely accuse the other major political party, which represents the majority of voting Americans, of being traitors, even for such petty offenses as not applauding his State of the Union address in 2018.

 

To properly judge Trump’s current behavior, we must think about treason inclusively. A useful conception of treason should include attempts to overthrow our constitutional system from within. Pinochet, the junta in Argentina, Mussolini, and Hitler all conspired to overthrow their own democratic governments. They were traitors to their fundamental laws.

 

Trump embarked months ago on an attempt to bring down the American republic during this election. Claiming that the 2016 vote that put him in the White House was rigged was typical of his unscrupulous boorishness and political egotism, but in a different category than asserting that the Democrats were from the beginning plotting to steal this election and are now doing that in front of our eyes. If there was any truth to this accusation, Democrats would be committing treason. But there isn’t, and that is abundantly clear to all but those Americans who believe every word of the President.

 

I might consider taking up arms against any group who was trying to take over my country, and I don’t even own arms. It is not surprising that Trump believers have brought weapons to intimidate elected officials, have plotted to kill Democratic Party leaders, and are now threatening revolt. Trump is purposely destroying the faith of millions of Americans in the legitimacy of their system of government. Afterwards he plans to return to his golden existence. But those supporters are making themselves outcasts in America, armed, angry, and woefully misinformed.

 

Commentators have searched for the proper words to label Trump’s current actions. Thomas B. Edsall asked many election experts about Trump’s behavior, and their responses include the words narcissism, sociopathy, dangerous, irrationalism, delusion, and norm-busting. The political scientist Bryan Garsten wrote in a New York Times opinion piece (November 9) that we should label Trump a “demagogue” as a way to protect our country from the next one. Sean Wilentz, professor of history at Princeton, was more politically pointed: if Trump rejected these election results, “it would be an act of disloyalty unsurpassed in American history except by the southern secession in 1860-61, the ultimate example of Americans refusing to respect the outcome of a presidential election.”

 

Margaret Sullivan wondered in a Washington Post column on Nov. 12 how journalists should “navigate this tricky path”: “How do you cover something that, at worst, lays the groundwork for a coup attempt and, at best, represents a brazen lie that could be deeply damaging to American democracy?”

 

It matters what words we use to label Trump. If the combination of Republican voter suppression and Republican rejection of election results is not called out for what it is, Trump will not just be busting norms, but destroying our still imperfect union. The proper label for that crime is treason.

 

Bank robbers who sign their hold-up notes may be incompetent, but they are still committing robbery. Just because Trump’s actions are clumsy, petulant, and unlikely to succeed, is no reason not to label their criminality properly.

 

And Trump is succeeding, at least in public relations. Seven in 10 Republicans now say the 2020 election was not free and fair: 48 percent of Republicans say it “definitely” was not free and fair, and another 22 percent say it “probably” was not. That’s twice the share of Republicans just before the election who said the race would not be free and fair. Before November, 68 percent of GOP voters said they had some trust in our elections. Now that has dropped to 34 percent.

 

The greatest security risk we face is not an external enemy. Only one-third of Republicans believe in American democracy. More than one-third think it is likely that the results we have heard will be overturned.

 

Trump’s tweet two days after the election, “STOP THE COUNT!”, has been widely misunderstood. Election officials do not heed his tweeted instructions. He was urging his supporters to stop those officials from counting ballots. That was incitement to insurrection.

 

Bringing a prosecution against Trump for treason would be futile and counter-productive. The flag-hugger’s treason is less a legal matter than a rhetorical issue. The recent willingness of news networks to label Trump’s lies as lies may not have swayed many of his supporters, but it has brought clarity to our thinking. Calling his efforts to overturn this election “treason” clarifies the significance of Trump’s anti-American actions.

 

Having a traitor in the White House was the fantasy of the TV thriller “Designated Survivor”. Now life imitates bad art.

 

Steve Hochstadt

Jacksonville IL

November 12, 2020

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/blog/154437 https://historynewsnetwork.org/blog/154437 0
HNN Will Be Off for the Week Thanksgiving greetings to the HNN community! I'm personally thankful to have spent most of 2020 connecting with this community of writers and readers and promoting the value of history in the public square. 

HNN will be observing the holiday by taking the week off. New op eds will return on December 6, and history-related news and opinion from around the web will be reposted starting on November 30. Newsletters for email subscribers will return on December 2

Also, HNN's annual fund drive will begin after Thanksgiving. The cajoling, badgering and hectoring will commence then. For now, I wish all of you a restful and safe holiday. 

--Michan Connor, HNN Editor in Chief

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178314 https://historynewsnetwork.org/article/178314 0
The Roundup Top Ten for November 20, 2020

Trump’s Big Election Lie Pushes America Toward Autocracy

by Timothy Snyder

"A claim that an election was illegitimate is a claim to remaining in power. A coup is under way, and the number of participants is not shrinking but growing. Few leading Republicans have acknowledged that the race is over."

 

Trump Continuing to Fight on Could Cost American Lives, Jobs and More

by Eric Rauchway

Herbert Hoover's refusal to support FDR's policy agenda during his lame duck period delayed economic recovery at the cost of jobs, homes and lives. 

 

 

Patsy Takemoto Mink Blazed The Trail For Kamala Harris – Not Susan B. Anthony

by Judy Tzu-Chun Wu

Patsy Takemoto Mink, elected in 1972 as the first woman of color in Congress, deserves recognition as a pioneering advocate for gender equity and the rights of Americans Caribbean and Pacific territories, and for preparing a path for Kamala Harris's election as Vice President. 

 

 

The GOP Test

by Sean Wilentz

If Mitch McConnell, Kevin McCarthy, and their respective caucuses persist, they will have tainted their party far beyond what Trump already has.

 

 

What the Greatest Generation had that the Covid Generation Lacks

by Nicole Hemmer

"The crisis, it turns out, is not the selfishness of this generation of Americans. It's the selfishness of the administration and its allies in Congress."

 

 

Effi Eitam Leading Yad Vashem Disgraces the Memory of the 6 Million

by Derek Penslar and Susannah Heschel

"The politicization and radicalization of the institution will rob it of its legitimacy. Yad Vashem cannot fulfill its responsibilities with Effi Eitam at its helm."

 

 

Trump’s Presidential Library Will Be A Shrine To His Ego

by Paul Musgrave

If former presidents aren't interested in hosting official presidential papers at a library center, there is little oversight of how they tell the president's story. Any future Trump presidential center will likely not function as a library at all, but as a propaganda organ for Trumpism. 

 

 

Against Returning to Normal

by David Walsh

Liberal pleas to return to a "normal" defined by bipartisan consensus ignore the long legacy of ideological conflict and the pursuit of division as a political strategy by the conservative movement. 

 

 

American Democracy Was Never Supposed to Work

by Richard Kreitner

"Merely ousting Trump is not enough without addressing more fundamental weaknesses in our political system, especially an outdated Constitution that continues to serve a minority of wealthy and white citizens and to curb any movements that might threaten their wealth and power."

 

 

Ruby Bridges’ School now Reflects another Battle Engulfing Public Education

by Connie L. Schaffer, Martha Graham Viator and Meg White

The New Orleans school integrated by Ruby Bridges is now operated by a private charter school company, part of a trend that three education scholars say jeopardizes the survival of the entire system of public education in the United States.

 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178313 https://historynewsnetwork.org/article/178313 0
Reckoning with Marcus Whitman and the Memorialization of Conquest

 

The National Statuary Hall Collection in the Capitol in Washington DC is crowded with embodiments of white supremacy. The collection, which consists of two statues donated by each state, includes Jefferson Davis, president of the Confederacy, installed by Mississippi in 1931. Alexander Hamilton Stephens of Georgia, vice president of the Confederacy, stands opposite Davis. Stephens described the Confederacy as founded on “the great truth that the negro is not equal to the white man.” John C. Calhoun, who represents South Carolina, defended slavery as a “positive good” that had elevated “the black race.”

No fewer than eleven former heroes of the Confederacy are enshrined in the Statuary Hall and other parts of the Capitol.They share space with many other defenders of white supremacist ideology, including Andrew Jackson, one of Tennessee’s honorees, who pushed Congress to pass the Indian Removal Act of 1830, forcing more than 100,000 Indians to leave their homelands in the South to make room for white settlers. Junipero Serra, an eighteenth-century friar, imprisoned and abused indigenous people who refused to convert to Christianity. A bronze statue of him, a cross held high in one hand, was placed in the Capitol by California in 1931.

And then there’s Marcus Whitman, a nineteenth-century Protestant missionary, promoter of the colonization of the American West, and, inexplicably, the hunkiest denizen of Statuary Hall. Donated by Washington state in 1953, Whitman’s statue depicts him as a ripped, muscular frontiersman, nine feet of gleaming bronze atop a seven-ton block of polished granite. He appears to be striding resolutely along an unbroken trail, one foot higher than the other, buckskins taut against linebacker thighs. His strong jaw is neatly bearded, his flowing locks topped by a beaver-skin hat; he holds a Bible in one hand, saddlebags and a scroll in the other.

The statue personifies Whitman’s place in the mythology of the West, not the realities of his life. The only feature that can be verified as historically accurate are the saddlebags, which were copied from a pair he used when he was an itinerate physician in upstate New York in the early 1830s. He left them behind when he was appointed a missionary in 1835. They ended up in the Presbyterian Historical Society in Philadelphia. Sculptor Avard Fairbanks studied them when he was designing what is otherwise a fanciful imagining of Marcus Whitman.

In 1836, Whitman and his wife, Narcissa, traveled west with another missionary couple to what was then called Oregon Country -- a vast region consisting of the present-day states of Oregon, Washington, Idaho, and parts of Montana and Wyoming. The Whitmans established a mission on Cayuse Indian land near today’s Walla Walla, Washington. Relations between the couple and their hosts were initially cordial but soured over time. The Whitmansexpected the Cayuse to be eager to convert to Christianity, take up farming, and live like white people. The Indians were interested in some aspects of the newcomers’ culture and religion, but only to supplement, not replace, their traditional beliefs and way of life.

The Whitmans eventually gave up trying to “save” Indians and shifted their focus to promoting white emigration to Oregon Country. Whitman put it bluntly in a letter to relatives in New York in 1844. “I have no doubt our greatest work is to be to aid the white settlement of this country,” he wrote. 

The Cayuse watched with increasing alarm and resentment as more and more whites moved through their territory, using up scarce firewood, killing game without permission, and depleting grasses needed for Indian horses and cattle. Between 1843 and 1847, nearly 10,000 settlers traveled overland to Oregon Country in rut-making wagon trains. “The Indians are amazed at the overwhelming numbers of Americans coming into the country,” Narcissa Whitman wrote to her parents in the summer of 1847. “They seem not to know what to make of it.”

More than 70 people had settled in for the winter at the Whitman Mission in November 1847, most of them emigrants who had arrived that fall and were too weary, sick, or destitute to continue their journey west until the spring. Their arrival coincided with a virulent epidemic of measles among the Indians. An estimated 30 of the 50 or so Cayuse living in a village close to the mission died between the first week of October and the end of November. Most of the dead were children. The tribe as a whole lost perhaps a third of its members to measles. The disease also swept through the mission community, sickening several young adults and at least a dozen children, but only one (a six-year-old girl) died. Whitman tried to treat everyone who was sick, but the fact that nearly all of his white patients recovered while his Indian patients died convinced some Cayuses that he was deliberately poisoning Indians in order to give their land to white setters.

On November 29, 1847, a group of Cayuse -- fourteen to eighteen by most estimates -- attacked the mission, killing the Whitmans and 11 male emigrants in what became known as the Whitman Massacre. Congress responded to news of the attack by passing a long-delayed bill establishing the Territory of Oregon. The bill had been stalled for more than two years by a debate over whether slavery would be permitted in the new territory. (In the end, it was not). The “massacre” served as a rallying cry for a two-year war of harassment and retribution against not only the Cayuse but any Indians suspected of being allies or sympathizers. Finally, in the spring of 1850, five Cayuse surrendered to the territorial government and were hanged. 

The Whitmans were virtually canonized in the years after their deaths. Local historical societies and community boosters placed plaques, monuments, and roadside markers honoring the couple in ten states and the District of Columbia. Whitman himself became one of the most memorialized figures in the Northwest. A county, a college, a national forest, half a dozen public schools, and numerous other enterprises carry his name. His former mission was preserved as a National Historic Site.

The pinnacle came in 1953, when the bronze Whitman was installed in Statuary Hall. Supreme Court Justice William O. Douglas gave the keynote speech. Douglas (later a liberal icon) was a graduate of Whitman College, established in Walla Walla in 1859 as a “living monument” to the missionary. He praised Whitman’s role in promoting white settlement, called him a martyr, and deplored the “treachery” of the Indians who killed him.

Now, during a national reckoning over who and what should be commemorated in America, Whitman’s halo has slipped. A Seattle legislator introduced a bill last year to replace the statue with a more contemporary hero. The bill died in committee but supporters have vowed to bring it up again. A bronze copy of Whitman’s statue on the outskirts of Whitman College has been vandalized several times. The city of Walla Walla, which owns the statue, is wrestling with the question of what to do with it.

Last summer, the House of Representatives voted to remove all the Confederate statues from the Capitol and send them back to their home states or to the Smithsonian. The measure passed with an unusual display of bipartisanship: 305 to 113. Senate Majority Leader Mitch McConnell (R-Kentucky) has thus far refused to introduce it to the Senate, calling it an attempt to “airbrush the Capitol." Still, the momentum seems to be on the side of banishing symbols of white supremacy from the public square.

But what to do with the statues, once they are forklifted away? Perhaps they could all be moved into a special museum. Call it the Hall of Fallen Heroes. Exhibit text would explain who the ex-honorees were and when, why, and by whom they were celebrated. Context is everything. Most of the Confederate statues are products of the Jim Crow and early Civil Rights eras. They romanticized the Confederacy as a noble and heroic “Lost Cause” and downplayed the role of slavery in the Civil War. They also served as not very subtle tools of intimidation to black and brown people who were trying to assert their civil rights. 

That same period (roughly 1890s through 1950s) saw the proliferation of monuments idealizing the virtues and sacrifices of while colonizers in the American West, including the Whitmans. These memorials ignored the impact of conquest on indigenous peoples. Glorifying the pioneers was a way to justify what had been done in the past and perhaps ease anxieties about the future -- the solidity of stone and metal suggesting that the sons and daughters of the pioneers would continue to prevail.

There’s a delicate balance between preserving monuments to the past and respecting the changing values of the present. But the memorial landscape is haunted by the ghosts of slavery and the legacy of colonialism. It’s past time to move toward a more honest and nuanced accounting of American history.

 

[i] Mike DeBonis, “A Field Guide to Racists in the Statuary Hall,” Washington Post, June 23, 2015 (https://www.washingtonpost.com/news/post-politics/wp/2015/06/23/a-field-guide-to-the-racists-commemorated-inside-the-u-s-capitol/)

 

[ii] James Bikales, “Here Are the Confederate Statues in the Capitol,” TheHill.com, June 12, 2020 9 Mike DeBonis, “A FIELD GUIDE TO RACISTS IN THE STATUARY HALL,” Washington Post, June 23, 2015 (https://www.washingtonpost.com/news/post-politics/wp/2015/06/23/a-field-guide-to-the-racists-commemorated-inside-the-u-s-capitol/)

[iii] Catie Edmondson, “House Votes to Remove Confederate Statues From U.S. Capitol,” New York Times, July 22, 2020 (https://www.nytimes.com/2020/07/22/us/politics/confederate-statues-us-capitol.html)

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178222 https://historynewsnetwork.org/article/178222 0
Recovering Acts of Progressive Patriotism: Teaching Through Protest Music

Anna Klobuchar Clemenc and the Italian Hall of Calumet, MI

 

 

The concept of patriotism has received a great deal of attention during the Trump administration, with the president and his supporters doubling down on the celebratory and mythic varieties.  Lacking any formal ties to military service, the president is left with what scholar Ben Railton refers to as exclusionary mythologizing, which is not patriotism at all, but rather blind nationalism (read Ben Railton’s essay on patriotism for HNN—ed).  At the top level, it involves bullying and bombast.  On the ground, it includes hawking of merchandise, owners adorning their trucks with both American and Trump flags, highway overpasses commandeered by fanatic supporters flying similar flags, and a whole lot of divisiveness.  

Regardless of the outcome of the election, such branding will no doubt be appealing to certain politicians and segments of the population.  Yet, Railton sees an opportunity to highlight other types of patriotism in America’s past, including participatory and critical patriotism, underscoring the importance of recovering stories and histories that are often overshadowed or forgotten.  While teaching a recent course using music history titled “Roots, Rock, Rebels”, my students and I had the opportunity to explore two such cases of critical patriotism that drew specific connections to and use of the American flag.  In both cases, however, hope gave way to backlash or defeat, at which point musicians offered another yet another type of protest.

The course came together through a mix of inspirations, starting with the first piece of the puzzle, when our department hosted authors Doug Bradley and Craig Werner during their lecture tour in promotion of the book We Gotta Get Out of This Place: The Soundtrack of the Vietnam War.  Their presentation included conversations with multiple veterans in the audience and did not shy away from the theme of critical patriotism.  They played numerous samples of music, with Werner suggesting that a favorite class activity was listening to Marvin Gaye’s What’s Goin On album from start to finish.  Werner is also the author of A Change is Gonna Come: Music, Race, and the Soul of America, a book not only suitable for the class, but also containing multiple examples of participatory and critical patriotism.

 

As these ideas were coalescing, a final piece to the puzzle came together with the publication of Daniel Wolf’s Grown Up Anger: The Connected Mysteries of Bob Dylan, Woody Guthrie, and the Calumet Massacre of 1913.   Wolf’s study of a tense labor dispute and the tragedy that followed became the core research focus of our course.  Numerous primary sources guided our study, along with Steve Lehto’s comprehensive book Death’s Door: The Truth Behind Michigan’s Largest Mass Murder.  Publicly, Lehto was critical of Wolf’s scholarship, but this misses the significance of Wolf’s book as Wolf shows a pattern in which a struggle occurs, but those holding power win out and attempt to extinguish the hope of those seeking progressive changes, sweeping them to the dustbin of history and their place in America’s identity in the process.  

Musicians, however, stood as historical witnesses to forgotten or frustrated histories and were, according to Wolf, willing to “tell the truth” through anger.  Channeling anger instead of folkish hope, Woody Guthrie looked back on the Calumet massacre in writing the topical ballad 1913 Massacre in 1941.  Bob Dylan later performed the song while also using the same melody in writing Song to Woody.  It was his classic Like a Rolling Stone, however that Wolf references as the ultimate truth telling song, full of anger and not hope.  Using the historical skill of sourcing that historians employ, one realizes the song was released in 1965, two years after the progressive hope of the March on Washington often associated with Martin Luther King Jr.’s iconic speech.

Although Wolf does not make a big deal of it, my students and I zeroed in on two very noticeable uses of the American flag as we reviewed the interconnected histories of Woody Guthrie, Bob Dylan, and Mahalia Jackson, and also the similar theme of hope and critical patriotism displayed by immigrant laborers and later African Americans across the first seven decades of the twentieth century.  

The first came in our study of the Calumet massacre that angered Guthrie many years later, so called because after an intense standoff between laboring miners and the Calumet & Hecla Mine Company, seventy-three individuals, most of them children, lost their lives in a crushing stampede in the stairwell of the town’s Italian Hall after someone opened the door and yelled fire.  After the false alarm was sounded, the door was barred shut, with many believing mine ownership and their police force were behind the tragedy, adding insult to injury.  Throughout the build up and aftermath of the tragedy, however, labor leader Annie Clemenc stood tall.  In fact, she was often referred to as “Big Annie.”  She was also not afraid to use the American flag to support the cause of first-and second-generation laborers seeking better wages and working conditions.

 

Annie Klobuchar was born in 1888 to Slovenian parents living in Calumet, Michigan, a town in norther Michigan’s iron range that drew many first-and second-generation immigrants to work the rich copper vein.  In 1913, tensions boiled over between the Calumet & Hecla Mining Company and their laborers.  By mid-summer, the laborers were on strike.  This did not stop the owners from hiring replacement workers and security forces to harass the striking laborers.  In the midst of the turmoil, Annie starting leading daily marches on behalf of the striking workers, putting the full force of their progressive demands for better wages and working conditions behind a ten-foot American flag, which she carried daily on her marches.  Over time, she was joined by defense attorney Clarence Darrow, John Mitchell, President of the United Mine Workers of America, and labor activist Mother Jones.   Leadership of the movement were proud to call themselves socialists, or at the very least read from socialist newspapers.  They were also proud to use the ideals of the American flag to support their progressive hopes as they aimed constructive criticism at mine ownership who held the balance of power.

 

Using replacement labor, force, and even time, ownership won out as 1913 wound down and winter approached.  Looking to add a bit of cheer in their lives, labor leadership hosted a Christmas party for many of the children in the area.  The pursuant disaster further broke their spirit, yet Annie did not abandon the flag.  On December 28, 1913, Clemenc led a parade of mourners, once again carrying her ten-foot flag at the head of the procession.  This time it was draped in black crepe, and later positioned prominently at the cemetery as eulogies were read in Finnish, German, Italian, English, and Croatian for many individuals soon forgotten in the American story.  That is, until Woody Guthrie’s anger sought to “tell the truth,” while Guthrie himself was working out definitions of Americanism amidst the backdrop of World War II.

The second prominent example in which we found the American flag as a part of a progressive cause came later in the course when studying singer Mahalia Jackson in Werner’s book A Change is Gonna Come.  Jackson was a musical icon who brought gospel from the church to mass audiences.  After meeting King in 1956, she agreed to sing at a fundraising rally in support of the Montgomery Bus Boycott.  Thereafter, she remained loyal to King, lending not only her talents to his speaking engagements, but also motivation.  She was at the March on Washington (note the image of Jackson framed by two American flags as she made her way to the stage).  Performing on stage, images of Jackson are framed by American flags.  Playing the audio of the performance while projecting an image of Jackson in front of a flag was a poignant reminder that those marching and taking to the stage identified with the ideals of the flag and dared to dream of inclusion in America’s story—past, present, and future.

 

The art of historical detection also indicated Mahalia Jackson played a critical role in encouraging King’s “I Have a Dream Speech.”  King took to the stage with a pair of metaphors in mind.  At a point when his prepared speech, which did not include the lines about the dream,  hit a bit of a snag, Jackson leaned in, as she did many times before, and offered the words, “Tell Them About the Dream Martin, Tell Them About the Dream.”  Flags fluttering in the backdrop were met with the words of hope as King, Jackson, and others standing at the base of the Lincoln Memorial participated in a protest of critical patriotism that dreamed of hope for an inclusive America.  A few short years later, the Civil Rights Bill followed.  Despite the hope and the subsequent policy change, racism and profiling persisted.  So did violence as the 1960s stretched to a chaotic end over the second half, thus betraying the hope of the first.

Historical detection also tells us Bob Dylan participated in the March on Washington, as protestor and performer, taking the stage, albeit somewhat awkwardly, to sing Only A Pawn in Their Game.  Already aware of historical injustices invoked by his hero Woody Guthrie, Dylan added to the catalog of injustices, increasingly willing to get at the truth through anger.  In 1964, the same year as the Civil Rights Act, he recorded The Lonesome Death of Hattie Carroll, chronicling 1960s racism on his album The Times Are a-Changin.  One year later, he released Like a Rolling Stone on his Highway 61 Revisted album.  Wolf argues this song found Dylan at his most angry, as if outrage was the only way to respond to the world.  Over the following years, Dylan’s music changed, even prompting critic Greil Marcus to muse “What is This Shit?” in a Rolling Stone review.

In the mid-1970s, a rejuvenated Dylan resurfaced in time to celebrate America’s birthday by organizing the Rolling Thunder Review.  It is no coincidence that the first stop on the tour was at Plymouth, Massachusetts, a location where rebels once settled, but also a location that assumes meaning in the malleable mythmaking of America.  One of the more electric songs on the set was Dylan’s recent release, Hurricane, a protest song, telling the story of Rubin “Hurricane” Carter, a tale of racial profiling and injustice that would sadly fit with 2020 America.  

The Rolling Thunder Review also included a stop at the Tuscarora Reservation in New York, where Dylan led a rousing cover of Peter La Farge’s The Ballad of Ira Hayes, proving more truth needed to be told.  Hayes’ post World War II story ended tragically following his act of participatory patriotism at Iwo Jima, first in service of his country as a Pima Native American, but most iconic as one of a handful of Marines raising the flag over the island in victory.  The latter is part of the American myth, while the former, including Hayes’ overall story, and even the service of Native Americans to America, is often forgotten.  Tellingly, the Rolling Thunder Review ended each night with a rollicking rendition of Woody Guthrie’s This Land is Your Land, a song many view as an inclusive alternative to the national anthem.

The connections explored here unearth histories of critical patriotism through progressive protest.  Throughout, the flag is used as the ideals of fairness, liberty, and justice are applied to its meaning.  As such, these are protests not against the flag, or America, but rather injustices, with the hope of finding a better, more equitable and inclusive America.  When these hopes are swept under the rug and struggles are ignored and discarded into the dustbins of history, the injustices can reach a point of simply being too overwhelming.  Woody Guthrie’s 1913 Massacre and Bob Dylan’s Like a Rolling Stone(along with his mid-1970s material) are songs of anger and truth telling, with the musicians using what was familiar to them.  

Contextualizing events from the historical past in which hopes and dreams that rallied behind the flag were crushed, ignored, or met with backlash are critical in framing current discussions and broadening definitions of patriotism.  For Guthrie and Dylan, it was using music to express disappointment and even anger after hope was ignored.  Might we consider Colin Kapernick’s act of kneeling during the anthem, along with fellow athletes who have use their platform of fame in following, in a similar vein?  Theirs is not a protest against a flag, but rather injustices that exclude from all the positive and progressive meanings the flag could represent.

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178226 https://historynewsnetwork.org/article/178226 0
A Medieval Perspective on the Public Acceptance of Women as Leaders

Jeanne de Penthièvre, Duchess of Brittany (1319-1384). Image Arras Bilbiothèque Municipale. CC by NC 3.0

 

 

 

It has recently been announced that America’s next vice president will be a first: Kamala Harris, a Black woman, has been elected to the second most powerful office in the United States. Her achievement seems fitting given the increasing prominence of women on the political stage in recent years, from Hillary Clinton’s winning the popular vote for President of the United States in 2016, to Jacinda Ardern’s recent landslide reelection in New Zealand, to the progressive leadership of the Squad (all four of whom have just won another term in the US House of Representatives).

 

The historical significance of these victories, however, lie not only in the power that these women exercise, but in how they got to their positions. In practice, women across the globe have held positions of political power for all of human history (a handful of their stories have been creatively illustrated and retold by Jason Porath over at Rejected Princesses!). But in the twenty-first century, women of all kinds still have major inroads to make into the tradition of ‘pale, male, and stale’ leadership that has dominated Western democracies (meanwhile, the only countries to have broken over 50% female in their legislatures are Rwanda, Cuba, and Bolivia). Democratic political systems, in fact, raise a more complicated question than whether women can exercise power: why should we choose women to be in charge?

 

Inheritance vs. Appointment

 

This is a question which people have struggled with for a very long time, as a case of disputed succession from fourteenth-century France shows. In 1341, the duke of Brittany died without any children to inherit the duchy. In the years leading up to his death, the general assumption had been that his niece, Jeanne de Penthièvre, would succeed him to the ducal title. But at the same time, people had worried that the duke’s younger half-brother, Jean de Montfort, would challenge his niece’s claim and try to seize Brittany for himself—which, when the time came, is precisely what happened.

 

It was far from the first time that an unscrupulous uncle had snatched power from the hands of a younger female relative in line to inherit. Just a few decades earlier, for example, Philippe Capet had taken the crown of France in 1316 by denying the inheritance of his late brother’s daughter Jeanne. Ironically, six years later, his own daughters were similarly disinherited in turn by their uncle, Philippe’s younger brother Charles.

 

This double disinheritance helped establish the principle that the kingdom of France could not be inherited by a woman (or even, it was soon determined, by a woman’s male descendants). But the French throne was somewhat unusual in this: most of its neighbors, as well as all the duchies, counties, and other lordships that made up the kingdom itself, had no such restrictions. As a result, Jean de Montfort’s attempt to take Brittany from Jeanne de Penthièvre didn’t turn out quite the same way.

 

To arbitrate the dispute, both sides presented the legal arguments in support for their claim to the king’s court in Paris. These cases raised a number of different issues, but a central pillar of Jean de Montfort’s argument rested on the issue of Jeanne’s gender. A woman could not inherit such an important and powerful polity as the duchy of Brittany, it claimed, because being a duke meant exercising military and judicial powers essential to the wellbeing of the kingdom as a whole. Women were, however, too weak to bear arms, and too capricious to exercise judgment, nor were they supposed to wield authority over men. By this reasoning, Jeanne was intrinsically disqualified from the succession.

 

Jean’s lawyers had not invented any of these ideas, which had established currency in medieval society, and they’re not particularly interesting in themselves. The problem was that they didn’t match up with reality, and medieval people knew this too. Women could, and indeed had, been duchesses of Brittany in their own right, as Jeanne de Penthièvre’s own legal team was quick to point out.

 

More importantly, there were equally convincing reasons why this was perfectly acceptable. If a duchess was called on to do something of which she, as a woman, was not considered capable—such as performing military service—then she could send her husband or another substitute in her place. This was standard practice, not only for women but also for priests, the elderly, and anyone else whose condition did not allow them to fight. While the lawyers therefore did not challenge the misogynistic premise of the original argument, their solution recognized that the problem of military service was not, fundamentally, an issue of gender.

 

But it’s in their discussion of justice and authority that Jeanne’s lawyers made a particularly telling distinction. They conceded (again) that one would not deliberately choose a woman to be a judge, but stated uncompromisingly that when women inherit, they could exercise justice. It was only natural that officers appointed for public administration be men, for a good ruler would necessarily select the ‘fittest’ candidate. But when it came to noble patrimonies, in which power was passed by blood rather than by appointment, women’s rights to public power were as valid as men’s.

 

Perhaps somewhat paradoxically, the king of France sided with Jeanne and granted her the duchy of Brittany (albeit primarily for political, rather than legal, reasons). And, to conclude this story, she retained it for more than twenty years until its conquest by the son of Jean de Montfort—also called Jean—in 1364. But the issues debated in court in 1341 show that from one point of view, it was less what a woman could do with power that mattered, than how she got it in the first place. Or, to put it another way, they moved the problem away from women leaders themselves, to the idea of choosing women leaders.

 

Women and Power by the Numbers

 

This distinction had implications reaching far beyond the courtroom. While the administrations of medieval rulers, from the royal council down to their regional representatives, were composed entirely of men, the social landscape of lordship was more varied.

 

It can be hard to estimate broad social trends in the Middle Ages, but some sources allow us to get pretty good samples of what was happening on the ground. One such opportunity comes from 1389, when King Charles VI made a grand tour of his domains in the south of France. During this trip, more than four hundred members of the local aristocracy came to perform homage to him—it was a legal requirement for them to retain their lands, but also a rare opportunity to meet (and be seen with) the king in person. The king’s accountants kept careful track of who showed up and the lands they held from the king, giving us an unusually detailed snapshot of the local elite.

 

These records highlight a number of ways women could come to exercise lordly power. Some did so as guardians of underaged children, usually their sons and/or daughters. Others might have held lands as widows: women were usually left a ‘dower’ property to serve as a sort of retirement pension, over which they were effective lord until their death. But the most common means of becoming a lord, just as it was for their male counterparts, was through direct inheritance. And, all told, women made up between a quarter and one-fifth of these lords (22%).

 

Similar calculations have been made for other regions of France. In the twelfth and thirteenth centuries, about 20% of the fiefholders of the county of Champagne were women. Likewise, English lordships were inherited by women about 20% of the time. These figures may seem surprisingly high, because we’re used to picturing medieval lordship as an exclusively masculine world. But they’re less surprising when put in the context of what inheritance meant. While daughters were rarely on an equal legal footing with sons in the competition for their parents’ wealth, social and biological factors meant that for 20% of families, there weren’t any sons to worry about—a pattern which holds true for modern societies with similar patterns of marriage and inheritance.

 

What is more telling, however, is how this surprisingly steady ratio compares to current-day patterns of women in elected positions. Globally, the UN reports that ‘[o]nly 24.3 per cent of all national parliamentarians were women as of February 2019, a slow increase from 11.3 per cent in 1995’. While in Western Europe this average is somewhat higher at 33.4%, the US House of Representatives is still at 23.2% and the Senate at 26%. And these rates have been climbing. Around when I, a Millennial, was born, women made up a mere 5% of the US Congress—a proportion that compares quite unfavorably with the number of women who did homage to the king of France 600 years before.

 

While the role of a modern legislator cannot be simply analogized to that of a medieval lord, the comparison demonstrates yet another side of why the democratization of politics has not been a straightforward improvement for women. And the problem is exacerbated the higher the office in question. For instance, only 6.6% of elected heads of state were women in 2019, and this number had in fact decreased from two years prior. Likewise, the proportion of women appointed to cabinet positions lags slightly behind their representation in the legislature, at 20.7%, particularly when considering the more prestigious posts.

 

This is not a new problem, and again, the patterns visible in historical practices of inheritance can shed light on the sticking points of our elected offices. The women (and men) who did homage to King Charles in 1389 were generally relatively minor landholders, and the low stakes made it relatively uncomplicated for women to inherit. But as we saw earlier, the Breton succession dispute expressed anxieties about what would happen if a woman inherited an entire principality. And, around the same time, female heirs were barred the throne of France itself (though queens consort and queens regent could and did continue to exercise considerable, publicly recognized authority). The exclusion of women from positions of highest authority reinforces the ideal of leadership itself as male rather than female, which only makes choosing women leaders harder in turn. It’s a cycle we have not yet managed to break.

 

The more things change?

 

Without denying the progress that has undoubtedly been made and is being made in women’s political participation through formal office, the loose parallels of the Middle Ages help add new understandings of the still disproportionately low representation of women in government. Women could, historically, ‘default’ to positions of power through the flukes of inheritance even if social convention would not have selected them for the position. But the opening of elected positions to women has not produced such a substantial rebalancing of the political landscape as we might have expected.

 

We must, therefore, consider not only the availability of paths to power for female leaders, but shift the social perception of female candidates as lesser alternatives to their male counterparts. Our unwillingness to see women as a good choice for leadership remains a real problem in how we, as a society, continue to allocate power—indeed, a problem that becomes even more salient in (ostensibly) democratic societies. While we are all taught the story of the long fight for women’s suffrage and their right to choose leaders, the ongoing fight for women to be chosen as leaders themselves is not so well-defined a story. And just as non-elective processes meant that some lordly power was filtered to medieval women, measures such as EU quotas for women in politics are merely a minimum step in counteracting our ongoing reluctance to see female politicians as viable candidates on their own.

 

Obviously, the election of women is not an end in and of itself, just as the presence of female lords did not make medieval society more fair. For one thing, consideration only of gender (in a binary view at that) leaves aside other biases (especially race and, more specifically, misogynoir) that impact who we choose to govern. And one need only to look to recent events to see why the appointment of the wrong woman can be disastrous for women’s rights—though even in less extreme cases, some women have always been systematically privileged at others’ expense. The goal, therefore, must not be to reinforce our present oligarchies through women’s participation in them, replicating the social solidarity that bound medieval noblewomen more closely to their male peers than to their female social inferiors. Nevertheless, the act of choosing female leaders helps reshape the long-entrenched narrative that women should only follow, and in so doing demands that we recognize that they have always been capable of leading.

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178225 https://historynewsnetwork.org/article/178225 0
How John Hersey Exposed the Human Face of Nuclear War: Lesley Blume on Her New Book "Fallout: The Hiroshima Cover-Up and The Reporter Who Revealed It to The World"

 

The atomic bomb embodied the absolute evil of war, transcending lesser distinctions such as Japanese or Allies, attacker or attacked.

Kenzaburo Oe, Nobel Prize Laureate for Literature (1994)

 

 

“Little Boy” was the innocuous code name for the uranium-235 atomic bomb that fell on Hiroshima, Japan, on August 6, 1945, at 8:15 AM, Japan Standard Time. The bomb exploded about 2,000 feet above the ground with the force of 20,000 tons of TNT and incinerated much of the once thriving city. 

At detonation and in the ensuing months, Little Boy killed more than 100,000 people, at least 90 percent of whom were civilians. Estimates of the total deaths from the blast range as high as 280,000 people by the end of 1945, but exact figures could never be determined because of the immediate chaos and because so many people were cremated in the firestorm.

Initial news reports on the bomb indicated that it was powerful but similar to a large conventional bomb. The American public read sanitized reports and statistics on the tremendous toll of the bomb. Papers and magazines ran black and white photos of the mushroom cloud, aerial views of the remains of the city, and damaged buildings, and reported figures on dwellings, warehouses, factories, bridges, and other structures that were destroyed. 

However, the reports to the American public following the atomic bombings of both Hiroshima and then Nagasaki contained little information on how the destructive new devices affected the human beings trapped under the mushroom clouds. Indeed, the US government celebrated the new weapons while suppressing reports on agonizing radiation injuries and poisoning, complicated thermal burns, birth defects, illnesses, and other novel and horrible medical consequences of nuclear war. And, after the war ended, the military closed the atomic cities to reporters.

Legendary reporter John Hersey, already a Pulitzer Prize winning novelist and a renowned journalist by 1945, set out to learn about the human face of the Hiroshima bombing. His resulting August 1946 article for the New Yorker became a classic of journalism and eventually a book for the ages. By telling the story from the perspective of six survivors—a young mother, a female clerk, a minister, two doctors, and a German priest—Hersey’s report captured readers with a new form of journalism beyond cold facts and statistics to detailed personal accounts of witnesses that vividly conveyed the moments leading to a historic catastrophe and its aftermath.

In her new book Fallout; The Hiroshima Cover-up and the Reporter Who Revealed It to the World (Simon & Schuster), acclaimed author and journalist Lesley M.M. Blume recounts the story of the atomic bombing of Hiroshima; government efforts to hide the nature of the terrible new weapon; and John Hersey’s journey to reveal the reality of the atomic bomb and how he came to write “Hiroshima,” a report of meticulous journalistic detail as well as an admired work of art that elevated the human voices beyond the soulless statistics and gray wire photos.

Ms. Blume writes vividly as she details this hidden history and demonstrates the value of independent journalism in holding the powerful to account. Her meticulous research included interviews and archival work that revealed new findings on postwar government press relations and on official actions to hide the reality of nuclear war from the public. Her revelations include the never before reported role of Manhattan Project director, General Leslie Groves, in reviewing Hersey’s provocative article.

Ms. Blume is a Los Angeles-based journalist, author, and biographer. Her work has appeared in Vanity Fair, The New York Times, The Wall Street Journal, and The Paris Review, among many other publications. Her last nonfiction book, Everybody Behaves Badly: The True Story Behind Hemingway’s Masterpiece The Sun Also Rises, was a New York Times bestseller, and she has written several other nonfiction books and books for children. Ms. Blume has also worked as a newspaper journalist and as a reporter-researcher for ABC News. And she has a lifelong interest in history. She earned a B.A. in history from Williams College and a master's degree in historical studies from Cambridge University as a Herchel Smith scholar. Her graduate thesis concerned the US government and press relations during the 1991 Gulf War.

Ms. Blume generously discussed her interest in history and her new book by telephone from her office in Los Angeles.

 

Robin Lindley: Congratulations on Fallout, your new book on author John Hersey and his classic account of the human face of atomic warfare, “Hiroshima.” Before I get to the book, I noticed that you have an advanced degree in history and that you often write about the past. What is your background in studying and writing about history?

Lesley M.M. Blume: I've always been a history obsessive, since I was a little girl. I read a lot of fiction then but, as I grew up, I gravitated toward nonfiction. I remember one time, when I was around eleven, one of my parents' friends came over and I was curled up in a corner and reading. She asked what I was reading, likely thinking that it was something like Babysitters Club, and I showed her the book cover. It was The Diary of Anne Frank. I've just always gravitated to history, especially World War II. 

I studied history at Williams College, like my dad did before me, and my focus there was 20th century history with a concentration on World War II. Then I went to Cambridge University for a graduate degree in historical studies. By then, I had become keenly interested in newsroom history and war reporting, and I did a master's thesis on the American media during the Gulf War in 1991. I looked at how that story had been rolled out to the public, and where that fell in the larger scheme of relations between the US government and the press corps and how that relationship had evolved since World War II. The thesis was about patriotism and war reporting and how patriotism waxes and wanes from conflict to conflict, along with the level of cooperation between the press and the military. 

Over the decades, I have had a continued interest in World War II and in war reporting and wartime newsrooms. So, in many ways, Fallout was the culmination of decades of study and interest in war history and reporting. 

Robin Lindley: What inspired your deep dive into the story of John Hersey and his book Hiroshima?

Lesley M.M. Blume: I knew I wanted to do a big, historical newsroom narrative, and there was also a personal motivation. 

The press has been under unprecedented attack in this country since 2015, and I have been disturbed and quite disgusted by the relentless attacks and the designating of journalists as enemies of the people. It was quite a shock when that vernacular first started to surface in 2015 and really got underway in 2016. 

I wanted to write a historical news narrative about America that would show readers the extreme importance of our free press in upholding our democracy and serving the common good. As these attacks have accelerated, not enough people have been defending the press or understanding what would happen to them specifically, not just to the country, but to them individually, if we didn't have a free press. 

It’s curious: the Hersey story found me as much as I found it. I was nosing around the European theater of World War II for a newsroom story before I came to this Pacific theater narrative. And, when I found Hersey's story, it seemed the purest example of the life or death importance of good, independent investigative journalism. I couldn't believe that the story, in the way that I ultimately approached it, hadn't been told yet. And, when a historian or journalist finds an untold story like that, you leap on it. 

Robin Lindley: The story is very timely and a tribute to the role of the free press in a democratic society. And there are many parallels now to handling of the deadly global COVID-19 pandemic as the administration attacks the press and spreads lies and misinformation about a health threat to all citizens, as tens of thousands die.

Lesley M.M. Blume: The pandemic is a global existential threat, which is exactly what I'm detailing in Fallout. Now, the administration is downplaying and covering up an existential threat just as the government in 1945 kept the American public in the dark about the reality of the bombs that were created in secret and detonated in their name. The parallels are uncanny and disturbing. 

Robin Lindley: That’s instructive on the role of the press. How did the book evolve for you? Is it the book now that you initially imagined?

Lesley M.M. Blume: The research surprised me, especially the extent of the coverup, and how concerted it was. 

I first approached the story from the point of view of a journalist covering another journalist. I asked how on earth did Hersey cover a nuclear attack zone in 1945? I was interested in how he got into Hiroshima and how he got people to speak with him. And then, when I started to really dig into the story, I realized that other scholars who preceded me had documented the coverup without really celebrating the critical role that Hersey played in revealing it. Nobody else had connected the dots in this way before.

Robin Lindley: What was your research process?

Lesley M.M. Blume: When I began the project, I told my agent and my editor not to expect to hear from me for months because I would be reading. I dug up a ton of reporter memoirs before I started with archival data. It was background, background, background. I read biographies of important figures such as General Douglas MacArthur and Manhattan Project head General Leslie Groves. 

I also reached out early to people to interview because, when researching people of Hersey’s era, I had to get to people fast who knew him. There were a few Hersey friends and colleagues who I spoke with a few years ago who are no longer with us. But there’s also a disadvantage in seeing them early, because I wasn’t as steeped in the material and in Hersey’s world yet, I wasn’t approaching them from a position of assured expertise yet.

After the initial reading and interviews, I had a better idea of what to look for in the archival records.

Robin Lindley: Thanks for sharing your process. I noticed that you also traveled to Hiroshima. That must have been very moving. 

Lesley M.M. Blume: It was one of the most extraordinary experiences in my life, and one of the most disturbing. Hiroshima is now a fully rebuilt city, with around three million inhabitants. It was almost completely destroyed and there is very little left to indicate what it had been like before the bombing. 

When I got off the train station and a sign read “Welcome to Hiroshima,” I almost crawled out of my own skin. It’s a vibrant, modern metropolis, yet Hiroshima’s leaders and residents definitely see the city as a witness to nuclear Holocaust.  But they also see the city as a Phoenix that has risen from the ashes, and as a monument to human resilience. I respect the latter view, but going to that city was almost a traumatic experience for me. I couldn't eat or sleep almost the entire time that I was there researching—knowing what happened there. 

I interviewed the Governor of Hiroshima Prefecture and he admitted that they still find human remains every time they dig for a new development there. He said that, if you dig three feet, you hit human bones, so it’s a city that's built on a graveyard. I'll never forget that trip.  

Robin Lindley: That had to be haunting. Didn't you also speak with some survivors of the bombing? 

Lesley M.M. Blume: I did, including the last surviving central protagonist of Hersey’s book: Koko Tanimoto, the daughter of Reverend Kiyoshi Tanimoto, who was one of Hersey’s six protagonists. She and her mother also appeared in his article. Koko had been eight months old when the bomb went off; she and her mother were in their family home, not far from the point of detonation, and the house collapsed on them. Somehow, they survived and her mother was able dig them out of the rubble just before a firestorm consumed their neighborhood. It was an absolute miracle that they survived. 

Koko was 73 or 74 when I met her. We walked together through central Hiroshima and we went to the monuments there. She showed me where the exact point of detonation had been, which is actually quite an under-visited site.  There's only a modest marker there, but it's in front of a low-rise medical building and a 7-11, of all things. I don't know if I would have found it without her.

It was very emotional to walk through the city with Koko. Ironically, she considers America to be almost like a second home at this point. Her father, Reverend Tanimoto, had become an antinuclear advocate over the years, and she did a lot of traveling with him. She's also a peace advocate and has spent a lot of time in the US. For her to have been on the receiving end of nuclear attack at the hands of America, yet still have such generous feelings toward us, was astonishing to me.  Fallout is dedicated to her.

Robin Lindley: Your memories of Hiroshima are striking. Did you find any surprises or new government information in your archival research?

Lesley M.M. Blume: I'll try to be concise on this topic, but the short answer is yes. When I was doing my last book on Hemingway, coming across new information was like scratching water from rocks, but there was break after break with this book. The research gods favored this project. I don't know what I did to deserve it, but I'm grateful to them. 

My Leslie Groves revelation was huge – at least, to me. That came from a misfiled document in the New York Public Library’s New Yorker archives. I had very slim expectations about finding anything new in that archive because the New Yorker has had several biographical books written about it, and its editors have all had biographies, except for William Shawn. 

The very last day I was in that archive, I went through a file that I thought was irrelevant; it contained documents pertaining to stories that the magazine had submitted to the War Department for censorship – but in earlier years of the war. Hersey was reporting on Hiroshima in 1946, but I was curious to see how the magazine had interacted with censorship officials at the War Department, and how cozy the relationship had been. That’s when I found the first document that indicated that Hersey’s article “Hiroshima” had been submitted to not only to the War Department for vetting, but to General Leslie Groves – head of the Manhattan Project - himself. I freaked out right in the middle of the archive. I stared at this document and couldn't believe it. I sent a phone photo of it immediately to one of my research associates and asked, ‘Am I reading this right?’ Yes, I was. I had a call right away with my editor because it changed everything in this book. It changed Hersey’s “Hiroshima” from a subversive piece of independent journalism researched under the nose of Occupation officials to almost a piece of sanctioned access journalism. 

And then I found confirming evidence in Leslie Groves’ records – both at NARA [National Archives and Records Administration] and in the independent files of Groves biographer Robert Norris, who was helping me -- that this vetting had taken place. That set off a whole new realm of research for me in terms of assessing Groves’ position at that time, why he would have agreed ultimately to release the article, and how the administration and the War Department’s aims had evolved. They had been suppressing information about the bombing since that previous August, but a year later, they were finding new utility for accounts of the nuclear aftermath in Hiroshima. And so that was huge. 

I was also able to call up, through Freedom of Information Act, documents from the War Department, CIA, and FBI, which detailed how they tracked Hersey when he was in Japan and their attitude toward Hersey after the reporting came out. I was quite curious to see especially the CIA records and FBI records because I wanted to know if there had been any move to try to discredit Hersey after “Hiroshima” came out, because the reporting had embarrassed the government. 

While it transpired that the FBI did investigate and question Hersey a few years later, in the McCarthy era, it doesn't appear from what was released to me that there were any immediate efforts to discredit him, or his sources in Japan. The government took a different approach: downplaying.  They mostly ignored the story to a certain extent, and then, when it was clear that the furor caused by “Hiroshima” wasn’t going to calm down, government officials put out their own counterpoint narrative, in an article in Harper’s Magazine, asserting that the bombs had been necessary and trying to dismiss Hersey’s revelations as sentimentality.

Robin Lindley: Thanks for your extensive research. I didn't realize that you found that new material on Groves’ review of the Hersey article. That was a coup. Congratulations.

Lesley M.M. Blume: That was me. I will not tell you what I yelled in the middle of that silent archive, but it's a miracle that they didn't kick me out.

Robin Lindley: What an incredible find. You write extensively about Hersey’s background. Could you say a few things about John Hersey for readers who may not know his work? 

Lesley M.M. Blume: Yes, absolutely. He's an interesting and unique protagonist for sure. Hersey in 1945 was 31 years old, movie star handsome, and already a celebrated writer. He had been covering the war since 1939 for Time, Inc. Henry Luce, the head of Time, Inc., had been grooming him to take over managing editorship of Time Inc., but they parted ways because Hersey couldn't abide Luce's chauvinistic, hyperpatriotic views. Hersey was also a recognized war hero for helping to evacuate wounded Marines while he was covering battles between the Japanese and the Americans in the Solomon Islands. And he had won the Pulitzer Prize for his 1944 novel A Bell for Adano.  Hersey was incredibly well known by the end of the war, and living what seems like a glamorous life. There were invites to the White House and he was mentioned in gossip columns. But he was not entirely comfortable being a public figure. He was the son of missionaries. He grew up in China. He was always a kind of outsider when the family moved back to the United States, even though he had a very celebrated life. He'd gone to Hotchkiss and Yale, where he was in the exclusive Skull and Bones society, but still, even when he was accepted among ultimate insiders, he always felt like an outsider. 

Robin Lindley: And you write about Hersey’s view of the Japanese during the war. 

Lesley M.M. Blume: He had covered the Japanese during the war and, like most Americans, he had been outraged by Pearl Harbor and by the stories of Japanese atrocities in China and Manila, and he was appalled by the battles in the Pacific theater. He said later that he had personally witnessed tenacity of Japanese troops; that’s a word that comes up again and again when American military veterans and journalists of that period described the Japanese, whom they expected to fight down to the last man in the Pacific theater and in Japan, if it were invaded. 

Robin Lindley: How did Hersey react to the atomic bombing of Hiroshima and then the second atomic bomb drop on Nagasaki three days later?

Lesley M.M. Blume: He was really quite appalled by the Nagasaki bombing. He was chagrined by Hiroshima, but he felt that it would speed the end of the war. But he thought that the atomic bomb used after Hiroshima was a war crime – a “totally criminal action,” is how he put it later. He realized before most people the implications of humanity having violently entered into the atomic age. He said to his editor at the New Yorker, William Shawn, that if humans could not see the humanity in each other – and continued to dehumanize one another as they had during the Second World War -- that civilization had no chance of surviving an atomic age now. 

Again, Hersey had covered everything from combat to concentration camps during the war. He had personally seen how the Japanese had dehumanized the Americans and the Chinese, among others, and how the Germans had dehumanized practically everybody. And when he saw Nagasaki bombed, he saw an active American dehumanization toward the largely civilian population in Japan. 

And so, he was able somehow to overcome his rage at the Japanese military to document what had happened to the civilian population who were the first humans in history on the receiving end of nuclear warfare. That was not a popular mindset, to go into Japan and say, I'm going to humanize this population for Americans – but Hersey was extraordinary in his perspective. 

Robin Lindley: Was it Hersey’s idea or Shawn’s to cover what actually happened on the ground at Hiroshima?

Lesley M.M. Blume: Hersey and his editor, William Shawn at the New Yorker, met for lunch at the end of 1945, when Hersey was about to do a big reporting trip to Asia. He was going to China, but from there, he planned to try to get into Japan. 

When he and Shawn were discussing Japan, they talked about the fact that the public had been shown in the press images and descriptions of the landscape destruction in Hiroshima, and pictures of the mushroom clouds.  But Americans had been seeing such rubble pictures of devastated cities around the globe for years, and the Hiroshima landscape photos didn’t seem that differentiated. And we can't forget that, when Truman first announced that the atomic bomb had been dropped on Hiroshima, he immediately cast it in conventional terms saying that the bomb was the equivalent of 20,000 tons of TNT. 

There was very little mention or reporting on what had happened to the human beings under those mushroom clouds, and how these experimental bombs were unique, and this really disturbed Hersey and Shawn. For them, there was a suspicious and disturbing lack of reporting on the human consequences the bombs -- even though major American news operations had bureaus in Tokyo since the earliest days of the occupation, or, at the very least, correspondents stationed in Japan.

Robin Lindley: What did Hersey sense that the government was hiding from the American people?

Lesley M.M. Blume: Hersey and Shawn knew something was going on about how the bombs affected humans. How could you have such a huge press presence, but have the hugest story of the war being under told or covered up? They decided that if places like the New York Times and the Associated Press and other big players either wouldn't or couldn't get that story, Hersey would try to get into occupied Japan and go to Hiroshima to investigate the story. 

Robin Lindley: Right after the bombing, General Groves said that the bomb was “a pleasant way to die.” That left the impression that tens of thousands of people died in a flash and they were merely statistics. But the atomic bomb continued to kill long after detonation. 

Lesley M.M. Blume: Yeah, that's exactly right. At first the administration and the occupying forces were reinforcing the narrative that the bomb was a conventional military weapon. A bigger piece of artillery, is how Truman would long characterize it. The U.S. government initially said that accusations of radiation sickness or radiation poisoning killing survivors were “Tokyo tales”—Japanese propaganda to create sympathy among the international community. 

Initially, there were a few original press accounts by Allied journalists who’d managed to get into Hiroshima and Nagasaki, during the earliest, chaotic days of the occupation. A couple came out of Hiroshima that indicated that a sinister new “disease X” was ravaging blast survivors there.  One account ran in the UP and the other in London’s Daily Express.  After that, another journalist tried to file a report to the Chicago Daily News from Nagasaki, confirming that a horrific affliction was killing off survivors there too. That report was intercepted by the Occupation censors under General MacArthur and supposedly “lost.”  The occupation forces clamped down on the foreign and Japanese press alike after that – and quickly. Those kinds of reports stopped coming out of Hiroshima – until Hersey got in. 

In the meantime, General Groves had personally spearheaded a PR campaign downplaying and denying radiation poisoning, and portraying the bombs as humane.  Meanwhile, he and his team were privately scrambling to study the aftermath and aftereffects of the bombs, but publicly said that this aftermath was not so bad. 

General Groves also commented, privately, during this time, that perhaps there was something about the composition of Japanese blood that was making them react especially badly to radiation absorbed into their bodies at the time of the bombing. That was an astonishing mindset. 

Robin Lindley: That’s incredible. Hersey was cleared to go into Hiroshima for two weeks in 1946 and he collected information from survivors on the human consequences of the bomb and how the damage to humans was much different than caused by a conventional bomb. And he chose to tell the story mainly through six survivors of the atomic bombing. 

Lesley M.M. Blume: Yes. By the time he left Japan, he also had radiation studies that had been undertaken by the Japanese, and Japanese studies on the damage to the city.  He had initial casualty counts, and an initial study on how the bombs might have affected the earth and botanical landscape in the atomic cities. He even had the hospital blood charts of one of his protagonists. 

In his subsequent article, Hersey wrote in excruciating detail, not just about the minutes, hours, and couple of days after August 6, 1945, but also the eight or nine months after by the time he entered Hiroshima. He wrote about how the atomic bomb kept on killing well after detonation. Several of his protagonists whom he profiled were critically ill and suffered from extreme hair loss, relentless fevers, total enervation, vomiting, and were in and out of hospitals. Hersey was so detailed in recounting their experiences that there would be no denying, after his report came out, the true medical effects of atomic bombs. Never again could atomic bombs be billed either as a pleasant way to die or as conventional megaweapons. 

This was a turning point, not just in America but around the world, and a wakeup call about the reality of nuclear warfare and what these bombs do to human beings. 

Robin Lindley: As you revealed for the first time, General Groves reviewed and surprisingly approved Hersey’s heart-wrenching account with only a few minor changes. Why did Groves approve publication of the story?

Lesley M.M. Blume: That was an astonishing revelation. By the time Hersey got into Japan in May, 1946, and wrote his story that summer, General Groves was already anticipating a time when America would no longer have the nuclear monopoly and would need to prepare for a possible nuclear attack on our own population. Both he and General MacArthur were anticipating this future landscape, and saw studying Hiroshima’s fate as a way to create an infrastructure here to prepare ourselves for nuclear attack. For example, they saw how Hiroshima suffered because all the hospitals were concentrated in the city center. Therefore, the U.S.  should take care to spread its city hospitals out, so they couldn’t all be taken out in one bombing. Hiroshima suddenly had enormous utility in terms of trying to figure out how to medically treat future survivors of nuclear attack. I came to realize that the U.S. military’s and the government’s policies and uses for the information that Hiroshima had evolved significantly since the early days of ham-fisted cover up and suppression about information about the bombings’ aftermath. 

But what really blew my mind was coming across the evidence that Hersey’s “Hiroshima” article had been submitted to Groves for pre-publication approval and vetting, and was approved. I was just trying to understand the mentality. 

A year after the bombing, the official approach to the narrative of Hiroshima and Nagasaki was becoming more nuanced. There were two developing considerations. First, we had to show the Soviets what we had. We still had a nuclear monopoly and wanted to keep them in their place. The more they saw us as a threat, the better. The Russians saw Hersey’s report as propaganda and hated him and “Hiroshima” duly.  

Second – and again -- General Groves and others in the US government and military were anticipating a moment when we didn’t have the nuclear monopoly anymore. And so, if Americans were reading “Hiroshima” and they were seeing, New York or Detroit or San Francisco or Toledo, Ohio, in the place of Hiroshima, they might have thought, ‘We need to ban nuclear weapons.’ Which was the reaction that Hersey hoped for. 

Or, they might thing that that we needed to build and maintain a superior arsenal, because someday the Soviets would get the bomb too, and likely others.  And this was the thinking that helped set off the arms race. Leslie Groves, at that point in 1946, was already arguing that it was imperative for the US to maintain its nuclear advantage. He may have read Hersey's article in the most cynical way possible: as an unlikely way of drumming up public support for the continued development of a superior nuclear arsenal. 

Robin Lindley: And Americans and people around the world were reading the Hersey article in the August 31, 1946 New Yorker, with its graphic descriptions of the ghastly medical and other human consequences of an atomic bomb attack. How do you see the reception and the influence of Hersey’s report? 

Lesley M.M. Blume: It wasn't a foregone conclusion that it was going to be well received because, when you think about the American attitude toward the Japanese then, most Americans hated the Japanese. They remembered Pearl Harbor and Nanking and Manila and the Pacific theater. They were bloody memories. 

When the article came out, Hersey left town. Maybe he feared for his life because humanizing Japanese victims – who had died in a hugely popular military victory - for an American audience was a dicey proposition, to say the least.

As it turned out, the impact of the article was instantaneous and global. People everywhere stopped to read this 30,000-word story – and even if they hadn’t read it, they knew about it and were talking about it. A survey of the article’s readers later revealed that the vast majority of those surveyed said that “Hiroshima” was not just fine reporting, but that it served the greater common good by revealing the truth about what had happened in Hiroshima and the truth about nuclear weapons. And, even if people weren't feeling sympathetic towards the Japanese victims, they were definitely seeing the perilous reality of the world that they now lived, the atomic age. It was an enormously effective wake up call. 

The article was syndicated in its entirety in publications across the country and around the world. And it was covered on the least 500 radio stations in America. It was read over four consecutive nights in its entirety on ABC, and later, on the BBC. Within a year, the article was translated into practically every language around the world from Spanish to Hebrew to Bengali. It was even in braille. You can hardly imagine an article today getting this much attention or having this much of an impact.

Robin Lindley: I remember reading Hiroshima in book form decades ago, when I was in high school. I still recall the graphic depictions of the dead and the injured, the pain and suffering. The article must have had an especially strong effect on people who read it for the first time and didn’t know of the human toll of the atom bomb. 

Lesley M.M. Blume: Yes. And it was extraordinary that Hersey was able to get people to read it when there was little incentive to do so, because, again, it humanized the Japanese. And while there may have been morbid curiosity about what it was like under the mushroom cloud but, at the same time, it was hugely disturbing material. The fact that Hersey was able to get people to stop and to bring the country almost to a halt for a few days after the article came out was just an enormous and astonishing accomplishment.  

One of the things that made the story unputdownable was Hersey’s writing: he made it read like a novel, complete with cliffhangers in between each of the testimonies of the six protagonists.  It draws you in; you’re totally engrossed.  “Hiroshima” basically became mandatory reading for the reading public across the country and around the world. 

Robin Lindley: And wasn’t Hersey’s innovative approach to the article perhaps a precursor of the New Journalism by telling the story of this historical atrocity through the eyes of several witnesses, rather than writing a straight journalistic account? 

Lesley M.M. Blume: The style and approach of “Hiroshima” was literally inspired, in part, by another, earlier novel, The Bridge of San Luis Rey [by Thornton Wilder], which Hersey had read while he was sick in China before going to Japan. At that point, Hersey knew generally that he wanted to tell the story of the bombings from individual vantage points, but he borrowed an idea from Wilder’s novel, which detailed the lives of a handful of people at the moment of shared disaster.  

In Bridge, those individuals all died on a bridge when it broke; in Hersey’s story, it would be a handful of people – everyday people – whose lives intersected in real life, and who all experienced and survived the Hiroshima bombing together. Each of Hersey’s protagonists are documented as they are going about morning routines on August 6, 1945, when the flash comes and their city and lives are destroyed.  It differed widely from any other journalistic accounts that followed in the days after the bombing, which again, largely cited clinical casualty statistics and described landscape devastation. But those accounts and that approach to the story of Hiroshima hadn’t really penetrated the global consciousness, and just didn't hit on a visceral level the way Hersey’s account did. 

In terms of “Hiroshima” being a forerunner to the immersive approach taken by “New Journalists” – well, it’s sometimes cited as such, but Hersey really disliked the approach of people like Tom Wolfe and Norman Mailer and other later journalists who made themselves the center of their stories.  Hersey thought it was an awful and dangerous journalistic trend.  And if you look at “Hiroshima,” you’ll see that Hersey totally absented himself from that reporting: no opinions, no rage; the voice of the story is very nothing-but-the-facts, and intentionally so.  

Plus, Hersey did not personally promote “Hiroshima” and had a lifelong aversion to self-promotion.  He felt that his work should speak for itself.  He never put himself on center stage.  Although he did leave a lot of documentation behind for historians like me to tell his story much later.

Robin Lindley: I appreciate those comments on Hersey’s approach to writing. Your book also demonstrates that you have a gift for storytelling and lively writing as well as research. Who are some of your influences as a writer?

Lesley M.M. Blume: Well, thank you for the compliment. First of all, I have to say that I have a vicious editor who kept me on the straight and narrow, or the book probably would have been twice as long as it is. 

On specific influences, at the risk of sounding like a cliché, I've been greatly influenced by both of the men who I've documented in my two main nonfiction books, Hemingway and Hersey. Both stripped down their writing to what was essential to the story. Hemingway’s tip-of -the-iceberg storytelling approach is still so damned relevant, so important. Hemingway is more stylized, but Hersey’s approach was honed with the New Yorker editors to a dispassionate recounting of fact. That has also been hugely instructive. 

In terms of other major journalistic accounts that I've read that absolutely floored me, there was David Remnick’s incredible account of the Bolshoi ballet when it was about to unravel. He reported on his protagonists just in their own words, but the characters were so outlandish and insane, and the cross-weaving of the hallowed Bolshoi history and the modern-day antics were unbelievable. It was written in a masterly way.  Something that all of these writers have in common is telling a big story through individual characters.

Robin Lindley: It’s also obvious that, like Hersey, you care about the human story behind statistics and other facts when you're writing or researching a story. 

Lesley M.M. Blume: It's all-important, and I've always known it, but this project has really brought that home: it always comes back to the human story. I wrote an op-ed for the Wall Street Journal a few weeks ago about how Hersey’s approach gives journalists today a tool for telling the story of other catastrophes, including the story of the pandemic.  We’re now over 200,000 deaths in this country -- more than three times the number of the Americans who died in Vietnam – and over a million global deaths. How do you deal with these statistics, how do you fathom the toll and the tragedy behind the numbers? It's relentlessly important to bring it down to the human lives behind this unfolding tragedy – or any mass casualty situation. 

For example, my favorite Hemingway book isn’t The Sun Also Rises, which I documented in my earlier book, but rather For Whom the Bell Tolls, which documented the horror of a war that presaged World War II. In it, he depicted the interactions among individual people in a small town as that war unfolded, and the cruelties they inflicted on each other. If you can bring a story down to a handful of people who are experiencing a globe- or country-rocking event, then there’s a better chance your readers will comprehend the enormity of the event. Ironically, the more granular and human-focused the account, the greater the comprehension. 

Robin Lindley: That’s powerful advice for all writers. I appreciated also your quote toward the end of the book where you said “Nuclear conflict may mean the end of life on this planet. Mass dehumanization can lead to genocide. The death of an independent press can lead to tyranny and render a population helpless to protect itself against a government that disdains law and conscience.” That was powerful and heartfelt. We’re at a time when our free press is under threat when the administration is actually hiding information. Where do you find hope now?

Lesley M.M. Blume: In Dr. Anthony Fauci. As long as we can hear from him, we will get guidance on how to get through this time, and we'll have a sense of where we really stand. 

To be honest, this is a bleak moment. I have enormous trepidation in the lead-up to the election. Every day there's evidence that our society’s battle over information is basically the battle of our times. This battle will determine how things shake out for human civilization and the democratic experiment, not just for this country, but for all of the world. 

I try to remember that our ancestors stared down and overcame enormous existential threats, and I look to the World War II period not for hope, but for strength. Can you imagine being in London during the blitz, or being in that country just after Dunkirk, and having to find the strength to carry on? There were such dark moments during that conflict yet there was an end.

Today, as then, we do not have the luxury of being exhausted or being demoralized. You just have to see what is right and relentlessly pursue that and try to find the energy to do that. 

I’m trying to find pleasure in everyday things also. I have a young daughter who is smart and strong and hilarious. Being a parent is extremely motivating to keep fighting because, if you bring a human into this world, you damn well better try your best to be the best version of yourself, and help make the world as just as possible. 

I’m also reading a lot of “Talk of the Town.” And I'm doing an Alfred Hitchcock movie marathon, which has been fun and stylish. Quarantine stress briefly led me to consume a daily gin and tonic, but I’ve weaned off them because they’re too fattening. I'd like to maintain some semblance of a jawline. 

It’s discouraging that right now we go to bed each night and we don't know what is going to unfold the next day. But we have to remember that we're not the only humans who have felt that way, and we just have to fight because there's no other choice.  Exhaustion and surrender are not options. 

Robin Lindley: Thanks Ms. Blume for those words of encouragement and inspiration. Readers are sure to appreciate your thoughts and all the careful work you've done on this story. Thank you for this opportunity to discuss your work and congratulations on your groundbreaking new book Fallout on the intrepid John Hersey and his classic account of the bombing of Hiroshima.

 

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, NW Lawyer, ABA Journal, Re-Markings, Real Change, Huffington Post, Bill Moyers.com, Salon.com, and more. He has a special interest in the history of human rights, conflict, medicine, and art. He can be reached by email: robinlindley@gmail.com.  

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/blog/154434 https://historynewsnetwork.org/blog/154434 0
Affirmative Action Goes Down to Defeat in Deep Blue California

 

 

To the chagrin of California’s liberal establishment, Proposition 16, which would have ended a 24-year ban on affirmative action, was rejected by 56% to 44%. This came despite the measure’s supporters outspending opponents almost 2-1 (some $20 million to $2 million) and Joe Biden carrying the state by 66% to 33% for President Trump.  

The ballot measure was supported by Governor Gavin Newsom, Senator Kamala Harris and many labor unions. Major donations to pass the ballot initiative came from a number of high-tech billionaires including Netflix founder Reed Hastings and former Microsoft CEO Steve Ballmer.

Pre-election polls showed that white voters strongly opposed the measure, Black voters strongly supported it and Latino and Asian-American voters were split 50-50 on it.

According to Thomas Saenz, president of the Mexican American Legal Defense Fund and a co-chair of the “yes” campaign, young many young Latino voters did not support it because “they have no personal experience” of affirmative action in their lifetimes.

The wording of the measure on the election ballot may also have contributed to voters’ opposition. The ballot wording stated that a “no” vote keeps Proposition 209 (the 1996 measure banning affirmative action) in place and confirms that “government and public institutions cannot discriminate against or grant preferential treatment to persons on the basis of race, sex, color or national origin.”

Ward Connerly, a Black businessman and former U.C. regent, who was a leader of “No on 16” campaign said that proponents of affirmative action “have not persuaded the people that is OK to discriminate against one group of people in the interest of trying to benefit another.”

Connerly added that “This was the political establishment going against the people. And the people kicked their butts.”

Affirmative Action Created in 1961

One reason that Black voters strongly supported it while Latino voters were ambivalent may be that affirmative action is strongly associated with specific efforts to assist African-Americans in the competition for jobs and education that began in the 1960s.

In a January 2020 article in The New Yorker,  Louis Menand noted that the term was created in 1961 by an African-American lawyer, Hobart Taylor, Jr.  Vice President Lyndon Johnson, who was to lead the President’s Committee on Equal Opportunity Employment, had asked Taylor for help in drafting goals for the new group. When Taylor suggested the term, it was quickly incorporated because of it had a vague meaning yet sounded like it would “do something.”  Johnson also liked the alliteration.

According to Menand, affirmative action is a paradox because “we took race out of the equation only to realize that, if we truly wanted not just equality of opportunity for all Americans but equality of result, we needed to put it back in.”  

It also became associated with adopting racial quotas in education and employment. However, in the famous Bakkedecision, the U.S. Supreme Court declared racial quotas unconstitutional. 

Menand said that after Bakke, affirmative action programs had to avoid “any suggestion of the Q-word.” This meant that institutions could use “terms like ‘targets’ and ‘goals,’ both of which are constitutionally legit.” But they could not set a specific number for a “critical mass” of under-represented groups because that meant a quota. 

In the wake of Proposition 16’s defeat, the media looked for explanations. 

The Los Angeles Times coverage of the measure’s defeat reflected the state’s ambivalence. In a post-election editorial, the editors called the rejection of Proposition 16 “a shame” and a “missed opportunity.”

However, Gustavo Arellano, the paper’s Latino columnist noted  in a prominent column  that it was no surprise that Latinos did not vote in a block for Proposition 16 in California or for Joe Biden in Florida because there is no single Latino community.

“We were supposed to be the phalanx in the war against Donald Trump. An immovable mass of multihued tribes hurtling like an unstoppable force to smash white supremacy in the name of democracy,” he said mocking white liberals’ false perceptions.

Arellano pointed out that there are many different Latino communities with varied interests ranging from Cuban-Americans in Florida, to Puerto Ricans and to third-generation Mexican Americans in south Texas. 

He said the Democratic Party needed to stop taking Latinos for granted and should reject the assumption that “Latinos are mostly liberal with a few regional anomalies.”

The party will be making a mistake if Biden wins and Democrats go “back to patting Latinos on the back for a job well done,” he concluded. 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178196 https://historynewsnetwork.org/article/178196 0
Can the COVID Crisis Create a New Civilian-Military Trust in Argentina?

Pirámide de Mayo covered with photos of the desaparecidos by the Mothers of the Plaza de Mayo in 2004

Photo WikiLaurentCC BY-SA 3.0

 

 

 

It haunts us still. Almost forty years have passed since Argentines reclaimed democracy from their last military government. Yet, the spectre of dictatorship persists. Trials of alleged human rights abusers from the 1976-83 period continue apace. In late October, Argentina registered 30,000 COVID-19 deaths. On that occasion, to a torrent of social media admonitions, the journalist Eduardo Feinmann invoked the 30,000 civilians killed by the last dictatorship to compare what he views as the disastrous handling of the pandemic by the current Argentine government to the horrors of military rule. And in late August, former president Eduardo Duhalde warned that the current Argentine political and economic crisis might lead once again to a coup d’état. Then, in October, he declared that the country was “in a pre-anarchic process.”  

 

Argentine democracy is among the most vibrant anywhere, providing a dynamic forum in which the most pressing social and political problems are debated openly, from equal rights for LGBTQIA+ people to abortion rights. Why, then, is the country still making constant and pained reference to a dictatorship that ended almost forty years ago?

 

In part, the answer rests in an affliction common to many polities today, the growing divisions between political movements on the left and right, and the heightened alarm that has generated on all sides. But more important, despite dramatic changes in Argentine military cultures over the past forty years, people remain uneasy that the Armed Forces have found no post-dictatorship role in civic life. They remain caught in time, unable to shake their legacy of state terror.

 

It wasn’t always like that. A hundred and fifty years ago, the Armed Forces occupied a crucial place in the development of civil society. In 1870-71, Argentina faced an unprecedented public health threat. A Yellow Fever epidemic killed thousands. Under the direction of the federal government, and working with civilian public health officials, the Armed Forces successfully advanced a rapid and multifaceted response to the crisis. During the War of the Triple Alliance (1865-70), the Army public health service operated an extensive network of clinics and medical stations behind the front lines of its conflict with Paraguay. At the end of the war, that military presence and expertise in rural areas was quickly adapted to the spread of Yellow Fever with Army personnel treating large numbers of infected civilians. The Army sent physicians to Corrientes province to tend to those in impoverished villages without other access to medical care. The Navy closed Argentine ports to ships from Paraguay to stop the potential arrival of infected passengers. And from April 12 to May 15, 1871, the Army maintained a quarantine declared by civil authorities in Buenos Aires.

 

One of dozens of Army physicians sent to infected zones, Caupolicán Molina was both the principal surgeon of the Army and the chief of sanitary services for the city of Buenos Aires. Responsible for hundreds of patients, Molina died of the fever in 1871. Born in 1795, Francisco Javier Muñíz also represented the fluidity of interaction between civil and military authorities. As a disciple of the physician Cosme Argerich, founder of the Military Medical Institute, Muñíz graduated as a military surgeon and Army officer. He attended to the wounded during Argentina’s war with Brazil (1825-28) and during the War of the Triple Alliance. In 1828, based in the town of Luján (Province of Buenos Aires), he treated both soldiers and civilians in a peacetime setting. At the same time, he conducted geological and paleontological studies in the area, worked on a cure for Scarlet Fever, and wrote books on regional vocabularies and the ñandú (Patagonian Ostrich). In 1870, at seventy-five years of age, he volunteered to help combat the epidemic. A year later, while treating both civilian and military patients, he died of the disease.

 

The relationship between the Armed Forces and civil society changed dramatically at the end of the nineteenth century with the consolidation of the nation state. In contexts of social unrest and political uncertainty, the military began to intervene in politics, leading to the first modern coup d’état in 1930, which ushered in half a century of increasingly violent military overthrows of civilian rule. Beginning with the return to democracy in 1983, Argentines became firmer in their rejection of military intervention in governance as anathema to core principles of civil society and democratic rule. Pilloried since the 1980s for their abuse of power and crimes during the dictatorship, in the 1990s the Armed Forces saw their power base weakened by sharp salary reductions for officers and the privatization of weapons plants under their control. Today, legislation blocks the Armed Forces from undertaking internal security-related functions. Soldiers play an outsized role in United Nations peacekeeping missions and are trained in domestic and international human rights law. But figuratively and literally, except for rare ceremonial occasions, they have been confined to the barracks – until now.

 

Could the armed forces’ fight against the 1871 Yellow Fever epidemic, in a manner that dovetailed with civil society public health initiatives, serve as a model for a renewal of military functions in Argentine society? It bears noting that the Argentina of 1871 was no democracy and that while Armed Forces physicians were fighting the virus in rural clinics, the Army was launching a series of brutal military campaigns to destroy First Peoples.

 

But in the current pandemic, we have a glimpse into the parameters of the possible. Without great fanfare or reference to 1871, beginning in April 2020, Defence Minister Agustín Rossi has ordered the military into dozens of operations to mitigate the impact of COVID-19. In a massive campaign, Hercules aircraft transported emergency medical supplies to Córdoba province, carried Army nurses to Neuquén province, and delivered respirators to multiple destinations.

 

It’s too early to tell, but perhaps COVID-19 might have a positive impact over the medium term to balance, if only in small measure, the enormous destruction of the new coronavirus. The first vast military infectious disease-related initiative since the Yellow Fever epidemic may allow for imagining a reintegration of the Armed Forces and civil society, giving Argentina a renewed opportunity to come to terms with its past dictatorship while reaffirming its repudiation of longstanding militarist fantasies of power over civil society. 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178223 https://historynewsnetwork.org/article/178223 0
Why the Odds Are Good that Biden Can Reduce Our Political Polarization

 

 

A few weeks ago HNN posted Joe Renouard’s “Post-Election America Will Still Be Deeply Divided.” Although he makes some good points, the odds seem good (at least to me) that president-elect Biden will be able to reduce our political divisiveness. This will be true whether or not the Republicans retain dominance in the Senate, which we will not know until January. After reviewing our history of conflict versus consensus, I’ll indicate six major reasons why I think Biden can reduce our present polarization.

The Civil War was our period of greatest conflict, but later periods of consensus also existed. In her Inventing the "American Way": The Politics of Consensus from the New Deal to the Civil Rights Movement (2008), Wendy Wall maintains that the roots of a “consensus culture can be found . . . in the turbulent decade that preceded U.S. entry into World War II.”  (For a similar conclusion about the period Wall identifies, see Robert Putnam on the “Great Convergence,” in The Upswing: How America Came Together a Century Ago and How We Can Do It Again (2020)).

Indeed, after three Republican presidents, beginning with Warren Harding’s election in 1920, the Great Depression helped elect the Democrat Franklin Roosevelt in 1932, a year in which much polarization existed. Robert Dallek in his biography of Franklin Roosevelt states that there “was the deep cultural divide between urban and rural Americans, or modernists and fundamentalists, as they were described.” Moreover, “rural folks who aggressively supported ideas and traditions largely in harmony with their established way of life” felt threatened by the growing dominance of the big cities and southern and eastern European immigrants who had come to the cities before the 1924 National Origins Act discriminated against them and Asians. 

But FDR’s resounding victory in 1932, when he carried 42 of 48 states, indicated that a consensus was beginning, that the old Republican preference for limited federal action was inadequate. In 1936, Roosevelt upped his success by winning in 46 states and helping Democrats increase their majorities in both houses of Congress. In 1940 and 1944, although his margin of victory declined from that of the 1930s, he still was easily reelected. 

Although Harry Truman proved less popular in 1948 than Roosevelt had been before his death in 1945, many historians believed that a general U.S. consensus remained. The major exception was over the question of civil rights, as southern segregation continued to blemish our nation.

The success of former General Dwight Eisenhower in 1952 and 1956 indicated to many that the broad U.S. consensus endured. As Scott Spillman has indicated in a 2017 article on conflict and consensus, historian Daniel Boorstin “represented perhaps the most unequivocally celebratory expression of what was known as the ‘consensus school,’ which dominated how a generation of historians after World War II thought about their country and its past.” And to Boorstin, Eisenhower, who once said “extreme positions are always wrong,” was “the living embodiment of  . . . American consensus.”

But then, as historian Jill Lepore writes, “What no one could quite see, in 1960 [Eisenhower’s last full year in office], was the gathering strength of two developments that would shape American politics for the next half century. Between 1968 and 1972, both economic inequality and political polarization, which had been declining for decades, began to rise. . . . A midcentury era of political consensus had come to an almost unfathomably violent end. After 1968, American politics would be driven once again by division, resentment, and malice.”

In the last half-century various individuals and factors have increased polarization, among them former Republican House Speaker Newt Gingrich (1995-1999) and the conservative Tea Party movement, which emerged in 2009. Lepore also adds that Internet media feeds people only what they want to see and hear, and social media “exacerbated the political isolation of ordinary Americans while strengthening polarization on both the left and the right.”

Despite this half-century of increasing polarization there were some noteworthy exceptions. Most notable was the ability of Ted Kennedy, one of our nation’s most liberal senators, to work with conservative colleagues like Sen. Orrin Hatch of Utah to pass bills dealing with such matters as HIV-AIDS, children’s health insurance, and volunteer national service. After Kennedy died in August 2009, Hatch stated, “We can all take a lesson from Ted’s 47 years of service and accomplishment. I hope that America’s ideological opposites in Congress, on the airwaves, in cyberspace, and in the public square will learn that being faithful to a political party or a philosophical view does not preclude civility, or even friendships, with those on the other side. . . . We must aggressively advocate for our positions but realize that in the end, we have to put aside political pandering, work together and do what is best for America.”

While Ted Kennedy was still alive an Illinois state senator named Barack Obama delivered the keynote address at the 2004 Democratic National Convention, and called for overcoming Red-state-Blue-state divisions, for overcoming “those who are preparing to divide us.” Later, as president, Obama continued to signal his willingness to work in a bipartisan spirit. Unfortunately, however, this pragmatic president, temperamentally so well equipped to work with Republicans to achieve the common good, discovered little reciprocity from the likes of John Boehner and Mitch McConnell. Among Republicans, John McCain, who died in 2018, was one of the few Republican senators who demonstrated a desire to lessen polarization.

By late 2020, after almost four years of President Trump’s angry polarizing effects, President-elect Joe Biden is following the example of Obama, who for eight years he served as vice-president. Here are the six reasons I think he can reduce our polarization.

1. Reducing it is a major Biden goal. He made this clear a month before the 2020 election when at Gettysburg he stated that he wished to “revive a spirit of bipartisanship in this country, a spirit of being able to work with one another.” A New York Times essay a few days after the election stated “his principal theory of governance . . . [is] that compromise is good and modest progress is still progress.” At about the same time, in Delaware, with ballots still being counted, he said, “The purpose of our politics, the work of the nation, isn’t to fan the flames of conflict, but to solve problems, to guarantee justice, to give everybody a fair shot, to improve the lives of our people. . . . The vast majority of the 150 million [voting] Americans, they want to get the vitriol out of our politics. . . . It’s time for us to come together as a nation and heal. . . . My responsibility as president will be to represent the whole nation. And I want you to know that I will work as hard for those who voted against me as for those who voted for me. . . . We don’t have any more time to waste on partisan warfare.”

2. Biden already has won over some former Republicans and conservatives (see here and here) who prefer him to Trump, and coming from a much poorer background than Trump, he is well positioned to win over some former white working-class Trump supporters. In September 2020, CNBC posted an article entitled “Scranton vs. Park Avenue: Biden leans into his working-class roots to draw a contrast with Trump.”

3. Trump’s polarizing effects as the chief divisive flamethrower will gradually recede, especially after he leaves office in January. Although his false claims that the election has been stolen from him continue to rile up some of his most diehard supporters, others who voted for him think he is behaving too much like a sore loser. Trump’s continuing refusal to concede will just alienate some additional supporters--maybe not many, but at least some. In addition, as a lame-duck president his influence over Republican congresspeople and media outlets like Fox News will dim, his reelection failure reducing his power over both them and their constituents. (This does not mean Trump will, to paraphrase Dylan Thomas, go gently into the good night. It is possible he will continue to battle to remain a major shaper of the Republican Party.) 

4. The first major issues Biden will need to face--“COVID, the economy, racial justice, climate change,” are four he mentioned in his above-cited Delaware remarks--are ones amenable to bipartisan compromise. As COVID cases and deaths continue to increase in Trump’s lame-duck presidency, his pandemic leadership failures will become even more apparent. More previous coronavirus skeptics will have to admit, albeit grudgingly, that Trumpian measures were inadequate and a better federal response is needed. Some Republicans, like Maryland Governor Larry Hogan, have already acknowledged this. Their number will increase. Enough bipartisanship existed to pass a first economic stimulus package. Given the public mood, a second package seems inevitable, either before or after Biden’s inauguration. Because Republicans are much less likely than Democrats to acknowledge existing racial injustices, steps to improve this situation are likely to prove more contentious. But, as Hannah Arendt once observed, truth possesses a stubborn staying power. Slowly, way too slowly for many, racial discrimination has decreased. Black Americans are no longer slaves, no longer overwhelmingly denied the vote in southern states. Martin Luther King Day is a national holiday. Regarding climate change, largely caused by humans, there can be no compromise on the need to address it, and to do so vigorously. And most Americans, even if they won’t admit that Trump’s environmental policies were disastrous, recognize that they were inadequate (see, e.g., the Pew poll of mid-2020 that found that two-thirds of those polled thought our nation should be doing more regarding climate change).

5. There is a great longing among the electorate for reducing polarization. As Biden said four days after the election, he believed he had a “mandate from the American people. They want us to cooperate.” After recognizing all the polarization that the election revealed, a post-election Pew analysis concluded: “Voters across the political divide . . . want the next president to govern in a unifying way. In October, 89% of Biden supporters and 86% of Trump supporters said their preferred candidate should focus on addressing the needs of all Americans, even if it means disappointing some supporters. Only around one-in-ten in both camps said their candidate should focus on the concerns of those who voted for him without worrying too much about the concerns of those who didn’t.”

6. Both Republicans and Democrats, conservatives and progressives, can agree on the need (as Joe Biden expressed it) “to put away the harsh rhetoric. To lower the temperature. To see each other again. To listen to each other again.” More than a century ago the Republican Theodore Roosevelt stated something similar with the words “Neither our national nor our local civic life can be what it should be unless it is marked by the fellow-feeling, the mutual kindness, the mutual respect, the sense of common duties and common interests, which arise when men take the trouble to understand one another.”  

The six reasons detailed above do not ignore all the roadblocks and possible impediments ahead. But my title only states that the “Odds Are Good that Biden Can Reduce Our Political Polarization.” Dial it back, not completely end it. As our president-elect said, cooperating or not is “a choice we make. And if we can decide not to cooperate, then we can decide to cooperate.”

 

Editor's Note: We echo Professor Moss's encouragement to read Joe Renouard's essay on the climate of polarization President Biden will likely encounter. 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178224 https://historynewsnetwork.org/article/178224 0
Who Will Form the Biden Cabinet?

Joseph Biden with former National Security Adviser Susan Rice

 

 

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

With the victory of President-Elect Joe Biden, it is time for speculation on the potential membership of the 46thpresident’s cabinet.

The US Senate must consider all cabinet appointments. The reality is that the Democrats, at their most fortunate, might have a 50-50 Senate with Vice President Kamala Harris constituting a tiebreaking vote. More likely is a Republican Senate majority after the special elections for the two Georgia Senate seats are decided on January 5, so the selection process must consider realities.

Any nominee perceived as too far to the left will inevitably come to grief in a Senate vote, and any nomination of a current Democratic Senator to the cabinet must consider how the new Senate vacancy would be filled, and by whom.

So the reality of having to negotiate with the Republican opposition, and avoid alienating far left Democrats, will require a balancing act and cautious steps on the part of President Elect Biden.

I propose this cabinet as one that is ideologically balanced and politically achievable.

Secretary of State---Former National Security Advisor and United Nations Ambassador Susan Rice deserves the step up to the State Department. Rice was a finalist in the vice presidential selection process, and gets along with and knows Joe Biden better than almost anyone who worked with him during the Obama years.  She will be excellent in restoring a sensible, rational foreign policy and promoting stronger ties with NATO and democratic allies around the world, while being capable of dealing with authoritarian governments, including Russia, China, Iran, South Korea in a tough and reasonable manner.

Secretary of the Treasury---Former Ohio Republican Governor and Congressman John Kasich was head of the House Budget Committee from 1995-2001, has good relations and friendship with Biden, and also knows how to deal with Wall Street and powerful corporations. He would be an excellent choice. While left wing Democrats would be furious, emphasizing the need to rein in Wall Street influence, in realistic terms, the new president will not want to go to war with Wall Street, while working on policy goals including raising taxes on the wealthy and reversing deregulation and other actions taken by the Trump administration. Every Democratic president since Franklin Roosevelt has one Republican in his cabinet, and since Kasich was against Trump from the beginning and has real credibility and substance, his appointment makes sense.

Secretary of Defense---Illinois Senator Tammy Duckworth, a double-amputee Iraq War veteran, has already served in the department of Veterans Affairs as an Assistant Secretary under Obama. Duckworth has a reputation of being tough minded and capable of meeting the challenge of updating and reorganizing the Pentagon. Since the Governor of Illinois is a Democrat, it would be safe for Duckworth to leave the Senate.

Attorney General----Outgoing Alabama Senator Doug Jones impressed everyone with his prosecution of Ku Klux Klan members involved in the infamous Birmingham Church bombing in 1963, securing indictments and convictions in 2001 as US Attorney in Alabama.  And Jones has made an excellent impression on civil rights and civil liberties matters in his three years in the US Senate, a stark contrast to decades of Alabama senators.

Secretary of the Interior---Washington State Governor Jay Inslee came across very impressively as a Democratic Presidential 2020 contender, and was the most outspoken of all the primary challengers on environmental matters.  His selection would inspire those who wish to reverse the tremendous damage done by the Trump Administration.

Secretary of Agriculture---Ohio Congressman Tim Ryan, another 2020 contender for the White House, would be a good fit as a Midwesterner with concerns about the difficulties that face farm communities in a time when trade and tariff policies can undermine agricultural interests.

Secretary of Commerce---Businessman Tom Steyer, another contender in 2020, made a good impression during his Presidential campaign, and would work to promote commercial revival from the terrible pandemic the nation is suffering through.

Secretary of Labor---Former Massachusetts Governor Deval Patrick, also a 2020 contender, would be a good fit to deal with the problems of working men and women after two distinguished terms as governor.

Secretary of Housing and Urban Development---Former Georgia gubernatorial candidate Stacey Abrams would really make a commitment to improvement of urban life and improvement of housing, an essential need for all Americans.

Secretary of Transportation-- Former Housing and Urban Development Secretary and San Antonio Mayor Julian Castro, also a 2020 presidential contender, would be eager to work on promotion of infrastructure, which is desperately needed but has been ignored by the Trump Administration despite frequent promises of an “Infrastructure Week.”

Secretary of Health and Human Services---New Jersey Senator Cory Booker, also a 2020  presidential contender,  would work to promote an expansion of the Affordable Care Act, and his  seat would be filled by the Democratic Governor of his state, not changing party control.

Secretary of Education---Colorado senator Michael Bennet, also a 2020 presidential contender, was Superintendent of Denver Public Schools, and his seat would be filled by the Democratic Governor of his state.

Secretary of Energy---Former Texas congressman Beto O’Rourke of El Paso, also a 2020 presidential contender, would fit this post in an exceptional way with his knowledge and background about sources of energy, with Texas a crucial state in that regard.

Secretary of Veterans Affairs---Former South Bend, Indiana mayor and presidential contender Pete Buttigieg served in the military, and has a real commitment and emotional involvement with issues veterans face.

Secretary of Homeland Security---Former Massachusetts Senator, Secretary of State, and 2004 Democratic Presidential nominee John Kerry, with his military experience, and his involvement with foreign policy and diplomacy over many years, would be the perfect candidate to work to improve American national security.

United Nations Ambassador---Andrew Yang, another 2020 presidential contender, with his business and personal skills, would be a very good representative of the need for greater interaction and cooperation with that international organization, restoring the faith and importance of international diplomacy. 

This combination of talented people would include ten former presidential contenders from 2020; a former presidential nominee; five people who served in the US Senate; three people who have served as governors; three who have served as mayors; two who have served in the House of Representatives; two businessmen; one prominent Republican; and two well qualified diplomats.

Additionally, this 16 member Cabinet would include 7 members of color; 9 white members; 13 men and 3 women; 2 who are Jewish, and 1 gay member.  So overall, it would be an extremely talented Cabinet of great expertise and would be able to help President Biden and Vice President Kamala Harris achieve their goals for the nation!

 

 

 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/blog/154433 https://historynewsnetwork.org/blog/154433 0
Roundup Top Ten for November 13, 2020

Kamala Harris Shows Women Can Thrive In Politics Doing Things Their Own Way

by Kimberly A. Hamlin

Kamala Harris's candidacy shows a new path for women in public life: being judged as an autonomous human being, rather than as a wife or mother. This will be a radical change if it sticks. 

 

I Was a Detroit Poll Challenger. The GOP Came to Make Havoc.

by Danielle L. McGuire

"The counting hadn’t even started yet; the ballots hadn’t yet arrived at the counting boards and already it was clear that the GOP challengers were there to sow confusion and suspicion."

 

 

Will the Trump-Biden Election Disaster Finally Convince Us to Scrap the Electoral College?

by Kevin M. Kruse

Abolishing the Electoral College isn't a radical idea. It had bipartisan support in the 1960s as a reform consistent with the Supreme Court's rulings that established "one person, one vote" as the core principle of representation in a constitutional democracy.

 

 

Millions of Americans have Risen Up and Said: Democracy Won't Die on Our Watch

by Carol Anderson

Every maneuver by Trump and his enablers to block voting was met with a more powerful and effective counter-maneuver by citizens. It had to be.

 

 

The Keystone State is Ringing

by Ed Simon

"Far more capable tyrants than Trump have been felled by Pennsylvania. This vanquishing feels like George Meade turning back Picket’s Charge at Gettysburg."

 

 

Voting Trump Out Is Not Enough

by Keeanga-Yamahtta Taylor

The results of the 2020 election show that the Democratic Party will fail unless it is willing to abandon a futile effort to woo Republicans to the center and embrace popular policies that meet the needs of Democratic constituents. 

 

 

Black Feminists Taught Democrats To Go Broad And Win Big

by Erica R. Edwards and Sherie M. Randolph

Since the late 1960s, Black feminist activists have viewed grass-roots participatory democracy as means of radically reversing systemic inequalities by enfranchising the disenfranchised and engaging the people who are routinely seen as politically untouchable in debate and consensus-building.

 

 

Although Now Required by California Law, Ethnic Studies Courses Likely to be Met with Resistance

by Nolan L. Cabrera

A scholar who studies racial dynamics on college campuses, argues the benefits of required ethnic studie courses outweigh their liabilities.

 

 

Measuring the Health of Our Democracy

by Heather Cox Richardson

Donald Trump saw the fading of his power to control political narratives as news organizations labeled his charges of election fraud as baseless. 

 

 

Trump's Latest Executive Order is a Head Scratcher to Historians

by Jim Grossman

"There is no shortage of contentious publications and conversations among professional historians about concepts like critical race theory or arguments like those advanced in the 1619 Project. But neither constitutes “child abuse,” which is a serious crime."

 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178213 https://historynewsnetwork.org/article/178213 0
How Two French Introverts Quietly Fought the Nazis

 

 

 

Whenever Lucy Schwob and Suzanne Malherbe went into town to do some shopping, they also snuck messages to the Nazi occupation forces.

 

Suzanne pulled a small note typed on a piece of thin colored paper from inside the pocket of her Burberry overcoat and stuck the message onto the windshield of a German staff car.  Lucy gingerly placed another on a cafe table as they walked down the street.  Working together, they tucked one inside a magazine for a soldier to discover.  Sometimes, Suzanne snuck up alongside a German and, with a trembling hand, slipped a note into his pocket knowing that a bump or a misstep could lead to time in a prison camp.  

 

In 1937, Lucy and Suzanne had left Paris and moved to the island of Jersey, one of the British Channel Islands just off the coast of Normandy.  Both were in the their late 40s and ready to start a new chapter of life on the beautiful island after living for so long at odds with the world around them.  They fell in love as teenagers, but being lesbian partners in a conservative turn-of-the-century France was sometimes painful.  Lucy’s father’s family was Jewish at a time of growing anti-Semitism.  By the 1920s, they achieved some success in Paris as avant-garde artists; they are known today for their Surrealist photography.  New gender-neutral artistic names Claude Cahun (Lucy) and Marcel Moore (Suzanne) allowed them to create new identities that crossed the boundaries between masculinity and femininity.  They associated with communists and flirted with radical politics.

 

But Lucy and Suzanne were always the quiet sort.  Although they opened their apartment to creative friends and socialized in Parisian cafés, they remained most closely connected to one another.  When Paris became too noisy and politically polarized, the women decided to leave it all behind.  

 

The Nazis took control of the Channel Islands in July 1940.  This archipelago in the English Channel was the only British soil the German army ever conquered.  It was also strategically important because it became the leading edge of Hitler’s “Atlantic Wall,” a series of fortifications along the European coastline designed to prevent Allied attack.  No dissent could be tolerated there.

 

Working together in the dim light each evening, Lucy and Suzanne wrote subversive messages by pushing cigarette paper, pages torn out of a ledger, or scraps they found on the side of the road into their Underwood typewriter.  They tapped out songs and imaginary dialogues designed to undermine morale.  Sometimes they made fun of Nazi leaders with bawdy jokes or included a simple summary of forbidden BBC news reports.  Written in German in the voice of an anonymous soldier, the notes proclaimed that the troops would pay the ultimate price for Hitler’s futile war.  

 

Then Lucy and Suzanne crammed the notes down deep into the pockets of their overcoats and headed out on another mission.  They always worked alone.

 

As the war escalated, the women became more extreme.  Lucy snuck out at night and dashed through the cemetery near their house to place crosses with the pacifist message, “For him the war is over” onto the graves of soldiers who had died during the occupation.  Inside a church where soldiers worshipped, they hung a banner claiming that Hitler believed he was greater than Jesus.

 

Back in Paris, Lucy had formulated a vision of resistance she called “indirect action.”  Art that tried to be revolutionary would not work, but a thoughtful poem, provocative story, or surprising photograph could have a profound effect on its audience, burrowing inside the mind and germinating into a new perspective.  This was not propaganda, she claimed, but an attempt to challenge people to think.  It was the introvert’s way of fighting.

 

The German secret police hunted them for four years, and during their trial, the chief judge told them why.  He accused them of being guerrilla warriors who were more dangerous than soldiers.  “With firearms,” he claimed, “one knows at once what damage has been done, but with spiritual arms, one cannot tell how far reaching it may be.”  It was a perfect summary of Lucy’s “indirect effect.”

 

These notes might have seemed small and insignificant, but they demonstrated that the Nazis could not colonize the human heart.  And Lucy and Suzanne’s story shows that quiet, persistent rewriting of the narrative of oppression can be a powerful means of fighting back.

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178135 https://historynewsnetwork.org/article/178135 0
There is Nothing Sacred About the Military Vote

Outgoing Military Absentee Ballots, Camp Arifjan, Kuwait, 2008.

 

 

 

More than half of states allow military absentee ballots received after election day to still be counted. On social media on the night of October 30, in response to the 8th Circuit’s decision to bar Minnesota from counting ballots received after election day, Senator Brian Schatz asked a valid question: would this ruling lead Minnesota to “invalidate a bunch of overseas military ballots?”  

 

Some commenters assumed that Republicans would find a way to allow military votes while disqualifying others, but don’t count on it. Parties comfortable with using extraordinary measures including disfranchisement to win elections won’t be swayed by appeals to their conscience; only mass voter turnout combined with national outrage have proven effective.  

 

The first American soldiers able to vote while deployed were California residents deployed during the Civil War. They helped push Republican Abraham Lincoln to reelection in 1864. During World War I, several states legislated absentee balloting for military personnel, while the U.S. military issued general orders to help deployed servicemen vote absentee if their states permitted. Several states did not establish absentee military balloting though, while others deliberately manipulated the soldier vote in an attempt to retain power. 

 

In 1918, Oklahoma’s anti-suffrage Governor Robert L. Williams joined Election Board Chairman W.C. McAlester to ensure the defeat of a state woman suffrage amendment. For an amendment to pass under Oklahoma’s 1910 Silent Voter Clause (passed the same year as its infamous Grandfather Clause), it had to have more yeas than nays and abstentions combined; an Oklahoma voter who did not vote on the amendments effectively voted against them.

 

Anti-suffragists argued that woman suffrage would be insulting to WWI servicemen: “[S]hould woman suffrage carry in Oklahoma, those [soldier] boys will return to find as their reward that their ballot is worth just one-half as much as when they went away.” But the antis proved to be nothing more than fair-weather friends to the soldiers and their ballots. 

 

After using the 1918 Flu pandemic as an excuse to prevent suffrage talks and a failed attempt to keep the suffrage amendment off the ballot on a technicality, anti-suffrage politicians came up with a fairly ingenious idea. State Election Board Chairman McAlester and the Election Board printed the suffrage amendment on a separate ballot to be sent alongside the original ballots to the polls, but they only printed half the number needed. Half of Oklahoma voters would not receive a ballot containing the amendment and would be counted as voting “no” on suffrage because of the silent vote clause. Local election officials learned of the discrepancy and printed the needed ballots, but, not to be outwitted entirely, McAlester and the Election Board had ballots purposely printed without the amendments sent to military training bases in Oklahoma. In doing so, they effectively voted “no” on behalf of every serviceman in the state- all 4,197 of them. Parties willing to disfranchise don’t consider military ballots off limits.

 

Despite these tactics, Oklahomans approved the suffrage amendment, but the Elections Board ordered a recount based on sealed returns, including mutilated and spoiled ballots that would increase the silent votes. When that failed, anti-suffragists filed a formal protest against certification of the referendum, provoking a national response and a torrent of letters and telegrams. Finally, on November 14th Governor Williams admitted the suffrage amendment had passed, though the Elections Board still refused to certify the election. Voter turnout and national outrage had secured the results. 

Oklahoma’s neighbor to the south offers another example of soldier disfranchisement. Texas completely disfranchised military personnel for the length of their service thru 1954, and they did so by accident. In its original 1845 constitution, Texas wanted to restrict military service in the state from making one eligible to vote. Instead, the poorly phrased clausethey used disfranchised all servicemen (and later women) in the state for the length of their service. There is nothing sacred about the military vote in American history. 

Texas enacted the poll tax in 1902 to bar Black and poor white voters, but veteran disfranchisement was an unintended consequence of the law. The poll tax had to be paid each year by February 1st; paying it after that deadline still left a voter disfranchised. At the conclusion of WWI, military personnel who were discharged and returned to the state after February 1, 1919 remained disfranchised for the year. 

Disfranchising efforts often have unintended consequences. Texas politicians were not concerned, until the war ended shortly before a May 1919 referendum on a state woman suffrage amendment requiring voters to be citizens. Seeking voters to help their cause, politicians passed a law allowing returning soldiers to vote on their discharge papers. However, they didn’t have the votes to enact it without a waiting period, meaning it wouldn’t go into effect until after the May election. In what became a public relations nightmare, moderate reformer Governor Hobby vetoed the soldier-voting billafter the state Attorney General advised that it was unconstitutional. Furious wounded soldiers at Fort Sam Houston refused to participate with Governor Hobby in a scheduled parade to advertise the Liberty Loan. In the end, the soldiers reluctantly participated, using the phrase “in spite of any Texas governor” as their slogan. 

Adding to the chaos, a curious set of editorials by conservative judges began to appear arguing that the poll tax was never intended to bar veterans from voting. The editorials, encouraging what was, if not illegal, then certainly extralegal voting, convinced progressive groups to lobby for a new soldier-voting bill. Texas suffrage leader Minnie Fisher Cunningham privately admitted that she feared the editorials were intended to ensure lax enforcement of election laws in preparation for bringing Mexican President “Carranza’s Army” into South Texas to illegally vote against suffrage. Acting primarily out of fear that voter fraud would be used against them and in response to public outcry, rather than a concern for returning soldiers, Governor Hobby called the legislature into special session. He asked for a bill to enfranchise discharged soldiers, “prevent the slacker or imposter” from dressing as a soldier to illegally vote, and standardize the rules in “each and every county…” The successful bill went into effect less than two weeks before the election. 

 

A party bent on winning by any means necessary will not stop short of disfranchising the military or manipulating their votes. Only massive voter turnout and national pressure forced state officials to accept the Oklahoma election results. Only their own interests and public outcry convinced the Texas legislature to enfranchise WWI veterans. The military vote has not been well-protected historically, and laws and campaigns intended to disfranchise one group can easily disfranchise others or trigger reactions that those in power never anticipated.

 

And lest we assume that manipulating military votes is a phenomenon of the distant past, take the case of Val Verde, Texas. In 1997, when 800 military absentee votes combined with low voter turnout secured narrow Republican victories over two incumbent Latinos, Democrats accused Republican strategists of recruiting military voters, some of whom hadn’t resided in Val Verde in decades. Republicans accused Democrats of wanting to disfranchise the military. As many of the military voters hadn’t lived in the area for years, they were likely unaware that one of the Republican candidates, whom they helped elect in a majority-Latino county, was a former leader in the Ku Klux Klan. The larger story over Texas’s lack of residency requirements for military absentee ballots and possibly manipulating military voters themselves never gained traction, though one of the candidates was forced to decline his seat when photos of him in Klan regalia were made public.

 

While appeals to conscience alone won’t stop parties bent on manipulating or denying ballots, including military ballots, mass voter turnout and public outrage have proven effective. Americans have already taken care of the former turning out in numbers heretofore unheard of during early voting. Now we must keep a watchful eye on efforts to throw out or manipulate those ballots, and when needed, let our voices be heard once again. 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178133 https://historynewsnetwork.org/article/178133 0
Will Trump's Last Fight be Against Howard Zinn (and America's History Teachers)?  

 

President Trump’s speech at his recent White House Conference on American History slandered both history teachers and historians whose writings foregrounded the role of racial and class conflict in American history. Without evidence, he accused teachers of promoting a “twisted web of lies in our schools,” indoctrinating their students in a version of our nation’s history that led them to hate America. This was, he  claimed “a form of child abuse in the truest sense of those words.”   Referring to the mostly non-violent nationwide Black Lives Matter protests against racist police violence as “left-wing mobs” fomenting  “violence and anarchy,” Trump again without evidence charged that this “left wing rioting and mayhem are the direct result of decades of left wing indoctrination in our schools.” The first source of this alleged indoctrination named by  Trump was the late Howard Zinn, author of the best-selling A People’s History of the United States (1980), whom he depicts not as a historian but as a propagandist: “ Our children are instructed from propaganda tracts, like those of Howard Zinn, that try to make students ashamed of their own history.”

This presidential tirade against history teachers and Howard Zinn demonstrates Trump’s ignorance of both. American history teachers in our nation’s schools do not indoctrinate their students; they educate them via state mandated curricula and textbooks that are at least as respectful of the Founders of our republic as any mainstream politician, and  promote democratic citizenship not mob violence. Howard Zinn’s People’s History of the United States is a work of history not propaganda.  Zinn’s book  does stress the role of racial and class conflict and militarism in the American past, and so is considered too radical to be adopted as an official textbook by most school districts. 

Trump gets the reality of school history education completely backwards. Far from being radical or so provocative as to seed leftist mob violence, most  history instruction is actually too conservative, so lacking in controversy that it too often leaves students bored. This is why innovative teachers bring Zinn’s work into schools,  aiming not to indoctrinate but to engage students in authentic historical thought via  debates contrasting select chapters from Zinn’s iconoclastic history with their official, conventional US  history textbooks.  There are in Howard Zinn’s papers at NYU  hundreds of letters from high school students and teachers  attesting that history came to life in their classrooms when they used the competing interpretations offered by Zinn and their textbooks to argue about Columbus’ bravery as an explorer vs. the brutality of his conquests, Andrew Jackson’s  democratic politics vs. his responsibility for the Trail of Tears, whether the Mexican American war was an unjust US war of aggression, whether the atomic bombing of Hiroshima and Nagasaki was necessary to end World War II, and other important, thought provoking debates about American history.   

Trump’s charge that Zinn makes students feel “ashamed” of their history is false.  Zinn had no interest in fostering shame, but rather wrote his People’s History to promote critical thinking, especially about class, racial, and gender inequality and war-making in America.  Zinn sought to arouse in his readers a quality Trump is famed for lacking, empathy, for the oppressed. Or as Zinn put it, “in a world of conflict, a world of victims and executioners, it is the job of thinking people, as Albert Camus suggested, not to be on the side of the executioners.”  Zinn so admired people’s protest movements in their struggles for a more just and democratic America that his People’s History valorized those movements. In fact, the book offered dramatic accounts of Americans resisting oppression, a narrative that was at times inspiring, promoting not shame but pride in this ongoing and unfinished struggle for social justice. 

It is absurd to think that the president read Zinn’s 688 page People’s History before knocking it or that he would do so now to see how wrong he is about Zinn. But he could get to know Zinn better without reading a thing. 

Zinn’s People’s History inspired theater events and then a movie, The People Speak, in which famed actors such as Danny Glover, Marissa Tomei, Viggo Mortensen and many others read the words of great American dissenters, from Frederick Douglass to Emma Goldman, to Martin Luther King, Jr. This movie, when aired on the History Channel, attracted some nine million viewers, leaving reviewers impressed with the eloquence of America’s radical social activists. Those speeches, and the protest music sung by Bob Dylan, Bruce Springsteen, Randy Newman, and John Legend fostered greater understanding of the role of dissent in the American past. Had this movie been screened at the White House Conference on American History, Trump’s talk about Zinn making students feel ashamed of their history would have been refuted right before his eyes.

 Like Howard Zinn, the best history teachers recognize that if students are to be prepared for democratic citizenship, they must learn to think critically about the world, and confront the reality that, as James Baldwin put it, “American history is longer, larger, more various, more beautiful, and more terrible than anything anyone has ever said about it.”  Serious study of our nation’s past involves understanding not only America’s inspiring democratic ideals but its failures to live up to them, and  studying the historical struggles on behalf of those  ideals. This kind of   a critical reckoning with historical reality is far more accurate, humane, and truer to our democratic faith than the flag waving, propagandistic version of the American past touted at the White House Conference on American history. 

 

 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178138 https://historynewsnetwork.org/article/178138 0
The End of an Era? Athens After Empire

 

 

 

Athens: the most powerful city in ancient Greece; the birthplace of democracy; home to the great tragedies of Aeschylus. Sophocles, and Euripides; philosophers such as Socrates, Plato, and Aristotle; brimming with large monuments and temples; people packing into the Agora to do business and discuss current affairs; life there anchored in freedom and autonomy; a great imperial power. This is the Classical-era city that resonates with us the most when we think back to ancient Greece.

 

But what was Athens like after the Classical period? Alexander the Great’s father Philip II had imposed Macedonian rule over Greece in 338 when he defeated a coalition Greek army at Chaeronea (in Boeotia). That rule continued for almost two centuries until Macedonia fell to the new power of the Mediterranean, Rome, which made Greece part of its empire in 146 (dates are BC). Commonly called the “Hellenistic” period, from Alexander’s death in 323 to Rome’s conquest of Egypt in 30 (which gave Rome mastery of the entire Mediterranean), Hellenistic Athens is normally viewed as a sad relative of its Classical self – subjected to outside rule; its democracy curtailed; its military next to nothing; its economy shattered; its population cowed; its intellectual and cultural life stifled; even aspects of its culture appropriated by the Romans for their own needs.

 

That dreary picture of decline and fall is wrong.

 

It’s wrong, as I show in my Athens After Empire, because of the chronological parameters imposed on “Hellenistic” Athens. The problem with periodization – breaking up a long historical period into a series of distinct smaller ones – is that it does not follow that the beginning and end points of these smaller stages are right. Scholars commonly divide Greek history into three broad eras: Archaic (750-478), Classical (478-323), and Hellenistic (323-30). Yet these dates are arbitrary – why, for example, should the Persian Wars (490-478) end the Archaic period, when they heralded a new era of warfare, not to mention political, social, and cultural changes throughout Greece, rather than start the Classical? After all, the Greeks did not have these terms and they thought about history differently.

 

The term “Hellenistic” was devised in 1836 by the German historian Johann Gustav Droysen, who saw the three centuries after Alexander the Great’s death (323) to Roman rule in the east (30) characterized by the spread of Greek language and culture there (thanks to Alexander’s conquests) – Hellenism. Yes, there was this spread as Alexander’s campaign in the Asia opened up new social, cultural, and economic contacts as never before. But this terminal date implies that history somehow ended then for all concerned. It did not, nor did the spread stop.

 

Within this “Hellenistic period” there were multiple independent kingdoms and areas going about their daily business, interacting and going to war with each other until Roman imperialism put an end to that. Yet the Romans conquered each of those places and made them become parts of their empire at different times – Greece in 146, for example, the Seleucid Empire (Syria) in 63, and Egypt in 30 when Cleopatra VII’s suicide ended the last Hellenistic dynasty, Ptolemaic Egypt. 

 

The year 30, then, was a significant one for Rome – and Egypt for that matter – as it could now boast mastery over east and west. But what about Greece, especially its leading city Athens, which in 30 had been in the Roman Empire for over a century? After the battle of Actium in 31, when the Roman admiral Agrippa defeated Antony and Cleopatra, Octavian (later Augustus) visited Athens. He was now ruler of the Roman world. He brought gifts of needed grain to the city and Greeks and made peace with them. 

 

Yet the only novelty stemming from Actium was a new Roman master, another in a line that went back to the likes of Julius Caesar and more recently Mark Antony (who had twice lived in Athens). Octavian foisted no new constitution on Athens, no new taxes, no new anything affecting daily life. His actions also followed a trend, for Caesar had forgiven the Athenians for their support of Pompey, as had Antony for their support of Brutus after Caesar’s assassination, and like Caesar Octavian had bestowed gifts on the city. 

 

It is hard to imagine, then, that the Athenians (or the Greeks) would have seen Octavian’s actions, and a year later Egypt’s fall to Rome, as being the end of an era for them. Why? Because there was minimal change but maximum continuity. It is that continuity that makes me challenge the accepted terminal date of 30 for “Hellenistic” Athens and to take it an entire century later to the emperor Hadrian in AD 132. In the process I show the pitfalls of periodization and bring Athens out of the shadow of its more famous predecessor.

 

Why Hadrian? Throughout the collapse of the Roman Republic and first century of the emperors Athens was still the dominant city in Greece. It remained a cultural and intellectual center, among other things for philosophy and rhetoric, attracting Romans to visit and study there. In AD 132 Hadrian drew on Athens’ cultural standing to make it the center of a new league of Greek cities in the east he had fashioned called the Panhellenion. 

 

In doing so, he catapulted Athens to prominence again in the Greek world. He had also constructed a number of monuments in the city (his magnificent Library, for example) and finished – after centuries – the massive temple of Olympian Zeus close to the Acropolis. And in that same year (AD 132) the Athenians erected an Arch to him (which is still standing, adjacent to the temple of Olympian Zeus), inscribing on its west side: “This is Athens, the ancient city of Theseus,” and on its east: “This is the city of Hadrian, and not of Theseus” (the city’s legendary founder).

 

When, however, we assess Athens from fall thanks to Philip II of Macedonia to rise again thanks to Hadrian as a single time frame, as I do for the first time, we see how the city is far from being a mere postscript as commonly held. Aside from its continuing diplomatic and cultural history is an even stronger pattern not appreciated by ending its “Hellenistic” history in 30: the people’s resilience – by extension serving as a lesson for their own time and for us when oppressed. Athenians fought their Macedonian masters when they could, no matter the odds, and sided with foreign rulers against Rome, in the hope of regaining freedom and restoring democracy. They were subjected to Macedonian garrisons in the port of Piraeus and in the city itself, and under Rome were forced to share the Acropolis, the sacred home of their patron deity Athena, with the goddess Roma as part of an imperial cult. 

 

Ending the city’s history in 30 does not bring out how the people’s resilience was a defining feature of their history under Macedonia and Rome. In the 80s they sided with Mithridates VI of Pontus (Black Sea) against Rome, suffering terrible losses in 86 when the Roman general Sulla sacked the city. But the people did not quietly fall away into subservience as ending hellenistic history in 30 suggests. In 21 for example, ahead of a visit from Augustus, they defiantly daubed Athena’s statue with blood and turned it to face westward: even the goddess it seems was spitting on Rome! 

 

Throughout the five centuries from Philip II to Hadrian, Athens carefully navigated masters to retain its own identity even as it appeared to transform into a Roman provincial city. One inscription on Hadrian’s Arch might well say Athens was no longer Theseus’ but Hadrian’s city, but the other one tellingly states that it’s still Theseus’ after all. And then came the Panhellenion, which turned Athens into the second city of the Roman Empire. Likewise, its – and Greek – cultural and intellectual life seduced Romans; as the Roman poet Horace put it: “Captive Greece captured her rude conqueror.” We do not appreciate this history if we end it too early. 

 

“Hellenistic” Athens may not shine as brightly as Classical Athens, but it has lived unfairly in the shadow of its famous predecessor. It’s time it emerged from that shadow.

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178132 https://historynewsnetwork.org/article/178132 0
Blaming the Messenger: Trump, the KKK, and the War on Historians

Hiram Evans leads the KKK in Washington, DC, 1926

 

 

 

During the lead-up to the election, President Trump and other right-wing populists have attacked American historians who question traditional – some would say outdated – narratives of American history, meaning those that stress unthinking patriotism and unquestioning national pride. Hoping to energize his political base, Trump convened the 1776 Commission, a panel tasked with developing a “patriotic education” curriculum that will supplant alleged “left-wing indoctrination” of innocent schoolchildren. His initiative reflects a larger backlash against multiculturalism and historians who expose the darker corners of American history while urging their students to create their own historical narratives through open-minded inquiry. One of Congress’s most strident conservative voices, Senator Tom Cotton of Arkansas introduced a bill to strip federal funding from school districts that teach the 1619 Project, an ambitious attempt to reframe American history around issues of slavery, racism and Black contributions to national life. Rising to this bait, conservative outlets have renewed their longstanding assault on the radical historian Howard Zinn, who died ten years ago, as a symbol of everything wrong with historians who dare question traditional narratives. 

 

The president and his ideological peers are likely unaware that their goals, motivations, and methods recall those used by patriotic groups and such populist, right-wing organizations as the Ku Klux Klan in the 1920s.

 

Peaking in around 1923, the American Legion, the Sons of the American Revolution, the Klan, and other groups sponsored a wave of bills designed to impose a simplistic, triumphalist vision of American history on students. Like today’s conservative culture warriors, they insisted that the “proper teaching” of American history could reverse the perceived decline of traditional values, blunt the appeal of left-wing radicalism, sever immigrants’ attachments to their former countries, and replace multiculturalism with good-old-fashioned Americanism.

 

This loosely coordinated campaign, like its 2020 counterpart, reflected broader concerns for the future of a country in flux. As is the case today, many 1920s-era Americans imagined that previous generations were more homogenous, more virtuous, and more patriotic than themselves. To their minds, divisions, struggles, and persistent inequalities were errant stitches, best ignored, on a broader tapestry of national greatness. In 1923, California’s state commissioner of secondary schools, A. C. Olney, in recommending appropriate American history textbooks, warned against assigning authors who treated “old and dead” controversies such as the Civil War “in such a way as to perpetuate animosities.” From this perspective, the purpose of studying the past was not to pursue broader truths or to appreciate the contributions of diverse groups, but rather to unify present-day Americans behind celebratory interpretations of historical events.

 

National unity struck seemed like a matter of national survival, as it does for many 2020 conservatives. World War One-era calls for 100% Americanism, backed by federal laws such as the Espionage and Sedition Acts, silenced dissent and inflated fears of un-American immigrants and heterodox opinions. The post-World War One dread of communist infiltration sparked a full blown Red Scare marked by harassment, vigilante violence, and deportations. Fears that multiculturalism would overwhelm traditional American values helped inspire the 1921 Emergency Immigration Act and 1924 National Origins Act, laws that restricted immigration from so-called undesirable ethnic groups.

 

Militant hyperpatriotism, Red baiting, nativist rhetoric, and anti-diversity campaigns fueled the right-wing backlash against American historians then as well as now. The Klan, in particular, hoped to exploit education reform (a term used here in a value-neutral way) to broaden its popular appeal while undermining perceived threats to white, Protestant dominance. The 1920s-era Klan was a prominent social organization that claimed a membership of four million, probably an exaggeration but one it could make with a straight face. Unlike its Reconstruction-era predecessor, it furthered its agenda through political pressure, economic boycotts, intimidation, and harassment rather than naked violence, although violence and threats of violence remained part of its toolkit. Its anti-Catholic, anti-immigrant, “pro-American” program resonated with white Protestants who feared losing social and economic status during a tumultuous time.

 

KKK Imperial Wizard Hiram Evans advocated the creation of a federal department of education that could oversee “the re-Americanization of our common Republic.” Local Klan chapters marched behind parade floats depicting little red schoolhouses, the symbol of sturdy, old-fashioned Americanism. Klansmen distributed American flags and Bibles to public schools, lobbied school boards to fire Catholic teachers, and introduced bills requiring public schools to hire public-school graduates as teachers, thereby undercutting Catholic parochial schools. With limited success, the Klan pushed its influence onto college campuses in an attempt to shape history curricula. Their most notable victory came in Oregon, which in 1922 passed the Klan-backed Compulsory Education Act. The law, aimed at destroying parochial schools that the Klan denounced as undemocratic, required all students to attend public schools (the U.S. Supreme Court later struck down the law).

 

A crucial element of the Klan’s education program, one it shared with other right-wing groups, was stifling American historians whose interpretations clashed with their own. For California’s A. C. Olney, that meant histories that treated “any part of the American history in a disloyal or unpatriotic manner” or minimized “the best patriotism of American tradition.” Olney’s use of “the” in “the American history” was no accident; many traditionalists used this defiantly singular construction to imply that there was only one correct version of their nation’s story.

 

Today’s culture warriors would recognize and even parrot perspectives from the 1920s, including that of Wisconsin state senator John Cashman, who declared in 1923 that “un-American professors can do more harm in ten months than a hundred ship loads of reds could do in ten years.” That same year, two years before Tennessee’s Butler Act banned the teaching of evolution, Cashman sponsored a bill to ban history textbooks that “cast aspersions upon the heroes of the American Revolution or the War of 1812.” A good American history class, he argued, “tells the truth from an American point of view,” thereby driving “anti-American propaganda” from the schools and enabling children to mature into patriotic adults. Cashman’s bill passed easily and was signed into law.

 

New York, Oregon, and other states considered copycat bills targeting “anti-American propaganda.” California legislators introduced a bill making it a fireable offense for history teachers to criticize the Constitution or the Founding Fathers. In Arkansas, a Klan-dominated state legislature passed a law requiring all college students to take a course in American history and government not as a path toward greater enlightenment, but rather as a means of engineering what they saw as the right kind of education: one that instilled blind patriotism.

 

President Trump and his allies would no doubt applaud these measures, some of which are still on the books. Like twenty-first-century traditionalists who want only to learn a (singular) truth about the past, early twentieth-century critics spread fundamental mistruths about the study of history. Rather than accept that a historian’s individual interpretation of documents and events are key to their craft, 1920s-era traditionalists cried for “strictly unbiased” texts without ever explaining what those would look like. While denouncing biased texts, they, much like President Trump’s 1776 Commission, pressured American historians to present positive interpretations of the past. “The young of today should be taught that there has been and is more virtue than vice, more strength than weakness, more noble aspirations than ignoble deeds, in the transcending story of American life,” the Marshfield (WI) News-Herald opined. “Our greatest men have had their faults,” seconded Judge Wallace McCamant of Oregon, national chairman of the Committee on Patriotic Education, “but why cast the spotlight upon the flaws while, at the same time, leaving the more important features in darkness?”

 

Decades later, the multiculturalist revolution of the 1960s rewrote American history by incorporating a wider range of voices into in increasingly fractured national narrative. In the ensuring years, Black people, women of all races, and ethnic minorities have assumed ever-more prominent positions in both textbooks and classrooms. Their stories, and the more complex and nuanced views of history they have inspired, banished Judge McCamant’s one-dimensional version to the historical dustbin. 

 

Or so the historians argue. Beyond the realm of academia exists a yearning to restore that simpler story, along with its message of inevitable progress, (Protestant) white male dominance, and traditional American values.

 

Whether for ideological or political ends, President Trump and other conservative culture warriors are waging this same battle for restoration, using the same tools as the 1920s-era Klan: intimidation, bullying, and public shaming. Whereas this harassment once involved boycotts and cross burnings, it now exists on social media and the internet, where armies of outrage, inflamed by such sites as Infowars and such groups as Campus Reform, denounce history teachers whose heterodox interpretations clash with their own. Honest debate and intellectual freedom are not their aims. Instead, similar to the past century’s Klansmen and their allies, they seek to either coerce universities into firing supposed radicals or, barring that, to bludgeon them into silence. Should they succeed, their victory will strike a blow against diversity, tolerance, and multiculturalism – precisely the goals of a century earlier.

 

The fate of President Trump’s 1776 Commission hangs in the electoral balance. What is certain is that, even should the president lose his re-election bid, the right-wing populist campaign that he helped recharge, a movement with deep roots in our past, will keep attacking historians whose professional interpretations clash with their own understandings of American history.

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178137 https://historynewsnetwork.org/article/178137 0
Eisenhower’s Election Day Crisis Reminds Us What Presidents Must Do

 

 

Dwight Eisenhower was worried on Election Day in 1956, but not about winning the presidency for the second time. “I don’t give a darn about the election,” he said that day.  Eisenhower was busy trying to avoid World War III. An invasion of Egypt launched days earlier by Israel, France and Great Britain threatened to spiral into a much wider conflict. The fighting had erupted over control of the Suez Canal, a valuable shipping waterway in Egypt connecting the Red Sea and the Mediterranean Sea.  Britain and France had owned the canal, but Egypt had seized and nationalized it that summer to help pay for a dam building project. The Suez crisis would turn violent at the end of October when Israel, France and Britain used military force to resolve the issue. Eisenhower was furious.  The Cold War rival Soviet Union warned it would come to the aid of Egypt and this prospect increased the dangers of a bigger war.  On Election Day morning (November 6, 1956) Eisenhower received intelligence briefings on the fighting and the possibility of Soviet intervention. Eisenhower told the CIA director Allen Dulles “If the Soviets should attack Britain and France directly, we would of course be in a major war.” Adding to this frightful prospect was Admiral Arthur Radford’s intelligence briefing that the Soviets most feasible means of intervention included “long-range air strikes with nuclear weapons.” It was an unlikely but horrifying prospect nonetheless. Such was the danger of conflict in the age of nuclear weapons. This made peace an even greater imperative.  There was a way out, which Eisenhower had set in motion at the United Nations the previous week. A ceasefire resolution, presented by Secretary of State John Foster Dulles to the General Assembly, had passed overwhelmingly. The ceasefire called for the withdrawal of troops on all sides and a ban on arms shipments in the area. Canadian foreign minister Lester Pearson created the idea of a UN peacekeeping force to keep the canal safe and open for all.  As Eisenhower told the American people during a broadcast speech on the crisis "we do not accept the use of force as a wise or proper instrument for the settlement of international disputes.” The U.S. was not going to get involved in conflict nor enable it through arms or military support.  Israel accepted the resolution. Now Eisenhower was waiting on Britain and France and that was his big worry on Election Day.  Eisenhower eventually received word from British Prime Minister Anthony Eden that they would accept the ceasefire. The fighting ended.  After the ceasefire Eisenhower made it a priority to “provide to the area, wherever necessary, surplus foods, and so on, to prevent suffering.” All too often the hungry and poor victims of war are forgotten. Even when guns cease firing, hunger still remains on the assault. Eisenhower’s Food for Peace program would help feed the Suez war victims. Eisenhower also directed millions in food aid to Hungary to help victims of a brutal Soviet attack on civilians in the aftermath of an uprising against the Communist government there. The Hungarian crisis occurred right at the time as Suez causing major conflict on two fronts in the Cold War.  The Suez crisis reminds us that we need presidents who seek to avoid war. Military actions have awful consequences, which Eisenhower understood the depth of. Yet today we have so many wars ongoing and U.S involvement directly or indirectly.  President Trump has provided military support and arms sales to the Saudi coalition fighting in Yemen’s civil war. This has prolonged the conflict and hunger crisis and placed Yemen on the brink of famine. Save the Children says over 85,000 Yemeni kids have died from hunger and disease since the war began.  The World Food Program feeds over 12 million Yemenis a month with life-saving aid, and funding is difficult to obtain. Some areas are cut off from aid making the need for a nationwide ceasefire urgent. The International Rescue Committee is calling for "a major push by the international community to use all diplomatic leverage to bring warring parties to the negotiating table to agree to a nationwide ceasefire and political settlement to the conflict. The US, UK, France, and others, knowing that they continue to authorize the use of their weapons in this conflict, must end all support to warring parties."  America should be bringing peace to Yemen, instead of enabling the combatants there. That is what we must expect of any president, to build peace and seek to avoid war.  

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178139 https://historynewsnetwork.org/article/178139 0
The New York Con

 

 

There’s a well-known saying that originated in New York, presented to challenge someone’s naiveté: If you believe that, I have a bridge to sell you. Many people are unaware that the expression became popular because a con artist actually made a fortune selling the Brooklyn Bridge to unsuspecting marks.

Around the dawn of the 20th century, impostor George C. Parker was able to repeatedly sell the iconic span between Manhattan and Brooklyn, sometimes twice in one week. He specialized in the marketing of a number of New York monuments, including Grant’s Tomb and the Statue of Liberty. 

Methodical in his grandiose schemes, he presented unwitting buyers elaborately forged deeds to public property. Some only realized they’d been swindled when attempting to erect a toll booth to charge pedestrians; the police interceding and ending their brief delusion. Parker cheated people in small and large ways for decades.

The duplicitous can lie with a smile on their face.The self-righteous, deluded, bullying loudmouth who has no other care but his own fortune and aggrandizement is secretly concocting sinister plans. When he is not smooth talking, he is vulgar and abusive, mocking and degrading those who he would steal from.

Grifters who gain the confidence of the innocent are usually narcissistic deceivers, enticing their prey in a cloud of self-righteousness. The desperate or gullible don’t realize what they are about to lose until their valuables are gone.

The scammer and his felonious cronies conspire to steal by deception. Cons depend on pretending to offer what they can never deliver. The most cunning are able to fool the personnel at small and large businesses, including banks. Sometimes they run for public office.

Cons persevere, finding victims willing to embrace their seductive fabrications. They believe they never will be caught. But occasionally, these clever thieves are apprehended and prosecuted for their corrupt operations.

Parker was a popular figure with those who were oblivious to the impact of his darkest deeds. And cons seem attractive and entertaining even when we don’t believe them. But nefarious crimes should remind us that the consequences of thievery, small and large, are both economically and emotionally traumatic. The law eventually caught up with Parker, and he spent his final years in Sing Sing prison for multiple fraud convictions.

Cons are recognizable once you know their ilk. And New Yorkers know their own best. From the small time crook on the street to a racketeer in a high-rise office, there is a ruthless class of character from New York who stands out like a sore thumb. We know a lying huckster when he appears and want nothing to do with this black hole of a human. 

Parker was a novice compared to Donald Trump. 

Trump’s devious enterprises, both during his term as president and in the years before he held office are much grander and complex. His appeal and behavior fooled millions of people, enough to gain him the presidency, making him one of the greatest hustlers of all-time. 

Trump’s chicanery as a real estate and casino magnate — and his tough guy TV image — presented the classic caricature of a con, including his legacy of causing pain and devastation. There are literally thousands of examples of his scurrilous behavior. 

As with the rest of his lot, the details of how he operates are much more revealing than anything he says.

The recent revelation of Trump’s Chicago skyscraper fiasco is a window into a world where he manipulates a financial situation in his favor, with whatever it takes. When he couldn’t pay back the loans taken out for the seven hundred million dollar project, he sued Deutsche Bank for predatory lending practices. The bank counter-sued and eventually agreed to a settlement greatly favoring Trump. The cancelled debt was taxable; the details of how he evaded paying these taxes are just part of the reason he has legal problems on the horizon. 

Trump’s con of the banking industry has a repeating pattern. He defaults on huge debts and then either declares bankruptcy or looks for any angle to reduce or eliminate the responsibility for the millions he owes. When at risk, caught or cornered, he takes outrageously antagonistic measures to weasel his way out of trouble. 

Trump’s involvement in over 3500 legal cases throughout his business career, before becoming president, reveals that his intentions and their effects are at a minimum, contentious, and certainly a great burden on those involved. 

Having a new powerful office provided the leverage to raise the heinous level of his con. And it took Trump’s rise to the presidency to finally end his relationship with the city that knows him for who he is. He claimed he rejected New York, but well before his decision to call Palm Beach his residence, most people from his home town had enough of his lies. His new national pulpit amplified his bluster and confidence, destroying all possibility of staying connected to the community that once tolerated him. 

When he felt he had extracted all he could from the city, and was no longer welcome, Trump packed his bags, figuring his illicit trade in Washington was more lucrative. He realized after attaining the highest office in the land, the majority of his fellow New Yorkers saw him as the consummate con man, and he attempted to cut his losses.

Upon departure, he summed up his situation, “I have been treated very badly by the political leaders of both the city and state. Few have been treated worse,” adding, “best for all concerned.”

Trump’s con is beginning to unravel, particularly in his former home. Both the district attorney of New York City, Cyrus R. Vance, Jr., and the attorney general of the State of New York, Letitia James, are looking into a number of illegal matters that will lead to indictments of Trump and his corrupt organization. 

The concept of fraud minimizes the depth of his alleged criminal activities. No amount of legal delays, arm twisting, bribes or federal pardons will protect him from the prosecutions that will come. He will eventually be forced to return to a place that will shine a bright light on the truth.

With a suite in Sing Sing awaiting him, the president has been highly motivated to say or do anything to hold on to the protections of his office to avoid a legal and personal nightmare. But he has burned his bridges, and many of his Washington facilitators are starting to act like they never knew him. His frantic push to save himself is futile; his house of cards is about to fall. 

Trump’s facade is crumbling; his true persona grows nakedly apparent to a wider circle. He is the self-described woman-grabbing misogynist, the wily operator who paid less taxes than a fireman in the year he was elected, the mean-spirited racist bully who makes fun of the disabled, the classic hypocrite who, in the name of protecting American values, oversees the separation of immigrant parents and their children, and the false patriot who incites violence and civil unrest while demeaning members of the armed forces. Above all, he is the lying swindler whose priority remains lining his pockets as a deadly pandemic surges and the world faces an unprecedented environmental catastrophe. 

Parker repeatedly sold a half-mile bridge, Trump is still selling a thousand mile wall, symbolic of his xenophobic vision for America.

Trump convinced his victims that he could provide security from an imagined demon, solely to con them for their votes. He relentlessly appealed to their emotions, threatening that any opponent would ruin their lives; he promised things he could never deliver. He’s the seller of a phony cure for his exaggerated, deteriorating vision of the country. His fear-mongering, hate-inciting presidency and re-election campaign only further revealed his despicable nature. 

The damage Trump has done to the United States has yet to be fully appraised.

He managed to con almost half of all Americans in the recent election and there are still people who believe him. But Trump’s supporters will eventually realize that they voted for a man who despises them. However slowly, an increasing number of people are recognizing who he really is; his game is over. There is no such thing as Trumpism, only degrees of how deeply he was able to con his entranced followers.

Like the rest of his deplorable breed, Trump is only a taker; he has nothing to give. His lifetime of deception and bitter conflict, with an increasing number of enemies, will bring him down.

With his confidence scams falling apart and his world in a tailspin, he is predictably saying the election was stolen from him, ramping up his hostile and abusive rhetoric, and furiously claiming he is a law-abiding and righteous citizen maligned by his enemies. 

Trump still insists he has been an inspiring and great leader of the United States.

If you believe that, I have a bridge to sell you.

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178136 https://historynewsnetwork.org/article/178136 0
Life during Wartime 525

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/blog/154430 https://historynewsnetwork.org/blog/154430 0
Biden's Years of Experience in Public Service are Second to One Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

 

With the election of Joe Biden to the presidency, we are about to inaugurate the second most experienced president, in terms of years of service in American history, five years fewer than John Quincy Adams.

If one looks at years of public service of the 45 men who have served as President before Inauguration Day 2021, four have resumes of extensive public service in elected or appointed service comparable to Biden’s—Adams, James Buchanan, Lyndon B Johnson, and Gerald R. Ford. Only Adams had more years of service than Joe Biden, although 23 of his total years were by appointment to diplomatic position, not elections to office.

John Quincy Adams (the 6th president, 1825-1829), served as US Ambassador to five European nations, including the Netherlands, Portugal, and Prussia from 1794-1801; to Czarist Russia from 1809-1815; and finally to Great Britain from 1815-1817, making a total of 15 years in foreign diplomacy followed by eight years as Secretary of State under President James Monroe from 1817-1825. Adams is considered one of the best Secretaries of State in American history.  Additionally, he served in public office as US Senator from Massachusetts from 1803-1808 before returning to diplomacy the following year. So when he was elected President in 1825, he already had 28 years of public service. After his one term as President, he was elected by the people of Boston to the US House of Representatives from 1831-1848. Adams ultimately had in total more than 49 years in public office, though 26 years came in elected positions versus 23 by appointment.

James Buchanan (the 15th president from 1857-1861) served as a member of the US House of Representatives from Pennsylvania for ten years from 1821-1831, and was chairman of the House Judiciary Committee in the last two years of his service.  He also was a US Senator from 1834-1845.  He had extensive diplomatic experience as Ambassador to Czarist Russia from 1832-1833; and to Great Britain from 1853-1856. Buchanan served as Secretary of State under James K. Polk from 1845-1849, and sought the presidential nomination of the Democratic Party a number of times before finally being elected in 1856.  If one adds up all his service, he was in office for a total of 33 years, with eight of those years as an appointed diplomat and 25 in elected office.

Lyndon B. Johnson (the 36th president, 1963-1969) served in Congress for 24 years from 1937-1961, twelve years in House, and then twelve years as a Senator, including service as Senate Majority Whip from 1951-1953; Senate Minority Leader from 1953-1955, and as the most prominent and influential Senate Majority Leader in American history from 1955-1961.  Then, he was Vice President of the United States from 1961-1963. After assuming the presidency with the assassination of John F. Kennedy in 1963, Johnson was elected to a full term from 1965-1969, accomplishing the greatest series of domestic reforms in American history under the slogan “The Great Society.”  He had planned for another term, and announced for it in 1968, but the Vietnam War morass led him to withdraw his candidacy and retire in January 1969, having served in government for 32 years.

Gerald R. Ford, (the 38th president, 1974-1977) served in the US House of Representatives from Grand Rapids, Michigan for 25 years from 1949 to late in 1973, including eight years of service as House Minority Leader of the Republican Party, and then was tapped by President Richard Nixon to replace the resigning Spiro Agnew as Vice President under the terms of the 25th Amendment. He served as Vice President for 8 months from December 1973 to August 1974, serving the remaining two and a half years of Nixon’s term when Nixon himself resigned in August, 1974. Ford’s bid for reelection ended in defeat to Jimmy Carter, and he left public office in 1977 having served 28 years in public office.

No other President matched John Quincy Adams, James Buchanan, Lyndon B. Johnson, and Gerald R. Ford in years of public federal service, but now President Elect Joe Biden surpasses all of them except Adams.

Joe Biden has served 44 years, eleven more than Buchanan; twelve more than Johnson; and sixteen more than Ford, an absolutely amazing record of public service! Further, if only service in elected office is considered, however, Biden is ahead of Johnson (32) Ford (28), Adams (26) and Buchanan (25) in that statistic. 

Biden was elected to the US Senate from Delaware in 1972, a few weeks before his 30th birthday, and sworn in during the midst of tragedy of the loss of his first wife and daughter, and the injury of his two sons, in an auto accident, six weeks after his election, and a month after his 30th birthday.  But he overcame tragedy, married his second wife, Dr Jill Biden, had a daughter with her, and overcame adversity; he had two brain surgeries for an aneurysm in 1988, and would experience the loss of his son Beau in 2015. 

Joe Biden served six terms as a Senator, a total of 36 years, making him the 18th longest-serving Senator; had he not been Vice President, Biden most likely would have served a seventh term until 2014, and might now be finishing an eighth, which would have made a total of 48 years. This would put him behind only Robert Byrd of West Virginia (51 and a half years) and Daniel Inouye of Hawaii (nearly 50 years).  

While in the Senate, Biden served as Chairman of the Senate Judiciary Committee from 1987-1995 and as Chairman of the Senate Foreign Relations Committee from 2001-2003 and 2007-2009. He was a headliner in the news during controversial Supreme Court nominations under Presidents Ronald Reagan and George H.W. Bush.

Biden made two unsuccessful attempts to run for President in 1987 and 2008. He served as Vice President under President Barack Obama for two terms, and was extremely engaged, active and influential, in a manner unlike any other Vice President, except Walter Mondale (1977-1981) under President Jimmy Carter.

So, among American presidents, Joe Biden surpasses all but John Quincy Adams in years of service to his country. Yesterday, Biden fulfilled the dream of his son and became the 46th President of the United States. 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/blog/154431 https://historynewsnetwork.org/blog/154431 0
Hooray for the Greens (But Not This Year)!

 

 

 

Hooray for the Green Party! Before there was a Green Party in the US, we rooted for the Green Party in Germany. They were a tiny group of committed environmentalists, with wonderfully advanced social and economic ideas. Unlike the US, but like most other democratic systems across the globe, third parties in Germany, and fourth and fifth parties, can shove the whole political system in the direction they advocate, as long as they can convince enough voters. Over the decades since their founding in 1980, Die Grünen have persuaded large minorities and in some places majorities of voters that they should govern. They are now coalition partners in the governments of 11 of Germany’s 16 states, and Winfried Kretschmann is the Minister-President of Baden-Württemberg, the equivalent of governor.

 

The American political system at nearly every level is hostile to third parties. The American Greens show no signs of being able to overcome these disadvantages. It is a symptom of our system that the Greens are widely viewed as a spoiler party, unfairly stealing votes from one of the Big 2. The best performance of a Green presidential ticket was in 2000, when Ralph Nader won 2.9 million votes, 2.7% of the total. If more than half of those Greens in Florida had voted for Al Gore, who lost the state by 537 votes, he would have won.

 

Since then, they have not reached 1%, except for 1.1% in 2016, when both major party candidates were so unpopular. Their membership reached over 300,000 in 2004, but has declined to about 250,000 now. This year they are not even on the ballot in 19 states. Their major candidates are Howie Hawkins and Angela Walker, both also nominees of the Socialist Party USA, who I don’t believe have ever held elective office.

 

The structural obstacles to Green success are not their only problem. Perhaps in overreaction to the label of spoiler, the Green Party is barely more polite than the Republican Party about those who don’t vote for them. Here’s what Virginia Rodino, co-chair of the Maryland Green Party, and Kevin Zeese, recently deceased former Attorney General in the “Green Shadow Cabinet,” say about the American voting public. The two major party candidates are “a certifiable, lying, murdering war criminal and a racist mass incarcerator,” who are representatives of the “two parties of the millionaires governing the United States since its founding.” Republican and Democratic voters “do not have faith in ordinary people” because they “remain shackled election after election to the corporate parties.”

 

“Voting for Biden is immoral in the current electoral reality. Trump is the worst president in our lifetimes and has to be removed from office. . . . Only voting for Howie Hawkins and Angela Walker makes sense in 2020 with two terrible candidates — the worst president of our lives and one of the worst corporate Democrats of our lives.”

 

Joe Biden has committed himself to the most liberal or progressive or even leftist program of any Democratic nominee in my lifetime. I don’t expect the Greens to even admit that, much less give him any credit. But I’d appreciate it if they gave me some credit, along with the millions of voters who have voted or will vote for Biden. They can’t even get their rhetoric straight: Trump “has to be removed,” but voters for Biden are themselves “immoral” because we “do not have faith in ordinary people.” I don’t find these Green leaders persuasive when they claim to be the only party with faith in ordinary people, but think that about half of American voters are committing a politically immoral act. I always thought most of us are ordinary people.

 

At another point in their diatribe, Rodino and Zeese narrow their focus: “people who live in states like New York, California, Maryland, and the other 30 solidly Democratic states would be wasting their vote if they vote for Biden.” According to the National Public Radio polling analysis, these 33 states in which a vote for Biden is immoral include 4 which lean Republican, 7 toss-ups, and the battleground states of Pennsylvania, Michigan and Wisconsin. If many potential Democratic voters took the Greens’ advice and decided not to “waste” their votes, they would hand a second term to Trump. The Green argument depends on lies about Biden as a candidate, insults to all Democrats, and electoral calculations which would hand another term to Trump.

 

The Green Party’s website takes a more positive approach, and is not so insulting to those who are not convinced that voting Green is the best way to get rid of Trump. Nowhere do they explain, however, why being Green necessarily means voting for Socialists. Their inability to come up with their own candidates this year shows their weakness and makes them even less likely to become a vital force for change.

 

I’m glad the Greens exist as another choice for voters. But this year, they are not a good choice.

 

Steve Hochstadt

Jacksonville IL

November 3, 2020

 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/blog/154429 https://historynewsnetwork.org/blog/154429 0
Roundup Top Ten for November 6, 2020

An Embarrassing Failure for Election Pollsters

by W. Joseph Campbell

Pollsters problems predicting the 2020 election deepened the embarrassment for a field that has suffered through – but has survived – a variety of lapses and surprises since the mid-1930s. 

 

We Must Do More to Honor the People and Places Lost to Violent Racism

by Walter Greason

Teaching a course about collective racial violence in the United States showed a professor the extent to which this history is both integral to the nation and completely hidden from the majority of Americans. 

 

 

Trumpania, U.S.A.: Making Federal Buildings Fascist Again

by Ed Simon

Trump's obsession with establishing neoclassical architecture as the default style for federal buildings echoes the delusional plan of Adolf Hitler to rebuild bombed Berlin in a monumental style purged of "decadent" modernism. 

 

 

Good TV Demands Results on Election Night, But that’s Bad for Democracy

by Kathryn Cramer Brownell

Election night 2020 promises to test whether the media has learned from failures of the past. 

 

 

Aaron Sorkin’s Inane, Liberal History Lesson

by Charlotte Rosen

Aaron Sorkin's Chicago 7 film strips away the radical, anti-imperialist, anti-racist, anti-capitalist politics of the 1960s New Left to make the defendants heroic defenders of liberal democratic politics. 

 

 

Monstrous Men: The Medusa #MeToo Monument Has an Oedipal Complex

by Erin Thompson and Sonja Drimmer

A New York statue of Medusa erected as a monument to the #MeToo movement of identifying sexual abusers of women is in fact yet another instance of fighting among male artists using women's bodies as symbolic weapons. It also garbles the myth of Medusa, draining it of its relevance to #MeToo.

 

 

What Modern Voter Suppression Looks Like In Florida

by Julio Capó Jr. and Melba V. Pearson

"The result of legal maneuvering in Florida is a 21st-century version of Jim Crow, now matured into James Crow Esq. The intent — to restrict minority community access to the ballot box — is the same, but the methods of voter suppression have become more sophisticated."

 

 

President Trump’s False Claims about Election Fraud are Dangerous

by Sid Bedingfield

Elected officials' use of the media to claim election fraud has resulted in violence in the past; the news media must take responsibility to avoid fanning the flames. 

 

 

A Large Portion of the Electorate Chose the Sociopath

by Tom Nichols

By picking Trump again, those voters are showing that they are just like him: angry, spoiled, racially resentful, aggrieved, and willing to die rather than ever admit that they were wrong.

 

 

The Racist Lady with the Lamp

by Natalie Stake-Doucet

"Nursing historiography is centered on whiteness. Even worse, nursing history revolves largely around a single white nurse: Florence Nightingale. This, unfortunately, doesn’t mean nurses understand who Nightingale was."

 

]]>
Sun, 29 Nov 2020 15:33:52 +0000 https://historynewsnetwork.org/article/178121 https://historynewsnetwork.org/article/178121 0