History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Sat, 17 Apr 2021 04:43:05 +0000 Sat, 17 Apr 2021 04:43:05 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.us/site/feed Can Space Exploration Restore American Faith in Science?

Yuri Gagarin prepares for the first manned space flight, April 12, 1961

 

April 12 marks both the 60th anniversary of the first manned space flight and the 40th anniversary of the first U.S. space shuttle launch. These anniversaries might pass with little fanfare, but as NASA hopes to put the first woman on the moon by 2024 with its Artemis mission, Americans might want to ask themselves how so much of the wonder of space exploration has faded in just six decades — and faith in science and our institutions with it.

On so many levels, the era that produced American space travel will seem like a foreign country, even to those who once lived there. This was a time when massive expenditures by the government for new projects and bureaucracies were not only unremarkable but exciting. It was a time when the public sector and the private sector were not only tightly enmeshed, but it was the public sector and the guiding hand of the government that was dominant and no one thought to call it socialism. It was a time when we implicitly trusted our government.

The space program was a symbol of our incredible faith in both our government and science. The scourge of polio had just been conquered as Sputnik brought new fear into our lives, and our faith in the government and scientists to protect us was natural. Americans would breathlessly listen to reports from an American scientist with an incredibly thick German accent and a name that suggests he was a recent immigrant, Wernher von Braun. His rehabilitation from Nazi to German-American was a simple matter of suspending our disbelief, which was easy because we had faith in our institutions. Whether it was curing polio or conquering space, Americans believed that men in white coats would point the way to a better future. Even with the breathtaking progress of research into the pandemic, American faith in science is far from its halcyon 1950s/1960s high.

The Challenger tragedy shattered our dreams, but more than three decades later, there are few signs America is ready to once more trust or even take seriously the science of space flight.

Today, we do not have the patience to follow the advice of scientists in the face of the greatest health crisis America has ever faced. We do not have the political will to make investments that will pay off in the future. So how will we be able to delay our gratification to make the kind of wise, patient investment that the space program needs? John F. Kennedy made a promise in 1962 that NASA kept in 1969, just under the wire. What promises will Americans make to our future selves?

I would guess that a lot of Americans, if asked to think of the future of space travel, would think of Elon Musk, and just maybe Virgin Galactic — both of these have a certain futuristic flair to them and a lot of profit motive. The incredible image of the earth in space, the “big blue marble” taken on December 7, 1972, briefly united a world divided by Cold War and hot wars, a world haunted by the prospect of impending famine and disease. That image reminded us that we were all united after all. That sight could soon be available to the highest bidders as a selfie background. It remains to be seen if a celebrity influencer could have the same gravitas as an astronaut. It won’t be easy to remind us of our shared humanity and the miracle of science.

Space tourism and space hotels, a space entrepreneur who seems part Tony Stark, part P.T. Barnum — this is what Americans think of the future of space. While the private sector focuses on such frivolity as sending a Tesla to space, the scientists, mathematicians and engineers trudge on, mapping the trajectories and designing the spacecraft and software, and waiting for the stars to align themselves again. The astronauts train, dreaming of the chance to go up. 

Our rocket scientists are some of the most brilliant people in the world. Instead of getting rich — and please notice the parking lots of their workplaces are not filled with luxury cars — they are using their brilliance to take mankind further than it had gone before. And the astronauts? They are part rocket scientist, part superhero. The best of the best in so many physical, mental, and emotional categories. They risk their lives for the dream of exploration. They used to be household names and international heroes. Maybe they can be again.

 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179854 https://historynewsnetwork.org/article/179854 0
Making Religious Peace in Afghanistan

The Hanging, Jacques Callot, ca. 1632.

 

 

Nearly twenty years ago, on October 7, 2001, the United States, supported by a broad international coalition, started what the Bush administration called a War on Terrorism; it began with an aerial attack on Afghanistan, whose radical Islamist Taliban government had harbored those responsible for the devastating 9/11 attacks. Two weeks earlier President George W. Bush had described the coming conflict as the necessary response to “a new kind of evil. . . . This crusade, this war on terrorism is going to take a while, and the American people must be patient.”

The President’s aides quickly tempered Bush’s rhetorical impulse, insisting that this was not a “crusade” against Islam, but a defense of freedom and democracy. Still, Bush was right in at least one respect: The War on Terrorism has taken a while. At this point, we still don’t know how and when our war in Afghanistan will end, but the Taliban and “radical Islam” are still generally considered to be the principal obstacles to peace. As the Biden administration struggles with the questions of whether and how soon to withdraw the remaining US military forces from Afghanistan, it is critical to recognize the religious dimensions of our “forever war” and to accept the challenge of making religious peace possible.

The day the aerial attack on Afghanistan began, Andrew Sullivan declared, in an essay in the New York Times Magazine, “This is a Religious War,” not unlike Europe’s religious wars. As a scholar of Europe’s religious wars, I appreciated Sullivan’s sense of historical recognition, which is still useful today. The problems and enmities that underwrote Europe’s religious wars as well as the War on Terrorism were religious in the sense that the forces in conflict recurrently and often insistently identified their enemies in term of religious ideas, behaviors, or affiliations. While some observers framed the War on Terrorism as a global struggle between Islam and the (Judeo-Christian) West, Sullivan framed it as a “war of fundamentalism against faiths of all kinds that are at peace with freedom and modernity.” After twenty years of “religious” war, however, religious fundamentalism has not been defeated in Afghanistan. It’s long past time to make religious peace.

But how do we shift from prosecuting religious war to making religious peace? Here the historical analogy with the religious wars in Europe is particularly useful. During more than a century of intermittent and increasingly destructive religious wars, Europeans learned to accept and manage their religious differences, thereby establishing the foundations of modern religious pluralism. This European religious peace, which I have described as complex and messy, has since been disrupted by revolution, nationalism, authoritarianism, and world war, but so far it has survived even the mass religious migrations of the last decades without descending to the coordinated destruction of religious war.

To learn anything useful from this history, however, we must shift our focus from contentious ideas to political action. Ideas, theologies, and ideologies provide useful clues for understanding the motives and intentions of those who prosecute wars, but it is a much broader array of political actors and actions that make war and peace possible, as often as not quite unintentionally. This is because the outcomes of large historical processes – like the cycles of religious conflict, violence, war evident in early modern Europe and in the world today – are the product of contentious human interactions, which do not yield clear winners and losers. Indeed, European history shows that if the essential foundation of religious war is ideological intransigence, the essential foundation of religious peace is political compromise.

During Europe’s Age of Religious Wars, most of the wars ended with a political compromise that took the form of a truce, an edict, or a treaty. From the First National Peace (Landfrieden) in Switzerland in 1529 to the Peace of Westphalia in 1648, each of these agreements was founded on three essential principles: mutual recognition, security guarantees for all parties to the agreement, and mediation to prevent the escalation of future disagreements into the coordinated destruction of war. Many of these political compromises failed when they were not accompanied by the demilitarization of the contending religious parties, but the ones that were successful, like the Peace of Augsburg, the Edict of Nantes, the Treaties of Westphalia, earned the grudging consent of those who controlled the means of coercion and warmaking. That consent, which was invariably grudging and implicit, entailed the recognition that war was the problem, not the solution to the “problem” of religious difference and that durable religious coexistence or diversity was the necessary condition for a more peaceful future.

Formal peace agreements can end wars – what we might call negative peace – but they do not suddenly create new, more peaceful conditions on the ground – which we might consider positive peace, or that which makes peace much more than the absence of war. The durable forms of religious coexistence that were the foundation of Europe’s religious peace emerged prior to, survived during, and were already firmly in place at the end of the military conflicts. And they had been created by a motley crew of political actors: often intolerant rulers and frequently dissenting subjects as well as competing claimants to religious authority and external allies and enemies. What the peace settlements did was to validate diversity that already existed for some time and then, over time, to protect that diversity in law and political institutions, both for groups and individuals.

In early modern Europe, the religious diversity that many considered problematic or even unacceptable was the legacy of a long-term process of religious fragmentation or pluralization within Christianity that began with the Reformation and was consummated by the permanent diminution of papal power. Thus, their religious wars did not represent a global struggle between Protestantism and Catholicism but were a myriad of struggles among fragmented communities of Christians and the ethnicities and political formations affiliated with them.  

Similarly, in our current cycle of religious violence and war, in the Middle East, North Africa, and South Asia, and in Afghanistan, in particular, the religious diversity that many consider problematic or even unacceptable is the legacy of a long-term process of religious fragmentation or pluralization within Islam that began with religious fundamentalist criticisms of the Ottoman Empire in the eighteenth century and was consummated by the abolition of the Ottoman caliphate in 1924. Thus, the religious struggle in Afghanistan is not merely an echo of a global struggle between Islam and the West or between fundamentalism and more tolerant faiths, but a local struggle among many fragmented communities of Muslims and the ethnicities and political associations affiliated with them, including the Taliban and the current Afghan government.

What this means for Afghanistan is that making religious peace is both straightforward and an enormous political challenge. The goal is not to defeat a fundamentalist ideology, but to broker a political agreement that validates religious diversity and protects that diversity in law. The Taliban has already stated that religious diversity can be protected under Islamic law, although their interlocutors do not trust their sincerity. But trust is not necessary; neither is religious dialogue or reconciliation. What is essential is that mutual recognition, security, and mediation be built into a political agreement that entails the demilitarization of the religious parties to the agreement and the protection of religious diversity in law.

Without recognition of the religious dimensions of the conflict and explicit protection of religious diversity, including the fundamentalism of the Taliban, peace is likely to elude us once again. But one thing all sides need to accept, however grudgingly, is that more war will accomplish nothing.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179853 https://historynewsnetwork.org/article/179853 0
Gordon Liddy and the Greek Connection to Watergate

Did this Democratic National Committee filing cabinet contain damaging evidence of an illegal contribution to the 1968 Nixon campaign by the Greek military dictatorship, rooted out by journalist Elias Demetracopoulos?

 

Obituaries of G. Gordon Liddy, one year shy of the 50th anniversary of the bungled Watergate break-in that he masterminded, remind us how history morphs with new information and changing attitudes.

Just what was Liddy seeking in the offices of Democratic National Committee chairman Larry O’Brien? It remains a mystery. After he transformed himself from a tight-lipped, Hitler-admiring political operative to a self-promoting right-wing media entertainer, Liddy embraced some far-fetched conspiracy theories touted by Nixon admirer Roger Stone and others. These fanciful tales were designed to implicate others in the Watergate crimes and exonerate Nixon and himself.

But in his more contemporaneous 1980 autobiography, Liddy said the purpose of the June 17 break-in, as ordered by Nixon aide Jeb Start Magruder, was to “find out what O'Brien had of a derogatory nature about us, not for us to get something on him or the Democrats.”

According to then-White House counsel John Dean, it was a fishing expedition. Magruder told Liddy to “photograph whatever you can find,” and Howard Hunt, Liddy’s political sabotage co-conspirator, told the burglars to “look for financial documents—anything with numbers on them,” especially if it involved “foreign contributions.”

That search would likely have included evidence O’Brien possessed concerning a large illegal transfer of cash, nearly four million in today’s dollars, from the Greek dictatorship to the 1968 Nixon campaign. The bagman for that payoff was Greek-American tycoon and uber-GOP fundraiser Tom Pappas, later named on the Watergate tapes as “the Greek bearing gifts.”

Pappas had convinced the Greek military junta, which had recently overthrown its democratic government, that underwriting the Nixon-Agnew campaign would be a good investment. The whistleblower in 1968 was Elias Demetracopoulos, a controversial and scoop-hungry Greek journalist whose exposes had so angered American officials that the CIA and State Department had long tried disinformation campaigns to destroy his reputation. After the junta took over in 1967, Demetracopoulos escaped to the United States. Changing from journalist to activist, he wanted to generate American opposition to the junta, restoring Greek democracy from Washington. After Nixon’s running mate Spiro Agnew endorsed the junta in September 1968, breaking a promise of neutrality, Demetracopoulos investigated, uncovered the secret Greek money trail, and met with O’Brien twice in October at the Watergate trying unsuccessfully to get him to expose the plot.

At the time, support was soft for all three candidates: Nixon, Hubert Humphrey and George Wallace. This lack of enthusiasm meant a higher-than-usual possibility of last-minute switches spurred by a late campaign disclosure. The Greek money revelation would have exploded the so-called “new Nixon” image campaign. That alone could have changed the outcome of the second-closest presidential contest in the 20th century. The victory margin was less than one percent. A shift of fewer than 42,000 votes in only three states would have thrown the outcome into the Democrat-controlled House of Representatives.

The Nixon people knew fragments about Demetracopoulos’s 1968 disclosure to O’Brien. For years it caused them great anxiety. In 1969, Jack Caulfield, handling wiretaps and political surveillance for Nixon, sent John Ehrlichman a confidential memorandum titled, “Greek Inquiry.” In 1970 John Dean became aware of anti-Demetracopoulos smears. In July 1971, Elias testified before the House Foreign Affairs Committee, against the Greek dictatorship -- and against Pappas.

In that hearing, he hinted that he would disclose more about the 1968 money. So, Pappas and his allies tried different approaches to get both the Greek and American governments to attack Demetracopoulos. Nixon’s hatchet man Charles Colson took Elias to lunch in September 1971 to warn him not to criticize Pappas. Pappas menaced him directly. In early 1972, Attorney General John Mitchell publicly threatened Demetracopoulos. He tried to have him deported, which would surely have led to his torture and likely death.

The evidence Demetracopoulos gathered and presented in 1968 was still in O’Brien’s files in June 1972, and there is strong circumstantial evidence that it was part of what the burglars were looking for in the Watergate break-in. Liddy told me he was aware of Demetracopoulos and his co-conspirator Hunt knew him from his days with the CIA in Greece. Jeb Magruder admitted to historian Stanley Kutler that information on the Greek money would have been part of what the burglars were seeking. Years later Harry Dent, White House counsel and an architect of Nixon’s Southern Strategy, told Demetracopoulos that the Greek-money origins of Watergate “makes sense.” Congressional investigations to explore these connections were repeatedly blocked. Before we are inundated with memorial reflections on the significance of the events of June 1972, we should consider that the roots of Watergate likely extend back to the 1968 Nixon campaign. Timely disclosure of the 1968 illegal money transfer could have meant a Hubert Humphrey victory, meaning no President Nixon, no Watergate break-in, and a different course of history.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179858 https://historynewsnetwork.org/article/179858 0
A Reason Republicans May Not Wish to Proclaim Themselves the Party of Lincoln

 

 

The Republican Party may soon live up to its moniker, “The Party of Lincoln,” though not in a way that bodes well for the GOP.  Abraham Lincoln, of course, was the first Republican president.  While he is heralded today as one of our greatest chief executives, he was never very popular in his own day.  Nor was his party.

It is worth remembering that the Republican Party was not a “national” party from its inception; it drew its support exclusively from the North.  Born in 1854 in Jackson, Michigan, the party in its earliest days was organized around opposition to the extension of slavery into the western territories.  Concern over slavery in the West erupted in the wake of the Kansas-Nebraska Act of 1853, which opened up those two territories to slavery after over thirty years of prohibition there.  Energized over containing slavery, the party attracted anti-slavery activists, Northern Whigs, and ex-Free Soilers, who similarly wanted the West kept as “free soil for free men.” 

Early Republicans knew that their fledgling party needed to broaden its appeal beyond the slavery issue.  So they advocated support for internal improvements, what today we would call infrastructure.  As New York Tribune editor Horace Greely wrote in 1860, “An Anti-Slavery man per se cannot be elected.”  But, “a Tariff, River-and-Harbor, Pacific Railroad, Free Homestead man, may succeed although he is Anti-Slavery.”  Containing the “peculiar institution” alone would not be enough to secure victory, but adding other planks to the party’s platform could result in electoral success.

For obvious reasons, the Republican Party held no appeal for southerners.  During the 1860 election, Lincoln’s name did not even appear on the ballot of ten southern states.  Although he was able to amass a majority of the electoral college votes, Lincoln won only 39.7% of the popular vote.  Six out of every ten Americans voted for someone else, as the nation descended into civil war.

Four years later, some Republicans wanted to dump Lincoln for Salmon P. Chase or John C. Fremont.  While Republicans renominated Lincoln, they replaced Hannibal Hamlin as the vice presidential nominee in favor of Andrew Johnson, a War Democrat.  Republicans knew their base alone would not be enough to secure victory.  The party of Lincoln had such limited appeal that Lincoln himself needed the support of Democrats to win reelection.

Fast-forward to the present, one hundred and sixty years later the Republican Party chances again to become a party of limited appeal.  Under Donald Trump’s leadership, a deep fracture has grown within the GOP.  While a majority of Republicans remain loyal to the former president, some have grown weary of his mendacious ways.  Even after election results were counted and recounted, Donald Trump will not concede defeat.  And as he continues to falsely claim victory, he demands that Republicans similarly proclaim “the lie.” 

Even before the election, fissures in the Grand Old Party were apparent.  Some Republican leaders, especially those who had or were about to retire from office, rejected Trump.  It was a strange and telling moment when Ohio’s former governor John Kasich, a life-long Republican, spoke at the Democratic National Convention in support of Joe Biden’s candidacy.

Nonetheless, the influence Trump has over Republican office holders is so great, the vast majority do not dare suggest that the emperor has no clothes—that Trump lost the election fair and square.  Instead, they embrace and perpetuate the lie.  To do anything less will result in Trump’s wrath and a primary challenge.  Even after January 6, only a small number of Republicans in Congress have shown the political will to challenge Trump’s deceit.  So, they voted for acquittal again, when he stood trial for inciting the capitol riot.  In their shortsighted effort to save their own political skins, those Republicans are fundamentally transforming their party. 

In stark contrast to the majority of House Republicans who supported efforts to overturn the Electoral College vote, ten Republicans voted for impeachment.  In the Senate, though the handful of Republicans who mustered the courage to vote to convict might seem small, they represent a growing number who reject Trumpism and the lie.  Once alone, Mitt Romney was joined by Lisa Murkowski, Susan Collins, Ben Sasse, Pat Toomey, Bill Cassidy, and Richard Burr.  Indicative of the depth of Republican divisions, several of them were censured by their own state party committees. 

The Republican electorate is similarly split.  A full three out of every four Republicans believe that there was widespread voter fraud in 2020, handing the election to Biden.  Though most Republican voters remain loyal to Trump and his lie, a growing number are re-assessing their fealty, repulsed by the events of January 6.  For them, there was no steal to stop.  They would not believe the lie.  After all, “you can’t fool all of the people all of the time.” 

Such divisions within the Republican Party threaten to devastate the GOP.  A party that has only won the popular vote once in the last eight presidential elections can ill afford its present fracture.  Trump’s lie chances to fatally handicap the party of Honest Abe.  As Lincoln warned years ago, “A house divided against itself cannot stand.”      

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179850 https://historynewsnetwork.org/article/179850 0
The Seductions and Confusions of Genealogical Research For a long time, I thought that researching family history was a dubious pastime. Also one fraught with peril, when undertaken for the purposes of ancestor-glorification and ego-gratification.  Should you have a forebear by whom you set great store – for example, as my Aunt May did by Philip Alston, you may well learn many disreputable things about him, of which owning slaves is only one.

That didn’t stop May from pursuing pedigrees on my behalf.  I remember being told as a teenager that she had filled out a chart in my name, detailing a lineage that would qualify me to join not only the Daughters of the American Revolution but also the Daughters of the Confederacy.  This was not how I pictured my future and I told my father, none too politely, to forget it.

Yet somehow this document survived – I found it among the other papers in the Pile. Labeled D.A.R. ANCESTRAL CHART, it diagrams a branch of my father’s family, starting with his name, Richard Griffin Banks, and working backwards in time through a Major Edwin Banks and a Dr. Richard G. Banks. 

This wasn’t the kind of rabbit hole I had any intention of going down.  Until for some reason it was.  Richard Griffin Banks is an unusual name.  Maybe I wasn’t ready to track my father on a genealogy website, but why not just Google him and see what I found?  Several hours later I was following the Internet trail of a Confederate Army Surgeon named Richard Griffin Banks.  Could this be my father’s great-grandfather,  the Dr. Richard G. Banks from the Ancestral Chart?

As my morning slipped away, I pursued Dr. Banks through 38 entries in my search results.  I learned that he was a trustee of a public school in Hampton.  I learned that at one point he became embroiled in a dispute involving a school budget which caused him to be assaulted with “horse whip and pistol” by C.J.D. Pryor, a teacher at the school. 

At that point I clawed my way out of the ancestry rabbit hole for the time being – but not before taking note of a line in the Richard Griffin Banks entry on the “Deceased Banks . . .” website:  “Unclear why he was born before the marriage date of parents.”  

What started as an idle pastime – Googling my father’s name – produced several surprises. It was of no particular consequence to learn that my great great grandfather may have been born out of wedlock.  But I was shocked to come across the information that he had owned 7 slaves.  It wasn’t surprising that my planter ancestors would have been slaveholders, but this great great grandfather was a doctor.  I didn’t know — though I have since learned — that households owning small numbers of slaves were not unusual; nearly half of the Southerners who owned slaves held fewer than five.  

According to a website compiling “All Deceased Banks & Bankses Persons of European Origin in the U.S. . .” Dr. Banks’ Hampton, Virginia, house was burned down during the Civil War and the family was forced to flee, saving only a pair of silver candlesticks.  This colorful detail comes from the records of a Mrs. James Banks and may or may not be apocryphal. (And if it IS true, what became of those candlesticks?)

I take note of the qualifying “of European Origin” in the webpage title.  In the 1840 census, Dr. Banks’s household consisted of “1 white male, 1 white female, and 7 slaves.”  In 1840, enslaved men and women were not listed by surname. But if they were eventually assigned the last name of Banks, as was common, it must have seemed important to the compiler of the genealogy to exclude them from the white Bankses.  

 

 

Read more about Ann's Confederates In My Closet on her website. 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/blog/154487 https://historynewsnetwork.org/blog/154487 0
What Will be the Terms of Racial Forgiveness in America?

Memorial "Dedicated to those known and unknown who lost their lives in the Elaine Massacre" of September 30-October 7, 1919. Dedicated September 29, 2019.

 

 

 

Dietrich Bonhoeffer, the German martyr and theologian who fought against Hitler and the Nazis in his native country during the 1930s and early 1940s, defined "cheap grace" as grace without cost, as "grace we bestow on ourselves." I hear many whites today defend themselves against the fearsome appellation of "racist" by announcing that they would never think about practicing the kind of behavior, shown in recent years across this country by white nationalists.  Having taken that position to convince themselves they are not racists, they nonetheless harbor only white friends, and they also attend virtually all-white religious institutions and social clubs. They live in all-white neighborhoods, and their business partners, associates, and contacts are routinely white. When we whites vindicate ourselves from racism this way, we commit the equivalency of bestowing exoneration upon ourselves. The hurdle is very much higher if we whites have any right eventually to receive actual forgiveness by African-Americans for our white racism, both inherited and practiced. So, what are the criteria to begin the process -merely to begin the process - that can help lead us whites toward a realistic chance for such forgiveness, and ultimately, toward a path of racial reconciliation with our Black brothers and sisters? First, we must thoroughly acknowledge, through demonstrable and sincere acceptance, the unvarnished racial prejudice and history of 400 years in the United States - without any deflective excuse from filiopietism (excessive veneration of ancestors, the past, and tradition), too often employed to ameliorate the effects of those 400 years.  We whites must, by unfettered actions and resolute ideology, accept the truth of our illegitimate and evil white American domination of African-Americans, as manifested in our "damaged heritage" - that essence of American white racism, consisting of evidence, passed on from generation to generation, of ingrained, prejudicial customs and traditions, sometimes codified into law, and historically combined with not infrequent, gratuitous, and often severe violence and repression perpetrated by American whites against African-Americans. We whites know what has happened through our racial subjugation of African-Americans, and we have no defensible or legitimate reason whatsoever to excuse it by benefit of prideful mythologies and endless genealogies. In a very much related thought, the philosopher Soren Kierkegaard wrote 170 years ago that one generation does not learn anything "genuinely human" from a past generation; in other words, we have to learn anew for ourselves those qualities, which constitute the "genuinely human".  Yet, those qualities that can lead us whites to the "genuinely human" to obviate our adherence to racism are not the aspects we whites have normally employed for the treatment of Blacks - rather, we have habitually relied on customs, traditions, skin color, accents, and history. But the "genuinely human" is deeper, more fundamental, instinctual as it wills a connection between us (Black and white) to understand, to empathize, to reconcile, to love, to co-inhere, to step into another's shoes and be that person. There can be no reservation about the need for this adoption, this additional and highly significant step for whites to brook.  Still, why does any white in America need to be forgiven for past racism?

 

Because we carry a self-destructive legacy we routinely do not even recognize, derived from white privilege and pure white domination. Since this legacy has been with us so long and it has been so thoroughly ours, most white Americans cannot perceive or appreciate its corrosive effects or consequences for others and ourselves.  While it may not have been apparent, whites always knew, but have been unable to admit that, in truth, we can only be cleansed of that legacy and its evil nature by those we made our victims: Black brothers and sisters among us. We simply do not have the resources for exerting the forceful act of truly forgiving ourselves for the accumulation of such evil displayed through white privilege and pure white domination we foisted upon African-Americans for generations.  Notwithstanding the indispensability for this cleansing of racial legacy, the steps called for here must be unilateral steps taken by whites, steps taken without expectation of anything immediate in return, but solely for the desire, for the simple purpose of demonstrating a desire by whites to express a sincere and personal mission to empathize, to reconcile, to love, to reach out to those we have egregiously and continuously harmed in body, spirit, and mind as an inhuman cudgel of national, white policy and practice.      

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179852 https://historynewsnetwork.org/article/179852 0
60 Years Later: The Enduring Legacy of the Bay of Pigs Fiasco

 

 

 

On April 17, 1961, a CIA-trained force of fourteen hundred Cuban exiles were all captured or killed within seventy-two hours of landing at the Bay of Pigs. In the aftermath of this fiasco, critics often asked how someone as smart as John F. Kennedy could have approved what some have described as the “perfect failure.” But there was a certain inevitability about the entire Bay of Pigs operation. Kennedy hoped to deliver on one of the key promises from his presidential campaign – to remove the cancerous communist growth 90 miles from Key West. Kennedy was determined to reverse Dwight Eisenhower’s “lethargic” foreign policy and saw a chance to do so within three months of his inauguration. A successful overthrow of Castro would have been a signal that American complacency had been replaced with a renewed “vigor,” a favorite term from the New Frontier. Toppling Castro would fulfill Kennedy’s inaugural pledge to “pay any price, bear any burden . . . support any friend, oppose any foe to assure the survival and the success of liberty.” More directly, it would fulfill his promise to “let every other power know that this hemisphere intends to remain the master of its own house.”

There were several consequences stemming from Kennedy’s failure at the Bay of Pigs. Some were significant, others less so. Allen Dulles was removed as CIA Director seven months after the failure of Operation Zapata. Kennedy told Dulles, “Under a parliamentary system of government it is I who would be leaving office . . . but under our system it is you who must go.” While it has become one of the main talking points of the post-Bay of Pigs, pro-Kennedy narrative, it is nonetheless true that he became more suspect of expert advice, including from the military and the intelligence community. Kennedy’s speechwriter and alter ego Ted Sorensen recalled Kennedy saying to him “I got where I am by not trusting experts. But this time I put all my faith in the experts and look what happened.”

Another repercussion from the Bay of Pigs turned out to be a boon for future historians, as President Kennedy secretly installed a tape-recording system in the White House to make sure that he, and he alone, would have important discussions “on the record.” Some advisors who favored the invasion claimed in off the record discussions with reporters that they had opposed it, and this duplicity irked Kennedy. He apparently intended to use these recordings to write a memoir someday.                      

But the most important consequence of the failure at the Bay of Pigs was Kennedy’s decision to intensify covert efforts to topple the Castro regime. Operating under the codename “Operation Mongoose” the President placed his brother, Attorney General Robert Kennedy, in charge of the effort, straining the concept of plausible deniability to the breaking point. Mongoose was designed, according to Robert Kennedy, to “stir things up” with “espionage, sabotage, general disorder.” But it also involved eliminating Castro by any means necessary. At least eight attempts were made on Castro’s life, with the CIA enlisting the help of American organized crime to do their bidding.

Attorney General Robert Kennedy viewed the Bay of Pigs as an “insult that had to be redressed” and he pressured a sclerotic bureaucracy to ensure that Mongoose received all the funding it needed to carry out its campaign to topple the Castro government. The assassination element of the campaign saw the CIA develop a variety of means to eliminate Castro including various poisons and exploding seashells designed to lure the curious scuba diving dictator to his death.

Many veterans of the Second World War considered assassination to be a legitimate weapon. The United States military had targeted the commander of the Imperial Japanese Navy, Admiral Isoroku Yamamoto, when decoded intercepts revealed his flight plans, while the British had trained the assassins of the “Butcher of Prague,” SS-Gruppenfuhrer Reinhard Heydrich. The Cold War’s ever-present threat of “mutual assured destruction” lent further credence to the idea that assassination was a legitimate “tool” in the nation’s arsenal.                              

Operation Mongoose continued for the entirety of the Kennedy presidency, despite Kennedy’s “no invasion” pledge to Nikita Khrushchev during the missile crisis of October 1962.  Mongoose became one of the largest covert operations in the CIA’s history, involving some 400 agents and an annual budget of over $50 million. Kennedy’s successor, Lyndon Johnson, shut Mongoose down in April 1964, later observing that the United States had operated a Murder, Inc., in the Caribbean.                                                                                                  

Operation Mongoose was publicly revealed in 1975 by a Senate committee chaired by Idaho Senator Frank Church that examined abuses of power by the CIA and the FBI. Only a handful of high-ranking U.S. government officials out of the one hundred and eighty-nine million Americans alive in November 1963, were aware that assassination had been adopted as a tool of American foreign policy by Dwight Eisenhower and John F. Kennedy. These 1975 revelations would fuel the already flourishing Kennedy assassination conspiracy complex, providing endless and fruitless leads for those intent on proving that Lee Harvey Oswald did not act alone. Sadly, this proved to be one of the most enduring legacies of the “perfect failure” at the Bay of Pigs.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179856 https://historynewsnetwork.org/article/179856 0
Holocaust Remembrance 80 Years After the Beginning of Hitler's Campaign of Genocide

Ruins of the Great Choral Synagogue of Riga, Latvia, which was burned by Nazis with Jewish victims inside on July 4, 1941.

 

 

We’ve got a lot on our collective plates and minds, coping with a pandemic and the frustrating partisan politics that continue to mire progress.

 

It would be natural for us at this time to embrace the rites of Spring and avoid the haunting historical memories and anniversaries of the Holocaust that began to occur 80 years ago in Europe, and which continued beyond the end of World War II in 1945.

 

Even as we are solemnly reminded each April to “Never Forget.”

 

But my purpose and mission is to unearth these horrors anew. Tragically, and inexcusably, most Americans remain ignorant of both the locations of these terrible crimes as well as the men, among the worst killers of the Holocaust, responsible for them. Horrifyingly, recent surveys in this country indicate that over 40% of Americans, and 66% of our millennials, cannot even say what Auschwitz was.  In 2021, how is this possible or acceptable?

 

On June 22, 1941, Hitler unleashed his armies on the Soviet Union, thereby forever altering the nature of both World War II and history itself. In the remainder of that year, the Holocaust would claim almost 500,000 victims, innocent people shot to death by Nazi Einsatzgruppen killing squads as the Wehrmacht’s four invading armies raced northeast into the Baltics, eastward through Belarus towards Moscow, southeast into Ukraine, and farther southeasterly towards the Crimea and the Caucasus.

 

The Nazis were quick to reach Lithuania. From June 25-29, 5,000 people were murdered at Kaunas, in what would be among the first of more than 250 such mass executions in Lithuania. At the Ninth Fort, also in Kaunas, on Nov. 25 and Nov. 29, Karl Jager ordered the death by shooting of almost 5,000 more Jews.  Arguably the most horrific killing site in the country was located in the Ponary Forest (now Paneriai) outside the capital of Vilnius, where from July 1941 to August 1944, some 100,000 people, including over 70,000 Jews, mostly from the capital of Vilnius, were murdered.

 

Nazi forces entered Riga, Latvia, on July 1, and within 3 weeks, over 6,000 Jews had been killed in targeted actions against them.  On July 4, the Great Choral Synagogue in Riga was burned to the ground with Jews locked inside. Heberts Cukurs, “The Hangman of Riga,” and Viktors Arajs organized collaborationist killing squads who helped Friedrich Jeckeln murder over 25,000 Jews on Nov. 25 and Dec. 8 at the Rumbula Forest outside Riga. Thousands of Jews were similarly murdered at both Skede and Liepaja in Latvia in the most notorious killings in the country except those in Riga.

 

The Nazi massacre of 33,771 Jews in the ravine at Babi Yar, outside Kiev, in Ukraine, on Sept. 29-30, 1941, is the worst two-day killing spree of the war. Overall, more than 100,000 Jews would be killed there under the orders of Karl Eberhard, Paul Blobel and Otto Rasch. The murders represented one of the largest mass killing actions in the early months of the invasion of the Soviet Union.

 

Only the massacres carried out by Romanian forces against Jews in the Crimean city of Odessa would numerically surpass the Babi Yar killings. From Oct. 22-24, 1941, almost 35,000 Jews would be shot or burned alive in locked buildings.  Over 100,000 people would be annihilated in the area through the early winter months of 1942 under the orders of Romanians Marshal Ion Antonescu and Gheorghe Alexianu, both of whom would later be executed as war criminals.

 

These anniversaries serve as grim reminders of the early months of the Holocaust against the peoples of eastern Europe. While many Americans are reminded of the Japanese attack at Pearl Harbor on Dec. 7, 1941, most have no clue that in Nazi-occupied Poland, on Dec. 8, the Germans began operating the first of six death camps, at Chelmno, where some 360,000 Jews from the Lodz ghetto would be taken to be killed in mobile gas vans and buried deep within a forest setting. The first phase of killing at Chelmno would last until April 11, 1943, coinciding with the deadliest phase of the Holocaust. The camp would resume its operations once more from June 1944-January 1945.

 

As awful as these murders were in 1941, the worst was yet to come. The following year would mark the zenith in frenzied Nazi killing in the death camps at Belzec, Sobibor, Treblinka, Majdanek and Auschwitz-Birkenau (which was already killing people in 1941). The 80th anniversary of the Final Solution at those sites will be solemnly remembered next year.

 

The war and mass murders inside the Baltics and Soviet Union 80 years ago may seem to be only a distant memory today within our country. The Nazis' premeditated attempt to obliterate peoples from the earth is today, eight decades and at least four generations afterwards, increasingly forgotten and ignored. Yet the evils of intolerance, racism, prejudice, and the horrors of ethnic cleansing that combined to produce the destruction of some 11 million people during the Holocaust live on, and even worse, remain pervasive today.

 

The Nazi goal was to eradicate them from the face of the earth and to then remove all traces of the instruments of their destruction: the camps were to be destroyed, the ground plowed over, all previously buried victims to be exhumed and burned and reduced to ashes, and no traces or records of the slaughter were to be left. In essence, the goal was to leave no memory of these victims for future generations.

 

Those of us alive today have an obligation to remember what happened, as well as the reasons why it happened and was allowed to happen, because we have the responsibility to at least try to prevent it from happening again. The very fact that the dehumanization and destruction of “undesirables” continues in our own time should serve as a gruesome reminder that Nazi ideology is still alive and well, and has been improved upon since 1945.

 

We must see our lives as inextricably linked to both the past and future, so that all peoples, individually or collectively, do not have to know of a world with genocide.

 

As we prepare to enjoy the remainder of our spring and impending summer months, let us pause to remember those who, through no fault of their own, were not allowed the same seasonal joys of Eastern Europe, 1941. Let us commit ourselves to remembering and learning from that difficult year, and to educating ourselves, our children and successive generations, so that our voices and memory will open new futures of hope, of restraint and of justice. There is no such thing as a lesser person.

 

 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179855 https://historynewsnetwork.org/article/179855 0
Senators who Made an Impact, Despite First being Appointed (not Elected)

Harry Byrd, while Governor of Virginia, photographed ca. 1928

 

 

Ronald L. Feinman is the author of Assassinations, Threats, and the American Presidency: From Andrew Jackson to Barack Obama (Rowman Littlefield Publishers, 2015).  A paperback edition is now available.

 

The US Senate, since the beginning of the 117th Congress this January, has seen a grand total of 1,994 members in its 232-year history.

Among those, there have been a total of 202 appointed Senators since the adoption of the 17th Amendment in 1913, which provided for direct popular election of Senators.

Therefore, it is common to think of appointed Senators as just temporary replacements, waiting for the next regularly scheduled election for that Senate seat, or until the next even-year election. This has often been true.

But several have ended up being major historical figures in Senate and political history.

This article is the first of two to examine the historical significance of twelve US Senators who, despite being originally appointed rather than elected, made a difference in American history.

 

Charles McNary (R-Oregon) was appointed in May 1917, and then was elected to the Senate in November 1918, serving until his death in February 1944.  He was chosen by the Oregon Governor for the vacancy due to his support of women’s suffrage and Prohibition, two policies that were established by constitutional amendments ratified before the 1920 national election.  He was Chair of the Senate Agriculture Committee from 1926-1933, and held the position of Senate Minority Leader during Franklin D. Roosevelt’s New Deal from 1933 until 1944, longer than any Republican has held that post. 

He was perceived as a “progressive” Republican who supported much of the New Deal and defense measures as World War II came closer, including the Selective Service military conscription in 1940 and the Lend Lease Act in 1941. A westerner, he supported the development of hydroelectric power, including the Grand Coulee and Bonneville Dams, as public works projects.  He was the primary promoter of the proposed McNary-Haugen Farm Relief Bill, twice vetoed by Republican President Calvin Coolidge in the 1920s, which might have staved off or alleviated the effects of the Depression on agriculture.  McNary was the Vice Presidential running mate of Wendell Willkie in 1940. In an odd footnote, had the duo been elected over FDR and Henry Wallace, they might have become the first president and vice president to both die in office, as McNary did in February 1944 of a brain tumor, and Willkie of a heart attack in October 1944.  My book, Twilight of Progressivism: The Western Republican Senators and the New Deal (Johns Hopkins University Press, 1981), has McNary as a leading figure in that group, which cooperated with FDR on many New Deal initiatives.

 

Carter Glass (D-Virginia) was appointed in November 1919, and then was elected to the Senate in November 1920, serving until his death in May 1946.  Glass had earlier served in the House of Representatives from 1902-1918, chairing the House Banking Committee from 1913-1918, and was appointed by President Woodrow Wilson for 14 months as Secretary of the Treasury from December 1918 until his appointment to the Senate. 

He served as Senate Appropriations Committee Chairman from 1933 until his death in 1946, and was also President Pro Tempore of the US Senate from 1941-1945.  He also helped to establish the Federal Reserve Banking System under Wilson, and was the author of the Glass-Steagall Act that set up the Federal Deposit Insurance Corporation under FDR’s New Deal in 1933.  However, as a staunch supporter of States Rights, he opposed much of the New Deal, and advocated disenfranchisement of African Americans in his state and nationally, and Jim Crow segregation laws.

 

Gerald Nye (R-North Dakota) was appointed to the Senate in November 1925, and was elected to three full terms before he was defeated in 1944.  He was termed a “progressive” Republican, and my book on the subject included an interview with Nye conducted in March 1971, his last interview with a historian before his death a few months later.

Nye became noted for his investigation of the Teapot Dome scandal, and helping to create Grand Teton National Park.  He supported much of the New Deal until later breaking with the President, but became most controversial as a leading isolationist spokesman. This included heading the Nye Committee in 1934-1935, which investigated the munitions industry, and promoting the view that America could have avoided entrance into World War I. He was a leading advocate of the neutrality laws passed by Congress in the mid-1930s.  Nye was accusatory toward Jews in the film industry, leading to charges of antisemitism, and was a major critic of both Great Britain and of the Republican Presidential nominee Wendell Willkie in 1940.  He was also an active speaker on radio at rallies of the America First Committee in 1940-1941, the leading organization attempting to keep America out of World War II. Nye told me, thirty years after Pearl Harbor, that he believed Roosevelt had plotted to get America into that war.  Nye was even ridiculed by Dr. Seuss for his isolationist views and his vehement rhetoric and oratorical manner.

 

Arthur Vandenberg (R-Michigan) was appointed to the senate in March 1928, after a career in journalism as an editor and publisher in Grand Rapids, and was then elected for four terms, dying in office in April, 1951.  Originally supportive of President Herbert Hoover, he would support much of the early New Deal of FDR, but then became part of the conservative coalition that opposed the 1937 Supreme Court “packing” plan and the pro-labor Wagner Act, and was an isolationist in foreign policy until after the Japanese attack on Pearl Harbor in December 1941. 

His position on foreign policy changed radically as a result, and he became an internationalist, making a well-hailed transformation in a speech in the Senate in January, 1945.  He became a promoter of the United Nations, and cooperated in a bipartisan fashion with President Harry Truman on the Truman Doctrine, the Marshall Plan, and the formation of the North Atlantic Treaty Organization as chair of the Senate Foreign Relations Committee from 1947-1949.  Vandenberg was President Pro Tempore of the Senate during the 80th Congress (1947-1949), so two heartbeats away from the Presidency, and was a “favorite son” candidate for the White House in 1940 and 1948.  The Senate Reception Room has a portrait of Vandenberg, part of a very select group of seven legislators rated by the Senate as the most prominent in its history.

 

Harry F. Byrd, Sr. (D-Virginia) was appointed to the Senate in 1933, and served 32 years.  Previously, he had been Virginia Governor from 1926-1930 after a career as a newspaper publisher and two stints in the Virginia State Senate.  His state political machine dominated Virginia politics for a half century, enforcing literacy tests and poll taxes to deny the franchise to African Americans. He became a leader in the conservative coalition against the New Deal, and opposed as Governor and in the Senate against any racial desegregation, advocating “massive resistance” to the 1954 Supreme Court decision in Brown v. Board of Education.

But in foreign policy, Byrd was an internationalist and supported FDR’s foreign policy as a leader on the Senate Armed Services Committee after World War II. He later became the Chairman of the Senate Finance Committee.  Byrd refused to endorse President Truman in 1948 or Democratic nominee Adlai Stevenson in 1952, and was always a thorn in the side of Dwight D. Eisenhower—refusing to support the Interstate Highway System—and of Lyndon B. Johnson—opposing the Civil Rights Act of 1964.  Byrd received 15 electoral votes in 1960, from Mississippi, Alabama, and Oklahoma, in the election that made John F. Kennedy President.  His greatest legacy was the creation of the Shenandoah National Park, Skyline Drive, the Blue Ridge Parkway, and the Virginia state park system.

 

Ralph Flanders (R-Vermont) was appointed to the Senate in November 1946, and then was elected to two full terms, serving until the first days of 1959.  He had a career as a mechanical engineer and industrialist, and was President of the Boston Federal Reserve Bank for two years before his Senate career. He served on the Joint Economic Committee in an investigatory and advisory committee, and on the Finance Committee and Armed Services Committee.  He promoted public housing, higher education spending, and the Civil Rights Act of 1957 under President Dwight D. Eisenhower. 

He promoted arms control in foreign policy, and became noticed when he became the major critic of Republican Senator Joseph McCarthy of Wisconsin, who was pursuing what Flanders saw as reckless rhetoric and behavior in his Red Scare tactics from 1950-1954.  He was an early and strong critic of McCarthy, saying on March 9, 1954 that he was misdirecting America’s efforts at fighting communism overseas, and causing a loss of respect for America in the world community.  His Senate address was a scathing criticism of McCarthy, hailed by many, but attacked by critics as supporting the Communist cause.  Flanders introduced a resolution on June 11, 1954, condemning the conduct of McCarthy and calling for his censure for flagrant abuse of power. The US Senate would censure McCarthy on December 2, 1954. Republicans split evenly on the motion, but the total vote was a landside of 67-22, and McCarthy never recovered from the censure.  Flanders became a national hero, and a profile in courage to many millions of Americans. 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/blog/154488 https://historynewsnetwork.org/blog/154488 0
Political Precedent for the Trump Cult of Personality

 

 

The term “Trumpism,” alluding to a cult of personality surrounding the 45th president, has penetrated the American vernacular. So much about Donald Trump and his presidency has been unprecedented. But in this case, the phenomenon is not new. A cult of personality also engulfed Ronald Reagan. Although these men are very different from one another in character, their cults of personality share similar qualities. Both were not always truthful, both made serious mistakes, and both were tinged with racism.

A political cult of personality means a strong admiration and devotion to a leader. Frequently, the leader spreads his fame widely through mass media. Followers become enamored to the point of idolizing the leader while overlooking or ignoring shortcomings. This characterizes the public life of both Trump and Reagan.

Familiar to millions of Americans by appearing in movies and hosting the weekly General Electric Theater on Sunday night television, Ronald Reagan began a political career on October 27, 1964 with a nationally televised speech on behalf of Republican presidential candidate Barry Goldwater. It was a week before the election. The speech was filled with false claims about the overbearing U.S. government and unverified anecdotes. This was all to support Reagan’s view that government needed to get out of the way of the economic freedom of the American people. Reagan falsely claimed that farmers could be imprisoned who did not cooperate with federal government programs, and that the Federal Reserve Board planned inflation.

Reagan also said, “We were told four years ago that seventeen million people went to be hungry each night. Well, that was probably true. They were all on a diet.” Due to his building a cult of personality, this particularly callous and inaccurate quote was overlooked. When Reagan spoke, more than 36 million Americans were living in poverty, nearly one-fifth of the country. Following the formula of that speech, Reagan won the California governorship two years later by a landslide and would go on to win the presidency twice by equally impressive margins. The Reagan cult of personality enabled him to remain popular with his followers even when violating his own conservative principles. Throughout his political career, Reagan railed against big government deficit spending. But when the national debt rose by 189 percent, he suffered no political consequences. When Reagan admitted to misleading Americans during the Iran-Contra scandal, his popularity went down temporarily, but bounced back by the end of his presidency.

Donald Trump, like Reagan, gained fame with the American public through show business. Trump starred in a reality television show called The Apprentice. Many Americans assumed that Trump was the “boss” starring in his own program, but in reality Trump was an actor employed by a television production company, just as Reagan was an actor employed by the General Electric Company. The Apprentice gave Trump a favorable celebrity status leading toward a political cult of personality. While Reagan launched his political career with a televised speech, Trump began his with a nationally televised accusation that Barack Obama should not be president because he was not a natural-born U.S. citizen. With no proof other than his words, Trump claimed to have investigators in Hawaii uncovering evidence that Obama was not born there as his birth certificate indicated. “They couldn’t believe what they’re finding,” Trump asserted. Several years later, shortly before winning the presidency, Trump admitted that he believed Obama is a U.S. citizen.

When Trump announced his presidential candidacy, he declared, “Sadly the American dream is dead.” The campaign slogan became “Make America Great Again” That is not unlike Reagan’s decrying big government for destroying our freedom. The Reagan 1980 campaign slogan “Morning in America” is not very different in meaning from the Trump 2016 slogan. Like Reagan, Trump deviated from facts to support political points. Examples of this are legion, from Trump’s assertion that he saw thousands of Muslims on 9/11 cheering the collapse of the twin-towers to his claim of Obamacare imploding. One difference however is that Reagan’s factual deviations usually served to buttress his political points, while Trump’s were often to boost himself, from the false claim to have graduated from the Wharton School at the University of Pennsylvania at the top of his class,  and the boast of being a “very stable genius.” That arrogance was not in Reagan’s character.

Trump’s personality cult protected him to some extent as it did Reagan. Trump’s popularity was never high as Reagan’s was. But his approval ratings always remained in the middle 40s, not dropping precipitously as in the case of Nixon and Carter for example. That is despite numerous scandals, including the Russia investigation, and a poorly-handled pandemic killing hundreds of thousands of Americans. In the end, 74 million Americans voted for Trump. The cult of personality remained intact.

Another and more sinister similarity in the Reagan and Trump cults of personality is white racism. Both men saw an opportunity to advance their political careers by appealing to white voters in a racially prejudicial way. In Reagan’s 1966 campaign for governor he appealed to white voters disgusted with the “beatniks, radicals, and filthy speech advocates” as Reagan termed it. In his 1976 campaign for the Republican presidential nomination, Reagan frequently told the “welfare queen” story about a woman on welfare who allegedly defrauded the U.S. Government of $150,000. The story was significantly embellished, but was in keeping with Reagan’s political views. He once called welfare recipients a “faceless mass waiting for a handout.” He did not mention race, but the implication was abundantly clear that the welfare queen is black. In his 1980 presidential campaign, Reagan after winning the nomination, traveled to Mississippi to give a speech at the Neshoba County Fair to a white audience glorifying states’ rights, which has long been the cry of white Southerners fighting civil rights. Neshoba County is the site where three civil rights workers were infamously killed in 1964.

Donald Trump’s appeal to white racism has been more blatant. In August 2017, the Unite the Right rally occurred with one counter-protester killed. Trump said that “you also had people that were very fine people on both sides.” One side had neo-Nazis, Ku Klux Klansmen, and Alt-Right people. In the last presidential campaign, Trump in numerous ways appealed to racism attempting to win re-election. For example, he condemned NASCAR for banning the Confederate flag. He condemned Black Lives Matter and predicted the “beautiful suburbs” will be destroyed by low-income housing if Biden wins. He blamed big city Democrats and their black voters for stealing the election, ignoring the fact that he lost battleground states because too many whites in the suburbs deserted him.

Two recent presidents have had cults of personality, although that is antithetical to democracy. That enabled both to win their party nominations and the general election. It gave both men the luxury of deviating from truthfulness and enabled Reagan to survive a severe scandal and Trump to be incompetent and scandalous while maintaining a significant base of popularity. This also indicates something ominous about America. If a candidate has a cult of personality, and develops a large number of devoted followers who believe he or she can do no wrong, it could potentially make white supremacy or other malignant elements  of politics seem permissible, with unknown consequences for democracy.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179849 https://historynewsnetwork.org/article/179849 0
Life During Wartime 532

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/blog/154489 https://historynewsnetwork.org/blog/154489 0
Roundup Top Ten for April 9, 2021

The Meaning of the Democrats’ Spending Spree

by Keeanga-Yamahtta Taylor

Joe Biden supported a balanced budget amendment in 1995, ran as the "establishment" candidate in the Democratic primaries, and has been a regular advocate of bipartisanship. So why is his administration proposing the massive American Rescue Plan Act, and showing a willingness to act without securing Republican cooperation? A tour of recent history can explain. 

 

Our Greatest Libraries are Melting Away

by David Farrier

Ice core samples from the Greenland shelf are a physical archive of the long sweep of human history, and demonstrate the connections of humanity's past and future. 

 

 

Without Asian American Studies, We Can’t Understand American Racism

by Min Hyoung Song

The establishment of Asian American Studies and ethnic studies programs has been essential to putting Asian American scholars (and scholars of Asian Americans) in position to engage the mass media around events like the Atlanta shootings. As those programs are under fire, it's time to recognize their value. 

 

 

What Manhattan Beach’s Racist Land Grab Really Meant

by Alison Rose Jefferson

Debates over  the redress of past racial injustice must acknowledge that some past actions have harmed communities in ways that can't be repaired, including th loss of space for communal leisure or equal access to everyday pleasures.

 

 

A Poem That Shows How to Remember the Holocaust

by James Loeffler and Leora Bilsky

"Lemkin’s anguished text also explains why the world had already begun to forget the Holocaust. Genocide represents more than a large-scale physical assault on human bodies, he suggests; it is also an attack on the very existence of minority cultures. In a genocide, books are burned and memories are extinguished."

 

 

“Taxpayer Dollars”: The Origins of Austerity’s Racist Catchphrase

by Camille Walsh

The rhetoric of protecting "taxpayer dollars" hinges on a selective interpretation of who pays taxes that reinforces the privilege of affluent whites to have government follow their preferences. 

 

 

Higher Education's Racial Reckoning Reaches Far Beyond Slavery

by Davarian L. Baldwin

American universities have grown in harmony with American racism throughout their history, from building on land appropriated from Native Americans to accommodating Jim Crow to promoting social science theories that justified segregation and directly encouraging gentrification through real estate purchasing. 

 

 

The World the Suez Canal Made

by Aaron Jakes

"The purpose of the Suez Canal, from the perspective of both the Egyptian state and its European investors, was not simply to render the world more interconnected and international transport more efficient, but to extract transit fees from the ships passing through it."

 

 

Restoring the People’s Universities

by Alejandra Marchevsky and Jeanne Theoharis

"We see this trend across the nation: when students of color finally began to gain access to higher education, disinvestment and the shrinking of educational opportunity followed."

 

 

Biden’s Plan for Central America Is a Smokescreen

by Aviva Chomsky

The Biden plan for Central America revives the Cold War formula of business-friendly economic development and militarized security in the name of stopping migration toward the US. This, the author argues, amounts to doubling down on failed policies that have driven migration for decades.

 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179845 https://historynewsnetwork.org/article/179845 0
Hidden Stories of Jewish Resistance in Poland

 

 

 

In 1959, writing about the Holocaust, scholar Mark Bernard highlighted that Jewish resistance was almost always considered a miracle, ethereal, beyond research scope. Still today, this impression generally persists. And yet, Jewish defiance was everywhere during the war, carried out in a multitude of ways, by all types of people.

 

I first encountered this phenomenon several years ago, when I accidentally came across a collection of Yiddish writing by and about young Polish-Jewish women who rebelled against the Nazis. These “ghetto girls” paid off Gestapo guards, hid revolvers in marmalade jars, and built underground bunkers. They flung homemade explosives and blew up German trains. I was stunned. Why had I – a Jewish writer from a survivor family, not to mention a trained historian who held a Ph.D. in feminist art — never heard this side of the story?

 

And so began my research. As I discovered, due to preconceived notions of gender, the girls’ educations, and the lack of evident markers of their Jewishness (i.e., circumcision), women played a critical role in the Jewish underground in Poland. But when I set out to write their story and sought a chronological context, it quickly became apparent that there was none. No comprehensive history of the men in the underground existed either. Sure, excellent academic biographies and case studies of rebellions in particular ghettos and camps had been published, but there were no recent English books that relayed the tale of Jewish resistance in the country as a whole. As much as I was baffled by the ferocious female fighters, I was equally baffled by the entire Jewish effort in Poland, the epicenter of the bloodshed, where 3 million Jews (90% of the pre-war population) were savagely murdered. The truth was, though I’d heard of the Warsaw ghetto uprising, I had no idea what actually happened. I certainly had no idea of the scope of Jewish revolt. 

 

Holocaust scholars have debated what “counts” as an act of Jewish resistance. Many take it at its most broad definition: any action that affirmed the humanity of a Jew; any solitary or collaborative deed that even unintentionally defied Nazi policy or ideology, including simply staying alive. Others feel that too general a definition diminishes those who risked their lives to defy a regime actively and that there is a distinction between resistance and resilience. The rebellious acts that I discovered among Jewish women and men in Poland, my country of focus, spanned the gamut, from those entailing complex planning and elaborate forethought, like setting off large quantities of TNT, to those that were spontaneous and simple, even slapstick-like, involving costumes, dress-up, biting and scratching, wiggling out of Nazis’ arms. Some were one-offs, some were organized movements. For many, the goal was to rescue Jews; for others, to die with and leave a legacy of dignity. 

 

As guerrilla fighters, the Polish-Jewish resistance took only a handful of Nazi casualties and achieved a relatively minuscule victory in terms of military success, but the effort was much more significant than I’d known. Over 90 European ghettos had armed Jewish resistance units. In Poland, where many of these were located, the units comprised “ghetto fighters” who used found objects (like pipes), manufactured items (such as homemade explosives), and smuggled-in weapons (including pistols and revolvers) to engage in spontaneous or, more often, organized anti-Nazi assaults. Most of these underground operatives were young, in their twenties and even teens, and had been members of youth movements, which now formed the core structures of resistance cadres. Ghetto fighters were combatants as well as editors of underground bulletins and social activists. The Warsaw Ghetto Uprising, I learned, was youth-driven, and strategically planned over months. Most accounts agree that about 750 young Jews participated. (Roughly 180 of them were women.)  

 

Some Jews fought inside the ghettos, but 30,000 (ten percent were women) fled their towns and cities and enlisted in forest-based partisan units; many carried out sabotage and intelligence missions. ‘The Avengers,’ a Jewish-led detachment outside Vilnius, blew up German trains, vehicles, bridges, and buildings. They used their bare hands to rip down telephone poles, telegraph wires, and train tracks. Other Polish Jews joined Soviet, Lithuanian, and Polish-run detachments or foreign resistance units; while others still worked with the Polish underground, often disguised as non-Jews, even from their fellow rebels. 

 

Alongside military-style organizations, Jews organized rescue operations to help fellow Jews escape, hide, or live on the Aryan side as Christians. Warsawian Vladka Meed, a Jewish woman in her early 20s, printed fake documents, distributed Catholic prayer books, and paid Christian Poles fees for hiding Jews in their homes; she also helped save Jewish children by sneaking them out of the ghetto and placing them with non-Jewish families. In Poland, rescue networks supported roughly 10,000 Jews in hiding in Warsaw alone; they also operated in Krakow. Mordechai Paldiel, the former director of the Righteous Gentiles Department at Yad Vashem, Israel’s largest Holocaust memorial, was troubled that Jewish rescuers never received the same recognition as their Gentile counterparts. In 2017 he authored Saving One’s Own: Jewish Rescuers During the Holocaust, a tome about Jews who organized large-scale rescue efforts across Europe. Poland, he claims, had only a small number of these efforts, and still, it was significant.

 

All these accompanied daily acts of defiance: smuggling food across ghetto walls, creating art, playing music, hiding, even humor. Jews resisted morally, spiritually, and culturally in public and intimate ways by distributing Jewish books, telling jokes during transports to relieve fear, hugging barrack-mates to keep them warm, writing diary entries, and setting up soup kitchens. Mothers kept their children alive and propagated the next Jewish generation, in and of itself an anti-Nazi act. Jews resisted by escaping or by taking on false Christian identities. Roughly 30,000 Jews survived by dying their hair blond, adopting a Polish name and patron saint, curbing their gesticulations and other Jewish seeming habits, and “passing.”

 

I was fascinated by this widespread resistance effort, but equally by its absence from current understandings of the war. Of all the legions of Holocaust tales, what had happened to this one?

 

While I researched the lives of Jewish rebels, I simultaneously probed the trajectory of their tales. As I came to find, though there were waves of interest in Jewish defiance over the decades, the resistance narrative was more often silenced for both personal and political reasons that differed across countries and communities. The history of the Jewish underground has generally been suppressed in favor of a “myth of passivity.” Holocaust narratives were shaped by the need to build a new homeland (Israel), the fear of exposing wartime allegiances (Poland), and redefining identity (USA). Early post-war interest in partisans turned into a 1970s focus on “everyday resilience.” A barrage of 1980s Holocaust publications flooded out earlier tales.

 

Many fighters who survived kept their stories hidden. Many women were treated with disbelief; relatives accused others of having fled to fight instead of staying to look after their parents; still others were charged with sleeping their way to safety. Sometimes family members silenced them, as they feared that opening up old wounds would tear them apart. Many hushed their tales due to oppressive survivors’ guilt: they felt that compared to others, they’d “had it easy.”

 

Then, there was coping. Women in particular felt a cosmic responsibility to mother the next generation of Jews. They wanted to create a normal life for their children and, for themselves. They did not want to be “professional survivors.” Like so many refugees, they attempted to conceal their pasts and start afresh. The fighters’ formidable tales were buried with their traumas, but both stayed close to the surface, waiting to burst out.

 

The Warsaw Ghetto Uprising began in April 1943, on the first night of Passover. In her groundbreaking book, We Remember with Reverence and Love: American Jews and the Myth of Silence After the Holocaust, 1945–1962, Hasia Diner explains that Passover, a holiday where Jews celebrate liberty, became the time around which American Jews commemorated the Holocaust. However, the uprising element was forgotten. When my book comes out this April, I hope to bring the revolt to the fore once again. I cannot think of Polish Jewry without it; theirs is a story of persistent resistance and profound courage.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179779 https://historynewsnetwork.org/article/179779 0
What Comes Next?

Poster images by Amanda Phingbodhipakkiya, from I Still Believe in Our City, a recent public art campaign for the New York City Commission on Human Rights. 

 

 

 

“Until we address the discrimination and harassment against Asian Americans today, they will become deeply entrenched in the fabric of our nation, causing unimaginable harm and suffering and taking decades to undo,” Manjusha P. Kulkarni, Executive Director of the Asian Pacific Policy & Planning Council, explained in a recent written statement submitted to the House Subcommittee on Constitution, Civil Rights, and Civil Liberties. On March 18, activists, scholars, and artists across the Asian American and Pacific Islander community provided testimony on the increase in anti-Asian hate speech and violence since March of 2020. These attacks aligned with former president Donald Trump’s use of phrases like  “Chinese virus” on Twitter and in public statements. Stop AAPI Hate—a coalition of activists and scholars maintaining a database that contains nearly 3,800 documented incidents of verbal and physical abuse—has continued the legacy of Asian Americans pursuing protection by presenting evidence of racism.

 

But what comes next?

 

Forty-two years ago, Asian Americans spoke to legislators in DC as consultants on civil rights issues still faced by the AAPI community long after the legislative milestones of the 1960s. From May 8th to the 9th in 1979, the U.S. Commission on Civil Rights held its first hearing on specific rights violations encountered by Asian Americans. It coincided with increasing representation of Asian Americans in politics and Congress’s passing of Public Law 95-419, which designated the week of May 4th as Asian American Pacific Islander Week. Just a few days earlier, President Jimmy Carter declared, “We have succeeded in removing the barriers [for Asian Americans] to full participation in American life.” Refugees from southeast Asia fleeing the wreckage of the Vietnam War were also resettling in the US, adding diversity to the AAPI community and, Carter declared admiringly, "their successful integration into American society and their positive and active participation in our national life demonstrates the soundness of America’s policy of continued openness to peoples from Asia and the Pacific." Carter’s praise for the AAPI community and their “enormous contributions to our science, arts, industry, government, and commerce” bolstered the idea of Asian Americans as the model minority who had overcome adversity to achieve the American Dream.

 

However, for those who appeared before the Commission, Carter’s comments did more harm than good. He described Asian Americans as economic drivers for the United States whose rewards were acceptance and economic comfort—an idea that glazed over the challenges they faced. Dr. Ling-Chi Wang, then an assistant professor in Asian American Studies at the University of California at Berkeley, provided a historical overview of Asian American experiences that clashed with Carter’s simplistic characterizations. “I just want to add,” Wang stated during his testimony, “that current popular beliefs, held most firmly by government agencies—that Asians have no problems, that Asians have made it, that Asians take care of their own problems, and that Asians are too proud to seek government assistance—are but persistent manifestations of the highly institutionalized government attitude toward Asian Americans of benign neglect.”

 

This neglect stemmed from a history of exploitation by the government and white employers. “Almost without exception,” Wang continued, “each economic crisis was accompanied by an anti-Asian movement… each Asian group was imported to meet a concrete demand for cheap labor, and each was subsequently excluded by law when each was no longer perceived to be needed or when it was no longer politically and economically expedient to continue its utilization.” Racist policies excluded Asian immigrants, making them expendable, rendering them as outsiders, and making them easy to scapegoat in different crises. As Wang charged, the model minority myth “absolves the government of any responsibility of protecting the civil rights of Asian Americans and assigns Asian Americans to a permanent status of being neglected.”

 

Others presented evidence of the damage from more than a century of anti-Asian sentiment. Challenges faced by Asian Americans ranged from limited access to health services, lack of bilingual educational resources, and poverty—social problems also encountered by other communities. Participants in the consultation offered solutions such as promoting more representation in the federal government, directing more money to community grants, and developing a set of criteria for identifying civil rights violations specific to Asian Americans. There was hope—particularly after the movement for reparations for Japanese Americans who survived incarceration during World War II—that with more attention, Asian American civil rights would progress.

 

But in 1986, the Commission on Civil Rights issued a disturbing report. “Recent Activities Against Citizens and Residents of Asian Descent” noted an uptick in attacks on Asian Americans. Economic competition from Japan spurred a reinvigorated anti-Japanese movement in the US during the early 1980s. In 1982, a Japanese American state legislator in California reported that someone had spray-painted the word “Jap” on his garage door while a group called the White American Resistance Movement distributed anti-Asian pamphlets throughout San Francisco. The report connected these incidents to the death of Vincent Chin, a Chinese American draftsman who was murdered by two white men who had recently been laid off from an auto plant in Detroit. They blamed Chin for the layoffs—thinking he was Japanese—and brutally beat him.

 

The recent deadly, racially-motivated attacks on Asian American women in the Atlanta metro area has brought attention to the historic trend highlighted by Wang during his 1979 testimony. In May of 2020, the Commission on Civil Rights promised to prosecute civil rights violations and hold public hearings on anti-Asian hate, but these initiatives had largely languished despite calls from the AAPI community for legislators to take verbal and physical abuse seriously.

 

The promises made by the Commission in 1979 did not save Vincent Chin. And now—as Wang predicted—the COVID-19 pandemic is the latest in a long list of crises that produced violent anti-Asian attacks. The Chinese Exclusion Act and other historic moments are crucial to understanding where the nation is today, but historians have more contemporary examples to draw from. The pleas for help in 1979 before the Commission on Civil Rights largely went unanswered. Today, holding public officials accountable for the promises they will undoubtedly make to the AAPI community after recent hearings depends upon forcing Americans to reckon with a cycle of perpetual scapegoating and the racist language that makes it possible.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179780 https://historynewsnetwork.org/article/179780 0
Pamela, Randolph and Winston: The Wartime Discord of the Churchills

Pamela Digby, photographed in 1938, before her brief courtship and tumultuous marriage (1939-1945) with Randolph Churchill. 

 

 

 

In the spring of 1941, Averell Harriman, Roosevelt’s special envoy to Britain, started an affair. That both he and the woman in question were married was not a huge problem; there were different rules in wartime. What was more complicated was the identity of the woman’s husband: Randolph Churchill, the British prime minister’s adored, spoiled, turbulent son.

 

Randolph had started the Second World War desperate for two things. He wanted to be wherever the fighting was fiercest, and was anxious to find a wife who would bear his child. Both were ways of pleasing his father, who placed an outsized premium on physical bravery, and was obsessed with the idea of building a powerful political dynasty. Randolph, he felt, had a duty to make sure the line was continued.

 

Randolph’s first ambition was stymied by the artificial calm that followed Hitler’s invasion of Poland, as well as his father’s reluctance to get him reposted. He was more successful in achieving the second. In the course of a fortnight he proposed to, and was rejected by, eight women. Then he met Pamela Digby.

 

Pamela had wide, deep-blue eyes, pink flushed cheeks and auburn hair streaked with a patch of white. Some saw her as “a red headed bouncing little thing regarded as a joke by her contemporaries,” but beneath the plumpness and a forced air of jollity was an adamantine desire to escape her dull, provincial life in Dorset.

 

Winston embraced her immediately. Pamela was soon an essential part of the Churchill family, especially once Winston became prime minister and they moved to Downing Street. She had an uncanny instinct for sensing what people needed, and then giving it to them almost before they had realized themselves. She was a source of support for an embattled Winston, and a much-needed confidante for his wife, the lonely Clementine. In October 1940 she gave birth to a boy, named, inevitably, Winston.

 

The only problem was her husband. Randolph was charming, clever, generous and funny. Most of the time. He was also rude, arrogant and incapable of understanding why marriage should stop him sleeping with other women. All of these qualities were exacerbated when he drank, which he did uncontrollably.

 

Bills and arguments mounted up. When Randolph behaved appallingly, or ran up debts he couldn’t cover, it was to his parents that Pamela ran for help. They, increasingly, took her side, which was another source of friction in an already fraught web of relationships.

 

It was after Randolph finally got his posting abroad that the problems really began. In January, 1941 his Commando unit set sail for Egypt. Before their ship had even docked on the other side of the Atlantic he’d lost more at the gaming table than he could possibly pay back. Once Pamela had fixed the financial disaster her wayward husband had forced upon her, she deftly, single-mindedly, began to fashion a new, independent life for herself. Before the end of spring she had started sleeping with Harriman.

 

Randolph was incandescent when he discovered his wife’s infidelity. This was largely because he was convinced that his father had at the very least tolerated, and at worst actually encouraged, an affair that was being conducted beneath his nose. After all, the situation presented a clear political advantage to Winston. And so although Pamela did not create the tensions that run between father and son – they had a long history of their own – her actions brought matters to a head.

 

Winston and Randolph’s bond had always had an almost romantic intensity. Winston was obsessed with his son, claiming that he would not be able to continue leading the country if anything were to happen to him; Randolph was devoted to his father. They had spent the last decade living in each other’s pockets: drinking, plotting, gambling, talking and quarrelling. But this closeness masked some profound difficulties.

 

Throughout his life Randolph had struggled to find a way of marrying the outsized expectations Winston had thrust onto his shoulders with the need to provide his father with the asphyxiating loyalty he demanded. Every time Randolph tried to fashion an opportunity for himself, or attempted to assert an independent position, he found himself accused of sabotage. He had been Winston’s most passionate defender during his time in the wilderness, an unfailing source of affection and reassurance. And yet when Winston formed his government in 1940, there was no place for Randolph. All of this had lain under the surface for years, now it erupted.

 

Volatile, unable to control their emotions, the two men launched into rows that frightened anybody who witnessed them. Winston became so angry that Clementine feared he would have a heart attack; Randolph stormed out of rooms in tears, swearing that he would never see his father again.

 

Although a fragile peace was restored, it could not last. Randolph was unable to reconcile the deep animal love he bore for his father with what he regarded as Winston’s treachery. Nor could he understand why his parents continued to show Pamela such open affection. Winston reacted violently to his son’s reproaches. Wrapped up in his own consuming sense of destiny, and unable to ever read what was going in anybody else’s heart, he did not see that he had done anything wrong. As Pamela moved serenely from one affair to the next, father and son fought, again and again, opening deeper and deeper wounds.

 

Randolph and Pamela’s divorce was confirmed in 1945. Randolph could survive this, but the damage to his relationship with Winston was irreparable. They would never recapture the intimacy they had enjoyed before the war. Randolph had married Pamela to make his father happy, and yet he only succeeded in alienating the man he loved more than anybody else on the planet.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179782 https://historynewsnetwork.org/article/179782 0
Economic Justice and Political Stability Require More Progressive Taxation

Income Tax filers, 1920

 

 

The invasion of the Capitol on January 6th is a sign of deep anger at the course of American life among what is usually called the “white working class.”  Beside it is the protest of black America over continuing racism and poverty.  People with little else in common count perceived economic unfairness among their complaints. What can we as citizens of a democracy do about it?  Significant reforms, such as those usually ascribed to the left in the Biden administration, are going to cost money.  A return to progressive tax rates by meaningful tax reform will be part of the solution.

 

Economists with a sense of history point out that inequalities began to grow about 1980, starting with the Reagan tax cuts.  Emmanuel Saez and Gabriel Zucman of the University of California, Berkeley, have done a service to the republic by methodically tracing what has happened to equality over the last 40 years.  Their book, The Triumph of Injustice (2020), is contentious but it sets out uncomfortable facts that bear upon a solution.  A complementary analytical tool is also provided.

 

They trace and measure the working, middle, and upper class divisions in American society all the way to the super-rich and the top 400 families.  Fully 50% — half — of the American people are classified as working class, with annual income on average of $18,500.  They earned 10% of national income in 1980; 40 years later, only 12%.  Most gains in the economy due to technological progress and globalization went to the upper 10%.  The next 40% of the people are middle class, earning an average of $75,000.  The last 10% of the people are reckoned as upper class or the rich, earning $220,000 annually.  But they have divisions, too.  The top 1% earn $1.5 million per year.  They earned as much (10% of national income) as the whole working class in 1980; 40 years later, their share had grown to 20% (pp. 3-7).  At the very top are 400 families of the super-rich, including Warren Buffett, who earned $3.2 billion in 2015, and paid taxes of $1.8 million (a rate of 0.055%) (p. 129).  Buffet is honorable in that he openly admits that he should pay a higher rate, which he has famously stated is less than his secretary’s.

 

Reversal of this pattern is absolutely vital to a sense of fairness in America.  We have already had one insurgency.  But won’t the rich, especially the very rich, resist any proposal that increases their taxes?  Money is their property.  A tax is an appropriation of private property for public benefit.  People within democracies are resistant to taxes until convinced of their necessity and justice.  This country began in a tax revolt.  How can we convince Jeff Bezos (net worth $179 billion) to Alice Walton ($62 billion) among the 400 to share?

 

The principle of progressive taxation was established historically.  The Constitution did not originally provide for an income tax, but it distinguished between indirect taxes (like customs duties) and direct ones (like taxes on land).  Customs duties or tariffs were understood as taxes on consumption, which are regressive, but the citizenry in the early republic were so nearly equal — most were owners and cultivators of farms — that the slightly increased price of foreign imports was bearable.  For 100 years the principal revenue of the U.S. federal government was drawn from tariffs.

 

Great national crises have been the settings for the introduction of a progressive income tax.  During the Civil War, the first income tax was introduced — as a direct tax — to meet the threat to the Union.  Its rates were gradually reduced until the 1890s, at the height of the industrial Gilded Age, when the Supreme Court ruled that the government had no right to impose a direct tax.  That defect was removed by the 16th amendment (1913), one of the high achievements of the Progressive Era (1905-15).  A regulated, orderly capitalistic economy was steadily established by the Interstate Commerce Commission, the Sherman and Clayton Anti-Trust Acts, the Federal Reserve System, the Federal Trade Commission, later the Securities and Exchange Commission, and the income tax.

 

Initial rates for the tax were quite modest (7% for the top bracket) but U.S. entry into the Great War increased the top marginal rate to 67% to counter war profiteering.  An estate tax was also established at 10% for the largest bequests, which grew to 20% by the late 1920s.

 

The great expansion of the income tax came with the supreme crises of the Depression and the Second World War.  In the 1930s, with business in ruins for many owners and many workers reduced to poverty, President Franklin D. Roosevelt aimed to confiscate remaining excessive incomes. The top marginal rate rose to 79% in 1936.  Roosevelt argued that in American democracy no one should, after taxes, have an income of more than $25,000 (equivalent now to about $1,000,000).  The purpose was plainly to redistribute income to create a more equitable society.  Roosevelt explained in his 1937 inaugural address, “The test of our progress is not whether we add more to the abundance of those who have much.  It is whether we provide enough for those who have too little.”  During the war, the top marginal rate rose to a maximum of 94%.  This progressive rate fell slowly in the 1950s and ’60s (even in Nixon’s time it was 70%), producing the most equitable, and hence just, U.S. society yet in the industrial age.

 

This achievement from about 1936 to 1980 has largely been undone by tax cuts (the top rate is now 23%) and by the rise of an immense tax-dodging industry.  Neoliberal economics argues that the optimal tax rate on capital should fall to zero, and that capital gains revenue should be replaced by higher taxes on labor income or consumption (p. 99).  Reversal of this 40-year pattern is vital to bringing the working class into the promise of America and the very rich into recognition of their obligations to a democracy.

 

If you want peace, work for justice.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179781 https://historynewsnetwork.org/article/179781 0
Richard Minear Reflects on Teaching History, Including Teaching Vietnamese History during the Vietnam War  

 

 

Dr. Minear grew up in Newton outside of Boston and his wife was born in Northampton.  He taught at Ohio State University from 1967 to 1970. “That was prime Vietnam time,” he said. “Columbus, Ohio is distinctly not New England. Ohio State is a huge school.” He also taught at the University of Massachusetts Amherst from 1971 to 2008.

Dr. Minear graciously answered questions via phone about history, his career, teaching, and what he is doing now while in retirement.

 

Did your education in your early life prepare you to eventually pursue a career as a historian?

Yes and no. I didn’t set out just to learn languages, but in this neck of the woods, being competent in Japanese, Chinese, or Vietnamese for that matter is a prerequisite.

Did you think you would travel to so many continents and experience different cultures?

By then, I had lived in Germany for two or three years, I had lived in Sweden for six months. When I was an eight-year-old, I went to a Swedish school while my father was on sabbatical. This was 1958-59: my brother was on a Fulbright scholarship in Germany and I was in Heidelberg for my junior year. My parents were in Holland in that spring. European winters are god awful, with not much sunshine. That spring, because of my dad's interest, he took my brother and me on a week-tour, and we hit Istanbul, Palestine, then Israel on the Jordanian side – Palestine/Israel was my dad's turf – New Testament theology. Then we went to Rome on the way home back to Europe. I had had a lot of travel.

Which Japanese island have you been to the most?

There are four major islands in Japan. I spent most of my time on the main island. UMass has a sister university, Hokkaido University. I spent the summer and part of another year. My first three years were in Kyoto and a little time in Tokyo.

Is there a historical event that captivates you most?

I was born in 1938. I learned about World War II primarily after the fact. My first ‘political’ memory was of the atomic bomb while I was up in Vermont. I got on a boat on the lake and the sirens started going off and I remember a bonfire. That's part of my background in Japan and a natural focus or interest.

I was too young to serve in Korea and I used the educational deferment, which got me through 1968 by which time I was 30 and married. The accident of chronology kept me out of the military during Vietnam, and Vietnam had a major impact on everything that I did afterwards. Ohio State has a quarter system which means three ‘semesters’ each year. I was beginning to teach six courses that year and I quickly ran out of Japan courses. In the third semester of the quarter of the first year, which would have been spring of 1968, it dawned on me that there was no course on Vietnam at Ohio State.

Here was a major university without a course on Vietnam and the war, and I proposed a course. Even though I had never had a course on Vietnam, my Asia background gave me some kind of entree. I taught a course in spring of in 1968 and 1969 and 1970, and Vietnam had gotten very big. I had brought in guest lecturers. Ever since, it has had a major effect on my politics, on my thinking, both having watched it and having read materials on it. I taught about Vietnam at UMass throughout the seventies and it has had an effect on all of my teaching.

Was the effect related to how people perceived the Vietnam War or based on how you approached teaching and explaining about it?

It very quickly dawned on me that this was more than textbook stuff. I had students who had graduated from my class who are in the military. One of the faculty members at Ohio State who was also an ROTC instructor, he (and a colleague) gave a single lecture in my course. It later dawned on me that they were only free to give the Pentagon line. He went back to Vietnam in late spring 1968 and a couple weeks later was killed. Students were graduating from the course and then going to Vietnam, and student populations were wrapped up in the anti-war movement. That gives a sense of urgency, a certain seriousness to what you do in the classroom.

Growing up in the 1940s and 50s, my high school and college education I had was pretty straight-lined and celebratory to the American master narrative. My involvement with teaching about Vietnam and reading about Vietnam basically knocked me off that master narrative.

What influenced your interest in history?

It all looks different in retrospect than in prospect. A while ago, I looked back at my high school yearbook and several people had said, “You're going to make a great professor!” They were way ahead of me!

My background was liberal arts English, history, and language. I knew I didn't want to go into theology. I was a history major as an undergraduate, although it wasn't much of a major. I spent my junior year abroad in Heidelberg and we went to classes, but there was no attendance, no grading, and no exams. It was great for languages and for other purposes, and then came graduate school. I can remember Christmas time in 1960 after I graduated from college, and my family was in the living room. The question was, “What will Richie do next year?”

There wasn't any drive on my part or any consuming interest driving me to history, but once I got there, I was not sorry.

I had seen enough of German European history which is what my undergraduate major was mainly about, to realize that it was a pretty trampled, congested field and somebody had told me about two-year programs in Asian studies. Yale had one, Harvard had one, and Berkeley. It was only two years, what could go wrong? That's what got me into graduate school and into Japanese language.

How did the perspectives acquired through your education influence your career as a historian?

I think the steering was more from the outside world than the education itself. The education that I got made it possible for the things that happened, but I think it was more stuff outside of the classroom. Looking back to the post-Vietnam era, Vietnam happened, and it had a major impact on my teaching. That is from 1964 to 1966; I was in Japan as a Fulbright graduate student and by the time I got back, it was a much bigger topic in this country. I was in Japan from 1964 to 1966, and by the late 1960s the war was heating up.

One of the fortunate things for me, first at Ohio State, and here in Massachusetts, I was the only Japan historian, which meant except for rare occasions I wasn't team-teaching or preparing my students to take an advanced course in the subject with someone else. I had an unusual independence when it came to coverage.

One of the major problems in history teaching is the compulsion that we feel or that is actual. Teaching Japanese history includes that you cover Japan from A to Z, or to cover the United States A to Z or Germany. If there are others in your department who are likely to get those students the next year, if your teaching has to cover what other colleagues expect it to cover, then that is one thing; but I never faced that issue. That's extraordinary freedom. It has been important for me all the way through. The standard introductory courses are large, and you have discussion sections taught by graduate students. In the discussion sections, all the way through, I was able to do my own discussion sections, so I rarely taught more than 60 people in one course. It was a Monday and Wednesday lecture and a discussion. I led the discussion sections and got to know the students as a result. It was important for me, not for them, to know where they were coming from.

How different were students’ specialties when they came to your survey course?

Here at UMass, the Japan survey was open to everybody so the history majors were a small part of that. I didn't really register which students were history majors or engineering or the sciences. This had an impact on my sense of audience, so I could give them some kind of perspective. One of your questions has to do with pedagogy; what we ought to be teaching and how to teach it.

Every year I had one and often two Japan survey courses taught to people who would never have another Japan course and who came from all parts of the university. This kind of shaped my ideas about teaching. We often think of covering the field, and in my experience, we don't teach history, we talk about history. We should teach a habit of mind, not a list of facts. This may have changed since my student days, but I'm not sure. Students can take our courses without really getting a sense of what it means to think historically.

Nowadays, things are different. Back when I was studying Asia, there were a handful of Asian experts at a dozen major universities. Nowadays, the US has many Asian historians. It wasn't true back then.

The name I knew was Edwin O. Reischauer. He had got into the field and he was a major figure in the beginning of Japanese studies. Later, Kennedy had appointed him U.S. Ambassador to Japan. He was one of the names that attracted me to Harvard but as soon as I got there he left, and I left before he got back.

We rarely teach about the history of the field. We rarely teach about who the historians are, who Reischauer was. What were the American Japanists doing in the World War II era. I didn't have to cover more than six to eight people to cover the field. This was after Vietnam had shaped my thinking. World War II and patriotic fervor and Japan was the enemy. Hence, they had a certain take on Japan. Some of them had a negative experience of Japan, but certainly there was ‘an American nation spreading democracy’ and that was shared almost across the spectrum.

Part of this is teaching about the background of the field and part of it is more practical – in my syllabi, this is after I had gotten my feet on the ground, after teaching about Vietnam for a while – I gave biographical information on every author we encountered in the course, and I included myself. Date of birth, educational background. Every discussion session, once a week, started out with a quiz. The first question was, “Who is the author?” and another question likely was “When was this written?” and maybe a third question was, “Where was it published?” Was it Life magazine or was it the Harvard Journal? That kind of questioning.

It underlined for the students that a major, major part of history is analyzing sources. Who is this person and why are they saying these things about Vietnam and Japan in World War II? Who is his or her audience? The emphasis for me in teaching, yes, the subject was Japan, but the underlying goal throughout was to get people to read critically, to think critically, not just about the authors we read, but also about me. At the end of the course, say “OK, this was Professor Minear’s course, who is he and where is he coming from? Then factor that in.” How many history courses today have biographical data on the professor and everybody else?

And the continued emphasis in discussions, lectures: Who is this author? When did he write? Was it before the Tet Offensive or after? For what audience? It makes the students into players rather than audience members.

In your opinion, what is the purpose of history? Who are its intended consumers, and does the historian have a social responsibility?

I think for everyone, it’s different! With the audience, something we tend to forget is – in my case, I began graduate study as a 21-year-old, I think that’s true for many of the folks. The sense of the audience then is nonexistent and you are just trying to get through the next exam and get your Masters and decide whether to go on. But once you get past and into your thesis, your audience is the three or four guys – and they were all men back then – on your thesis committee. All of whom were academics and distinguished. I can remember thinking for a while in my thirties that my audience for my writing wasn’t anybody at UMass. My audience was 30 or 40 Japan experts like me, scattered around the country but limited to the ‘in-group’ of the real experts. I can remember thinking, at that stage, that maybe my audience was, in part, historians like me in Japan. If I was really good, they might learn something about Japan from what I had to say.

In my teaching and publishing, it gradually got me away from that kind of hyper-professional focus on specialists and into what was useful for non-experts, the students who I was teaching in my courses or the general readers. Each historian has a different path to follow and maybe everyone has different expectations and a different take on this.

My ‘5 minutes of fame’ was Dr. Seuss Goes to War: The World War II Editorial Cartoons of Theodor Seuss Geisel, and soon after that came out, I gave a talk in Dr. Seuss’s hometown in California, at University of California, San Diego (UCSD). They had posters around the campus which had Dr. Seuss and my name on it. I knew one of the Japanists at UCSD. I bumped into him after the talk, and he said, “I saw this poster. I knew it wasn’t you because you were a Japan person.” The idea that a Japan person would write about Dr. Seuss didn’t compute, and yet that book got me on Good Morning America and All Things Considered. Part of teaching about writing got me into E.B. White, and I did an essay tracking the changes on the various editions of his book, The Elements of Style. The idea that a Japanist could do Dr. Seuss and E.B. White…

Back on Japan and speaking to the Japanese, my second book was Victors' Justice: The Tokyo War Crimes Trial. I was writing it in the middle of when the Vietnam War was happening, in anger. This was the Pacific counterpart of Nuremberg. When you look at the trial in retrospect, it was heavily a propaganda operation and it had a serious impact. In that sense, I’m here writing a counter piece on the trial that wasn’t exactly an exercise in justice. That gets translated into Japanese and it reinforces what the hard right in Japan was saying about the war, and about the Tokyo trial – that it was a put-up job. They are coming at it from a political position diametrically opposed to mine: the context makes a huge difference. They reacted, they loved the book. The Japanese have an expression, a proverb that if something is big news on Japan abroad, it tends to feed back into Japan. The Japanese press sits up and takes notice; I guess it’s much less so for the United States, partly because of size, reach, and influence. What other people say matters [in Japan].

I have done a lot of translations of Hiroshima survivor accounts and more recently translations of "ephemera" (pamphlets, wall posters) produced by Japan's left-wing activists. It’s fascinating how stuff that you do for one audience can be read very differently by another audience.

I think maximum clarity about your own politics, your own stance, your own commitments, and not simply clarity, but not hiding your politics can give your readers enough material, whether it’s a biographical squib on a syllabus or a translator’s introduction to a translation. That gives your audience some clue as to who you are and where you are coming from.

One of the first major translations I did was of a WW2 battleship epic, Requiem for Battleship Yamato. The battleship sailed out at the end of the war into the Okinawa campaign on essentially a suicide mission. What were they going to do with the battleship? They turned it into a floating platform that would maybe have some minor effect on the battle. Without air cover, it would be destroyed rapidly. One of the officers on Yamato survived, one of the three hundred or so crew members who had survived out of 3,332. He wrote his account. We tend to look down on military history, but it was a stunning, gruesome, yet gorgeous account of his own experience, of truth-seeking, and I showed a draft to my colleagues, a European historian, one of them a Canadian classicist. He said to me, “Any classicist (of whatever tradition) would appreciate Requiem for Battleship Yamato.”

No matter which classics (for example, if you're a classical scholar in the European tradition or the Indian tradition or the Chinese tradition), there’s horror on one hand and human nobility on the opposite, but also underlying human need, a common humanity. I think that’s part of what we owe to the public and to our kids, to get across with our work.

When I started teaching the Vietnam course, I very quickly found a classic Vietnamese poem, The Tale of Kieu, written before the French takeover of Vietnam. It’s beautiful, utterly unconnected to the war and yet. Kieu is a woman who undergoes great suffering, largely not of her own devising, and yet survives. The author is Vietnam's Shakespeare. I can remember one fellow here at UMass in the Vietnam course that had to read this, and this guy served in Vietnam. He came up after I had him write a paper, and he said, “I feel closer to Kieu than I ever felt with any Vietnamese.”

If you approach Japan, China, Vietnam, or Russia through classics and poetry, it becomes a little harder to accept unthinkingly what used to be in the textbooks and the press. What used to be in the newspapers and comics. I used to do a lecture on the Sergeant Rock comic book, Ali My, and it was a story of a U.S. operation in a war, and Ali My is an anagram for Mỹ Lai. A gruesome American massacre of Vietnamese civilians gets transmuted into a heroic battle.

How many of us grew up reading comics and war comics? Somebody needs to study videogames for their images. Who is the ‘other,’ who is the bad guy, how are they depicted, what are the gender dynamics? Videogames are having a far greater impact on our kids than any teacher in a classroom.

Who is our audience and what do we know about our audience? What de-programming needs to be in place? One of the major influences on my intellectual development was Orientalism by Edward Said. That book blew my mind! I was already coming off of Vietnam disillusioned. Said's book takes the entire tradition of European and American thinking about the Arab world and points out what a coherent, self-congratulating, and denigrating constellation that is. When the book came out, the Journal of Asian Studies commissioned essay reviews, and it came to three Asian experts: one from Japan, one from China, and one from India. I was the Japan person, and almost everything that Edward Said says about orientalism transposes beautifully onto the pre-war- and wartime American thinking about Japan.

You’re inside a tradition and you can’t see it as a tradition because it’s the world, but when somebody points out from outside the tradition or from a position from within, when somebody nails it so beautifully, you can say, a-ha. This is a world view. This is a coherent system, and we need to re-examine all of it.

There was true excitement there. What we ought to be doing in teaching is somehow start conveying that excitement, that possibility to the folks who are in classrooms. And then to say, OK, who is Edward Said? Where’s he coming from? And who am I – either I as a professor or as a student and where am I coming from? How does this all factor into how I read Edward Said and how I look at the America or the European hang-ups about Japan or the Orient. It’s a game of mirrors but it’s a deadly serious game of trying to be aware. Not simply of what the tradition is, from a matrix, what’s handed down, but also of myself and how I’m reacting and how I’m contributing in one way or another to the perpetuation or the challenge.

It’s only when you’re getting into it at that level or that order of operation, that you begin to see what a fascinating and difficult and impossible task we all have. But that’s where it goes back to syllabus – biographical sketches of all of the authors, and the dates when they’re writing, who is this person, when was she writing? Where was this published? We just don’t, for the most part, let most of our students, we don’t make them aware that there is this whole level of endeavor of thinking. How many times have you run into people who said, “I had history in high school and I hated it.” Don’t blame the teachers, they’re doing the best they can, given the constraints of SATs and covering the waterfront and all that.

Part of the problem is history is not exciting for most people because they don’t see it for what it is. They can get into a historical novel because in one way it comes alive, but when you read a book like Said’s Orientalism, all of a sudden, the whole board game shifts. The whole perspective gets challenged in ways that can only be useful.

Who writes history? By and large, of course, it’s the victors, but we don’t know who the victors are until much later. They cover stuff up. It has to be uncovered by oddballs like historians who don’t buy into the master narrative.

A story about Vietnam: when I was teaching the Vietnam course in the mid- to late 70s, it was a smaller class size. There was less interest after a while. There was a group of the class of maybe 40 kids and we got two-thirds of the way through, and I said, “OK, you’ve got some play here on the last several weeks, what topics would you like to cover?”

I listed several possibilities, including Mỹ Lai. After the class, one of the guys who had been sitting in the back all semester said, “Well, Professor, if you were going to cover Mỹ Lai, I’d be happy to answer questions.” He had been in Lt. Calley's platoon at My Lai. He did two class periods and took us through training. He had been through Vietnamese language training. Your jaw drops.

 

What have you been doing since your retirement?

I retired in 2008 and I was 69. Since then, I’ve published 3 or 4 book-length translations. I’ve kept some of that going and I’m doing a little bit now. I’m still living in Amherst, and I stopped teaching cold turkey and haven’t gone back to part-time teaching. It’s been 12, 13 years now since I gave a talk. For a while, with the Dr. Seuss book, I was giving talks on a regular basis but I did stop and I’ve been happily [retired].

Amherst is a neat pace to retire, it’s a beautiful fit. The city is close to the hills and the roads are good for biking. I do a 25-30-mile span when I go out. I hike in the hills; I bike north and south along the Connecticut. There’s an online journal, The Asia Pacific Journal: Japan Focus, and my most recent stuff is there, including, I mentioned, a Japanese leftist pamphlet about a Japanese massacre of Chinese forced laborers in the summer of 1945. I’ve kept a toe–or two toes–in. I loved teaching while I was doing but I’m happy to not be doing it now.

I’ve always been active.

I have two sons and they are in their 50s but for a while, we did triathlons as a team. I swam, one of them biked, and one of them ran. If you did a great time you could qualify for the Iron Man. We weren’t in that category but it was fascinating just to see how fit some of the folks were.

For many students then and now, martial arts offers a way into Japanese culture. One of my students from 20 years ago, she sat in and took one of my courses. She's now a MMA practitioner, in the top ten in her weight category.

You take them where they are at, try to figure out where they are at, and what you can do that might be useful, not in terms of profession, but thinking about Japan, about life, about what it means to be human. Those folks are maybe less likely to doubt the basic humanity of the Vietnamese or Japanese. Martial arts practitioners—or fans of anime or Zen meditators—have an advantage. One toe in the door.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179785 https://historynewsnetwork.org/article/179785 0
A Personal and Family History of Encountering Prejudice and Intolerance

 

 

Anti-Asian sentiment is nothing new in America. For instance, the Chinese exclusion Act of 1882 and the acts that preceded it, especially those that restricted Chinese women from living and working in the United States and those laws that followed restricting any person of Asian descent from being anything but a second class citizen. There has always been a deep-seated fear of what white supremacists call the yellow peril, a concept that some historians believe originated as far back as the Greco-Roman Wars. William Randolph Hearst is the villain who in the 1900s popularized the yellow peril in his newspapers as a major selling tool in the era of yellow journalism. Whether he believed it, he said that America was under threat of an invasion from Japan, thus what he called the yellow peril. We should never forget Hearst and the role he played in creating this deepest of inhumane prejudices. The threat of a Japanese invasion is long gone but the fear of Asians, their look and skin color remains deeply engrained in America's collective psyche.

It is easy to review law by law how the mass of Asians suffer because of discrimination but that would be nothing new. I am here to tell you about my life, a personal history if you will, as I inadvertently became a part of the wider Asian community in many different counties. My late wife was from Saigon, Vietnam. We met in Saigon, married in Hong Kong, lived in London, Washington and New York. We had three mixed race children, two boys and a girl, now thriving adults. I have three grandchildren, boys who, because of their antecedents, are part of the Asian continuum.

We lived in dynamic cities. As a soon to be married couple we had a tough time in Saigon. Unmarried and still courting, we did not live together. When as a couple in public, Vietnamese soldiers mocked and chided us accusing me of usurping their women and calling my future wife a whore. We found it better to walk separately with me behind her and to never hold hands or otherwise touch in public. Incidentally, but not less important are the memories my wife had of being chased through the streets of Saigon by French soldiers from Africa. From an early age she had an understanding of what it meant to be sexually harassed.

We thought life in Hong Kong would be better and for the most part it was, but prejudice tailed us everywhere we went. In the 1960s, Hong Kong was a progressive, dynamic city with many mixed race couples. European mixed race couples were more easily acceptable than I was as an American with an Asian wife. I should note that during the Vietnam War, there was no love for Americans. The war was not very popular in South East Asia, of which Hong Kong was a part. Do I attribute the prejudice we felt to my being an American, for some reason easily detectable because of the way I walked, looked, dressed? To a degree, yes but it was mostly because we were a couple. A Chinese doctor friend said with a smile, that many Asian men could not understand why a beautiful Asian woman, particularly from Vietnam, would consort with a pale faced, big nosed American. Beyond that popular descriptive utterance he had no answer why prejudice should be part of anyone's life. I knew it was not part of his life.

Life in London, Washington and New York, at least on the surface seemed to have less prejudice than in either Saigon or Hong Kong, yet it still existed in many forms, especially for my wife and then for my mixed race children as they grew and we established ourselves on Long Island. The outward expression of prejudice we experienced were the hard stares of people who viewed us as beings out of the ordinary. Most of Long Island then was conservative and not very progressive. So seeing a mixed couple and often their mixed race children out for a meal in public was indeed strange, enough so that most people could not help staring even for a moment. We were uncomfortable but we did nothing to stop the stares. We learned that keeping to ourselves in public was the best defense, though at times I wanted to physically strike out against their stupidity as I once did in a movie theater in Hong Kong when we faced a crowd of teenagers who attacked us verbally for being a couple. 

My wife worked for years on Long Island helping settle Vietnamese, Lao and Cambodian refugees. She served as a court interpreter for many refugees helping them understand a language for the most part they did not speak or understand. She worked with them to traverse the intricacies of the benefits they were due. The prejudice those new immigrants felt knew no bounds but they never, nor did she, ever complain publically. Getting and holding jobs, making life work, no matter how trivial what they did may have seemed, was more important than registering a complaint about a life they were trying to understand and manage to survive. Many of these former refugees, now adults, made it through to the new world of opportunity in America. When they first arrived, intolerance, though a concern, was not an issue. In time, they ignored, but never forgot, the unreasonable hatred they knew as newcomers to our so-called hallowed land. For many years there was no issue with hate crimes. Now that there is, living their lives to the full and educating their children about the faults of hate and intolerance works best for them in our current climate.

I am Caucasian Jewish, my family from Lithuania, and Russia. That is normally enough for full bore bigotry. I grew up in a diverse neighborhood in Brooklyn and felt almost no prejudice. It was not until college that I suffered for being not only Jewish, but a New York Jew, a condition I survived with added strength into adulthood. My wife was South Vietnamese, part Chinese and a Buddhist. By the sheer force of her personality she was able to overcome much of the racial intolerance that permeated Long Island but she never understood why some people did not like her because she looked different, For my mixed race children, today all worldly adults, they are a part of a unique fabric that is more like an abstract quilt. They are white and Vietnamese but with those other fragments blended in. It was quite a mix and a serious burden for young children to carry. As children they knew they looked different. Everyday in their preteen and young teen years they knew the slings and arrows of racism. "Chink," was the epitaph applied to how intolerant and mostly ignorant kids and teenagers usually attacked my sons. My oldest son knew he looked different. He turned to Judo in the hopes that he could defend himself if attacked. My daughter simply said yes when asked if she knew bigotry but she did not elaborate.

As a family we never talked about hatred and racial intolerance but I know this: what my wife and children went through informs my children's lives to this day. In a backward sort of way, the idea of bigotry in their lives has taught them to be better husbands, a better wife and better parents. They are better people for what they learned, for what many other people have never known or will have sadly ever understood.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179784 https://historynewsnetwork.org/article/179784 0
Paying the Price: Our Veterans and the Burden of Parkinson's Disease

 

 

 

Parkinson’s disease is the world’s fastest growing brain disease, even faster than Alzheimer’s.  The number affected worldwide has doubled in the past 25 years and, absent change, will double again in the coming generation.  In the U.S., 1.1 million Americans bear its burden, up 35% in just the past decade.  The toll is especially great on veterans; 110,000 have the debilitating disease.

 

Veterans are at high risk for at least three reasons.  First, many were exposed to toxic herbicides like Agent Orange during the Vietnam War and other conflicts.  Richard Stewart is one of those affected.  He is a former Green Beret who served as a platoon leader in Vietnam for the U.S. Army’s famous 101st Airborne Division.  He, like thousands of other veterans and millions of Vietnamese, were often soaked by the 45 million liters of Agent Orange (“pretty nasty stuff” in his words) that were sprayed in the country.  The chemical, which derived its name from the large orange barrels in which it was stored, killed vegetation and crops and contributed to birth defects, cancer, and Parkinson’s disease.  Today, Stewart lives in upstate New York with his wife, a “flower child who peacefully protested the war.”  He still walks 2.5 miles and does 200 push-ups daily, is a member of local veterans’ groups, and says, “I only have Parkinson’s.  A lot of people are worse off.”

 

Pesticides are not the only chemical contributing to Parkinson’s disease among veterans.  Trichloroethylene, or TCE, is another.  TCE has been used to decaffeinate coffee, clean silicon wafers, and remove grease.  The military used the dangerous chemical to clean engines and vehicles.  At the Marine Corps Base Camp Lejeune in Jacksonville, North Carolina, TCE and 70 other chemicals poisoned the base and its water supply for 25 years.  Over one million service members, their spouses, and children were exposed to its toxic effects, leading to miscarriages, birth defects, cancer—and Parkinson’s disease.  Many drank contaminated water or inhaled TCE that had evaporated into their homes, like radon, from polluted groundwater.  The consequences of that exposure are still being felt 30 years later.

 

Finally, head trauma contributes to Parkinson’s disease.  A single head injury causing loss of consciousness or memory loss can triple the risk of Parkinson’s.  Repeated head trauma raises the risk even further.  These injuries are common in the military.  According to the U.S. Department of Defense, nearly 400,000 service members have had a traumatic brain injury since 2000.  Another eight million veterans have likely experienced such an injury.  Of those with moderate or severe injury, one in fifty will develop Parkinson’s within 12 years.

 

So what can we do to help our veterans? The first and most important step is to prevent those who serve from ever developing the disease.  Banning harmful pesticides and chemicals like TCE, which the Environmental Protection Agency has proposed to do, is an important step.  We also need to clean up contaminated sites throughout the country, many of which are located on current or former military bases.  In addition, service members must have proper equipment to minimize the risk of head injury.

 

Next we need to advocate for those that have already been harmed.  Veterans who have Parkinson’s and were exposed to Agent Orange are now eligible for disability compensation and health care.  Some efforts have been made to help those who have Parkinson’s tied to their service at Camp Lejeuene.  But these efforts are insufficient and have excluded many who have been injured.  For example, in 2019, the U.S. Navy denied civil claims from about 4,500 harmed at Camp Lejeune. 

 

We also need more research to prevent, measure, and treat the condition.  Despite Parkinson’s growth over the past decade, funding from the National Institutes of Health for the condition, adjusted for inflation, has actually decreased. 

 

Anyone anywhere with Parkinson’s should receive the care that they need.  The Veterans Health Administration has long had dedicated centers to research and treat Parkinson’s.  However, not every veteran lives near one of these centers.  Telemedicine is one way to expand the reach of care, but some veterans do not have internet access.  Others need in-person in-home care and support.  Increased access and novel care models can help ensure that no one suffers in silence.

 

Finally, better treatments for Parkinson’s disease are lacking.  The most effective medication for the condition is now 50 years old, and we have had no major therapeutic breakthroughs this century.  The economic burden of Parkinson’s disease is over $50 billion per year.  Federal and foundation support is less than 1% of that total.  That will not get the job done.  We must increase our research efforts ten-fold to change the course of Parkinson’s as we did for polio, HIV, and COVID-19.

 

Veterans have served and sacrificed too much to have Parkinson’s be their fate. 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179786 https://historynewsnetwork.org/article/179786 0
Teachers, Keep Hope about the Minds You Influence

Professor Donald Treadgold of the University of Washington. 

 

 

How and to what degree does a teacher impact a student?  I doubt that we will ever be able to gauge the matter. I believe it may boil down to matters of hopefulness and pessimism and the moral imperative of making a choice between them. Surely each one of us, both professional educators and laymen, impacts the lives of people we cross, but we often don't know which ones or to what degree. We must remain hopeful as an article of faith.

I once had a memorable professor at the University of Washington, Donald Warren Treadgold, an eminent scholar of the Soviet Union. Let's be kind and just say that this man, a cold warrior extraordinaire, knew his own mind. He was more than a little famous for that. Professor Treadgold was a prolific author, and his work was known throughout the world. He authored Lenin and His Rivals; The Great Siberian Migration; Twentieth Century Russia; The West in Russia and China (two volumes); and Freedom: A History. When I first wandered into one of Professor Treadgold's classes, almost all of my study had been on western Europe. I was a stranger in a strange land in my attempts to learn about Russia and the Soviet Union. Considering myself a hotshot, I plunged forward. But wait. The rub was that Professor Treadgold attempted to teach me a great deal that I found myself resisting at every turn. It all took place within the constraints of academic etiquette, but make no mistake, this was a slugging match. And it was a mismatch, for he knew so much, and I knew so little. I considered him to be an old relic. He considered me to be a dopy, misguided, poorly informed idealist. I dug in. He persisted. Throughout the following years, my memory of him remained fresh. I continued to remember his disdain for my viewpoints, his deep learning, his patient demeanor, and the overall gentleness of his character. And as the decades passed, I found myself incorporating much of what he had vainly tried to teach me. It dripped into me, consciously and subconsciously. I never swallowed it whole, but the slow drip never stopped. I can now firmly say that he had as great an intellectual impact on me, both morally and intellectually, as any person that I have known. One day, many years later, I was pecking away at my computer. Suddenly, for no conscious reason, I did googled his name. I found that two years before he had passed away as a result of leukemia. Stunned, I gazed out my window. The sun was going down and it looked cold outside. The streets were empty. I placed both hands over my face and sobbed like a little child.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179783 https://historynewsnetwork.org/article/179783 0
The Roundup Top Ten for April 1, 2021

The Painful History of the Georgia Voting Law

by Jason Morgan Ward

The new wave of vote suppression bills, like the one in Georgia, reflect a less obvious but important aspect of Jim Crow law: the use of superficially race-neutral language to keep specific groups from voting. The danger is that courts today will similarly fail to see these bills for what they are. 

 

Mitch McConnell is Wrong. The Filibuster is, in Fact, Racist

by Keisha N. Blain

"Try as he might, McConnell cannot erase the historical record. To use his own words, 'There's no dispute among historians about that'."

 

 

Working with Histories that Haunt Us

by Marius Kothor

The author responds to a recent essay on the traumatic aspects of archival research. As a political exile from Togo, her identity and experience converged with subject matter she couldn't study at a remove. 

 

 

Government has Always Picked Winners and Losers

by David M.P. Freund

Government action has always been tied to economic growth, and always involved policy choosing winners and losers. Policies proposed by the Biden administration as part of the COVID recovery aren't inserting the government into the market, they're changing the parties favored by government policy. 

 

 

The Problem with Confederate Monuments

by Karen L. Cox

"I also believe it’s important that I, a Southern white woman, write and speak about this topic with blunt honesty. Monument defenders cannot dismiss me as a Northern liberal who has invaded the region to tell them what to do. I’ve grown up here, too."

 

 

Teaching Controversial History: Four Moves

by Jonathan Wilson

A reflection on the work of teaching controversial subjects argues that it's essential to respect students' autonomy and provide them with the tools with which to change their own minds. 

 

 

Who's Afraid of Antiracism?

by Chelsea Stieber

Recent books in different genres shed light on the limits of the French governing ideal of republican universalism for a society where racism is real and historically significant. 

 

 

Paleo Con

by Daniel Immerwahr

Why do the lifestyles of paleolithic hunter-gatherers repeatedly pop up as foils for western capitalist modernity? 

 

 

The Lack of Federal Voting Rights Protections Returns Us to the Pre-Civil War Era

by Kate Masur

New vote suppression bills in multiple states threaten to return the United States not to the Jim Crow era but to the period before the Civil War and Reconstruction when civil and political rights were protected or denied according to state politics. 

 

 

America’s Longest War Winds Down

by Andrew Bacevich

Public fatigue over the ongoing War on Terror must not allow political leaders to do what they seem to want most to do: avoid taking responsibility or learning lessons.

 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179777 https://historynewsnetwork.org/article/179777 0
Telling the Story of the "Ghost Children" of Germans who Plotted Against Hitler

 

 

 

I was excited.

 

I recognized the tell-tale signs of discovery after a decades-long career of sharing under-told stories from history with children and teens. Excitement, yes, and a sense that I’d found my next book. But this was something more. My latest discovery literally took my breath away.

 

Not only had I unearthed an episode of World War II history that had barely been told beyond Germany, I’d found one that was anchored by a little-known diary kept by a child eyewitness to the events. And she was still alive. And so were other “children” from this history. And I was going to be able to meet and interview them.

 

Their stories were intimately intertwined with a much better known occurrence: the Valkyrie coup attempt of July 20, 1944, that began with Claus von Stauffenberg’s efforts to kill Hitler at the dictator’s Wolfsschanze, or Wolf’s Lair, hideaway. Stauffenberg’s explosive device killed four men, but not Hitler, and the associated coup crumbled.

 

Within hours Stauffenberg and three associates had been executed by firing squad. Theirs were among the earliest deaths in a wave of retribution that would claim over 150 lives. But this trail of vengeance—embedded within the greater horror of the Holocaust—and as gruesome and unjust as it was, marked only the starting point for Hitler’s revenge.

 

That’s where the children came in. And the diary, and two research trips to Europe, and six years of alternating work and fermentation along the path toward creating Ensnared in the Wolf’s Lair: Inside the 1944 Plot to Kill Hitler and the Ghost Children of His Revenge (National Geographic Kids: 2021).

 

Children remained top of mind throughout my work—those who had experienced the past and those who would learn about it through my book. I was also concerned about the adults that my eyewitnesses had become. These people were in their 80s now or more. Most had dodged and battled with shadows of trauma for years, even a lifetime, and I didn’t want my project to become another triggering event. I would need to employ patience and discretion during our shared backward glance.

 

Context is everything when writing for children and teens. I assume a baseline knowledge of, well, nothing. The trick is to feather in a framework of understanding without overpowering the history-driven narrative. In order for my readers to empathize with the child protagonists in this book, they had to understand a wealth of background: how Hitler rose to power, the shifting nature of German resistance to his rule, World War II history, and the events of Valkyrie itself.

 

Opening chapters offer this context, but whenever possible I share information through the lens of the families that would become intertwined in Valkyrie and the punishments that followed. I particularly drew on the childhood memories of my interview subjects. I hoped readers would feel even more connected to history when viewing it through the eyes of an earlier cohort of children and young adults.

 

When I write, I aim to make the history feel personal, urgent, and irresistible. For this project I relied on historical facts, photographs, and eyewitness recollections to draw my audience into the drama of the events. I needed readers to comprehend how challenging and daring—and likely doomed—it was for the fathers of these children to try and overthrow Hitler’s regime.

 

Twelve-year-old Christa von Hofacker began keeping a diary shortly after her father’s arrest in Paris for his role in the attempted coup. No sooner did her father’s fate become uncertain than her mother’s did, too. And her older sister’s. And her older brother’s. All three family members were taken away by Gestapo agents with minimal explanation soon after her father’s arrest. Christa and her two younger siblings remained at home under a patchwork of care that included an unwelcomed Nazi state nurse.

 

Readers of Ensnared in the Wolf’s Lair know all about Christa and her family by the time these relatives start to go missing. They’ve been following them since the opening pages of the book with a growing sense of anxiety that makes the shock of their arrests that much more personal and distressing.

 

I was able to build these bridges between readers and subjects because of direct connections I’d been fortunate enough to establish with Christa. Using interviews, correspondence, and access to her diary, I could share her perspective through her childhood writings as well in statements made with a lifetime’s worth of lived experience and insight.

 

Similarly I was able to introduce readers to the memories of Berthold von Stauffenberg, the oldest son of Hitler’s would-be assassin, who was ten years old in 1944. During my second research trip in Germany I was fortunate enough to meet and interview him as well as Friedrich-Wilhelm (Friewi) von Hase, a retired professor who had been seven when his family became entangled in the aftermath of the failed coup.

 

Friewi’s older sister, Maria-Gisela, is now in her mid-nineties, but her memories of those years remained fresh during our conversations. She was twenty in 1944 and was among the older family members swept up in early Gestapo arrests. These relatives became pawns for use as leverage against the conspirators, faced interrogations of their own, and were generally terrorized by their extra-judicial captivities.

 

Younger family members experienced their own terror after they were removed from their parentless homes and spirited away to a remote compound for weeks or months on end. These girls and boys came to be known as the “ghost children” and were offered next to no explanation for their detentions or their fates.

 

The children spent weeks and even months in suspended animation at their hideaway in the Harz Mountains of central Germany. After older relatives began to be released from prison, they found it next to impossible to locate the missing youngsters, both because there was a war going on and because the children’s surnames had been changed to obscure their identities. Christa, Berthold, and twelve others remained in captivity when Allied forces reached the property on April 12, 1945.

 

These events are still living history for the children and young adults who have carried it with them into their senior years. They are also captured in the historical record through memoirs and letters from other eyewitnesses. My goal as an author was to transport new generations back to this era without losing the immediacy of those personal connections. Direct quotations from interviews, written accounts, and Christa’s diary are essential to that work, but so are my own memories from the research.

 

During my investigative travels I had seen the Wolf’s Lair and explored the grounds of the wartime complex where Hitler had been attacked. I’d visited the Borntal, too—the Harz Mountain property where Christa and other children were detained—and I’d even been permitted to wander the abandoned interior of one of their former residences. I also had memories of seeing the cities the children had known, the places where conspirators had been executed, the room where the coup had failed.

 

I infused my text with those details, too—and the emotions that accompanied them, from amazement at the grandeur of the lives these families had once led, to the paralyzing contrasts of the situations that followed, to the gritty terror of the places where the accused had been killed.

 

Lives are built around memories, emotions, and personal connections. So are histories. For me the best way to engage readers in the past is to bring it to life, to make it fresh, to create such tangible connections through text and illustrations that readers almost fall into the pages of the book and travel back in time.

 

I try to personalize the reading experience even further by capturing stories from the past that resonate in current times. Whether it’s the tutorial on the power of propaganda, or the account of the rise of a demagogue, or the conveyed terror of family separations, readers aren’t just learning about events from the past. They’re learning how to live in the present.

 

That’s something to get excited about, too.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179728 https://historynewsnetwork.org/article/179728 0
HNN Will Be OFF This Thursday and Friday (April 1 and 2) HNN will be taking the end of this week off (April 1 and 2). We will not be emailing a newsletter on Friday morning, but will post the week's Roundup Top Ten, along with a slate of new op ed essays, on Sunday. HNN will resume normal news posts on Monday, April 5. 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179768 https://historynewsnetwork.org/article/179768 0
Ammon Bundy's Ongoing Religious War

Ammon Bundy's arrest for failure to appear was triggered by his refusal to wear a mask into the courthouse. 

Still from Ammon Bundy YouTube. 

 

 

 

Ammon Bundy, right-wing malcontent behind the 2016 armed takeover of Oregon’s Malheur Wildlife Refuge and now a western anti-mask movement, believes he’s doing God’s work.

 

Coming from a long line of religiously inspired men who have been “called” to defend the US Constitution, Bundy has varied in his focus, from rebelling against public land ranching regulations to protesting COVID-19 safety protocols. But in his view, these are all forms of government tyranny and affronts to constitutional rights. Arrested for the fourth time on March 15, 2021, Bundy was taken into custody for failing to appear at his hearing on past trespassing charges. Because he refused to wear a mask into the courtroom, thereby missing his trial, he was apprehended outside amid of a throng of other protesters.

 

Bundy’s crusade has been a long time in the making, but in the last year he successfully established a coalition of supporters that is broad, diverse, and a serious threat to federal law. His group is called the People’s Rights Network. Like the Oath Keepers and Proud Boys, it includes members who see the current government as a threat to perceived rights and are committed to defend their ideas of personal liberty, by force if necessary.

 

So what has taken Ammon Bundy, who first came to prominence during the 2014 armed standoff in Nevada over his father’s unpaid grazing fees and trespassing cattle, into a life of an anti-government militant? The answer is a libertarian worldview and his take on Mormonism. Bundy’s ideology parallels the thinking of certain leaders in the Church of Jesus Christ of Latter-day Saints, who’ve had a history of government cynicism. He also shares with them a tradition of theo-constitutionalism--venerating the Constitution as a sacred document. The paradox here is that Bundy believes he is upholding the Constitution and fulfilling his religious duties in his acts of lawlessness.

 

His impetus has roots in the early Church. After the founder and first prophet of Mormonism, Joseph Smith (1805-1844) was murdered, Brigham Young (1801-1877) assumed the reins of the Church and brought the Latter-day Saints into the Great Basin. By that point, Young and his brethren were disgusted with the US government after the years of mistreatment and bigotry they had faced. But the Mormon people kept great faith in the Constitution. While still an apostle of Smith, Young said “I find no fault with the Constitution or laws of our country; they are good enough. It is the abuse of those laws which I despise, and which God, good men and angels abhor.” He later avowed “ Corrupt men cannot walk these streets with impunity, and if that is alienism to the Government, amen to it. The Constitution of the United States we sustain all the day long, and it will sustain and shield us, while the men who say we are aliens, and cry out ‘Mormon disturbance,’ will go to hell….But to proceed; the principal evil is in the rulers, or those who profess to be rulers, and in the dispensers of the law…”

 

Young was not just the leader of the Church; like Smith, he was a prophet. Although he was not as prolific in his revelations as other Mormon prophet/presidents, these statements are memorialized in the Journal of Discourses and the History of the Church, texts not part of official Church doctrine, but significant, especially to those with radical leanings. Over the years, many Church prophets echoed Young’s sentiments, from Wilford Woodruff (1807-1898) to Ezra Taft Benson (1899-1994), reinforcing the idea that the Constitution is good, but not those who govern under it. Benson took that idea further, declaring that the Mormon people had a religious obligation to protect the Constitution, even if this meant violence. In 1979 he declared, “I say to you with all the fervor of my soul that God intended men to be free. Rebellion against tyranny is a righteous cause. It is an enormous evil for any man to be enslaved to any system contrary to his own will. For that reason men, 200 years ago, pledged their lives, fortunes, and sacred honor. No nation which has kept the commandments of God has ever perished, but I say to you that once freedom is lost, only blood – human blood – will win it back.”

 

So this is where things get treacherous. If the Constitution is sacred, but those overseeing it are evil, then who determines and upholds the law of the land? Bundy has come to think that this is his duty—a chilling certainty that is likely to escalate during this current administration. As vaccines are more widely administered and mask mandates therefore become less of a concern, Bundy’s focus will return to his original cause. The new Secretary of the Interior, Deb Haaland, is now charged with overseeing public lands, including the place where Ammon’s Bundy’s father, Cliven, continues to illegally graze his cows. The patriarch Bundy and his most infamous son share the conviction that the federal government has no constitutional right or power to own land; hence the land belongs to those white people who have occupied and used it, and the requirement of grazing fees paid to the Bureau of Land Management is unconstitutional. Although Cliven has repeatedly lost his appeals in federal court, and currently owes over a million dollars in fines and fees, the old rancher’s cows still roam the same BLM land, years after the Nevada armed standoff. To Ammon, mask mandates and grazing regulations are the same thing—affronts to constitutional rights. Law and common good be damned—he sees both as tyranny. He is determined to protect the Constitution, even by unconstitutional means. Where the next action is again taken—Nevada, Oregon, or somewhere else on the 600,000,000 acres of American public lands—remains to be seen.

 

In 2018, Bundy talked before an audience in South Jordan, UT during an event advertised as a “power packed four hours, with an LDS [Latter-day Saints] perspective, but practical info for all true friends of liberty.” He told them about his father’s dream, in which people approached a large building. Inside the building sat a golden calf, a biblical reference to a false idol, that Cliven understood to symbolize the American court of law. Ammon explained that the dream meant people are putting their faith in judges who do not have their best interests at hearts. “You can’t worship the golden calf, you have to have faith in God,” he told the audience. “When you know for certain that those are your own rights, you do not allow them to be questioned. And I know that’s a strong thing I’m saying. But when you do that, then your friends need to come and protect you also. And it’s a duty of ours to do that.” Four months later, PeoplesRights.org was registered, a year and two months before pandemic brought America to a screeching halt. COVID-19 gave him a cause that fit a long ongoing narrative. The pandemic swelled the ranks of People’s Rights because of conspiracy theory and righteous anger, but it wasn’t invented in response to it.

Ammon Bundy has been looking for a next battle since the takeover of Malheur, when he led a group of heavily armed militia to occupy government buildings in Harney County, Oregon. During that takeover, one man, LaVoy Finicum, was shot and killed by police. Ammon now has his own militia, the People’s Rights Network, an army of over 50,000 members in 50 states, according to the organization’s website. He recently finished a recruitment tour in Utah, talking God, liberty, and the need for vigilante action—antidotes to golden calves. His campaign is part of a long drawn arc and we shouldn’t expect his rebellion to end with a die-down of COVID-19. Bundy’s attention will return to public land battles, where the first insurgencies began. It wasn’t COVID-19 that spurred the formation of the People’s Rights Network and inspired Bundy’s mission, it was a deeply rooted sense of righteousness, Cliven’s dream, and a version of Mormon ideology.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179726 https://historynewsnetwork.org/article/179726 0
America Does Have an "Original Sin": A Response to James Goodman

 

 

Scrolling amongst all the clickbait on CNN’s website these days, I was genuinely intrigued when I saw James Goodman’s headline “It’s Time to Stop Calling Slavery America’s ‘Original Sin’.” I was, however—as I often am—left disappointed by the click after I read the contents.

 

While I agree with Goodman’s conclusion that slavery is NOT America’s original sin, I disagree with how he arrives at this conclusion. Goodman begins by critiquing the theological origins of the idea of “original sin,” rejecting the concept as irrelevant to our polity, seeing it as an unnecessary confusion between our secular state and Christianity. Indeed, American scholars can sometimes be quick to dismiss the importance of religion in American society. If this were simply a response to the recent corrupt entanglement of evangelicalism with the American right, it might be more understandable, but the fact is, Goodman’s reaction is illustrative of a much larger trend among American scholars that tends to ignore religion (and especially, conservative religion) and its influences in the hopes that it will just simply fade away. We can find the seeds of this in American life even among liberal theologians shortly after the end of the First World War, when Harry Emerson Fosdick, a prominent Presbyterian public theologian, intellectual, and modernist asked, “Shall the Fundamentalists Win?”

 

To understand American history, and the American present, means that we must also understand the lasting influence of American religion. The idea of “original sin,” is, in fact, an apt metaphor for what continues to plague American society. It is just that slavery is a symptom of this original sin, and not the first sin itself.

 

But before I address America’s original sin, I first want to defend the use of the metaphor. In 1967, in what is now a seminal classic article in the field of sociology of religion, the esteemed scholar Robert Bellah published “Civil Religion in America” in Daedalus, the Journal of the American Academy of Arts and Sciences. In this article, Bellah demonstrates how American politicians, as well as the American people, have practiced a “civil religion,” that has allowed politicians to continue to use and co-opt the Deism of the American founding generation as a unifying religiopolitical force that acknowledges a “divine providence” that was generic enough for Americans from most religious persuasions to accept. Politicians used and continue to use the rhetoric of American Civil Religion because it works, and the vast majority of Americans accept it. For those that doubt, witness the recent use of religious language to describe the U.S. Capitol during the middle of the armed insurrection against President Joe Biden’s election: American politicians and citizens alike were aghast that Trump’s supporters would violate the “sacredness” of the Capitol Building and America’s political institutions under the Constitution. Political crimes against the “spirit” of Democracy truly are sins to the American people because of American Civil Religion.

 

Further, Goodman misunderstands the term of original sin. While he is correct that the concept in Christianity means that an original actor committed the sin, he is incorrect in his assertion that those who inherit the guilt of this sin are not responsible for it. While many Christian faith traditions reject the doctrine of original sin outright (it is, after all, originally a doctrine of the Catholic Church), those Christians that do hold to it believe that the original sin tainted all of humanity, to the point that it ensured continued sinfulness among all peoples, and it also meant that without a redeemer, all those who inherited this sin would be condemned to hell for its seriousness.

 

Translated to American Civil Religion, such an original sin would mean that all of America is condemned for the nation’s (in this case, white America’s) crimes. Without redemption, the entire country is on the course to hell. However, in American Civil religion, the Great Emancipator of Abraham Lincoln has (in the eyes of certainly not all, but many), functioned as the redeemer, the divine agent of providence that would finally free us from the scourge of this original sin. However, as I look around American society, we still appear to be damned. Racial inequality remains long after both emancipation and the Civil Rights era. Unarmed black men and women continue to be shot by our law enforcement officers, who have been socialized to believe that the next person they contact might try to kill them, and even more so if they fail to possess white skin. It is quite apparent that the damnation of America’s original sin is still with us.

 

America’s original sin, however, is not slavery, it is settler colonialism. As Goodman finally acknowledges in his piece, slavery would not be America’s original sin, because America’s sinfulness begins with the dispossession of indigenous lands. Settler colonialism, however, as an organizing principle, encompasses not just the theft of lands from the indigenous peoples of the Americas, but it also includes the forced labor of slavery, both chattel and otherwise, in order to bolster the budding capitalistic project of imperialism. After all, the original form of British American colonialism was carried out by the Virginia Company, incorporated under the British crown in order to carry out the Empire’s wishes and to make its original shareholders a significant profit.

 

John Winthrop of the Massachusetts Bay Company and colony, in a letter to both potential supporters and detractors, described the right of the Puritans to take indigenous lands as a reaction to the closure of the English commons. The rich in England had enclosed the land that commoners had used for centuries to grow crops and graze upon in order to sustain themselves, and this natural right (granted by God, he believed) gave way to a civil right to make such land private by the act of erecting fences and gates. Believing that God had shown him and the Puritans a new commons, they argued that they could use any land not currently used and occupied to European standards by the indigenous peoples of America. While Winthrop made this argument, much of his letter is spent arguing against those who had warned that it was immoral to take indigenous lands because indigenous people had occupied it for centuries. While Winthrop thought he had the right to take indigenous lands, some back home in England thought the theft of the land was sinful. And they were right.

 

Settler Colonialism continues to damn America. Today, in the midst of the COVID-19 pandemic, the virus kills more indigenous and African American people than it does white Americans. Where I live, on the Navajo Nation, many of my students are among the 30% of the Navajo population that in 2021 still do not have electricity or running water in their homes. Some of my students charge their cell phones, tablets, and laptops in their cars overnight and use mobile hotspots in order to turn in their homework, but can’t take a shower in their own homes and have to use camp stoves to cook. With 57-hour weekend lockdowns to stop the spread of COVID, many couldn’t even keep perishable food cold over the weekend because of a lack of access to ice.

As part of settler colonialism, indigenous communities are denied their full sovereignty. Recently, two different indigenous nations in South Dakota shut down their borders to non-tribal members in order to keep the virus from infiltrating their communities, since the massively underfunded Indian Health Service would be quickly and completely overwhelmed by the arrival of COVID-19 to their reservations. In response, South Dakota Governor Kristi Noem threatened both tribal governments with lawsuits if they did not allow traffic to flow freely in and out of their reservation. The requirements of settler colonialism and capitalism demanded the death of more indigenous people in order to keep money and “liberty” flowing. Settler colonialism is America’s original sin, and it continues unabated to this very day. America will continue to find itself in hell—or at least, in purgatory—until it repents from and seeks redemption for its actual original sin. Additionally, American scholars will continue to misunderstand American society and culture if they fail to take stock of American civil religion and the spiritual beliefs of Americans in general. Original sin, after all, must be cleansed by a redeemer, but first, we must acknowledge our complicity in the sins of our ancestors.

 

Editors Note: HNN excerpted the essay by James Goodman referred to here in our Roundup of op-eds in February.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179725 https://historynewsnetwork.org/article/179725 0
Will the Supreme Court Uphold the NCAA's Version of Amateurism?

EA Sports has, to the chagrin of many gamers, not produced an NCAA-licensed football video game since 2013. The video game market is just one area of dispute over the right of collegiate athletes to compensation for the commercial use of their names, images, or likenesses (NIL). Image Sports Gamers Online.

 

 

 

On March 31, 2021, the U. S. Supreme Court will hear the case of NCAA v. Alston.  It is an antitrust case in which the NCAA argues that the property rights of Division I basketball and FBS football athletes should be dismissed because college athletes are amateurs.  If the NCAA wins, college athletes will continue to be deprived of financial benefit from the commercial use of their names, images, and likenesses (NILs).  The NCAA argues that it must control what players are paid to protect their amateurism.  Shawn Alston, as part of a class action suit, argues that the NCAA is violating antitrust law, and that property rights belong to the athletes as they do for all other college students, and they should be able to profit from their use.

 

As I have just completed a book, The Myth of the Amateur:  A History of College Athletic Scholarships (University of Texas Press, 2021), I decided to initiate an amicus brief for the U. S. Supreme Court, challenging the NCAA’s perpetuation of the myth of college amateurism.  I asked five other historians who have written on the history of intercollegiate athletics to join me in the amicus brief. Taylor Branch wrote a piece in the Atlantic Monthly (October 2011) about the exploitation of college athletes under NCAA rules.  He is better known for the Pulitzer Prize he won for Parting the Waters, the first book in his trilogy on Martin Luther King.  Richard Crepeau is a historian at the University of Central Florida, who has written on Roman Catholic athletes as well as a recent history of the National Football League.  Sarah Fields, a lawyer and historian at the University of Colorado, Denver, has written a book about female competitors in contact sports and one on sports celebrity and the law.  Jay Smith is a French history scholar at the University of North Carolina, Chapel Hill, but has written on the decades-long disgrace of the notorious athletic-academic scandal at his institution.  John Thelin is a prominent educational historian at the University of Kentucky, who wrote a history of college athletic reform attempts.

 

I was a history major and a baseball and basketball player at Northwestern University decades ago and was given a scholastic scholarship, but I was told if I did not keep a “B” average it would turn into an athletic scholarship.  I was not good enough to profit from my NIL, nor did anyone on my teams know that it was possible.  Later I did my doctoral work at the University of Wisconsin, writing my dissertation on an athletic conference.  I then joined the Penn State University faculty shortly after Joe Paterno became head football coach.  I have been interested in how athletes have been paid, not only when I was an undergraduate, but when I began researching intercollegiate athletics.  That took me back to the original college contest in America. It took place in 1852, nine years before the Civil War.  It occurred because a railroad magnate wanted to make money from sponsoring a crew meet between Harvard and Yale.  He told the Yale captain that if he would get Harvard athletes to agree, he would pay all expenses for an eight-day vacation for the crews at New Hampshire’s largest lake, Lake Winnipesaukee.  From that time to the present, athletes have often been paid in one form or another.

 

Today, a major question is the paying of athletes for their property rights to use their names, images, and likenesses.  The U. S. Supreme Court will soon be pondering the question of whether the NCAA denying such property rights to athletes is a horizontal antitrust violation under the Sherman Antitrust Act of 1890.  The issue should have come up first in the early twentieth century, more than a century ago.  Probably the most prominent player among the big-time football schools of Harvard, Yale, and Princeton was James Hogan, a 29-year old who was paid in a variety of ways.  He had his tuition paid, lived in the most luxurious of Yale’s dormitories, ate free meals at the University Club, profited on scorecard sales at Yale baseball games, and was given a vacation in Cuba when the season was over.  But more important to the Supreme Court case, Hogan profited from his name and image from every American Tobacco Company pack of “Hogan” cigarettes sold in New Haven.  It was legal then to profit from his NIL. 

 

The U. S. Supreme Court will hear of the many ways that big-time college athletes are paid in 2021.  That is, there are more than a dozen methods in which athletes are paid legally beyond a full athletic scholarship, but which don’t violate “amateurism” under NCAA rules.  For instance, Olympic swimming gold medalist Katie Ledecky was awarded more than $300,000 in the last Olympics, and she remained eligible to swim for Stanford University.  An international gold medal swimmer, Joseph Schooling of Singapore, was given $700,000 for beating Michael Phelps in the 100-meter butterfly.  He then swam for the University of Texas as an amateur. The NCAA also allows athletes to draw thousands of dollars from two multi-million dollar funds, the Student Assistance Fund and the Academic Enhancement Fund.  In addition, federal Pell Grants for needy athletes increase athletes’ funding.  The NCAA also allows money to go to the conference athlete of the year, a team’s most-improved or most valuable player, and for bowl bound or March Madness player freebies worth hundreds of dollars.  This is not amateurism that the NCAA claims it is protecting.  Yet, the NCAA opposes a player gaining a portion of the revenue made from game jerseys sold which display his or her name and number (or from video games licensed by the NCAA). That property right is off limits, and a player who seeks to capitalize on it will be classified as a professional and lose eligibility.

 

Our amicus brief points out the NCAA’s hypocrisy by quoting Taylor Branch: “no legal definition of amateur exists, and any attempt to create one in enforceable law would expose its repulsive and unconstitutional nature.”  According to Branch, “without logic or practicality or fairness to support amateurism, the NCAA’s final retreat is to sentiment.”  Historically, we point out the false NCAA claim that amateurism is central to college sport.  NCAA amateurism, originally opposed to any athlete being paid in any form, has been modified so drastically that athletes can now be paid in a variety of ways.  What sets college athletic participation apart from “professional” sports is not that intercollegiate sports are amateur, but that they are part of institutions of higher education.  College sports are, at least nominally, intended to be educational, while professional sports are not. Before a decision is made in the NCAA v. Alston case, the Supreme Court justices should read a brilliant article in the Harvard Law Review published three decades ago.  “Judicial invalidation of the amateurism principle,” the legal experts stated, “may actually allow the NCAA to place more emphasis on academic values in its members’ sports program.”  Six knowledgeable historians agree.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179720 https://historynewsnetwork.org/article/179720 0
Can Abolition of Nuclear Weapons Overcome the Opposition?

White House vigil, June 2006. Photo moi 84CC BY-SA 3.0

 

 

Given the fact that nuclear war means the virtual annihilation of life on earth, it’s remarkable that many people continue to resist building a nuclear weapons-free world.  Is the human race suicidal?

Before jumping to that conclusion, let’s remember that considerably more people favor abolishing nuclear weapons than oppose it.  Public opinion surveys—ranging from polls in 21 nations worldwide during 2008 to recent polls in Europe, Japan, and Australia—have  shown that large majorities of people in nearly all the nations surveyed favor the abolition of nuclear weapons by international agreement.  In the United States, where the public was polled in September 2019 about the UN Treaty on the Prohibition of Nuclear Weapons, 49 percent of respondents expressed approval of the treaty, 32 percent expressed disapproval, and 19 percent said they didn’t know.     

Nevertheless, surprisingly large numbers of people remain unready to take the step necessary to prevent the launching of a war that would turn the world into a charred, smoking, radioactive wasteland.  Why?

Their reasons vary.  Die-hard militarists and nationalists usually view weapons as vital to securing their goals.  Others are the employees of the large nuclear weapons industry and have a vested interest in retaining their jobs.  In the United States, that enterprise has long been very substantial, and the Trump administration, through massive infusions of federal spending, has succeeded in fostering its greatest expansion since the end of the Cold War.  According to a December 2020 article in the Los Angeles Times: “Roughly 50,000 Americans are now involved in making nuclear warheads at eight principal sites stretching from California to South Carolina.  And the three principal U.S. nuclear weapons laboratories . . . have said they are adding thousands of new workers at a time when the overall federal workforce is shrinking.”  Members of these groups are unlikely to change their minds about the importance of retaining nuclear weapons.

But another constituency resistant to the abolition of nuclear weapons, and probably the largest, is comprised of people whose position could be changed.  They view nuclear weapons as a deterrent to a military attack—and especially a nuclear attack—upon their nation.  And their fear of external aggression is often inflamed by hawkish politicians, defense contractors, and the commercial mass media that whip up public hysteria about enemies abroad.

Of course, it’s not at all clear that nuclear deterrence actually works.  If it did, the U.S. government, with its vast nuclear arsenal, wouldn’t be as worried as it is about Iran obtaining nuclear weapons or fomenting war.  Indeed, if U.S. officials really believed that possession of nuclear weapons reduced the likelihood of nuclear and other kinds of war, they would be welcoming the proliferation of nuclear weapons around the globe.  Unfortunately, though, as they apparently recognize, the presence of nuclear weapons makes the world even more dangerous than it already is.

Even so, the advocates of nuclear deterrence make a very legitimate point about the reality of international affairs.  It is a dangerous world, and people have good reason to fear external aggression.  Although nuclear weapons provide an inadequate response to the dangers of military attack, there is considerable justification for people to be concerned about the security of their nation.

But what if the danger of external aggression were diminished?  In those circumstances, wouldn’t a substantial portion of the people concerned about national defense come around to supporting a nuclear weapons-free world?

Developing a stronger international security system would provide a useful way to foster this shift in attitudes.

The launching of the United Nations in 1945 raised hopes for the creation of an international entity that, in the words of the UN charter, would save humanity “from the scourge of war.”  And, in subsequent decades, this world organization, unlike any individual nation, did attain widespread legitimacy in world affairs, particularly for its humanitarian accomplishments and for the fairness of its decisions on global issues.  Nevertheless, the major nations—reluctant to give up the dominant power that they had traditionally exercised in international affairs—saw to it that the United Nations was denied the authority and resources that would enable it to develop an effective international security system. 

If, however, the United Nations were granted that authority and those resources, thereby providing nations with safeguards against external aggression, that would do a great deal to allay the fears of many people who cling to nuclear weapons.  And that, in turn, would transform the popular support for the abolition of nuclear weapons that currently exists into massive support for it—support that would be so overwhelming that even the nuclear powers might find it difficult to resist.

It is possible, of course, that hammering away relentlessly at nuclear dangers will be sufficient to finally convince the governments of nations—even the governments of the nuclear powers—to abolish nuclear weapons. 

Nevertheless, people who want to end the nightmare of nuclear destruction that has haunted the world since 1945 should consider widening the popular appeal of nuclear weapons abolition by strengthening the UN’s ability to provide international security.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179722 https://historynewsnetwork.org/article/179722 0
Red Flags on the Map: What Soviet Kids Learned about the United States

 

 

My most treasured possession from more than a quarter century of travel in Asia cost me more to frame than to buy. It is a laminated Soviet-era middle school historical map of the United States for the period from 1870 to the beginning of World War I in 1914. I picked it up for less than $5.00 at a bookstore in Bishkek, the capital of Kyrgyzstan, on the last day of my first working trip to Central Asia in December 1995, four years after the collapse of the Soviet Union.

In the mid-1990s, there were no direct flights from Bishkek to European hub airports, and I didn’t relish the challenge of negotiating Moscow’s Sheremetyevo. My return flight to Frankfurt left from Almaty, the then-capital of Kazakhstan, a four-hour nighttime drive from Bishkek. Around midnight, the guards at the Kyrgyz-Kazakh border cursorily examined my passport, then waved the car through. With an icy wind blowing in from the north, they were more interested in returning to the warm stove in their guard post (and perhaps another shot of vodka) than going through my suitcase.  At Almaty airport, however, the customs officials were alert, keen to explore what this foreigner was trying to take out of the country. One pulled out the map and unfolded it on the table.

“The export of valuable cultural artifacts including historical maps is strictly prohibited,” he solemnly informed me. In the dimly lit customs hall, I could not see his face and try to guess whether he was serious or just trying to solicit a bribe from an anxious traveler. I summoned up my limited Russian vocabulary and explained that this map was neither rare nor valuable. Maps like it had hung on schoolroom walls throughout the Soviet Union for decades. Perhaps he remembered it from his school years?

The officer shrugged. If I really wanted to take it, he said, then he would have to charge a special export license fee. We chatted amiably for a few more minutes before I came up with the winning argument.

“What do you think I’m going to do with a historical map?” I asked. “Use it to plan an invasion of the United States?” The officer and his colleagues got the joke. To save face, they confiscated a couple of knick-knacks, and sent me on my way.

It was not until two years later, after returning from a Fulbright Fellowship in Kyrgyzstan and more than a year of Russian classes, that I studied the map in detail. It focuses on the US economy and population, with symbols showing the expansion of the railroad network and mineral deposits—coal (угля), oil (нефти), and iron ore (железной руды). The most prominent symbols on the map, larger than the circles denoting the population size of cities, are red flags. They pop up in cities and rural areas, with most in the northeast and Midwest.

 

Below each flag was a year, and that was my clue. Cities such as Buffalo, Pittsburgh, Philadelphia, Chicago, and St. Louis had two years listed, 1877 and 1886—these were the years of railroad strikes that were brutally suppressed. Pittsburgh, 1892—the Homestead Strike. Ludlow, Colorado, 1913—the miners’ strike. And so on. The legend identified the red flags collectively as “labor movement” (рабочее двжение--rabocheye dvijeniye).

The history of the United States was being taught to Soviet schoolchildren through the ideological lens of strikes and labor unrest. Shaded areas indicated Native American reservations, where indigenous peoples were, in the Soviet view, being oppressed and exploited. An inset map, entitled “Imperialist Aggression,” showed Central America, the Caribbean and East Asia, with nasty-looking black arrows indicating US military operations.

The map is interesting, not only because of the contrast it offers to the maps used to teach American middle-school students about their country’s history, but because of the not-so-subtle message it sends about borders. State borders are marked, but the states are simply classified into two groups: those that were part of the Union before 1870, and those that joined later, with no text explaining when or why they became states. The real borders on this map, as represented by the red flags and shaded areas, are class borders—between the capitalist bosses, supported by the federal, state, and local governments, and the working classes, between settlers and ranchers and the Native Americans, confined to their reservations.

It is a selective, ideological history, although perhaps no more so than the history of the Soviet Union as taught to US students (if it was taught at all). Working in Central Asia from the mid-1990s to 2012, I met many people whose view of US history was largely a catalog of class warfare, strikes and racial oppression. Lessons learned in middle and high school were amplified and reinforced by newspapers, TV, and films. In his March 1983 speech to the National Association of Evangelicals, President Ronald Reagan famously described the Soviet Union as the “Evil Empire,” a label that fit well with the preconception of many Americans at that time. They might be surprised to learn that many Soviet citizens also thought of the US as an evil empire, even if they did not use that phrase. In both societies, education and the media shaped the mental maps of its citizens.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179694 https://historynewsnetwork.org/article/179694 0
Inequality, Labor Unrest, and Police Brutality in Early 20th Century Spokane, Washington: Jess Walter on His New Historical Novel "The Cold Millions"

 

 

 

Spokane, Washington. 1909. The City Council bans downtown speeches to curb labor agitation. The Industrial Workers of the World (the IWW—the “Wobblies”) organizes a mass protest against this restriction of free speech. Local police under notorious Spokane Police Chief John Sullivan brutally break up the nonviolent labor protest. Hundreds of union supporters are arrested and jailed. Many are injured. IWW firebrand “Rebel Girl” Elizabeth Gurley Flynn arrives on the scene to secure release of the jailed workers and to organize for the IWW. She is just 19 years old and pregnant, yet she courageously organizes working people in her travels around the Northwest. She later becomes a leading suffragist and one of the co-founders of the American Civil Liberties Union. In union circles, she is exalted still for her leadership, humanity, and bravery.

Celebrated Spokane novelist Jess Walter brings to life this fraught history in his new historical novel, The Cold Millions, a titular reference to the many poor and forgotten souls of early 20th century America. With a cast of real and fictional characters, he takes on issues from more than a century ago that resonate today including intolerance, income inequality, police brutality, violence, and human rights. At the same time, the novel plumbs emotional depths as it explores the complexities of friendship, sacrifice, betrayal, lust, cruelty, and love.

The story unfolds through the perspective of two orphaned and jobless young men, the Irish American Dolan brothers from Montana, who seek new lives in the metropolis of Spokane. Police jail the idealistic brother Gig, 23 years old, who embraces the promises of the IWW, while younger brother Rye, 16, yearns only for modicum of stability and a home. Yet it’s Rye who accompanies the fiery Gurley Flynn on her fiery campaign for workers as he also becomes enmeshed in the dark schemes of a wealthy Spokane mining magnate. Other characters include a burlesque actress and her performing cougar, a hired assassin, anti-union scabs, hoboes, labor organizers, a crusading attorney, and more.

Mr. Walter’s extensive historical research is on full display in The Cold Millions. In the creation of his novel, he pored over period newspapers, maps, diaries, letters, postcards, and more. The novel captures the mood and rhythm of the time, the arcane language, the passion of average people for fairness and justice, as well as the moments of debauchery and humor. Walter’s writing conveys his affection for his hometown of Spokane with full awareness of its fraught history, a reflection of the larger checkered history of the United States.

Mr. Walter is best known for his literary novels including Beautiful Ruins and The Financial Lives of the Poets, the National Book Award Finalist The Zero, and Citizen Vince, the winner of the Edgar Award for best novel. He also wrote a critically-acclaimed book of short stories, We Live in Water, and his short fiction has appeared in Harper's, McSweeney's, and Playboy, as well as The Best American Short Stories and The Best American Nonrequired Reading. He began his writing career as a reporter for the Spokesman Review and wrote a nonfiction volume, Ruby Ridge (Originally entitled Every Knee Shall Bend). He lives with his wife Anne and children, Brooklyn, Ava and Alec, in Spokane.

Mr. Walter generously responded by email to a series of questions on his writing career and his new novel.

 

Robin Lindley: Thank you for connecting with me Mr. Walter, and congratulations on your powerful new historical novel, The Cold Millions. Before getting to your new book, I’m also interested in your writing career. You have a background in journalism and a career as a prominent literary writer. Did you want to be a writer when you were young? What drew you to a writing career?

Jess Walter: I wanted to be a writer as long as I can remember. I created a family magazine with my siblings when I was six or seven (called Reader’s Indigestion) and was the editor of my junior high and high school newspapers. I read voraciously and used to visit the library as a 13-year-old, imagining where my future novels would go.

In college, I was a young father, and so I had to switch from majoring in English and creative writing to journalism, so that I could support my young family. But that seven-year detour into newspapers made me a better writer, I think, and certainly a better citizen.

Robin Lindley: How does your experience in journalism inform your writing now?

Jess Walter: Journalism informs my writing in many ways, I think: certainly the ability to research, and to publish without fear or a kind of preciousness. You don’t come back from a newspaper assignment saying that the “muse didn’t strike.” Likewise, you learn a directness and an economy of style that translates well to fiction. As an early newspaper editor once told me, “You write beautiful descriptions. Now pick one.” But the biggest attribute that I gained from journalism, I would say, is a keen sense of curiosity, and the tools to satisfy it. I think I’m a more outward-looking novelist, with an understanding of systems and institutions, because I worked for newspapers.

Robin Lindley: What sparked your career as a novelist? Are there certain writers that have influenced your work?

Jess Walter: Hmm, I think of a spark as something external, but a novelist is his or her own spark. You just read and write. Every day. I’ve written pretty much every day since I was a teenager. I wrote fiction for fifteen years before I had much success at it. I wrote a nonfiction book, two unpublished novels, dozens of short stories and was a ghostwriter before I published my first novel.

My fiction didn’t support me until my seventh book, and still, I am incredibly lucky that it supports me at all. As for influences, there are so many it’s hard to know where to start. From the top, I’d go with: Joan Didion, Kurt Vonnegut Jr., Don DeLillo and Gabriel Garcia Marquez. 

Robin Lindley: Some of my favorites too. You’re praised for novels that are always different. As America’s Librarian Nancy Pearl has said, “Jess never writes the same book.” How do you see the arc of your writing career?

Jess Walter: Ha! Well, first let me just say that Nancy is a dream reader and a wonderful writer. But isn’t it strange that the anomaly is the person who “never writes the same book”? Shouldn’t that be the case for more writers? I would rather ask, “Why do so many writers keep digging the same hole?” As for me, when I finish a book, I’m ready to do something different. I strive to get better as a novelist, and I think I get better by trying new things. But once I get going on a project, honestly, I don’t think about any of that. I just let the story dictate its genre, style and tone. If I concentrate simply on writing the next book I want to read, the rest takes care of itself.

Robin Lindley: It seems that most of your books involve moments in history. How does history play a role in your work? Did you enjoy history as a student?

Jess Walter: I did, and I do. But other than The Cold Millions, I wouldn’t say that my writing is particularly tied to historical moments. In fact, I would say, like the journalist I was, I’m more drawn to the contemporary.

I was at Ground Zero in the days after the terror attacks of 9/11 and wrote a dark satirical novel about our reaction to those attacks (The Zero), and I wrote a farcical family drama about the financial crisis of 2008 (The Financial Lives of the Poets.) Even this historical novel rose out of my desire to address contemporary issues like income inequality and political and social unrest. With Citizen Vince, I chose to write about the 1980 presidential election in part because of its significance in swinging American politics so firmly to the right over the next forty years. So I guess I would say my interest in history is really about how it impacts the present moment.

Robin Lindley: Thanks for those insights. Now, to go to your new, highly praised novel, The Cold Millions, what inspired this particular book? 

Jess Walter: It’s difficult to distill so many years of thought and research and writing into a few impulses of inspiration, but I’ll try.

Early on, I felt the political and social echoes of the last Gilded Age in our current economic climate, and I hoped to write about issues like inequality and nonviolent protest without being didactic. I also wanted to write a kind of labor Western, to collide those genres, the social novel and the adventure story, around the real free speech protests of 1909-10, and to recreate the thriving, boisterous Spokane that I found in old newspapers and postcards.

I was also taken by the figure of Elizabeth Gurley Flynn, and hoped to renew interest in her amazing life, while at the same time echoing the youthful activists that I saw leading the current political fights for sensible gun and climate legislation, and against police brutality against African Americans.

There were many novelists who inspired me, too, from Tolstoy to Steinbeck to E.L. Doctorow’s Ragtime to William Kennedy’s Ironweed.

And finally, a big part of the novel was personal for me. I’m a first-generation college graduate from a working-class family. Both grandfathers were itinerant workers in the 1930s, and my dad worked for 40 years for Kaiser Aluminum, rising to president of his Steelworkers Union local. My dad has Alzheimer’s now, and is at the end of his life, and I wanted to honor his steadfast belief in unions.

Growing up, the fairness and egalitarianism of labor was as close as my family had to a religion. I saw this early period of labor as a kind of origin story, filled with idealism and courage, before the unions became tainted by corruption and Communism became connected to the brutal regimes of the twentieth century.

Robin Lindley: The novel is filled with history and you have a gift for evoking this age. What was your research process for the book? Did you find especially useful archives and other resources?

Jess Walter: I read dozens of books from and about that period, correspondence and academic papers, pored over maps and railroad schedules, but most of my research, honestly, was done bent over microfilm, reading old newspapers.

The Spokane Library was very helpful, especially its Northwest Room, and I took several trips to the Seattle Library and to the library at Washington State University. Research is incredibly helpful until it isn’t. At some point, the novelist just has to just create, and to imagine. You become fluent in a period and then you can allow the characters you’ve conjured to drive the action.

Robin Lindley: What are a few things you’d like readers to know about Spokane in 1909?

Jess Walter: If you can imagine the railroad in 1900 as the equivalent of the internet today—connecting the world in ways it hadn’t before—you can see how Spokane was one of the fastest-growing and most thriving cities in the United States at that time. Every northern railroad line pinched together in Spokane, before spreading out to Portland, Seattle, Vancouver. The incredible wealth from the area’s mining, timber and agricultural flowed through the city. Like Seattle, it was doubling in size every six or seven years, but unlike Seattle, it was known for being an island of sophistication in an empty part of the world, with great hotels and restaurants and one of the best theater scenes in the West, including the largest stage in the world. 

Robin Lindley: I’m a native of Spokane but never knew of the 1909 Free Speech Movement and the labor strife then. It’s fascinating and now more people will know about this past thanks to your novel. How did you come upon this overlooked campaign for workers?

Jess Walter: I can’t remember how I first came across the free speech action in Spokane, but I think it was in the morgue of my old newspaper. Perhaps I was grabbing files on Tom Foley (I covered his last election in 1994) when I pulled the file on Elizabeth Gurley Flynn and noted her story and set it aside as a topic for a later novel. The sheer audacity of Gurley Flynn and the ahead-of-its time inclusivity of the IWW seemed remarkable to me.

A few years later, I read that Dashiell Hammett had worked as a Pinkerton detective out of Spokane, investigating labor figures in Montana (the roots of his novel Red Harvest), and I began looking for ways to bring that period to life. For years, I gathered articles and books and mulled over how to tell the story.

Robin Lindley: When I was younger, the Industrial Workers of the World, the Wobblies, were seen as bomb-throwing radicals, but you found a different story. What did you learn?

Jess Walter: Well, at times, there were bomb-throwing radicals and anarchists in the IWW, but usually the violence came in reaction to the IWW. The union was radical, definitely, pushing for a complete overhaul of capitalism, but it also preached nonviolence. Some members pushed for more direct action, like sabotage and general strikes, but it was actually the IWW’s pacifism that caused it to run afoul of the U.S. government, when the union objected to our entry into World War I.

There was awful violence involving the Wobblies, in Everett, in Centralia, in Butte, Montana, but almost always that violence came from the other side, from vigilantes or detectives who had infiltrated the IWW. In fact, the free speech actions in the Northwest were the first successful nonviolent protests in U.S. history, a model for civil rights activists decades later. 

Robin Lindley: The Free Speech Movement occurred in 1909, a decade before the better-known Seattle General Strike. What did workers gain from the Spokane Movement?

Jess Walter: They were very much connected. By 1919, the IWW’s profile in the United States had been greatly diminished, and they were seen as the most radical labor organization in the United States. The Seattle strike was groundbreaking because of its breadth, because more traditional unions took part in it: dockworkers and unions affiliated with the AFL. But city officials fighting the strike used the Wobblies as socialist bogeymen to try to turn public perception against this huge, broad social movement. 

Robin Lindley: A central character of The Cold Millions is Elizabeth Gurley Flynn, a young labor activist—a real person—who spoke on behalf of workers and the poor. What are a few things you’d like readers to know about her?

Jess Walter: I write about Gurley Flynn at a fascinating time in her life. (She would go on to become a founding member of the ACLU, the chairwoman of the Communist Party USA, be jailed for her activism, and become a civil rights activist, among other things.) But in 1909, she was a fiery 19-year-old labor activist and suffragist who had been speaking in factories and rough work camps for three years, known as the East Side Joan of Arc and, by the establishment New York Times, as a “she-dog of anarchy.” I marveled at a pregnant 19-year-year-old, ten years before she can even vote, traveling west by herself to fight for workers’ rights against corrupt police and company goons.

Robin Lindley: You humanize real characters in your book such as the “Rebel Girl” Gurley Flynn and the brutal Spokane Police Chief John Sullivan. How do you create the fictional presence and world of a real character?

Jess Walter: There is a fine balance, I think. To make them come alive like the other characters, you have to treat them as fictional creations, inventing dialogue, motivations and actions. But I feel a responsibility to the historical figures, as well, and so, with all of those characters, I tried to research them, and to keep the invention to a real minimum. For instance, most of the speeches that Gurley Flynn gives in the novel come from accounts of her actual speeches, in newspaper stories and books. 

Robin Lindley: You tell much of the story through the eyes of a couple of young Irish-American vagabonds from Montana who are drawn to Spokane. Were they based on real people? How did you choose this point of view?

Jess Walter: Gig and Rye are entirely fictional characters. But their story parallels many of the hobos working at that time. And their sense of adventure comes from stories my grandfather used to tell about his own hoboing days a generation later in the 1930s.

Robin Lindley: And you etch the age through a range of characters including a Pinkerton detective who sees Spokane as “a box of misery” and “a syphilitic town” that metastasized, a hired killer, an actress who performs with her cougar, a righteous lawyer, wealthy tycoons, and more. Were there historical models for these characters?

Jess Walter: Other than Fred Moore, who was an actual labor lawyer who moved from Spokane to other free speech protests around the West, they are all fictional characters burnished by my research into the time. 

Robin Lindley: The brutality of the Spokane police, jailers, and anti-union thugs may stun some readers. What was the city like in 1909 for the poor, the dispossessed, the nonwhite?

Jess Walter: About like it was everywhere. Maybe the one difference was that the city was teeming with itinerant workers because of its location as a hiring center for mining, timber and agriculture jobs. Many of these were recent immigrants from Central Europe, and they suffered through waves of abhorrent racism and xenophobia, as immigrants as varied as the Chinese and the Irish had previously, and as Native Americans and African Americans continuously faced. The Spokane Police, during this period, were accused of everything from brutalizing traveling workers to shaking down the city’s brothels, again, not unlike police in other cities.

Robin Lindley: You also capture the arcane language and idioms of the period. How did you come to learn these expressions and obscure words?

Jess Walter: It was great fun, immersing myself in the language of the newspapers, the IWW speakers and singers, the Pinkerton detectives and others. Much of it came from newspapers and Wobbly accounts of the free speech protests in Spokane. In capturing the way a 60-something-year-old Pinkerton detective might sound, I read old mysteries to find words that had disappeared from the lexicon, like “the morbs” (a morbid feeling of unease) and “lobcocked” (bothered or blocked from action) … that language, in particular, began to feel like some missing link between Western and Hardboiled literature.

Robin Lindley: You present an unsparing account of Spokane history, including an account of atrocities against Native people. What did you learn about treatment of Native people?

Jess Walter: This is another thing I feel like I’ve always known. I grew up on the river, near Plantes Ferry and the horse slaughter camp, where in 1858, eight hundred native ponies were ordered shot by Col. George Wright as punishment and warning to the Spokane tribe. In the 1970s, when I was a kid, people were still finding bleached horse bones along that shoreline.

I live now just across from what used to be Ft. Wright, near the confluence of the Spokane River and a stream that for 120 years was called Hangman Creek, named for the spot where Wright had tribal leaders hanged when they came to beg for peace.

My family lived for a few years on ranch bordering the Spokane Indian Reservation, where the tribe was forcibly relocated. Anyone who doesn’t understand the brutal history of treatment of Native Americans in the place they live is just not paying attention. And not just Spokane. Seattle, Yakima, Manhattan, how many of us live in cities named for the people from whom it was brutally taken.

Robin Lindley: Your book is a tribute to human rights, the rights of assembly and free speech, and the struggle to preserve those rights, along with a recognition that all people regardless of social station or wealth or race, deserve access to justice and equal rights. Were you thinking of those values as you wrote The Cold Millions?

Jess Walter: Definitely. And I’d add one more, the old-fashioned idea of brotherhood, the kind that Gig and Rye share, and also the kind that they share with Jules and with Gurley Flynn and the leaders of the IWW.

Ten years before I was born, in 1955, about one in three Americans belonged to a union. Now that number is less than one in ten. And, not coincidentally, the middle class has eroded and the gap between wealthy and poor is as high as it was in 1909. The book is an elegy for labor idealism and perhaps a suggestion for the road back.

Robin Lindley: Are there other books and resources you’d recommend to help readers better understand the history behind The Cold Millions?

Jess Walter: Oh, so many. The book has an Acknowledgments section that is chock full of books that I used in my research. But I will suggest one that gives a broad sense of the labor wars of that period in the Northwest, Big Trouble by J. Anthony Lukas.

Robin Lindley: It’s clear from your work that you love Spokane despite its checkered history. You’re a native of the city and you still live there. I recall that the late, great Spokane artist Harold Balazs told me that friends asked him why he never moved from Spokane to an arts mecca like New York City or LA. He said, “You bloom where you’re planted.” It seems you share that strong sense of place.

Jess Walter: Ha, please point me to the American city that doesn’t have a checked history, and I will move there. Every city is born, as Spokane was, through some combination of brutality toward its Native people and the destruction of its natural resources.

I think some people in Seattle look with condescension at Spokane because it’s poor. But equating a poor city with a bad one is rank snobbery. In fact, I would argue that there’s something more fundamentally wrong with a city where a teacher or a police officer can never dream of affording a home. I happen to like Spokane’s grubbiness, its weirdness, its rough edges. Harold’s answer to that question is terrific, like everything about Harold, but I kind of wish he’d have just said, “Go pound sand.”

Robin Lindley: Outsiders may see Spokane as conservative bastion in a county that voted for Trump and is represented by a rightwing member of Congress, but the city also has growing arts, literary and higher education communities. Perhaps voting patterns don’t reflect the entire reality of the city. How do you see the social and political evolution of Spokane since 1909? Are younger people there now interested in social and political change?

Jess Walter: The city itself is quite liberal, went for Biden by almost 20 points, and has a city council with a 6-1 progressive bent. Because of the Spokane Valley and its more rural areas, Spokane County did tip for Trump, by about 4 points, half the margin of 2016.

But I think it’s misleading to think of Spokane as just another part of red Eastern Washington. The real divide is between urban and rural, like everywhere in the United States. And Spokane’s politics has always been far more complex than the West Side of the state wants to imagine. Even in Spokane’s more conservative periods, a Democrat, Tom Foley, represented the region and rose to Speaker of the House. And Spokane had a black mayor, Jim Chase, a decade before Seattle did.

As for young people, I think, like everywhere, they are more engaged than I’ve ever seen them, and personally, I can’t wait for them to take the wheel.

Robin Lindley: As we today face a politically divided country, a deadly pandemic, a political insurrection, and a history of systemic racism, among other issues, where do you find hope?

Jess Walter: Wow, that’s a hard question. I like what Kafka says: “There is infinite hope … but not for us.” Still, deep inside, I cling to an old-fashioned kind of humanism, and the belief in what Lincoln called the better angels of our nature. But, as a novelist, you’d better keep track of the devils, too, because they make for better characters. 

Robin Lindley: You have a gift for breathing life into history Mr. Walter, and for blessing each of your characters with a sense of presence and humanity. Is there anything you’d like to add about your writing or your new epic novel and its resonance now?

Jess Walter: Thank you! No, those were wonderful questions. 

Robin Lindley: Thank you for your thoughtful words and generosity Mr. Walter. And congratulations on your epic historical novel The Cold Millions and the stellar praise you’re receiving. Well deserved, indeed.

 

Robin Lindley is a Seattle writer and attorney, and the features editor of the History News Network. His articles have appeared in many periodicals. He can be reached at robinlindley@gmail.com

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/blog/154485 https://historynewsnetwork.org/blog/154485 0
Historians for Peace and Democracy Present Free Resources for History Educators

 

 

Historians for Peace and Democracy (H-PAD) is a national organization of progressive historians. As part of our mission to foster education on campuses and in communities, encourage activism, and facilitate networking with organizations that work for peace and justice, we are making a series of new resources available for use. They are totally free, so they fit your budget! These resources include a Virtual Speakers Bureau, short videos in the Liberating History series, and a syllabus on sanctions.

H-PAD launched its Virtual Speakers Bureau in March 2021. Forty outstanding professional historians, activists, and independent scholars have volunteered to speak to classes, campuses, community-based groups, and other organizations. No honorarium is required or expected, just a mutually-agreed-upon date, time, and topic. The presentations can be tailored to meet both parties’ interests, expertise, convenience, and needs. H-PAD has organized speakers bureaus in the past, but the current widespread use of video conferencing technology allows us to extend the invitation beyond our own locales to include organizations across the United States and around the world.  If you would like to learn about the speakers and how to invite them, please click here.  

The new Sanctions Syllabus was developed by Renate Bridenthal, Molly Nolan, and Prasannan Parthasarathi, three members of the H-PAD Empire Working Group. It dissects “economic sanctions – their forms, legality, and effectiveness, their history across the twentieth century and their current deployment, as well as blowback from and resistance to them.”  The syllabus offers definitions, examples, and links to a wealth of articles, books, and films. It examines the use and impact of sanctions against Cuba, Venezuela, Iraq, Iran, Russia, China, apartheid South Africa, and Israel, with a particular discussion of the Boycott, Divest, and Sanctions (BDS) campaign. To access the syllabus, click here.  

We’ve recently expanded into video and audio production, too. Our Liberating History series features lightning video lectures of 3-4 minutes. In “Black Panthers Against Patriarchy,” Robyn Spencer discusses why so many Black women saw the Black Panther Party as a place of feminist empowerment. In another new episode, Prasannan Parthasarathi puts “India’s Far Right in Historical Perspective,” explaining its origins in the country’s caste system and the ideology of Hindu nationalism. And be sure to check out our earlier Liberating History episodes as well: Irene Gendzier on the roots of Trump’s Middle East policy, Donna Murch on crack and mass incarceration, and Ellen Schrecker on McCarthyism past and present.

We encourage you to use, and share, these resources. At H-PAD we believe in using history to empower people to confront systems of hierarchy and oppression. If you’d like to collaborate in making that happen, please join us!    

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179723 https://historynewsnetwork.org/article/179723 0
Can Biden Fulfill JFK's Incomplete Promise of a Peace Presidency?

JFK and Nikita Kruschev at the 1961 Vienna Summit. Talks there led toward the 1963 Partial Test Ban Treaty. NARA record: 3951647

 

 

 

President Joe Biden and his advisers appear to have studied the lessons of Franklin Roosevelt’s presidency. Several executive orders have undone some of the damage wrought by President Donald Trump. The passage of the American Rescue Plan Act of 2021 provides aid to poor and working people and investment in the county’s infrastructure. In addition, Biden has spoken out in support of a union vote by Amazon workers in Bessemer, Alabama.

 

It would be wise for Biden and his advisers to study the lessons of the Kennedy and Johnson administration, second to the Roosevelt Administration in achieving domestic reforms. Like Biden, John Kennedy was supported by unions and articulated pro-union views. One year into his presidency, Kennedy signed Executive Order 10988 establishing collective bargaining for federal employees.

 

Kennedy’s more dramatic shift to progressive positions came in June 1963 with a speech to the nation advocating civil rights and a speech at American University advocating peaceful coexistence with the Soviet Union. The shifts in policy these speeches represented led to the passage of the Civil Rights Act of 1964 and the August 1963 Test Ban Treaty ratified by the U.S., the Soviet Union, and the United Kingdom.

 

Peace advocates found plenty to criticize in the foreign policies of the Kennedy and Johnson administrations. Prior to 1963, Kennedy’s relationship with Cuba and its ally the Soviet Union was confrontational, leading to the 1962 Cuban missile crisis. The escalation of the U.S. war in Vietnam by the Kennedy and Johnson Administrations and Johnson’s invasion of the Dominican Republic are among the ways that the shift to a more peaceful foreign policy fell short. That Johnson undermined his own domestic goals by his expansion of the Vietnam War is well known.

 

In the Cuban missile crisis, Kennedy stepped back from the brink of nuclear war and reached an agreement with the Soviet Union that included a promise by the U.S. to cease its effort to overthrow the Cuban government. In his American University speech, Kennedy called for rethinking the cold war.

 

The ideas that Kennedy articulated in his American University speech remains relevant. Kennedy declared:

 

“So, let us not be blind to our differences--but let us also direct attention to our common interests and to the means by which those differences can be resolved. And if we cannot end now our differences, at least we can help make the world safe for diversity. For, in the final analysis, our most basic common link is that we all inhabit this small planet. We all breathe the same air. We all cherish our children's future. And we are all mortal.”

 

The world system is much changed from what it was in the 1960s. During the Kennedy and Johnson years, the U.S. Gross Democratic Product (GDP) was nearly half the world total. Even with that economic clout, attempting to maintain U.S. hegemony undermined domestic reform goals. Today, the U.S. share of GDP is about 25 percent of world GDP. When one takes into account the many social benefits not measured by GDP, the U.S. position is weaker still.

 

The attempt to maintain U.S. dominance with outsized military spending – 37 percent of the world total in 2015 – has led to a series of endless unsuccessful wars. The damage inflicted on millions of people in other countries and on the tens of thousands of U.S. people involved in these conflicts is both sad and unnecessary. It’s time to return to JFK’s concept of paying attention to “our common interests,” resolving our differences peacefully, and making “the world safe for diversity.”

 

Progressives, unions, and the left, are seeking to achieve domestic reforms – the passage of the For the People Act of 2021 to protect voting rights, the Protecting the Right to Organize Act, Medicare for All, the $15 minimum wage, and the Green New Deal.  There are two ways to fund social programs and move to a more equal society. We need to increase taxes on the wealthy and corporations. The Ultra-Millionaires Tax Act proposed by Senators Elizabeth Warren and Bernie Sanders is a first step. Second, we need to shift funding from the military budget to social needs. To accomplish the latter goal means emphasizing peace advocacy.

 

President Biden needs to step back from his attack on Russian President Vladimir Putin. The intelligence reports Biden is reviewing are political, not scientific documents.  The military-industrial complex is now the military-industrial-intelligence complex. The so-called intelligence community is part of that larger complex and is seeking to consolidate the new cold war with both Russia and China.

 

One of Kennedy’s virtues was his ability to set aside advice from foreign policy and defense experts and to think independently. It helped that he had a sense of humor and, despite his high position, could show some modesty. After the Bay of Pigs failure, he commented to aides, “It’s just like Eisenhower. The worse I do, the more popular I get.” Whatever the truth of the charges of Russian interference in U.S. elections, Biden should remember that the U.S. intervened openly in the 1996 Russian election, helped overthrow the elected Ukraine government in 2014, and has a long record of interfering in other countries’ internal affairs.

 

President Barack Obama took some steps away from the new cold war campaign with the New Start Treaty of 2011.  He also took a step toward the relaxation of the blockade against Cuba.  Biden should follow up on Obama’s Cuba initiative by ending the sixty-year-old blockade of Cuba.

 

Returning to the themes of Kennedy’s American University speech could lead Biden to make lasting contributions to world peace.

 

On disarmament, Kennedy said: “Our primary long range interest in Geneva . . . is general and complete disarmament-- designed to take place by stages, permitting parallel political developments to build the new institutions of peace which would take the place of arms.”

 

About the United Nations and disarmament, Kennedy said: “we seek to strengthen the United Nations, to help solve its financial problems, to make it a more effective instrument for peace, to develop it into a genuine world security system--a system capable of resolving disputes on the basis of law, of insuring the security of the large and the small, and of creating conditions under which arms can finally be abolished.”

 

Today, a peace presidency would ensure access to vaccines against the coronavirus by the neediest nations. It would lend full support to the United Nations and the World Health Organization. It would shift funding from armaments to domestic needs and aiding the world’s needy. It would put an end to the new cold war and seek ways to cooperate with Russia and China. It would put an end to our endless wars and support Palestinian rights. It would set our sights, once again, on world disarmament.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179724 https://historynewsnetwork.org/article/179724 0
"What the Black Man Wants": Frederick Douglass's Answers Still Resonate

 

In April of 1865, Frederick Douglass addressed the Annual Meeting of the Massachusetts Anti-Slavery Society. At forty-four years of age, six-foot-one inches tall, streaks of gray emerging in his hair, Douglas still radiated strength. He stood resolute before an audience of abolitionists with whom he was popular and respected. With his intense gaze afire in triumph and alertness to an hour of opportunity, trimly bearded but still rakishly handsome, his fierce countenance attracted admirers of many stripes. As always, in this venue, he was interrupted often by applause, laughter, and shouts of approval, as he presented his powerful arguments with clever word play steeped in American popular culture: the Bible, Shakespeare, and the already sacred rhetoric of the Founders. In his prime, Douglass presented perhaps the most striking public figure ever to stride across the American political stage, every bit as compelling as iconic politicians of the television age like JFK, Ronald Reagan, Bill Clinton, or Barack Obama.

Douglass advanced his case with a series of questions.

WHAT IS FREEDOM? Speaking out against a discriminatory labor policy instituted by the Union Army ostensibly to "prepare ex-slaves to better handle freedom," Douglass called the right to choose one's own employment essential. But the bedrock of true freedom for the freedman would be "immediate, unconditional, and universal enfranchisement."

Why suffrage? Some will ask WHY DO YOU WANT IT? Invoking the language of the Declaration, Douglass simply demanded what was his by right. Any deprivation of a natural right reduced the "nature" of men. Voting represented a symbol of equality. As a result of the American founding and democratic evolution, the idea of universal suffrage defined American citizenship. In other political cultures, the denial of the "elective franchise" might do no great violence to a man. But, in our system, Douglass argued, disfranchisement equaled inferiority.

And there were practical reasons. Beyond the principle of equal rights, it was in the interest of the Federal Government to empower and enlist their Black allies in the ongoing fight to stamp out treason and perpetuate unified constitutional government. Presciently, Douglass predicted the reluctance of the South to accept the verdict of the war, predicting the United States government would find itself an occupying force in a "strange land" surrounded by a "hostile spirit" struggling to maintain order and authority. 

HOW WILL YOU WIN THE PEACE WITHOUT THE BLACK MAN? Where will you find the strength to overcome the persistent spirit of the Southern rebellion? The North would need their wise and faithful Black allies, who understood clearly the war and its ultimate aim from the beginning much better than the North. African Americans had voted with their bodies, had been impervious to danger, and supported the cause of Union and freedom stubbornly and courageously as the war hung in the balance--and, truly, Douglass asserted, going forward, they represented "our only friends in the South."

To the question of INFERIORITY, Douglass acknowledged the disadvantaged political condition of Black people in America, but, asserting once again a natural right claim, he denied inferiority in any original, natural, or practical sense--pronouncing African Americans equal to "anybody on this globe." Douglass reminded his audience that slavery and oppression, historically, did not equal a racial condition but rather a function of circumstance. Were not the "blue-eyed and fair-haired Anglo-Saxons considered inferior by the haughty Normans"? "You were down then," Douglass reminded his fellow abolitionists to howls of laughter and applause. "You are up now. I am glad you are up, and I want you to be glad to help us up also."

"The story of our inferiority is an old dodge," Douglass continued. A rationale to explain political interest. When our "Manifest Destiny demanded a slice of Mexico,” we hinted the Mexicans were an inferior race. When Russia coveted parts of the Ottoman Empire, or the British wanted more authority in Ireland, the people in their way were an inferior race. "You say we are ignorant; I admit it." But if African Americans knew enough to be hung, they knew enough to vote. If they knew enough to fight for the flag, they knew enough to vote. If they knew enough to pay taxes, they knew enough to vote. With another call back to the American Revolution, Douglass proclaimed to his Boston audience, "taxation and representation should go together." And, of course, never one to pass up a swipe at the immigrants from the Emerald Isle, "if he knows as much when he is sober as an Irishman knows when he is drunk, he knows enough to vote, on good American principles."

WHAT DOTH IT PROFIT A NATION IF IT GAIN THE WHOLE WORLD, AND LOSE ITS SOUL? In addition to a practical need for African Americans to accomplish a successful reconstruction of the South, what abut HONOR? What about JUSTICE? Douglass: You asked African Americans to "incur the enmity of their masters." You induced us to "turn against the South in favor of the North; to shoot down the Confederacy and uphold the American flag." The white people of the South will hate us for generations. "You have called upon us to expose ourselves to all the subtle machinations of their malignity for all time." DO YOU NOW INTEND TO SACRIFICE YOUR FRIENDS IN FAVOR OF YOUR ENEMIES? WILL YOU GIVE YOUR ENEMIES THE RIGHT TO VOTE AND TAKE IT AWAY FROM YOUR FRIENDS? We responded to your call to arms (like we did in 1776 and 1812). "In time of trouble we are citizens. Shall we be citizens in war, and aliens in peace?"

Noting a proliferation of benevolence societies to aid African Americans, Douglas observed, "the American people are disposed to be more generous than just.” But, once again asserting a natural right claim, Douglass wondered, now that you are inarguably aware "we are men," will you deny us the "possession of all our rights?" Repudiating the poor substitutes of benevolence, pity, or sympathy, Douglass simply demanded justice. 

WHAT SHALL WE DO WITH THE NEGRO? "Do nothing with us," Douglass suggested. Leave African Americans alone. Give them a chance to be men. "If you see him on his way to school, leave him alone; don't disturb him," Douglass entreated. Similarly, if you saw a Black man having dinner at a hotel, or casting a ballot, or practicing his craft, just let him be. Allow him to pursue his inalienable rights in peace. If the Black man failed, surely it would be the fault of his Maker and perhaps give lie to the universal principle of the American founding. But, Douglass was certain, if given a chance, if unbound, if allowed to succeed on his own, the Black man would prove himself equipped for citizenship and success just as much as the white man. The war [in which 200,000 African Americans served with distinction in the Union Army] swept away a "great many delusions," Douglass reminded them.

What does Douglass have to say to 2021? Should we just leave the Black man alone? Do nothing?

Conscientious historians fault modern conservatives for misusing the above paragraph to distort or even troll the African American civil rights cause over the last four decades. It is a fair point. When Douglass advised “do nothing,” he envisioned an American government that permitted African Americans to join the body politic and be subject to equal protection and due process under the law as first class citizens. When Douglass declared, “just let him alone,” he clearly imagined and advocated a passive partnership between the government of the United States and African American citizens in which access to education, the right to vote, equal employment opportunity, and public accommodations were open and equal to all. 

But, instead, what Douglass feared most came to pass. After an attempt to honor their Black allies and establish justice for all, the North ultimately chose to make common cause with the white South—reneging on the promises of Reconstruction and the rhetoric of equality. The failed revolution succeeded in amending the Constitution. African Americans gained full citizenship and suffrage on paper, but, over the course of the next three decades, Douglass watched his victorious coalition of 1865 slowly but surely betray their “faithful friends.” By the time of his death in 1895, the United States had abandoned reform, left the South to the vagaries of white rule, and was fully engaged in the long nightmare of Jim Crow segregation that would last into 1960s.

Almost a century after Douglass’s speech in Boston, the March on Washington, and Martin Luther King speaking before the Lincoln Memorial, symbolizes for us a rededication to our founding principles. African American leaders once again called upon the American people to honor the Creed. After a century of discrimination, oppression, and intimidation, in a very different moment, Congress passed the Civil Rights Act of 1964 and the 1965 Voting Rights Act. Unlike the failures of Reconstruction, the twentieth century moment represented a seismic cultural shift and a great leap forward. At the very least, a great down payment on living out “all men created equal.”

An Aside. We should be honest about the progress achieved since our great civil rights moment. We merit praise not scorn for sincere repentance and 55 years of tangible achievements in cultural integration and racial unity American style. Almost inconceivably, we now live in a world in which myriad African American icons and heroes populate the uppermost elite echelons of our society: Oprah, Lebron, Barack and Michelle, Tiger, Beyoncé, Ta-Nehisi Coates, Shonda Rhimes, et al. Looking back from 2021, who are our most admired historical figures from the twentieth century? Number One (with no real competition): MLK. And the pantheon certainly includes Rosa Parks, John Lewis, Malcolm X, Colin Powell, and Jackie Robinson. In sports, Muhammad Ali has come to personify absolute excellence, integrity, and courage for the vast majority of Americans. Young people today worship a whole galaxy of African American sports stars with virtually no thought to race. Same for the arts and entertainment. No high school or college American lit survey seems complete without Toni Morrison, Langston Hughes, James Baldwin, Richard Wright, Ralph Ellison, or Maya Angelou. And, if we were enumerating heroes that one half of the nation adores but the other half detests for reasons only marginally connected to race, we would add Clarence Thomas, Thomas Sowell, Condoleezza Rice, and Tim Scott.

An Aside. We should acknowledge that there are tens of millions of white Americans who would enthusiastically vote for Tim Scott, or any other conservative black candidate, over Joe Biden, or any other white male Democrat, for president of the United States. We should acknowledge that we have utterly shattered the ubiquitous assumptions of white supremacy that barred African Americans from participating in American politics solely on the basis of race just sixty years ago. They are virtually non-existent in our current political environment.

But I HEAR YOU. This is not about Oprah or Barack Obama. Our problem is George Floyd, economic disparities including income, unemployment, the wealth gap, the inheritance gap, poverty rates, and home ownership, also disparities in health outcomes (COVID deaths) and incarceration, voter suppression, food deserts, and education. African Americans, statistically, in the aggregate, disproportionately suffer a lower quality of life in our nation compared to whites (and Hispanics and Asians).

We have succeeded in empowering the Talented Tenth of Black America. We are now happily accustomed to seeing wealthy and powerful Black people among us, enriching our culture and strengthening our economy.

But how do we achieve broader and deeper success? How can we make things better for more people? How can we make things right? Building on a half century of remarkable progress, how can we repair the residual damage resulting from a century of systemic discrimination? How can we honor our promises and live up to our founding ideals (not as white people) but as a united people? As one nation? How can we finally win the peace?

I have a few ideas—and I think they are in keeping with the principles of Douglass, Lincoln, and King (without sacrificing Jefferson and Madison). Let’s talk about some possible economic and cultural solutions in my next installment, tentatively entitled, “A Just and Lasting Peace.”

What the Black Man Wants,” Frederick Douglass, 1865.

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179721 https://historynewsnetwork.org/article/179721 0
Is History Ready to Judge the Trump Presidency?  

 

 

With the second Trump impeachment concluded, the (first) Trump presidency is officially confined to history. How should history understand the Trump presidency? Right now, we would be hard-pressed to find anyone who disagreed with the contemporary consensus that Trump shattered the norms of the presidency itself. Hovering like a specter over historical analysis, that consensus obscures other significant innovations that Donald Trump brought to the presidency. Understanding his political strategies will help historians and political scientists generate further insights into the nature of power inherent in the office of the President and the structures that enabled him.

 

We know that Trump’s presidency was consequential. He single-handedly changed the presidency in several ways, from altering relationships with the press, to hollowing out bureaucracies, and garnering unprecedented media attention from all over the world. What makes Trump different however is the unusualness of his style and methods. Take his use of social media as an example, effective as it was in boosting his own political standing by stirring chaos through entertaining and inflammatory remarks on Twitter. His Twitter account ultimately did not serve the interest of the country (as Twitter itself determined in the wake of the January 6 attacks on the Capitol, with the controversial decision to suppress the President’s access to the platform). And yet no doubt future presidents might adopt similar strategies to more traditional ends (what’s without controversy is to hope they use the Twitter pulpit to pursue national interests rather than personal ones.)

 

Another controversial president who could demonstrate the unprecedented nature of the Trump presidency is George W Bush. Although few people draw comparisons between the two, Bush – like Trump - was plagued by historical low approval ratings and controversies, from his decision to invade Iraq in 2003 to his handling of the US economy in the wake of the 2008 financial crisis. Are both presidents destined to be remembered horribly?

 

The Bush and Trump presidencies could not have been more different, as Bush, though awkward in conducting foreign policies, more plausibly rooted his intentions in what he believed was the morally righteous thing to do. By contrast, Trump was a tactician who applied unconventional methods in fulfilling his own political gains regardless of the nation. Concerning Bush, it was his policies that were out of touch with reality. He was simply not savvy enough to understand the political and military complexity of invading the Middle East. Though he perhaps had a point in assuming the danger of terrorism, given the shock that the nation endured with 9/11, his false judgment in invading Iraq, a nation with no credible evidence of preserving weapons of mass destruction, was of his own making. Like Trump, he handicapped himself by politicizing his own intelligence bureau, and the nation paid the price. Unlike Trump, Bush also paid the political costs.

Bush was often depicted as a “war criminal” for the destabilization of the entire Middle East. In retrospect, at least it seems that Bush was reacting to a truly national emergency. Based on his course of actions, we can assume that Bush was simply inept. The nation suffered from the opposite problem with Donald Trump, who apparently never acted in the interest of the nation but who was so adept at controlling media narratives that he remains king of the Republican Party (where is George Bush, these days?). By repeatedly calling the news media fake news, he discredited negative stories. This tactic is effective in a rational choice framework if we were to disregard the implications of it all.

 

If we were to use the criteria that presidents should be judged by how they employ the most rational choice and effective strategy in fulfilling their political interest within a set of limited options, it should be noted that while Bush did react out of proportion to the crises that he inherited, he did not necessarily use those crises to his own advantage. Bush used the resources of his office in a more traditional sense, though at the time of his presidency many thoughts about his tactics ranging from the opening up of Guantanamo Bay to the invasion of Iraq are approaching the borderline of the power of the American presidency. Though many might argue that his winning of reelection in 2004 indicated the successful selling of his “wartime president” status, this victory prolonged the festering of the existing crisis he manufactured himself in Iraq.

 

More money and time were wasted in the Middle East, creating a financial drain on the country that cemented his status as a controversial president or “war criminal” by the time he left office in 2009. As it should also be noted those were arguably bad political tactics; though Bush won reelection in 2004 as a “wartime president,” he left office with low approval, and saw his own party move away from his leadership through the Tea Party. Trump, on the other hand, while unsuccessful in winning reelection, used a new method of conducting the presidency that made every scandal conducive to his own personal interest, retaining the loyalty of his base and command of the Republican Party.

 

Compared to Bush, Trump played the role of presidency unconventionally by being able to manufacture crises to his own advantage, completely changing the way presidency is conducted and, possibly, basic expectations about its function. However much controversy Bush stirred, his controversial legacy nonetheless pales in comparison to Trump’s. And yet, the Trump presidency might be the point of inflection for the country, and a moment for historians to recalibrate how they judge future presidents.

 

 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179727 https://historynewsnetwork.org/article/179727 0
The Roundup Top Ten for March 26, 2021

I Don’t Want My Role Models Erased

by Elizabeth Becker

The work of women journalists covering the war in Vietnam has been obscured in remembrance of the war and its place in American history and culture. The author seeks to recover the stories of Frances FitzGerald, Kate Webb and Catherine Leroy.

 

Can a Grand Bargain Empower Amazon’s Workers and Limit Corporate Power?

by Nelson Lichtenstein

"Unions are weaker today than they were in the 1930s, but the idea that wages have to rise and democracy has to be revitalized, in the workplace and beyond, is returning in an echo of that era."

 

 

Letters From an American: March 23, 2021

by Heather Cox Richardson

Beginning in the 1970s, the National Rifle Assocaition evolved into a political lobbying organization increasingly enmeshed with the conservative movement. Two recent mass shootings are a tribute to the organization's success. Congratulations. 

 

 

The Battle Against D.C. Statehood is Rooted in Anti-Black Racism

by Kyla Sommers

"The continued power of Congress over the District’s affairs is rooted in this same fear of Black power and racist belief that a majority-non-White populace is incapable of independently governing itself."

 

 

The Immovable AMLO

by Humberto Beck, Carlos Bravo Regidor and Patrick Iber

"AMLO continues to decry the faults of neoliberalism, but his government is, for the most part, failing to build an effective alternative to it."

 

 

How the U.S. Tax Code Privileges White Families

by Dorothy A. Brown

The history of the married-filing-jointly tax return is one of affluent white families securing advantages through the tax code that working class families, including most Black taxpayers, were unable to realize. After the expansion of income taxation during World War II, this disparity became a significant source of inequality. 

 

 

We Need Social Science, Not Just Medical Science, to Beat the Pandemic

by Nicholas Dirks

"In order to ensure that scientific advances work not just to create new medicines but to help lead to a healthier and more just world, we need to ensure that science and social science work hand in hand as well."

 

 

The Nazi-Fighting Women of the Jewish Resistance

by Judy Batalion

"I was raised in a community of Holocaust survivors and had earned a doctorate in women’s history. Why had I never heard these stories?"

 

 

Medical Racism has Shaped U.S. Policies for Centuries

by Dierdre Cooper Owens

Medical racism is as old as America, and the COVID-19 pandemic has been no exception in terms of unequal vulnerability to disease. 

 

 

The Triangle Fire and the Fight for $15

by Christopher C. Gorham

The Triangle Shirtwaist fire inspired workplace safety regulation and advanced the cause of organized labor. It's time to remember the victims with a commitment to a federal living wage law.

 

]]>
Sat, 17 Apr 2021 04:43:05 +0000 https://historynewsnetwork.org/article/179718 https://historynewsnetwork.org/article/179718 0